We can probably credit Nintendo for kicking off the most recent push to advance motion control gaming. Indeed, the Nintendo Wii was the first gaming console to bring motion control to the mainstream public.
However, I think Microsoft helped significantly evolve the technology with its Kinect motion control system - as a player needs nothing but his or her body to control a game.
Although Redmond has already fielded a Kinect for Windows adaptation based on the classic sensor, the company recently showcased a new Kinect-like system that uses a notebook's integrated speaker and microphone to operate.
Microsoft calls the new system SoundWave, as it uses the Doppler effect to detect a user's movement and gestures made near the computer. Essentially, the speaker is coded to produce an ultrasonic sound wave in the 18 to 22 kHz range that changes frequently - depending on the location of an individual.
The primary advantage of a system like this? Computer manufacturers could theoretically equip their notebooks with gesture sensing interfaces without having to add any additional sensors or peripherals.
One example of how such a system could potentially be leveraged? A laptop or desktop that automatically locks itself when the user moves away from the screen.
According to Microsoft, the accuracy of the system is between 90 and 100%, even in noisy environments.
However, one question remains - will people use a system like this?
Think about it: Why would the average person bother with gestures when the keyboard is literally beneath their hands?
Persinally, I believe the gesture control system would ultimately have to be much more intuitive and easier to use than a keyboard for such a platform to be viable.