What’s trending now?
Natural User Interface, technology that allows one to use their natural senses to control applications thus making technology itself work more naturally with humans. The most famous fictional examples of natural user interface are probably the many that are seen in the Steven Spielberg movie ‘Minority Report’.
While we are already used to touch screens and voice-command technology, current innovations are taking NUI to a whole new level. The New York Times reports that in a few years to come simple movements, like the way one shifts on their chair, might replace computer login passwords.
The iPhone 4S came out with Siri, the intelligent voice command-based personal assistant. Samsung Mobile’s new GS3 phone presents users with its own version called S-Voice. These voice command applications are not just voice recognition software that requires one to programme their phone and use particular keywords. Siri and S-Voice understand your natural speech.
Motion, an app that comes with the Samsung Galaxy S3, simplifies the user experience by understanding your motions as a user. For example, if you are busy typing a text message but decide to call the recipient instead, all you need to do is to lift the phone to your ear and viola!
Smart Stay uses facial recognition technology to identify the user’s eyes eliminating screen time-out as long as your eyes are fixed to your screen.
Why it is important?
NUI obviously makes using technology a much easier experience. For the first time ever, we don’t have to adapt to how technology works. Instead, technology is now adapting to human behaviour.
Microsoft has invested heavily on NUI research and says a future where technology is almost invisible is possible.
What’s the butterfly effect
Beyond just sensory control, this technology can help improve education and healthcare. Microsoft’s chief research officer Craig Mundie said last year that they envision the creation of software that will allow computers to make simple health diagnoses, for example. He showcased a robotic triage nurse, which could help speed up the process of sorting patients in accordance to with the urgency of their cases, eliminating the long waits that patients often have to endure in clinic and hospital waiting rooms.
The software giant also reports that its ‘Microsoft Surface’ technology has been used in cerebral palsy therapy research. The tabletop touchscreen interface, first introduced in 2007, allows for multiple users to use at the same time. Researchers programmed the table with games and activities. These were designed to reach specific goals that include stretching and other movements as a fun approach to cerebral palsy therapy.
The Pioneers and Global Hotspots
Bill Buxton, the principal researcher at Microsoft, is widely credited as one of the earliest pioneers of NUI technology. Even he, however, acknowledges that touchscreen technology came before his time. Innovation, including the multi-touch interface found on gadgets like the iPhone for example, first came in the 60s when IBM first built a touchscreen. In the 80s Canadian researchers developed a system that used a frosted glass panel and a camera placed behind it. When a finger pressed on the glass the camera would detect the action and register it as an input. The camera-based optical sensing was succeeded by a system that used capacitance. Buxton was part of the 1985 University of Toronto group that developed this system.
In recent history, California-based start-up Siri Inc first developed the personal assistant programme now known as the iPhone’s Siri in 2010. Apple acquired the start-up soon after their first app was launched in the App Store.
By: Sandiso Ngubane
Sandiso is a 25 year old, Johannesburg-based writer and blogger who enjoys connecting the dots.