Deloitte university press just released a report that discusses the evolution of interaction. The computer mouse was first invented in 1964 by a visionary World War II veteran named Douglas C. Engelbart. Though the mouse didn’t become the standard way to control a desktop computer until Apple released the Macintosh in 1984. The invention of the modern computer keyboard sprung from the invention of the typewriter by Christopher Latham Sholes in 1868.
Since then we have continued to evolve the human computer interfaces. The IPAD made a huge splash in bringing the touch interface into the mainstream. Logically, the next evolution was talk. With speech recognition software getting better and better, the mainstream voice recognition software platform that made a lot of noise ( pun indented) , was the popular Apple “Siri”.
Now with the evolution of wearables, it has brought in a whole new era of human computer interfaces. Interfaces such as gesture, which is built into the HoloLens or available as an add on with devices such as the LeapMotion or the Thalmic Lab Myo. Promptly we are looking at the next big evolution. Sensors are allowing computers to interact with humans by using body worn biosensor wearables. Sensors can pick up your mood by reading and interpreting the data from your body. Eye tracking sensors can sense where you are looking on the screen and help guide information to the focused field of view.
My prediction is the next big evolution in the human computer interface will come from augmented cognition. As we understand the brain better, we can start interpreting the data to perhaps interfacing with the computer algorithms to then interact with objects that allow for data transmission.
Who is up for a game of pong using just your brain?
Exciting times for sure.
Read more here:
https://dupress.deloitte.com/dup-us-en/focus/tech-trends/2016/augmented-and-virtual-reality.html
Check out the diagram below: