Heralding a wave of invisible computing4 min read . Updated: 15 Feb 2012, 09:17 PM IST
Heralding a wave of invisible computing
Heralding a wave of invisible computing
Mumbai: Imagine this. You slide a finger across your smartphone screen to copy a file on the finger, literally making the digit a human storage device. You later copy that file from the finger onto any screen—be it a laptop, liquid crystal display (LCD) screen or any surface, for that matter—by simply touching that.
This isn’t a scene from a sci-fi movie. It’s a technology that India-born Pranav Mistry, a 31-year-old computer scientist doing his PhD with the Media Lab at the Massachusetts Institute of Technology (MIT), Cambridge, demonstrated on Wednesday while delivering the technology keynote address on the second day of the Nasscom India Leadership Summit in Mumbai.
“I often wondered why I couldn’t simply elongate my arms to open the door or switch off the lights of a lamp rather than walk and do these tasks. After all, Indian mythological figures could do that," Mistry said.
“The digital world--laptop, TV, smartphone, e-book reader--all rely upon the cloud (metaphor for the Internet) of information. Sparsh lets you transfer media from a device to your body and pass it to another device by simple touch gestures using the cloud," said Mistry. The lamp, for instance, is connected to the Internet (similar to the ‘Internet of Things’ concept wherein gadgets talk to each other).
Mistry has also developed ‘TeleTouch,’ which allows users to see through the smartphone’s camera and control home appliances like the television, alarm system and music players by simply manipulating them on a screen.
Mistry did not provide details; the technology is patented and being developed by some electronic companies for commercial use.
The MIT scientist, who prefers to call himself a designer rather than a computer scientist, is also the creator of THE Sixth Sense digital prototype which he developed under the guidance of associate professor Pattie Maes. The device consists of a pocket projector, mirror and web camera bundled in a wearable, pendant-like gadget.
The projector can turn anything into a touch screen. The webcam (and colour-coded finger-gloves worn on the index finger and thumb) can recognize the movements of a user’s hands, which enables gesture commands. A “square frame" gesture, for instance, will prompt the device to take a photograph.
The device can also recognize a book that the user selects from a bookstore—either by image recognition or radio frequency identification (RFID) tags—and project information, like an Amazon rating, onto it. The system can also project a keyboard to type on, detect items on grocery shelves and compare online prices. A newspaper can prompt the device to search for news video clips (the device’s smartphone uses an Internet connection to retrieve information).
“The possibilities are immense but it’s a work in progress," said Mistry.
The “Sixth Sense" device, which MIT has patented, helped him get closer to his childhood dream of melding the flexibility of the digital world with the physical one. The current prototype system costs some $350.
Mistry has created other technologies too. Consider this. You and your spouse want to watch a movie on a TV screen or in the theatre. She loves romantic movies but you prefer action ones. So you enter the theatre with a special pair of glasses each, and both come out content—you having watched a martial arts movie while she has had her fill of romanctic cinema.
Called “thirdEye", the technique enables multiple viewers to see different things on the same display screen simultaneously.
“With thirdEye, we can have a public sign board where a tourist at the New Delhi airport sees all the instructions in his language while others see it in their own languages. We don’t need to have the split screen in games now. Each player can see his/her personal view of the game on the TV screen. Two people watching TV can watch their favourite channel on a single TV screen. A public display can show secret messages or patterns," said Mistry, adding that imagination is the only limitation.
Brought up in Gujarat, Mistry did his master’s in design in visual communication from the Indian Institute of Technology-Bombay, and later worked with the Microsoft India Development Centre—first as an intern, then as an employee—on several projects including Akshar, which was basically an attempt to create a mechanism for inputting Indic scripts in digital devices like mobile phones, kiosks, interactive TVs and personal computers.
One of Mistry’s earlier projects, Sandesh, attempted to bridge the digital divide. It contains a message-receiving unit in villages and kiosks in cities with visual aids and uses print—or sound-based media to convey messages. And his “mouseless" technology does away with the need for an external computer mouse. The device consists of an infrared (IR) laser beam and an IR camera — both embedded in the computer. Users cup their hand, just as they would do if a physical mouse were present underneath, and the laser beam lights up the hand that is in contact with the surface.
The IR camera detects the bright IR blobs using computer vision. The change in the position and arrangements of these blobs are interpreted as a mouse cursor movement and clicks. When users tap their index fingers, the size of the blob changes and the camera recognises the intended mouse click. It costs around $20 to build a fully functional working prototype system of “Mouseless".
Mistry said all these technologies highlight a trend called “invisible computing". “When any industry becomes mature, you move to the next level. Similarly computers will stay but move into the background becoming invisible," he said.