Many futurists envision a world where computing isn’t limited to desktops and mobile devices but rather a ubiquitous function of everyday items — appliances, cars, coffee mugs, clothing, sprinkler systems — all networked into an “Internet of everything.”
Prototypes of the software-based electronic devices that will define this emerging world are being built and tested in the College of Architecture at the Texas A&M Embodied Interaction Laboratory. Established with a $326,000 grant from the National Science Foundation, the new lab is headed by Francis Quek, professor of visualization.
Consider this scenario: While returning home from work, you consult your home computer for dinner recommendations, which will be based on the contents of your fridge and food pantry. Your home network “knows” what’s available because each food item has an identifying tag. And, because all the food items are networked to the oven and microwave, they can provide preparation and cooking time estimates for each option.
Twenty minutes prior to your arrival, your home air conditioner, apprised of your destination and sensing your approach, activates to optimize the temperature.
Interconnectivity like this, with objects and machines and people, will require electronic sensors and devices like Quek and his students are creating.
These gadgets will be developed for a very broad range of applications, including teaching and learning, “smart” home technology, aids for people with visual or hearing impairments, and many more.
“It’s hard to answer the question ‘what is the one thing we are going to make,” said Quek. “The lab is going to be a wild and woolly place, a space for students to imagine, build and test anything.”
TEIL faculty and students will work in an environment that draws its inspiration from the maker movement, which joins individuals in engineering-oriented pursuits in a social environment that features prototyping, invention and creativity.
The movement seeks to upend the notion that only giant companies are capable of creating devices needed for ubiquitous computing; devices such as the
Wiimote, the primary, motion-sensing controller for the Nintendo’s Wii video game console, which subsequent to its release, has been widely repurposed for a variety of uses unrelated to the game system.
In the past, said Quek, only after a device such as a Wiimote was released could students explore and test how it could function with other applications, such as creating digital art.
“Students should think, ‘I can make that too … I can build a quick prototype and test it,’” said Quek. “With maker movement thinking and the commodity electronics available in the lab or online, it can be done really quickly and inexpensively.”
The lab is also developing assistive technologies for people with disabilities, groups like the seeing and hearing impaired who are unable to readily exploit rapidly expanding, emerging technologies.
With $302,000 from another NSF grant, Quek is developing a new method to aid sight-impaired readers using an iPad with a tactile overlay on its screen. The overlay, called a Spatial Touch Audio Annotator and Reader, allows users to navigate an electronic book and hear its contents read aloud.
The device, he said, promises to be a better reading aid to the visually impaired than Braille, which can be cumbersome in an academic setting.
“We had a blind student whose physics textbook, when converted to Braille, took up 20 volumes of 11 by 14-inch pages,” he said. “What kind of warehouse would such a student need just to go to school?”
Such a device also promises to aid students in primary and secondary school settings where visually impaired students, fewer than 10 percent of whom read Braille, are no longer separated from the rest of the school population, said Quek.