Creating a considerate context
|Ted Selker||CMU Silicon Valley||[Home Page]|
Notice: Hosted by Richard Waldinger
Date: Thursday March 11, 2010 at 16:00
Location: Building A, Conference room B (note the non-usual location) (Directions)
We are now poised to create a world where objects with computers in them can recognize our looks, feelings, and actions to simplify how we work with them.
Keyboards and mice will not continue to dominate computer user interfaces. Keyboard input will be replaced in large measure by systems that know what we want and require less explicit communication. Sensors are gaining fidelity and ubiquity to record presence and actions; sensors will notice when we enter a space, sit down, lie down, pump iron, etc.
This talk will present examples in which our intentions can be understood and acted on by computers. The work reaches across domains to demonstrate that human intentions can be recognized and respected in many complex natural scenarios As well or instead of relying on explicit input, systems are starting to improve their actions by watching implicit actions of their users. We will discuss how scenario, design and careful use of sensors and effectors made a helmet that mediates communication between a bicyclist and their environment. We will discuss how the use of knowledge enabled cell phones to be used to improve shopping experience for a person bringing a child with them to a grocery store. We will discuss how a simple data gathering approach allowed cell phones in the Mobile Essence system improve meeting memory. We will discuss how experiments with cell phones show the value of a disruption manager using AI and machine learning.
Creating a considerate world
No longer will the sensors we develop simply be part of a control system. New systems will have to build social awareness into their feedback and attempts to actuate and effect things in the world. We are beginning to create a world where objects with computers in them can recognize our feelings and actions to simplify how we work with them. Context aware systems can recognize human intentions, making capabilities available as needed and reducing interruptions and disruption when they arent. My Context Aware Computing, intelligent kitchen work work to show new ways in which design can take human needs and abilities into account. Examples will show how everything from beds to kitchens, to the way we acquire things can be changed to be more appropriate as we work to improve our lives without complicating them further.
People have always loved to do as much as they can and complain about it being nerve wracking. Today, the video, Email, IM, and other communication channels we keep open seem to make it harder than ever to concentrate. For many years, my students and I have been exploring how to model and modulate communication to improve concentration. We have made and tested system in the kitchen the car ... and across many approaches to desktop communication. We experiment with allowing people to choose preferences for being interrupted in interruption manager. We learn that different modalities of feedback have different effects on people. Systems such as CarCoach show us that we can improve peoples performance by paying attention to timing and quantity of system feedback to them. Disruption Manager demonstrates that automatically mediating communication based on a cognitive model can improve human performance. With such new tools, people can begin to create a world where systems can recognize our needs though our actions to reduce unnecessary complexity in how we work.
Dr. Ted Selker came to the MIT Media lab in September of 1998. From 1999 until he left in June 2008 he was an Associate Professor and the Director of the Context Aware Computing Lab. His Context aware computing group created 48 research platforms to demonstrate that systems can recognize and respect human desires and intentions across many natural scenarios. The group is recognized for its work in creating environments that use sensors and artificial intelligence to create so-called virtual sensors; adaptive models of users to create keyboardless computer scenarios. Ted also directed Counter Intelligence, a forum discussing kitchens and domestic technology, lifestyles and supply changes as a result of technology. Ted created the Industrial Design Intelligence forum to discuss the need to understand cognitive science and quantitative experiments in doing product design. Additionally, from March 2004, to June 2008, Ted served as co-Director of the MIT/Caltech Voting Project. Prior to joining MIT faculty, Ted directed the User Systems Ergonomics Research Lab at IBM Research, where he became IBM Fellow in 1996. He has served as a consulting professor at Stanford University, taught at Hampshire, University of Massachusetts at Amherst and Brown Universities and worked at Xerox, PARC and Atari Research Labs. Teds research has contributed to products ranging from notebook computers to operating systems. For example, his design of the TrackPoint in-keyboard pointing device is used in many notebook computers, his visualizations have been responsible for performance and usability improvements in products and his adaptive help system was the basis of products as well. Teds work has resulted in numerous awards, patents, and papers and has often been featured in the press. Ted was co-recipient of the Computer Science Policy Leader Award for Scientific American 50 in 2004 and the American Association for People with Disabilities Thomas Paine Award for his work on voting technology in 2006.
Please arrive at least 10 minutes early in order to sign in and be escorted to the conference room. SRI is located at 333 Ravenswood Avenue in Menlo Park. Visitors may park in the visitors lot in front of Building E, and should follow the instructions by the lobby phone to be escorted to the meeting room. Detailed directions to SRI, as well as maps, are available from the Visiting AIC web page.