OAA-Based Applications

More than 35 applications have been implemented using the Open Agent Architecture. Here is a partial list to give you ideas!

OAA-based Applications

Perhaps the most exciting aspect of OAA applications is the cross-application possibilities: for instance, the agent community making up CHeF the smart fridge, can communicate with agents in your car, your house, your office, and so forth. Where others build standalone applications, these CHIC! applications use OAA to explore the truly connected world!

Automated Office

Control your office from remote locations -- agents provide access and monitoring of your calendar, email, or database applications over the telephone, a laptop, web browser or wireless PDA.

Multimodal Maps

A teacher at a blackboard communicates with a classroom of students by drawing, speaking, underlining, circling, writing -- all simultaneously. Communicating with a network of distributed software agents should be this easy! Using a map-style interface, users can naturally combine writing, speaking, and gesturing to retrieve information and issue commands to distributed agents.


Related to the multimodal map project, CommandTalk is a spoken-language interface that allows commanders to use natural, spoken English commands to control simulated forces. This system is in use at the Marine Corps Air Ground Combat Center, Twentynine Palms, California.

Speech Recognition over the Web

Try it!!!
Use your telephone and Java-enabled web browser to talk to an OAA-based demonstration application for scheduling airline reservations.

Agent Development Tools

A set of Agent Development Tools, implemented within the OAA themselves, guide a programmer through the tasks involved in implementing new OAA agents and applications.

Multi-Robot Control

OAA-powered robots wins first place at 1996 AAAI Robotics Competition and Exibition (Office Navigation Task).

Spoken Dialog Summarization

A real-time system for summarizing human-human spontaneous spoken dialogues (Japanese).


An animated voice interactive system for learning about SRI.


MVIEWS enables an analyst to annotate a live video stream using pen and voice, and incorporates various technologies (pen & speech recognition, image processing, content-based indexing, collaboration) to enhance the analyst's abilities to intelligently monitor the situation.


A system for coordinating multiple recognition technologies (speech, OCR, NL Extraction, etc.) for indexing broadcast news videos.

SUO Robots

This project focused on a wireless multimodal interface for controlling teams of robots and their sensors (including video) in Small Unit Operations (SUO) missions.


Among other things, this smart fridge knows what's inside and can find recommended recipes using the ingredients.


The objective is to give SRI visitors a wireless tablet computer that augments their experience during the day. Position tracking helps the user navigate SRI hallways and resources, and information is updated on the display based on the user's location and interests.

TravelMATE and CARS

TravelMATE uses GPS, a compass, a 3D virtual reality terrain database and a smart windshield to augment a tourist's driving experience with context-relevant information. CARS provides a speech-enabled interface to systems and services provided by the car and by the Internet.


EMCE, the Enhanced Multimodal Collaborative Environment, plays host during augmented meetings.


Surf provides an OAA-based interface for your TV, where you can task your community of agents (home appliances and information services) by speaking into your remote control. Events (e.g. the phone ringing) can also trigger informational updates on the TV screen (e.g. the phone number using caller-id).

Copyright 1999 - 2001, SRI International