CommandTalk is a spoken-language interface to synthetic forces in entity-based battlefield simulations, developed by SRI International under our DARPA-sponsored project on Improved Spoken-Language Understanding. The principal goal of CommandTalk is to let simulation operators interact with synthetic forces by voice in a manner as similar as possible to the way that commanders control live forces. CommandTalk currently interfaces to the ModSAF battlefield simulator and allows the use of ordinary English commands to CommandTalk was initially developed for LeatherNet, a simulation and training system for the Marine Corps developed under direction of the Naval Command, Control and Ocean Surveillance Center, RDT&E Division (NRaD). Subsequently, CommandTalk was extended to Navy, Air Force, and Army versions of ModSAF, to provide control of all synthetic forces in DARPA's STOW '97 Advanced Concept Technology Demonstration. The Advanced Simulation Technology Thrust (ASTT) program supported further development of CommandTalk to include a dialogue component and to research robustness techniques for interpreting user input.

CommandTalk is implemented within the DASLING (Distributed-Agent Spoken-Language Interfaces based on Nuance and Gemini) framework for interactive spoken-language interface applications. DASLING combines the Nuance speech recognizer (a commercially available system based on SRI-developed technology) with SRI's Gemini natural-language understanding system, using SRI's Open Agent Architecture.

For more information on the CommandTalk system see

Our publications:

Our web resources:

For additional information on CommandTalk see NRaD's CommandTalk site.