Centibots Home > Technical Design > Robots
Technical Design
   TECHNICAL DESIGN: Robots
  1. Introduction 
  2. The robots
    1. - Amigobot 
    2. - Pioneer 2 (AT or DX) 
  3. Sensors 
    1. - Laser Range Finder SICK 200 
    2. - Sonar 
    3. - Cameras 
      1. - Logitech Pro 3000 
      2. - Logitech Pro 4000 
      3. - Unibrain Fire-1
  4. Computers 
    1. - VersaLogic VSBC-8 
    2. - VersaLogic VSBC-6 
    3. - Epia M9000 
    4. - Epia 5000
    5. - Wireless 
      1. Orinoco card 
      2. Netgear M101
  5. Software
    1. - Aria 
    2. - Saphira 
    3. - Localization 
    4. - Path planning (gradient) 
    5. - Map exploration
    6. - DCAM 
    7. - Video compression 
    8. - Quick Mono Camera Tracke
    9. - Color Tracker ACTS 
    10. - Behaviors 
    11. - PacketHop 
    12. - Jini services
      1. - Centibots Operator Interface
      2. - Map Publisher
      3. - Spatial Reasoning Tool
      4. - Taxi dispatcher
  6. Breakup log
  Introduction
For this project, we had a lot of tools already working but not a single one that was at the level required by this project. First the robots, we had to choose a robot (or several types) to could achieve the tasks. Then we had to select the computer that would run on the robots. We've decided that every robots should be at least capable of self-localization, path planning, video processing and negotiating with its team member. We want them to be self contained agent. The last piece of this gigantic puzzle is the software, we have a lot of them. The plan for this project is quite simple, we will develop the main infrastructure for the first demo (January 2003, done see the movie). Then we will had the team behavior and geometry reasoning for the second demo (50 robots in July 2003). We are hoping to have most of our software finished for the second demo and only to correct or improve for the last and final demo (December 2003) with 100 robots.
  The robots
We have selected two types of robots:
  • The larger one: P2 AT or DX from ActivMedia Robotics. Each robot is equipped with a Laser Range Finder (LRF) from SICK (see the section on sensors), an on-board computer, odometer, sonar array, a camera, an INS. This robot can carry the LRF without trouble but the battery life is considerably shorten by using the laser.
  • P2AT with the Laser Range Finder mounted.
    P2 AT with LRF
  • The smaller ones are Amigobot also from ActivMedia Robotics. The trouble with this robot is that it's not a self-contained robot, there is no computer on-board. It's just a mobile platform with sonars. They are perfect for indoor environment with excellent turning radius, small footprint, good battery and CHEAP. Remember we need 100 robots, the pioneer are expensive (especially with the LFR) we couldn't afford to have too many, so our main fleet will be Amigobot. We still need to modify them to accommodate a computer. The section on computer will describe the computer we choose. The white box on top of our Amigobot is the on-board computer.
Top picture: the original amigobot as we bought them. Bottom pictures: An amigobot in the Centibots configuration (computer, wireless USB and USB camera).
amigobot
  Sensors
  • Sonar: Each robot is equipped with at least a set of 6 sonars, we used theses sonar for obstacle avoidance and localization (for the Amigobot). In the movie section, there a movie of one Amigobot making a lap in our K wing at 1m/s fully localized using only sonar. The sonar are the golden disk on each robot.
  • Laser Range Finder SICK 200 is a very precise laser that produce high quality map. The drawback of this sensor is the weight (10 pounds !!!) and the power consumption (12V at 1.6A). The LRF is too heavy and use too much power for the Amigobot. The Amigobot will rely on the map made by the first team composed only of pioneer to navigate. The idea is once the map is made using LRF, the map is uploaded to all robots so they can navigate. This sensor is the workhorse of our project.
  • SICK Laser Range Finder
  • Cameras, we try several. We had consideration of price, power consumption and drivers, we use the following sets:
    • Logitech QuickCam 3000 Pro, use the PWC driver that supports both Pro 3000 and Pro 4000. The 3000 is better supported but they stop production before we could order all 100 of them. So we had to use the 4000. We can get a frame rate of 5Hz uncompressed with a 320x240 resolution.
    • Logitech 4000 Pro
    • Logitech QuickCam 4000 Pro is still less supported and we still have some trouble with some of them (don't ask me why I still trying to figure that out).
    • Unibrain Fire-1 is our firewire camera. This camera works perfectly and is very well supported under Linux. We choose this one because you can power the camera from an external source with a standard 12V DC. Of course this camera is able to get 640x460 at 30Hz. The camera are currently mounted on top of the LRF.
    • Unibrain Fire-1 on top of the LRF
  • INS Crossbow, some pioneer have Crossbow as Inertial Navigation System, it helps a lot when closing maps. Using the odometer is a problem if you want to have a precise notion of how much you've turn, the INS solves the problem.
  4. Computers
A robot without a computer is as good as doorstop. This is a critical element, not enough computation power and will have trouble navigate in the environment, too much power and you will need more battery, a better cooling system, stability issues. We have two main system, the pioneer comes with a Versalogic board, while the Amigobot have a mini-itx from VIA:
  • The VersaLogic VSBC-6 has a PII-400Mhz, our older robots have this board. It's really the limit in term of computation we need.
  • On the most recent pioneer, we have a VersaLogic VSBC-8 with Pentium III 850Mhz. This gives us plenty of computation power.
  • VSBC-8 mounted inside the P2AT
  • The Amigobot have an Epia 5000 mini-itx board. This is a full computer in 17cm by 17cm. We only had to figure out how to mount it on top of the Amigobot. It comes with a serial port (handy for connecting to the robots), 4 usb ports with 2 usb bus (so we could connect the USB wireless on one and the USB camera on the other), 10/100 ethernet port (really handy for upload and synchronization), and a 800Mhz C3 processor (equivalent to a P3 600Mhz). The best of all, this board supports network booting (read our section on how we replicate and maintain 100 robots). Fortunately for us ActivMedia engineers build the custom case (photo .....) and adapt all our Amigobots with the processor. The battery was also replaced to accommodate for more power. On the photo ...., you can see that we put a IBM 20GB travelstar as our main disk (it's really getting hard to find 20GB disk nowadays :-)
  • Custom computer case for the EPIA 5000 and M9000. Right picture, notice the custom power board for running the computer off a 12V DC.
  • The improved version of the previous mainboard was released in 2003, Epia M9000, we jump on it and some of our Amigobot are running with it. The main benefit for us was not too much the processor speed (933Mhz instead of 800Mhz) but more the firewire built-in (oh yes !!) and the USB 2.0. The firewire wire camera can deliver a better frame rate than the usb one.
  • EPIA M9000
Our adhoc wireless system runs on standard 802.11b card:
  • The pioneer use an Orinoco card with PC/104 PCMCIA adapter mounted inside the robot. We use external outdoor antenna to get the signal out. By using the outdoor antenna we can get a far better range with our wireless network.
  • Left picture: Orinoco wireless card in the PC/104 PCMCIA adpater. Right picture: the Netgear USB to wireless for the amigobot.
  • The Amigobot use the Netgear M101 usb wireless system. We have try the D-Link but we could get it to work reliably, mostly each time we order the D-Link each time we ended up with a different chipset making our Linux support more complicated than required. The netgear seems consistent across several orders and the driver seems also more stable (i.e. it works for us !)
  Software
  1. - Aria
    From the ActivMedia Web site: ARIA (ActivMedia Robotics Interface Application) is designed for professional developers. It is a powerful Object Oriented (OO) interface to ActivMedia mobile robots, usable under Linux or Win32 OS in C++. 
    ARIA is an API (Application Programming Interface), designed and written entirely in the OO paradigm.  ARIA communicates with the robot via a client/server relationship, either using a serial connection (to talk to the robot) or a TCP/IP connection (to talk to a simulator).
    ARIA is the low-level sensor, actuator abstraction, we are getting ARIA from ActivMedia.
  2. - Saphira
    SAPHIRA is a robot control system developed at SRI International's Artificial Intelligence Center. It was first developed in conjunction with the Flakey mobile robot project, as an integrated architecture for robot perception and action. That's where we are making most of your development. This includes new algorithm for localization, map exploration, people tracking using USB camera, etc..
  3. - Localization
  4. - Path planning (gradient)
    We use Kurt Konolige's gradient path planning (which is distributed in Saphira 8). We have made some tuning to the released version and our changes will propagate into the main Saphira distribution when we have time :-). Here the original paper on the gradient by Kurt:
    K. Konolige. A Gradient Method for Real time Robot Control, IROS 2000
    "Despite many decades of research into mobile robot control, reliable, high-speed motion in complicated, uncertain environments remains an unachieved goal. In this paper we present a solution to real time motion control that can competently maneuver a robot at optimal speed even as it explores a new region or encounters new obstacles. The method uses a navigation function to generate a gradient field that represents the optimal (lowest-cost) path to the goal at every point in the workspace. Additionally, we present an integrated sensor fusion system that allows incremental construction of an unknown or uncertain environment. Under modest assumptions, the robot is guaranteed to get to the goal in an arbitrary static unexplored environment, as long as such a path exists. We present preliminary experiments to show that the gradient method is better than expert human controllers in both known and unknown environments."
  5. - Map exploration
    Constructing Maps To construct a map, we deploy a team of robots each equipped with the SICK Laser Range Finder (LRF) into an environment. They need not start at the same point, nor do they need to know where the other robots are located. At regular intervals, each mapping robot records both its encoder position (where the robot thinks it is based on wheel rotations) and the current laser scan reading. Using the encoder information, the scans are pieced together to form a map of the environment. Combining Maps When two robots believe that their maps overlap, they verify their hypothesis by arranging to meet at some location in both maps that they believe to be one and the same. If the robots do not meet up, their hypothesis is false and they continue mapping. If the robots meet up, then they have found a match and subsequently piece the two maps together to make a larger map.
  6. - DCAM
  7. - Video compression
    We use a library called libcu30 made by Aron Rosenberg and Derek Smithies, that was on sourceforge as Open Source project but now has moved to a more commercial endeavour. You can still get the library here. The reference paper is this one:
    Yi-Jen Chiu and Toby Berger, A Software-only Videocodec using Pixelwise Conditional Differential Replenishment and Perceptual Enhancments, IEEE Transactions on Circuts and Systems for Video Technology, April 1999.
    All we have done is integrated this method into a network service so we could visualize any camera of any robot in our network.
  8. - Quick Mono Camera Tracker
  9. - Color Tracker ACTS
  10. - Behavior
  11. - TBRPF
  12. - Jini isn't an acronym, is a name. Jini is a programming model and a runtime infrastructure that can offer "network plug and play". A Jini system consists of a collection of clients and services communicating by the Jini protocols using the Java Remote Method Invocation mechanism (RMI).
    Although Jini is written in Java, neither clients nor services are constrained to be in Java. They may include native code methods, act as wrappers around non-Java objects, or even be written in some other language altogether. Jini supplies a middleware layer to link services and clients from a variety of sources.
    The Jini API defines conventions that support leasing, distributed events, and distributed transactions. The API makes it easy to add, locate, access and remove devices and services on the network. The Jini infrastructure provides a reliable distributed system even though the underlying network is unreliable.
    Jini services are network-aware. They are discovered, represented and used automatically as they become available and no longer when they disappear. Either because they were stopped or because they are no longer reachable. This is handled be JINI-mechanisms called discovery and leases. Jini can be used for mobile computing tasks where a service may only be connected to a network for a short time, but it can more generally be used in any network where there is some degree of change. They discover each other using broadcasts that are not routed outside the local network and ServiceLocators (IP-addresses) to explicitly name a computer in another network.
    Jini Services are meant to keep running, but a service does not need to use resources all the time to do this. While it only waits for incoming requests it can use RMI to be suspended to disk and waked up automatically if a new request comes in. This works even across restarts.
    Services are made available through a ServiceEntry. This entry contains a proxy, implementing a service-interface and thus hiding and abstracting from the actual way the functionality of that interface is implemented. It also contains a world-wide unique identifier (UID, partly randomly generated and very long to avoid conflicts, 128 bits) and a set of Attributes. These attributes can be attached, not only by the service itself but by users and other services. They describe the placement of a service (for example a printer-service's printer), give a description, the version and implementer/vendor of a service and also the representation. The representations of services can run on different computers and even multiple representations can run for one service. Services, UIs and representations communicate with each other using messages. They register at each other to receive a number of different message-types and are granted that right for a specified time until they either renew this contract or time out. This timeout is used everywhere remote resources are used as a guard against particle network-failures. Otherwise it would be possible for such registrations to accumulate and be stored for an infinite time whiles the receiver for the messages in no longer reachable. Services and UIs (service-browsers) need to belong to at least one common group to see each other. With such groups it is possible to make services available to only a part of the network. Here the Jini architecture of the Centibots system:
    1. Centibots Operator Base Station (COBS)
      For the GUI we chose the NetBeans platform. NetBeans provide us with a plug and play IDE with services common to almost all desktop applications – windows, menus, settings management and storage, file access and more.
      The JGraph graph library was chosen for displaying network topology, spatial information graphs and taxi dispatching graphs. JGraph is a powerful, lightweight and open source graph component.
      CUI reference
      Figure 1. Screen shot of COBS.
      In COBS the operator has a complete overview of the system. COBS is designed to help the operator control 100 robots. A hierarchal team management system is implemented to mange the robots.
      Team:
      • Contains 1 or more robots (Amigobots or Pioneers)
      • Contains 1 or more Teams

      • Can be assigned to 1 or more robots
      • Can be assigned to 1 or more team
      • The following tasks are defined in COBS:
        • Map environment: Map the environment.
        • Find object: Use dispatcher service to coordinate searching for object among robots.
        • Guard object: Use dispatcher service to coordinate guarding of object.
        • Go To: Set a position for the robot to go to.
        • Go home: Return to start position.
        • Set position: Resets the robots internal position to a specified position.
        • Done: Stop all activity.
    2. Jini Services
      The Centibots system consists of the following jini services:
      • Robot: General robot service
      • Amigobot service: Service running on Amigobots
      • Pioneer service: Service running on Pioneers
        Each robot is running Saphira and depending of what type of robot (Amigobot or Pioneer ) a jini service is also running on each robot. Saphira is written in C++ and the Jini service is written in Java. To get information passed between Saphira and jini we have implemented a local asynchronies socket layer SaphiraProxy.
        When there are updates to any state of the robot a message is send between Saphira and Jini by the SapiraProxy layer.
      • Publisher service: The Publisher service listens to scan events from the pioneer services and combines the scans to a vector map and occupancy map. When scans are received their sequence number is checked. If a sequence is missing the publisher asks for the missing scans.
        The vector map and occupancy maps are then send as jini events to listening services.
      • Recorder service: Records all events in system into a relation database. The recorder service supports searching and replay of events.
      • Spatial Reasoning service: The spatial reasoning service listens to “FinalMapEvent”. From the final map the service builds a spatial graph used for dispatching the robots (see dispatcher service).
      • Server service: The server is batching and caching all events from robots, publishers, spatial and dispatcher services and forwarding them to COBS. The server is also collecting requests from COBS to the system and sending the appropriate events to the services. A schedule manager schedules when cached information needs to be updated and queries the right services for updating the cached information.
      • Taxi Dispatcher service: The taxi dispatcher listens to “FinalGlobalGraph” and based on final global graph calculates optimal dispatching of robots based on their position and task.


  Breakup log
Here the log of the different break-ups, we have on those robots:
  • 05/14 robot 7 left axle broke.
  • 05/25 robot 15 back from repair.
  • 05/25 robot 7 sent for repair (got a spare number of axles).
  • 05/27 power board on robot 22.
  • 05/29 LRF serial cable snapped.
  • 06/12 sonar broke on robot 7.
  • 06/20 shell broke after a fall of robot 5
  • 06/28 USB cables were incorrectly set on robots 28 and 31
  • 07/02 robots 24 and 25 seems to have battery charge problems.
  • 07/22 robots 32, 46, 48 have Firewire issues, cause not yet known
  • 07/27 robot 5 has some hardware issue (motherboard most likely)
  • 07/31 mapper 3 is having an overhaul (new motherboard (VIA one), new control)
  • 08/04 robots 32,46,48,37 have firewire issues (likely due to bad connectors)
  • 09/05 to 09/15 we had issues with the USB wireless driver for the netgear. We solve it by switching drivers.
  • 09/25 Today is great day, all our robots are working fine, no hardware problems.....
  • 10/14 starting to receive the 55 ones from Achievement. We have some switches, battery and sonars problems on the new ones.
  • 11/20 we have isolated about 12 of our robots that have some issues, we will use them as parts for the working ones
  • 12/14 That's it we have 102 robots. 97 amigots, 82 in working order and 5 mappers. UPS lost one mapper (2 boxes out 3 arrived at SRI).
  • 01/05 During the loading of the truck, one cart containing 18 amigobots was dropped. The box broke open, we will deal with it at Fort AP Hill.
  • 01/11 Time to check all those robots and getting ready.
  • 01/19 The 2 SRI mappers are down: one has broken connectors which prevent two batteries to connect the robot. The other one drain batteries way too fast.
  • 01/20 The experiment is over, so technically the project. I'm happy
Created by Rety Web Design