What is CENG499 ?

Ceng499 is a course offered twice a year by the University of Victoria. This course is a pure project course, with these students working in small teams to complete all aspects of a design. These teams are made up of students from many different disciplines, and the projects range in both scope and complexity.

 

What is the unit00 Project

The unit00 project was one of these CENG499 projects, completed by two students during the summer term of 2003. The main goal was to develop a robust robotics research platform. This platform will be used in the next 499 term to complete experiments in image processing, path planning, and obstacle avoidance.

 

Summary of project

The field of robotics research is currently very active, with everything from search and rescue robots to autonomous vehicles being investigated. To conduct this research a robust and expandable platform is required. To this end, the goal of the Unit00 was to develop such a platform.

Over the course of the previous term our team has tackled the electrical, software, and mechanical hurdles in developing a robust, expandable research platform. The final result is a platform that can autonomously navigate around within a given space, avoiding obstacles (both static and dynamic). The platform operates on the Linux operating system, allowing for the use of multiple processes and more advanced programming techniques than a simple microcontroller.

To keep the main processor free for advanced navigation heuristics, many mundane, CPU intensive tasks have been moved to smaller, satellite processors. Tasks such as motor power control, speed monitoring, as well as infra red proximity sensing are run this way.

Also included in the platform is a wireless link to allow for remote, real time telemetry monitoring, and control.

All these factors combine to create a robust platform that can perform basic navigational tasks with lots of opportunity for future expansion and features

 

Purpose of the Unit00 project.

The intent of this project is to develop a low cost robotics research platform suitable for future research into autonomous control algorithms. This initial development will also include a basic control algorithm to allow the platform to traverse the engineering building in a random fashion, avoiding obstacles as it goes.
The platform itself will be designed in such a way as to make later modifications or additions simpler. This means modularity needs to be the primary concern in the mechanical, electrical, and software designs.
This platform needs to be developed in a low cost way as it is being funded by the development team. This means that custom hardware will be used wherever possible to attempt to keep the total project cost down.

 

How does it work ?

Please see the final report in the documents section for a much more detailed discussion of the design work behind this project.

Sensors.

Infra-red Modules

Close-range sensing was achieved by using four infrared modules placed around the front of the chassis. They operate by sending out an invisible beam of light. When the beam hits an object, the light is reflected back onto an array of photo-diodes. The result is a non-linear voltage output proportional to the distance to the object.

Sonar Modules

For long range sensing, Polaroid sonar modules were employed. These operate by transmitting a high power burst of sound in the ultrasonic band (approximately 40KHz). This sound pulse travels out from the sensor at the speed of sound, they then bounce back from any object they encounter, effectively producing an echo. By detecting these reflections, and measuring the time of flight, the distance to any obstacle can be measured.

Processors.

In addition to the PC/104 board that acts at the central controller, two microcontrollers were used as satellite processors, these were given tasks that took some load from the main CPU module. The diagram below shows the organization of the whole system.

Control System.

The subsumption approach to implementing a behavioral system allows us to develop a pattern of behavior abstracted from the environment we are intending to interact with. In this way we can generate a more generic behavioral algorithm, while avoiding developing a expert system, which is incapable of dealing with un-programed stimuli. The subsumption approach also allows for easier algorithm development and coding, as each behavior is discrete, and therefore can be implemented and tested independently of any other module. The image below shows a rough block diagram of how the subsumption architecture in Unit00 works.

Subsumption, an analogy:
Our daily behaviors can be modeled using the subsumption architecture, take for example walking to the store to buy milk. When we decide to walk to the store, what is the first thing we do (assuming we have our wallet, keys, shoes etc.) we walk out the door, and begin to walk down the street. This is modeling our base behavior, and we can call this the walk behavior. As were walking along we run into a friend we haven’t seen in a while, we can wave , or stop and say hello, either way, this can be modeled into our system by adding a new behavior at a higher priority then walk, we’ll call this greet. Now greet doesn’t have to know what walk does, we don’t care how we were walking when we say hello, we just care that we have stopped walking and our now talking with our friend. To review our simple system now we have our base behavior of walk, which we will act on until a higher priority behavior wishes to take over, in this example, greet. So we will continue to walk to the store until we see someone we know, we will then stop and greet them, once we are finished talking to them, we will continue on our way. Just to make things a little more interesting, lets say the store is on the other side of the street and we have to cross to reach it. Being safety mined individuals (and this close to finals) we want to check both ways before we cross, if there are no cars coming it’s safe to cross, but if there are cars coming we’ll wait for them to pass. We’ll call this new behavior Cross, and assign it a priority higher than either Walk or Greet. If the Cross behavior was at a lower priority then Greet we could run into a problem, if we were to see a friend as we cross the street, according to our model, we would have to stop in the middle of the street to say hello (a very dangerous proposition). This is due to the fact the highest priority behavior always has full control of the system, suppressing any lower priority behaviors. This simple subsumption model could be implemented directly in hardware, with each of the behaviors being triggered by stimuli produced by one or a combination of sensors, yielding a system capable of making a trip to the local corner store.

Final Results.

The final result of this project that was a system that met or exceeded all of the original design requirements. The final platform was able to autonomously navigate within a space avoiding both static obstacles (walls, boxes etc.) and dynamic obstacles (people).