Navigating Rural Areas Through the Eyes of Autonomous Sensors

Published: March 04, 2019

Massachusetts Institute of Technology Research Team Working to Solve Autonomy-Related Perception and Navigation Issues

Photo courtesy of MIT Computer Science and Artificial Intelligence Laboratory (CSAIL).

Autonomous vehicles are able to see the environment through the digital eyes of multiple sensors strategically placed around the vehicle for a 360° view. Using radar, infrared cameras and ultrasonic sound technology, sensors detect and transmit a constant influx of raw data to an electronic control unit (ECU), where it is processed in a rapid, time-correlated manner to determine what action the car should take. 
Perception, as you can imagine, is everything when it comes to autonomous driving. But how can safety be guaranteed? Faced with a slew of variable road conditions, how is the best vehicle path determined? How do you find the optimal route when you factor in the rules of the road, multiple destinations, and other variables? 
These are the kinds of questions that student engineers are working to solve at the Massachusetts Institute of Technology (MIT) Computer Science and Artificial Intelligence Laboratory (CSAIL). The students are in year four of a five-year research project to find new and novel approaches to solving autonomy-related issues.
The 12-member team, made up of postdoctoral scholars, PhD and Master’s students, is being led by Daniela Rus, Director of the Computer Science and Artificial Intelligence Laboratory and Professor in the Department of Electrical Engineering and Computer Science at MIT. She and Sertac Karaman, Associate Professor of Aeronautics and Astronautics at MIT, are serving jointly as lead principal investigators of the autonomous car research group. 

“This team is very strong technically and well-integrated. The team is pushing the boundaries of what self-driving cars can do, in order to shape the future of personal mobility.”

Professor Daniela Rus, Massachusetts Institute of Technology (MIT) 

The projects that the team is currently working on include:

  • Navigating a self-driving car in a rural area
  • Guiding a car to steer using a neural network
  • Engineering a safe, autonomously operated wheelchair

“We are trying to solve problems confronting autonomous cars, and there are many,” said MIT Robotics Software Engineer Thomas Balch. “Our research is focused on such things as navigating on difficult rural roads and developing robust perception methods that will work in extreme weather conditions.”

MIT Robotics Software Engineer Thomas Balch shows the various hardware components used to autonomously drive their Prius V. The equipment includes a dSPACE MicroAutoBox to interface between the computers and the ECUs, a Pilot Systems relay box to switch between manual and computer control, NVIDIA Drive PX2 for machine learning and deep neural networks, a GPS unit and power distribution boxes.

Guiding Autonomous Vehicles Through Unmarked Roads

Self-driving cars are largely dependent on the availability of 3-D maps, which meticulously pinpoint the landscape of the open road (i.e., driving lanes, exit ramps, road signs, intersections, etc.). To prepare for driverless technology, roadways across major cities are being labeled and 3-D-mapped. But how are autonomous cars supposed to navigate through unmarked roads? There are millions of miles of country roads in rural areas that will not be 3-D-mapped for quite some time.

To address this problem, the team has developed a system, named MapLite, which enables autonomous navigation without 3-D maps. The MapLite technology uses GPS data, which provides a rough indication of the car’s location on a topological map, and lidar sensors, which generate a trajectory to track the boundaries of the road’s surface, to enable navigation. MapLite combines road boundary detections with the vehicle’s odometry (data provided by motion sensors to estimate a change in position) to enable self-driving vehicles to navigate the road reliably and at high speeds.

“We are trying to find a way to use sensors to drive in a place where you may not have a dense feature map,” said Balch. “At most, you may have a phone GPS, or you may have nothing at all. How do you stay on the road and follow it smoothly? What if the pavement ends? What if there are intersections? Those are some of the things MapLite is trying to solve.”

A technical paper on MapLite was presented at the International Conference on Robotics and Automation (ICRA) in Brisbane, Australia, in 2018. The team is optimistic that MapLite technology will be used in future autonomous vehicles.

Training a Neural Network to Steer a Car

To study various autonomy-related scenarios, the research team uses two 2016 Prius V cars equipped with an array of different sensors, including lidar, IMU, GPS, cameras, encoders, and others. dSPACE MicroAutoBox prototyping units are used as the interface between the autonomy software and the vehicle’s control systems.

The team uses the cars to build algorithms that learn how to steer using raw camera input. They are developing these end-to-end techniques to teach the car, from scratch, how to navigate during different times of day or environments.

“The underlying autonomy software on the platforms enables them to function as fully autonomous vehicles,” said Balch. “This software has to stay modular and easy to interface with so that any team members can bring their research and quickly load it to the car. They depend on the rest of the car functioning autonomously so that any results we see during experiments can be used to validate or troubleshoot their research.”

Seemingly every bit of research done by the team thus far has come with its own, unique set of challenges and a continuous need to make improvements. Nearly every time new code is written to improve one part of an underperforming software stack, a new area in need of optimization is typically revealed.

Balch shared an example of a single glitch that affected the entire system. The Linux computer used to control the car was being updated when suddenly a few sensor messages, and none of the output commands to the PCB, were sent faster than 1 Hz. After searching around, the team found that there had been an update to the Linux kernel that changed the USB rules for the computer. This resulted in a cap on the USB transfer rate, which meant it could no longer control the car or receive some of the critical sensor inputs.

“Once we found the bug, we made a patch to our rules, which reset the value to what it was before,” Balch said. “This was not necessarily a hurdle for autonomy, but definitely a very difficult bug to track down.” 

Engineering a Self-Driving Wheelchair

Another project the team is focused on is the development of an autonomous wheelchair. By using two electric wheelchairs controlled with a custom PCB and fitted with onboard computers and sensors, they are able to run autonomy software to navigate the wheelchairs without human intervention. The sensors create a 3-D map of the surrounding environment so that obstacles in the path of the wheelchair can be detected and avoided.

The team is gathering research on how to navigate through cluttered spaces. They are looking into such issues as what is the best way to navigate through those spaces to stay safe, avoid collisions and drive smoothly? How should the autonomously operated wheelchair respond if an aggressive person tries to run the chair into a wall or off its path?

The team is hopeful that one day the autonomous wheelchairs can be used in hospitals to aid patients with mobility issues.

The Role of the dSPACE MicroAutoBox

The research team uses laptops to run the autonomy software and process sensor data and output control signals. The Prius V cars have been retrofitted to send voltages to the steering, brake and accelerator ECUs in a way that mimics the signals that the vehicle’s own sensors would send. This setup essentially tricks the car into driving itself.

Balch said the dSPACE MicroAutoBox units perform two significant roles as the interface between the computers and the car’s ECUs. The MicroAutoBox reads the CAN bus, the output of the car’s actual sensors, and the output of encoders. It packages the data into messages that can be sent to the computers. This information, along with the data from other sensor inputs, is then used to provide feedback on the state of the car. Control signals are then sent out to tell the car what to do. The MicroAutoBox receives those signals in a message and translates them into the appropriate voltage, which it then outputs to the corresponding ECU.

“The MicroAutoBox has performed very well!” said Balch. “Before, we were struggling to send our messages to our custom board over a USB connection and at frequencies that are sufficiently high. But now we are easily sending and receiving at over 100 Hz, and the firmware running on the MicroAutoBox has a 1 ms run time, so we could conceivably send and receive commands at up to 1 kHz, which is great. The voltage levels are also very smooth and consistent, so we aren’t nearly as likely to trip an error in the car because the voltage dips unexpectedly.”

Balch said the desire to improve the performance of the steering controller on the cars was one of the main reasons the team decided to upgrade its hardware to the MicroAutoBox. Before that, the team used a custom PCB that an electrical engineer developed in their lab. It worked, but there were some issues that resulted in poor control of the steering.

“The improved voltage control, command resolution, and communication speed of the MicroAutoBox gives us a much more constrained and faster voltage control, which results in much smoother steering,” said Balch. 

Guidance from Pilot Systems

Pilot Systems built and installed a relay box (upper left) for the MIT CSAIL team. The relay box switches actual sensor signals (e.g., for acceleration, braking, and steering) to the MicroAutoBox, where they are converted into inputs and sent to the vehicle’s electronic control units for autonomous control. The relay box also serves as a signal conditioner for sensors, such as axle encoders, to accurately calculate distance and speed.

To aid the research team, MIT receives engineering support in several areas from Pilot Systems, a US-based systems design and consulting firm specializing in multiple areas, including vehicle-level systems engineering.

Pilot Systems rewrote the team’s firmware in Simulink® and added a framework to help simplify future upgrades. The consulting firm also built and delivered a hardware assembly and relay box. Additionally, Pilot Systems assisted the team with acceptance tests and provided insight into ways to integrate new features and good Simulink and ControlDesk coding practices to help keep the firmware clean and organized for easy use.

“Pilot Systems has been extremely helpful,” said Balch. “They were, and continue to be, very responsive to any questions or requests we have, and they were very flexible in the design process when we were trying to figure out the best way to transfer the functionality we had over to the MicroAutoBox. They are very friendly, professional, easy to work with, and they deliver a quality product.”

Over the past four years, the team has completed a considerable amount of research and documented their lessons learned. They are hopeful that their research will make a meaningful impact in the future development of autonomous cars.

With the kind permission of Massachusetts Institute of Technology (MIT)

Basic Information More Information Product Information

Drive innovation forward. Always on the pulse of technology development.

Subscribe to our expert knowledge. Learn from our successful project examples. Keep up to date on simulation and validation. Subscribe to/manage dSPACE direct and aerospace & defense now.

Enable form call

At this point, an input form from Click Dimensions is integrated. This enables us to process your newsletter subscription. The form is currently hidden due to your privacy settings for our website.

External input form

By activating the input form, you consent to personal data being transmitted to Click Dimensions within the EU, in the USA, Canada or Australia. More on this in our privacy policy.