Our simulation program — sensor fusion

Roboauto
Roboauto - blog
Published in
2 min readApr 27, 2018

--

When it comes to training self-driving cars, it is a well-known fact that the best option is to train them in their software mode. That means one builds a type of real-world interface on a PC to simulate actual road conditions as well as input many unexpected situations which may occur during driving. There are many kinds of simulation programs that can help us. Recently, we were facing a decision on which one to choose. There is a lot to choose from such as the simulation program from Udacity which is sadly more for learning how to code and train neural networks than really extracting useful data for a deep learning process. Microsoft also developed a simulation program, but it still has some bugs and its licensing is not open source. That makes it quite difficult to build additional extensions or even pull raw data. We arrived at the realisation that we needed to build our own made-to-measure simulation program.

Udacity simulation program, source

We were looking at many options on building a realistic model. Our first attempt was with Grand Theft Auto 5. We took the offline version and made some changes via Script Hook and we were able to export everything we needed to build an autonomous car. Read here for further information. However, we were not totally satisfied with the options that we had concerning the building fuse of sensors and its representation.

In the end, we came to the decision to build our own simulation program. This simulation program enables us to create any given scenario with any given objects and lighting. We are aiming at sensor fusion and the simulation program’s ability to process and display sensor fusion well. Have a look at our recent video showing how we implemented a basic sensor fusion from camera and radar data. We also implemented GPS, lane detection and YOLO object detection.

Our latest video from our simulation program

This video shows only the basics of what our simulator can in reality accomplish. Starting with various camera angles, editable landscape as well as road surfaces (differing road marking colours for specific countries). Here we show how our sensor fusion works locating the car in the blue area based on probability. One can see that the actual difference is enormous. Other than that, we are working on 2 more videos from our simulator which will show how our fusion can help prevent alleged incidents, such as the recent ones including Tesla and Uber autonomous vehicles. Stay tuned for more blogposts and videos. It could be more than interesting than you think.

--

--

Roboauto delivers safe and effective solutions for the automation of machines in warehouses and public transportation. Drive anything from anywhere.