In order to make self-driving smooth and safe in all-weather conditions, Sensible 4 autonomous vehicle fleet is being test-driven hundreds of kilometers almost everyday. Our office is located in Espoo, where the autonomous vehicles can be instantly sent to collect testing data from complex and dynamic city environments.
However, proper winters are rare in the southern part of Finland, which makes it difficult to test our software in harsh winter conditions. Due to this reason, we spent two and half weeks in Muonio in December, when the temperature was already well under -20 degrees Celcius.
We had three different targets with the trip: training data gathering, hardware testing, and so called full-stack testing.
To help measure the weather conditions, we had the Vaisala PWD22 weather instrument with us. The Vaisala PWD22 is used to precisely detect the current weather parameters, and helps us to identify the threshold for the weather conditions in which our hardware and software are still able to operate.
Full-stack performance in harsh winter conditions
Sensible 4 is essentially a software company, providing a full stack solution for autonomous vehicles. This means incorporating individual software components, such motion planning, localization and object avoidance, into a coherent system where each component has to work seamlessly together.
For full stack testing, we had designed common scenarios from traffic, such as vehicle overtaking, emergency braking and adapting to a lead car speed. In Muonio, we were especially interested in conducting the test while the road surface was slippery and during heavy snowfall. Both are important factors as during a wheel slippage the vehicle motion calculated based on the wheel encoders does not match to a real vehicle motion and might confuse our vehicle localization. In addition to slippery surfaces, a heavy snowfall will also impair our vision-based localization and object detection by decreasing the visibility and by deforming the landscape indistinguishable.
Vision dataset on adverse weather conditions
Obstacle Detection and Tracking System (ODTS) uses artificial neural networks for vision related tasks, which are the hottest techniques right now in vision-guided robotic applications.
However, most of these techniques are really data hungry, meaning they require a lot of example images of objects and scenarios from realistic environments to work properly.
For instance, one of the most typical tasks for autonomous vehicles is road lane detection, which is used as a building block for motion planning and control actions. In Muonio, we were interested in collecting image data from road conditions where the lane markings are not visible due to the extensive amount of snow on the road.
In addition to identifying the drivable path on the road, one of the most crucial tasks is object detection. During winter people wear different clothes and the vehicles on the road might be covered in snow, which makes their appearance much different compared to summer time. It was important to collect data samples of these examples in order to train our perception systems against winter conditions.
The third objective for the test-trip was to test hardware. The main target of the hardware testing was to evaluate the performance and robustness of various LiDARs, radars and cameras in arctic weather conditions.
For instance, today there exists a huge number of different LiDAR manufacturers, each trying to offer a better solution than the other one. Although the prices have come down steadily over the last couple of years, a single LiDAR sensor can still cost tens of thousands dollars and typically there are multiple sensors installed into a single vehicle.
The sensors have to be robust as the autonomous vehicles are operating outdoors where they are exposed to various natural phenomena which can deteriorate the sensor performance such as direct sunlight, fog or accumulated water on the sensor surface. In Muonio we were exposed to even more extreme weather conditions, such as freezing temperatures and whirling snow on the road.
One of our main findings from the tests was that some of the sensors significantly lose performance when the temperature drops to less than -10 degrees. In addition, combination of high humidity and low temperature can create a thin layer of ice on the lidar surface, eventually obstructing the lidar field of view and literally making it blind for surrounding obstacles.
The abnormal behavior is noted, and in the future, those sensors that performed well even in the hardest conditions will be used. Another option could be creating custom solutions, like covers, to avoid the problems.
Main takeaways from the trip
Muonio provided us a lot of information for future development and testing purposes. The collected data can be used to improve our ODTS stack to cope with winter conditions where the common landmarks or objects are not visible or they look completely different than in summer.
In the hardware testing, we were surprised how differently various sensors developed for autonomous driving performed in freezing and snowy conditions. For instance, some of the sensors went completely blind or otherwise acted strangely whereas others worked perfectly fine even in the lowest temperatures.
Finally, the tests related to the full stack evaluation provided a big picture of the current status of our software for autonomous driving in harsh weather and based on our results we should be more than ready in 2022 when we will launch the world’s first commercial all-weather Level 4 autonomous driving software named Dawn.
Antti Hietanen works as a Senior Autonomous Vehicle Engineer at Sensible 4. Less than a month ago he successfully defended his PhD thesis Safe and Autonomous Robots using Computer Vision.