In the coming years, autonomous driving applications will become increasingly common in public transport. Self-driving electric shuttle buses will be driving in areas where passenger numbers aren’t large enough to support rail connections, but population density still calls for some type of public transport. These areas often include inner and outer suburbs, with people using various means of transport for moving around.
Whenever people and vehicles are on the move, you have to expect the unexpected. A car may stall in the middle of the road, traffic lights may go out of order, or familiar routes may be blocked due to roadwork. These kinds of special circumstances are particularly challenging for autonomous vehicles. Typically, a robot can navigate around an incident, but what the machine lacks is an ability to read the overall situation and communicate with other road users. Traffic is, after all, mostly about people interacting.
Different levels of vehicle autonomy
According to widely-recognised industry definitions, there are five levels of automation. Regular cruise control is an example of low-level autonomy – it’s a machine that maintains a specific speed regardless of road topography.
Adaptive cruise control and lane keeping assistant are examples of Level 1 or Level 2 autonomy. On Level 3, advanced driver assistant systems are used to maximum effect, with the vehicle driven by a machine most of the time. Still the human driver is responsible for safe progress.
Level 5 autonomy means full autonomy with the human completely out of the equation. Up until a few years ago, the autonomous driving industry was mainly focused on this type of fully self-driving vehicle technology. Since then, awareness of the technical challenges involved has gradually increased and it has become clear that special circumstances in traffic are extremely difficult, if not impossible, for a machine to handle.
Remote control is used in level 4 autonomy
Sensible 4 is developing vehicle autonomy for Level 4. At this level, autonomous driving technology handles driving without continuous human driver monitoring. Level 4 autonomy can’t handle all special circumstances, but it can always bring the vehicle to a safe stop. The driver, or operator, is located in a remote location and takes control if necessary.
“The operator receives sensor data and video from inside and around the vehicle,” says Matthieu Myrsky, team lead for development of Sensible 4’s Remote Control Centre (RCC).
The operator can also monitor vehicle telemetry, including speed, location, and battery charge level, and, in demand-based transportation, rides that have been requested or are in-progress.
Going forward, the RCC under development also enables voice communication between operator and passengers for handling typical customer service scenarios. If a robot bus encounters an obstacle it cannot navigate within the parameters of a predetermined route, the vehicle stops and the operator assumes control.
Commands given by the operator in navigating obstacles are important, for example, when police is guiding traffic and the bus computer system doesn’t get ‘stop’ or ‘go’ information from traffic lights.
No remote steering wheel for autonomous vehicles
There is no virtual steering wheel or pedals, however, for controlling the vehicle in Sensible 4’s RCC.
“The delay in 4G communications between the shuttle bus and Remote Control Centre is the main obstacle preventing direct remote control,” Myrsky says. Any breakdowns in telecommunications could also be a problem in the middle of overtaking another vehicle.
It’s difficult for the operator to remain fully in control when there is a slight delay in the vehicle video feed displayed in the control centre, and, similarly, a delay between operator giving commands and the control data reaching vehicle steering.
The improved data communications from gradually expanding 5G networks will reduce latency in vehicle remote control. Specifically, this will enable delivering an increasingly rich and detailed snapshot from the vehicle to the operator, and make it possible to provide road infrastructure information to the vehicle.
Sensible 4’s core competence lies in self-driving car and autonomous shuttle bus technology – the software in the vehicle computer system designed to interpret the sensor feed and make driving-related decisions accordingly. For remote control technology, Sensible 4 has partnered with the Japanese company BOLDLY, previously known as SoftBank Drive. Among other things, BOLDLY delivers the RCC dispatcher interface used in the FABULOS project, which receives information from vehicles driving in Pasila, Helsinki via Sensible 4’s programming interfaces.
Similarly, future demand-based services, various route planners, and mobile ticket apps developed by other parties are linked to Sensible 4’s systems through these interfaces.