The company provides driving simulation software, and 3D content, for Deep Learning Autonomous Driving, ADAS and Vehicle Dynamics testing and validation.
Also, the company is being used to train, test and validate Deep Learning systems for ADAS and Autonomous applications by OEMs and top tier companies.
By developing a physical model of the real world, known as the Ground Truth, the company can accurately test the vehicle’s perception of its surroundings, which has not been previously possible.
This will enable legislators to define an approval process for a vehicle within a completely virtual environment, certifying it as safe to use on real roads.
The technique is already being used to validate vehicle safety in Euro NCAP tests.
“Most system modelling begins with ideal sensor models in order to validate the algorithms and control systems of the vehicle, but this bypasses any limitations in the sensors themselves,” said rFpro technical director Chris Hoyle.
“Difficult lighting conditions or the reflections in a shop window can corrupt a sensor’s perception of the vehicle’s surroundings, leading to potentially catastrophic errors. Thorough validation of a CAV or ADAS-equipped vehicle must include the sensors’ ability to recognise and characterise the features of its environment,” he added.
The ability to evaluate the sensors’ perception during simulation matters because future legislation is likely to dictate the virtual testing and approval of any autonomous system before its use on public roads is permitted.
The whole system must, therefore, be tested in a fully-representative virtual environment, not just elements of it.
Due to the vast number of miles required to validate an autonomous vehicle in a huge number of different environments, it is not feasible to do this in the real world.
Sensor perception is the most challenging aspect because it requires a physically accurate virtual world with high levels of correlation to the real world and physically modelled sensors.
“Physical modelling means simulating the materials and properties of every object encountered by the vehicle and its onboard sensors, rather than just an abstract representation of it as used by most systems,” said Hoyle.
“With several years’ experience of creating digital twins of city streets, rural roads, proving grounds and test tracks, we understand the complexities of modelling features like changing weather conditions or road surfaces.
“Our engineers are constantly being challenged by our customers to bridge the gap between simulated and real-world testing. Whether this is 8 stereo, 4K cameras with live exposure control and real motion-blur modelled, or even a radar model picking up the micro-doppler from a pedestrian moving their arm. All of this must be possible for successful simulation and can all be done using rFpro.”
Hoyle believes the company is unique in providing a closed-loop end-to-end simulation of autonomous systems, with a vital element of this process being the inclusion of an accurate vehicle model that responds in a fully representative manner to road surface changes and control inputs.
“Perception is critical not just from a safety point of view but also for consumer satisfaction,” he says. “For example, with our system, minute road surface differences are accurately modelled, which means ride comfort can be assessed. We can explore the CAV’s ability to identify and avoid potholes like a human would. Without this, passengers are unlikely to want to ride in an autonomous vehicle again.”
A current UK government project, dRISK, uses the company's software to validate sensor models against the real world so they can be correlated with actual sensors. This is seen as an essential stepping stone towards end-to-end validation of autonomous systems through simulation.
“Due to the infinite inputs possible from these sensors, there is a big emerging need for autonomous system test engineers making use of simulated software,” concluded Hoyle.
“Previously there has been around a 1:1 ratio of software engineers to test engineers in the automotive sector. We believe this is moving closer towards the avionics sector where there are around five test engineers per software engineer to ensure the safety of passengers.”
When developing systems based on machine learning from sensor feeds, such as camera, LiDAR and radar feeds, the quality of the 3D environment model is very important.
The more accurate the virtual world is the greater the correlation will be when progressing to real-world testing.
The company's HiDef models are built around a graphics engine that includes a physically modelled atmosphere, weather and lighting, as well as physically modelled materials for every object in the scene.
Hundreds of kilometres of public road models are available off-the-shelf, from rFpro, spanning North America, Asia and Europe, including multi-lane highways, urban, rural, mountain routes, all copied faithfully from the real world.
The company scales from a desktop workstation to a massively parallel real-time test environment connecting to customers' autonomous driver models and human test drivers.