Simulation specialist develops data farming technique to bring synchronise multiple sensor data and remove the need for manual test data annotation.
Simulation specialists at rFpro have developed a method to cut down on the cost of simulation hardware, potentially reducing the existing dependence on manual annotation of test data that is created frame-by-frame, which is both time-consuming and error-prone.
According to rFpro’s Matt Daley, many autonomous vehicle technology companies have an army of people to manually annotating each frame of a video, LiDAR point or radar return to identify objects in the scene (such as other vehicles, pedestrians, road markings and traffic signals) to create training data.
“Our new approach provides a digital method of creating the same data completely error-free and 10,000 times quicker compared to manual annotation, which takes around 30 minutes per frame with a 10% error rate. This step-change will enable deep learning to fulfil its potential because it significantly reduces the cost and time of generating useful training data,” he says.
rFpro calls the new approach Data Farming and compares it to Render Farming, which has revolutionised the economics of popular animation. It enables customers to build complete datasets that cover the full vehicle system where every sensor is simulated at the same time. The data is synchronised across all sensors, even with the most complex hardware designs. This is essential where customers are employing sensor fusion to bring together data, for example from multiple 8K HDR stereo cameras, LiDAR and radar sensors at the same time.
Data Farming is already being used by existing rFpro customers, including global Tier1 supplier, DENSO ADAS Engineering Services. “Through rFpro’s Data Farming we can create an extensive number of driving scenarios, allowing the generation of very large variations in scenes, all through the investment in a single platform,” said Francisco Eslava-Medina, Project Manager at DENSO ADAS. “This allows us to quickly and cost-effectively generate the vast quantity of quality training data that is essential for certain product development phases of computer vision technologies, especially for neural networks for our autonomous vehicle technologies.”
The new approach permits customers to start with even a single PC to perform a complex simulation involving multiple sensors. “Simulations don’t have to be run in real-time, offering flexibility to the user around the computing power required,” added Daley. “For engineers, this puts it within a typical departmental budget, rather than requiring senior approval. High-quality training and test data is now far more accessible.” Data Farming is fully scalable, allowing customers to expand across multiple hardware resources when they are ready to accelerate their data production.