The system uses embedded and cloud software.
Nisenbaum says the company's embedded software uses proprietary algorithms and artificial intelligence to generate real-time insights about vehicle-road dynamics.
"About 50 to 60 self-driving vehicles equipped with our embedded software will drive on the streets of a particular city and collect information from existing nonvisual sensors," Nisenbaum said.
The embedded software analyzes sensor data such as wheel spin, wheel angle, rotations per minute and gear positions.
The results of all this data-crunching are fed back into the vehicle computer, providing better context for driving decisions.
"Vehicles embedded with this software are provided with insight on the road ahead, improving safety and user experience reaction time," Nisenbaum said.
Meanwhile, Tactile's cloud software aggregates data gathered by multiple vehicles on multiple roads to create constantly updated maps of road conditions and hazards, including bumps, cracks and potholes. This crowdsourced SurfaceDNA 3D model can be used by customers such as municipalities, fleet managers and insurers.
"Imagine the thought of taking multiple trips of different vehicles on the same road and overlaying the signals collected from each vehicle," Nisenbaum added. "Data collected are processed into machine-learning algorithms that tell a narrative about the road, and with that, we are able to create maps of the roads."