So far, these screens and signals have only existed in a virtual-reality world, but last week, workers began installing prototypes on select vehicles in the fleet of Argo AI, the Pittsburgh startup building the virtual driver system which will run Ford's autonomous vehicles. Installations began in Dearborn, Mich., last week, and those cars will begin testing in Ford's Miami pilot project within a matter of weeks, according to the company.
Those three light signals are designed to be understood by all road users -- Shutko opted against text because it would be relegated to one language and wouldn't work to communicate with illiterate adults and young children. The lights are designed to meet regulations that govern how lights can be displayed on vehicles, and they are designed to be easily comprehended across cultures and on a global scale.
Equally important to what the light signals were designed to convey is what they're not designed to do: Shutko says the light bars communicate the status and intent of a vehicle, much like a turn signal. What Ford's system intentionally avoids is telling others how they should act or behave.
"We basically provide the status of the vehicle to others," he said. "We're not suggesting, 'We're turning, so you can now cross over there.' We didn't want to go down the road of recommendations."
That's not necessarily the position of others in the self-driving space. For one example, Bay Area startup Drive.ai, which is serving dedicated routes with self-driving Nissan NV200 vans in the Dallas suburbs, has developed external screens capable of displaying more than a dozen different text-based messages and icons, one of which may prompt a pedestrian to start moving by saying, "Waiting for You to Cross."