Here’s an idea: rather than work the bugs out of self-driving vehicles on real-word roads, why not used a “simulated urban environment,” complete with faux buildings and traffic lights?
That’s the concept behind what’s known as “Mcity,” which is a full-scale simulated real-world urban environment at the University of Michigan. The 32-acre “Mcity” site – which opened back in July and is part of the school’s Mobility Transformation Center – is packed with street lights, crosswalks, lane delineators, curb cuts, bike lanes, trees, hydrants, sidewalks, signs, traffic control devices, even construction barriers.
The facility also offers a range of surfaces – concrete, asphalt, simulated brick and dirt – along with two-, three- and four-lane road configurations, as well as ramps, roundabouts and tunnels.
“The goal of Mcity is that we get a scaling factor,” noted Ryan Eustice, an associate professor with the university. “Every mile driven there can represent 10, 100 or 1,000 miles of on-road driving in terms of our ability to pack in the occurrences of difficult events.”
Eustice also serves as the “principal investigator” with Ford Motor Co. at the Mcity site to help the automaker put its self-driving technologies through real-world road scenarios – such as running a red light – that can’t be replicated on public roads.
“Testing our autonomous vehicle fleet at Mcity provides another challenging – yet safe – urban environment to repeatedly check and hone these new technologies,” noted Raj Nair,’s group VP for global product development, in a statement.
He added that Ford revealed its Fusion Hybrid Autonomous Research Vehicle with University of Michigan and State Farm Insurance in 2013 in an effort to advance sensing systems so self-driving technologies could be integrated into Ford’s next-generation vehicles.
Nair also references efforts announced to move its research efforts in autonomous vehicle technology to the next step in development, to what’s called the “advanced engineering phase” wherein its engineers work to make sensing and computing technologies feasible for production while continuing to test and refine algorithms.
For example, Ford’s Fusion Hybrid Autonomous Research Vehicle merges current driver-assist technologies, such as front-facing cameras, radar and ultrasonic sensors, and adds four LiDAR (Light Detection and Ranging) sensors to generate a “real-time” three dimensional map of the vehicle’s surrounding environment – a map essential for dynamic driverless vehicle performance.
“This is an important step in making millions of people’s lives better and improving their mobility,” Nair noted.
Will ironing out the “bugs” in self-driving vehicles make them more attractive? That remains to be seen – though some recent surveys seem to cast doubt on the willingness of drivers to let go of the wheel.
[Then there’s the ticklish problem of what to do if a self-driving vehicle gets cited for a traffic violation – especially if no human is in the car.]
Yet other research indicates that more widespread adoption of “driver assistance” technology could significantly reduce vehicle crash rates – and save a ton of money, too.
Maybe such safety benefits will be the key to wider acceptance of self-driving vehicle systems. We’ll see.