- Photo by Pixabay on Pexels.com
I was reminded of it when I saw this story about how crime data will be used recommend safe parking areas. Of course that’s a cool feature. When I was at INRIX we looked at similar advanced routing ideas for everything from scenic drives, fuel efficiency and accident frequency.
We’ve all relied on mobile apps for the “fastest” route with the least traffic, even if that route might otherwise be difficult or dangerous to drive. Waze already has the ability to avoid high crime areas. It will even alert you when you stray off its recommended routing and into an area it considers high crime.
There are regions around the world where kidnapping and auto crimes are much more prevalent than they are in the US. These online mapping apps can help keep anyone who’s in unfamiliar territory safer.
Having these features built in is convenient but nothing new. Police activity and crime statistics are available from many sources. Everyone can use that data today to stay informed before getting into a car. Apps and algorithms place this information in your smartphone and at your fingertips.
What happens when we delegate even more decisions to our autonomous cars?
Imagine a future where you call for a shared autonomous vehicle to pick you up. You’re not sure what you’d like for dinner and its easy ask the car’s built-in virtual concierge for recommendations. The results will reflect past preferences as well as settings from your service profile.
Your service profile includes a “safety” setting. You’ve told the service to avoid driving through any area not marked as “safe” on the city map. Its not clear that the difference is between safe and unsafe. Still, better safe than sorry?
There is no standard for things like this. The red/yellow/green colors for traffic on your car or app map are set by each company. What passes for bad traffic in your app might be 10 miles per hour below the normal average. The car manufacturer could decide that 20 miles per hour is the right “red” setting and 10 miles should be “yellow”.
Now apply that to the idea of safe areas or neighborhoods. What’s the definition of safe? How much crime is OK in this city versus another? Cities across the US have differing levels of crime. What might be considered average or safe in Seattle might be thought of differently in other parts of the country. Its unlikely you’ll know the exact data the algorithm used to determine the “safe” neighborhoods.
Excessive personalization can create an isolating “filter bubble”. You know what you like and choose it. The algorithm learns your choices and keeps providing answers to your queries that fit your past preferences. You end up trapped in a bubble of your own biases. New information and ideas can’t break through. The system is tuned to only show results that fit your past choices.
Back to our trip to dinner, the car’s concierge algorithm is trying to pick a restaurant for you. It knows you have certain safety preferences in your profile along with past choices. You gave the service a bad rating on the last trip, so now the algorithm will try to optimize for an improved result. The programmers have designed it to reduce member churn and keep customers happy.
What restaurant options will it present to you? Its likely that whole neighborhoods and cuisines will be discarded before you’re presented with the results. There’s a highly rated new place in an up and coming part of town that still struggles with nighttime crime. Eliminated. The neighborhood surrounding an old favorite has started to decline. Doesn’t make the cut. Those restaurants won’t be top options if they are presented at all.
Autonomous cars are going to be a boon to cities worldwide. They will enhance transportation for everyone. on demand mobility will be more accessible regardless of economics or ability. That “great power” comes with an even greater responsibility. Its not enough that the next generation of transportation is safe on the roads. it has to treat all roads and all destinations equitably.
Cities must be vigilant in how these systems are built and deployed. It will be too simple to continue economic isolation by simply routing people away from “danger.” That could hurt many vulnerable communities unintentionally.
Autonomous cars and the intelligent algorithms that power them should not become another isolating bubble. Sensors protect autonomous cars from obstacles. Planning for and designing for inclusion will protect our community: in the car and in our neighborhoods.