The family of a girl who was killed when the car she was in was rear-ended by a driver using his iPhone’s Facetime app has sued not only the driver, but also Apple. The family says iPhones should disable video and other distracting apps when they are being used by a driver. Should it be a company’s responsibility to make social media and other distracting apps unworkable when they are used in a moving car?
Mangu-Ward believe the responsibility rests with each of us as individuals, and believes it is impossible to outsource our ability to make decisions:
“You almost certainly already rely on technology to help you be a moral, responsible human being. From old-fashioned tech like alarm clocks and calendars to newfangled diet trackers or mindfulness apps, our devices nudge us to show up to work on time, eat healthy, and do the right thing. But it’s nearly impossible to create a technological angel on your right shoulder without also building in a workaround that is vulnerable to the devil on your left. Put another way: Any alarm clock user who denies that he has heard the siren song of the snooze button is lying.
Technology can help us make good decisions, but outsourcing good decision-making to technology, tech companies or the government isn’t just a bad idea — it’s impossible.
People already know that distracted driving is dangerous. They tell pollsters so all the time. Because of this clear customer demand, smartphone makers offer safety conscious drivers a variety of ways to minimize distraction, from handsfree headsets and voice command to mute buttons and airplane mode.
But automatically disabling certain apps in a fast-moving vehicle — as the grieving family of 5-year-old distracted driving victim Moriah Modisette is suing to force Apple to do — won’t work. One of the great glories of the smartphone era is the ability to work, chat and read while on mass transit or riding shotgun, so there’s no way to build an accelerometer-based shut-down unless you also add an opt-out. And if there’s an opt-out, then fallible, foolish humans will always use it to thwart the original intent.
What’s more, legally mandated technological fixes tend to be even less effective than their market-driven counterparts: Think of the “Are You 18?” queries that pop up on sites peddling liquor, cigarettes or other adult products. (Has anyone in the history of the internet ever clicked “No”?) Judges and regulators consistently overvalue their ability to prevent catastrophe and undervalue the costs they impose on innocent users. The most wide-reaching effect of any kind of mandatory distracted driving safety provision will simply be to force every user of every smartphone, on every bus, train and plane to click “I am not the driver” every day unto eternity, without actually dissuading the kind of jerks who are determined to FaceTime while driving down the interstate.”
Mars argues an opposing view by addressing the potential technology that can be developed to detect whether or not the user is behind the wheel:
“While the untimely death of an innocent 5-year-old is tragic, it’s clear that Apple shouldn’t be legally responsible for the irresponsible driver who killed her. Almost any distraction can lead to an accident. If a driver slammed his car into someone because he took his hands off the steering wheel to unwrap a taco, surely we wouldn’t hold Taco Bell responsible, or outlaw the eating of tacos while driving.
That being said, companies do have a social responsibility to be mindful of hazards that arise from misuse of their products and take sensible precautions. In the case of Apple, it would be absolutely reasonable for it to use a non-intrusive mechanism to detect with near perfect accuracy when a user is driving to prevent hazardous distractions.
The challenge that arises here is whether the technology can achieve near-perfect accuracy in driver detection. From a technical standpoint, its straightforward to sense the rate that a phone is moving. For example Apple provides a set of software protocols called CoreMotion that lets programmers glean insights about the phone’s movement and even has an “automotive” property to predict whether the user is in a vehicle. However, detecting whether the user or owner of the phone is the driver or a passenger is trickier with just this approach. In the case of FaceTime and other apps involving a camera, there is an opportunity to use the camera, along with deep-learning algorithms, to literally look at the user and environment and discern whether the user in view is driving. There has been a wealth of research on detecting driver fatigue and other attributes, some of which has been discussed at the IEEE Intelligent Vehicles Symposium. I would expect such a solution to be readily adopted by users if the accuracy is high enough, as mispredictions can create frustration and discourage use.
The state of deep learning technology is at a place where companies like Apple should explore its use for safety purposes. While a staunch libertarian would be opposed to the infringement on freedom, I simply can’t think of a situation where someone should be FaceTiming and driving, ever.”