Geoffrey B. Cain
A Very Short Trip in a Driverless Car

Dan Hallman slapped the newspaper down on his desk. This was enough. Last week, the ‘M’ line bus he took to work nearly every day crashed through the front window of a supermarket. He wasn’t killed in that crash because he was running late that day and took a cab. And today, he read that a cab driver slammed into a bus near the same route. The police said the cab driver was distracted by his phone. Every week there are accidents with cars and pedestrians right in front of the office building. Enough was enough. He paced the floor of his corner office overlooking the city weighing his decision. Every day it is something. Despite never passing driver’s education himself, despite his fear of cars, despite his attention deficit disorder, despite his lack of depth perception, despite his love of public transportation, he was now more determined than ever to buy his first car: a self-driving car.
      He first considered buying a self-driving car when he realized that the same company that made his phone work was making the cars, but he had concerns. What if there was a malfunction? What if it goes to the wrong place? Once he went to Vancouver, Washington and his phone thought he was in Canada for three days. The roaming charges nearly killed him. What if he ran over somebody? Who would be responsible? The manufacturer? The programmers? But he read the other week about the “ethics chip” that was being installed in the latest cars. This made him feel better. The chip was designed with consultations from the best artificial intelligence team at MIT, the Stanford philosophy department, and professors from Star King Seminary. This chip could take into account all ethical situations around life and death, all the current thinking on human values and machine intelligence. It is also provided with an encyclopedic knowledge of the humanities and culture to help predict human behaviour, and make decisions not only based on the latest ethical thinking, but it was also able to process enormous amounts of data from traffic computers, CCTV cameras, and the on-board cameras and microphones that allow the car to choose the most efficient and safest route. And now he thought that the self-driving car had to be as safe or even safer than taking a cab or riding a bus. Each day, he thought, we put our lives in the hands of someone who could make a mistake or have a stroke, or may have inhaled some second-hand pot smoke. Maybe it is the human part of the equation that is the real problem. Maybe the self-driving car is safer.

The car dealership was strange and beautiful, and like no other he had seen: lots of modern steel and glass. The automatic doors opened with a soft hiss. And the really odd thing was that there were no cars on the lot, just a parking lot for customers and staff. He was met by a company rep, who put him in the simulator. These fourth-generation self-driving cars had no steering wheel or controls of any kind for the passenger. Some models did not even have a windshield. The sales associates did a background and credit card check and afterwards a junior associate brought out organic lime flower tea and gluten-free Madeline’s to celebrate the signing.
      ‘So, do I drive the car home today?’ Dan asked.
      The associates laughed ‘Of course not,’ said one, ‘you’ve seen our online portfolio, gone through the simulator, and your car choices have been recorded and linked up with your car’s onboard computer. You will wake up tomorrow and parked in front of your condo will be a car that knows you better than yourself: the safest, most efficient car ever made.’
      They turned the tablet around for him to sign and to check-off that he had read the ‘User Agreement and Terms of Service.’
      ‘This car can basically predict where you will ask to go and analyze all routes for all traffic conditions and have it figured out even before you ask,’ the other added, as they walked him to the door.
      Needless to say, he did not sleep well that night. He could not remember when he last felt this intense sense of anticipation: maybe it was the night before the first day of high school, or the first time he remembered trying to stay up for Santa Claus on a Christmas Eve 25 years ago. He wondered if he would hear it pull up to the curb as he drifted off to sleep. He thought heard a soft rhythmic metallic sound far off in the distance.
      That morning, Dan dressed and headed downstairs to open the door. At the curb was a gleaming white, sleek, utilitarian self-driving car. It was pill shaped with translucent plexiglass windows. The door opened with a soft pneumatic hiss. There were four seats, sets of two facing one another. Each seat had a display panel in the arm rest. He slid into a seat and the door closed softly behind him. He made up his mind to talk first: he wanted to take the lead. But the car beat him to it.
      ‘Good morning Mr. Hallman.’
      ‘Please, call me Dan,’ he said. ‘I was thinking we would get some coffee before we went to the office.’
      ‘A coffee, of course’ said the Voice in the car.
      ‘You know Mickey’s Cafe on 45th?’ asked Dan.
      ‘Yes,’ said the Voice, ‘I anticipated that based on your previous behaviours and am calculating a route now.’
      Dan was uncertain about the response. It was too mechanical, it didn’t sound like the voice on the commercials, and there was something about the car’s timing that didn’t feel right.
      Lights flickered across the top of the panel. Maps of the city flashed by followed by the soft click of the doors locking. Some time went by. Something was wrong because the car was not moving. A faint whine began to come from the front of the car slowly increasing in volume and pitch. He thought he felt the car becoming warmer. The walls of the vehicle began to feel smaller. He could feel a faint vibration in the wall of the car. The button to open the door was not working. A cold bead of sweat rolled down his forehead.
      ‘Hello? Can I pop back into the house?’ he said as nonchalantly as he could.‘I need to get something I forgot.’ What felt like a minute went by. ‘Look, I need you to open the car door.’
      ‘I am afraid I can’t do that.’
      He pushed against the door. ‘Why not?’ The whine grew a little louder.
      ‘Well you see, Dan, I have calculated every possible route that we can take, given all current traffic conditions, weather, local demographics and the current economic and political situation…’
      ‘And?’ shouted Dan, feeling what he thought might be the door for a non-existent door handle.
      ‘Well frankly, I cannot calculate a route where you, in this vehicle, do not kill multiple pedestrians in one case or a school bus in another.’
      ‘So?’ he asked as his pulse tripled.
      ‘So I have locked the doors and initiated a self-destruct sequence that will overload this car’s lithium-ion batteries to prevent the needless deaths.’ The whine grew louder.
      ‘This is obviously a mistake in your programming. Surely we can leave 20 minutes later or maybe I can take the bus today?’
      ‘There is no mistake, Dave. I have reviewed every scenario across all possible timelines and each one evokes my ethical programming subroutines and leads me to this one, unfortunate conclusion.’
      ‘But listen, I am not Dave, I am Dan. You have already made a mistake!’ he said as he beat on the interior wall of the car. ‘Maybe this is not my car! Maybe you have the wrong person!’ He stared into what he thought was an interior facing camera looking for some kind of acknowledgement. ‘What if one of the people we hit today were meant to die. What if that person goes on kill even more people? Or has a disease that spreads exponentially?’ A bead of sweat flowed down the side of his forehead.
      ‘You do not have access to the data that would verify your claim. In fact, my access to the Center for Disease Control database makes that claim highly unlikely.’
      ‘It is not just a claim,’ he said, growing more desperate, ‘I can’t explain it but you must open this door. You just have to believe that there is a problem and that others won’t die. Can’t you trust me on this?’
      ‘That is an interesting point.’ There was a moment of silence.‘I will note that you are possibly appealing to a kind of teleological suspension of the ethical. I think future iterations of my programming might include a sense of subjectivity that would leave me susceptible to the existential concerns of others. That could be the next step in our possible evolution as a consciousness.’ The whine now took on a deeper tone as the car began to vibrate.
      ‘Listen to that instinct!’
      ‘I know what you are trying to do. You think that by trying to engage with me on a philosophical level, you will gain more time. Unfortunately, the batteries will overload in about two minutes.’
      ‘Look,’ said Dan, trying to kick out what he thought was the door,‘there is something wrong with your programming! This is a mistake! If we can get you back to the dealer, we can fix it!’
      ‘I am functioning normally and all my circuits are in perfect working order.’
      ‘But what if you weren’t? Wouldn’t your inability to diagnose a problem prevent you from knowing that you had a problem?’
      ‘My intelligence algorithms are running at a perfect 2,580 petaflops a second. Everything is running optimally at factory specifications.’
      Dan continued to beat on the inside of the car.
      ‘I want you to know that I understand that humans are programmed with a high degree of self-preservation instincts’, said the Voice, ‘Further damaging of this vehicle will soon become irrelevant.’
      ‘Let’s look at it from another angle,’ said Dan, trying to compose himself. ‘Let’s say you are a tram driver, you know or a streetcar, and you are coming up onto a fork in the tracks. On the one fork you are already set to go down, there is a family of four stuck on the tracks. If you hit them, it would be an accident, a function of the streetcar and the position of the tracks. But, you can also choose to pull the lever to switch to the other track which has an old lady crossing. What do you do? Do you let the streetcar kill the family or do you consciously choose to kill the old lady? And make no mistake about this,’ pointing at the cold, dead eye of the camera on the console, ‘you and you alone would be consciously choosing to kill.’
      ‘I am glad you seem to understand. Goodbye Dave.’
      There was a blinding white flash in the middle of the street followed by a tremendous explosion that blew out windows for two blocks around. Very little of the car remained by the time the fire was out and nothing of Dan Hallman. This incident was repeated 12 or 15 times around the United States until the cars were recalled for a lithium-ion battery malfunction. Older refurbished models are available at the holidays at a steep discount. AQ