Since the post by Mike Love entitled “AMA lobbying for comments to NHTSA for Automated Driving Systems” has been replied to mostly by people with other axes to grind about the AMA, etc., etc., I am posting in a new thread some thoughts which I have had about the future problems which can be expected with so-called “Autonomous cars” and their interaction(s) with our old motorcycles. These two articles were written by myself and published in the Canadian Vintage Motorcycle News in 2016. Hopefully they will encourage some discussion on this subject.
AFJ
The Tragedy of the Autonomous’ Car—that Wasn’t
(From the September, 2016 Canadian Vintage Motorcycle Group News)
In last November’s column I wrote the following words regarding the Ontario government’s pilot 10 year project to test autonomous (robot) cars and trucks on public roads.
“It would seem to be an automatic response. A small child chases a ball out onto the road in front of you as you ride along. Depending on the time and distance available you brake to a halt in time or swerve to avoid a collision, even though you might end up in dropping the bike and sliding down the road to probable injury.”
But, in the near future (by 2020 some car makers say) there will be driverless (autonomous) “robot” cars and trucks on our roads. But if Isaac Asimov’s first ethical Rule for Robots “An autonomous machine may not injure a human being or, through inaction, allow a human to be harmed”, is programmed into a car, in the same situation as above, how does the car choose between protecting the person or persons being carried in it and the possibility of injuring a smaller human being? What ethical choice should a “robot car” be programmed to make?
Certainly, in the more distant future, when all cars and trucks are robot-controlled so as to completely avoid colliding with others of the same type and size, there may be significant benefits to robot cars. But to me there will always be what bureaucracy calls “vulnerable road users” (pedestrians, bicyclists and motorcyclists) who will not be recognized by the robot cars. Some software people indicate that such ethical programming for every conceivable situation is currently impractical. Further, our old bikes would not have the electronics which would allow our bikes to communicate their presence to these future “robot automobiles”.
On May 7th, on a Florida highway, a 2015 Tesla S car, operating on their “Autopilot” Beta-stage system, crashed at high speed into the trailer of a crossing tractor-trailer, passing under the trailer, which sheared off the top of the Tesla. The car continued for 700 feet along the highway, gradually going into and out of a fenced field, travelling 200 feet more, then hitting and shearing off a utility pole and finally coming to a halt. The driver of the Tesla was killed, presumably when the upper part of the Tesla body was torn off by the deck of the trailer. The $140,000 Tesla has computer, radar, photo and wireless sensing systems which are supposed to warn the driver of possible collision and then, if the driver does not respond, control the car to take avoidance action and brake itself to a halt.
The accident happened on May 7th but an investigation was not started by the US National Highway Traffic Safety Administration (NHTSA) for about 10 days until Tesla notified them. Public notice only occurred when media reports began to come out on July 1.
What appears to have occurred was :
The Tesla was doing 85 mph on unsigned Florida SR500 (a 4 lane divided-with-median highway with many unmarked intersecting or access paved or gravel roads controlled, in some cases, by Yield signs), when it came over a hill and down to where the tractor trailer was making a legal crossing turn into a side road. The radar send/detect system mounted below the front bumper on the Tesla did not detect the trailer as it could see under it. The camera system mounted higher up at the windshield top seems to have failed to detect as well, claimed by the Tesla firm to be due to lack of contrast between the white trailer and the Florida sky. However, Mobileye, the company that makes camera-based computer-vision systems for autonomous driving has stated, “We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.” So the camera was not designed to see a “crossing” problem at all.
But what of the driver who, according to Tesla’s instructions, should always have his hands on the steering wheel and be paying attention to the road. Tesla also tells their car purchasers that their “Autopilot” may not detect “stationary vehicles or obstacles, bicycles, or pedestrians.” The NHTSA in the US indicates that the Tesla S ranks at only level 2 or 3 out of the 5 levels of technology required to be a fully “self-driving car”. Florida State police found a portable DVD player in the wrecked Tesla just after the crash. The truck driver involved in the accident went to the Tesla and has stated that he could hear sound, coming from the wrecked Tesla. It was, he reported, a Harry Potter movie playing. It would seem that the Tesla driver may have been distracted, at very least, from watching the road.
Incidentally, about a month before this fatal accident, the driver had posted a YouTube video showing how his Tesla had “reacted” to save him from a highway accident—with a white truck. Apparently he was listening to an audio book at the time of that encounter.
It puts me in mind of the long-ago “Pogo” comic strip where the slogan (then referring to pollution problems) originated. That slogan? “We have met the Enemy—and He is Us!”
(From the December, 2016 Canadian Vintage Motorcycle Group News)
You Just Knew This Was Going to Happen
A traffic accident in Norway, involving a Tesla Model S with Autopilot engaged, two other vehicles, and a motorcycle, has prompted questions as to whether testing of Tesla’s Autopilot system sufficiently took into account two-wheeled vehicles. This follows recent official tests in Germany that characterized the Autopilot feature of the Tesla S as a “traffic hazard.”
Meanwhile, a small scandal was unfolding in Germany after the magazine Der Spiegel published a previously unseen report from the Federal Highway Research Institute (BASt) on the Tesla Model S Autopilot. The German tests had started as soon as the first fatal accident involving the system was reported in May in the US. With an estimated 3,000 Tesla Model S cars sold in Germany, the authorities were understandably obliged to look deeper into the matter.
After many thousands of kilometers of testing, BASt reportedly concluded that Autopilot represents a significant traffic hazard. Judging that is was not designed for complex urban traffic situations, the report declared that the car’s sensors are too short-sighted (i.e. short range) to cope with the reality of German motorways. The federal agency in charge of motor transport evaluated the research institute’s results and responded swiftly, proposing that the government provisionally suspend Tesla’s type approval. Although this didn’t happen, German Model S owners are reported to have received official German federal correspondence urging them to remain vigilant while the Autopilot system is engaged.
The Federation of European Motorcyclists’ Associations (FEMA), in co-ordination with the Koninklijke Nederlandse Motorrijders Vereniging (KNMV) and the Motorrijders Actie Groep Nederland (MAG NL) motorcycle clubs, has issued a formal letter to the Dutch vehicle authority RDW inquiring whether testing procedures of autonomous vehicles take into ac- count two-wheelers.
Similar action had been undertaken earlier by the Norwegian riders’ organization NMCU, directing questions towards the transport minister, Ketil Solvik-Olsen, and Tesla co-founder and CEO, Elon Musk. This was sparked by an accident on the E18 road to Drammen, Norway, where a Tesla Model S with Autopilot engaged rear-ended and seriously injured a female motorcyclist on July 27. Motorcycle rear-ending raises questions on Tesla vehicle type approval in Europe.
American research conducted by John F. Lenkeit of Dynamic Research, concludes that forward collision warning systems for automobiles fail dramatically to detect motorcycles, providing inadequate results in 41 percent of tested cases, against only 3.6 percent for passenger cars.
AFJ
The Tragedy of the Autonomous’ Car—that Wasn’t
(From the September, 2016 Canadian Vintage Motorcycle Group News)
In last November’s column I wrote the following words regarding the Ontario government’s pilot 10 year project to test autonomous (robot) cars and trucks on public roads.
“It would seem to be an automatic response. A small child chases a ball out onto the road in front of you as you ride along. Depending on the time and distance available you brake to a halt in time or swerve to avoid a collision, even though you might end up in dropping the bike and sliding down the road to probable injury.”
But, in the near future (by 2020 some car makers say) there will be driverless (autonomous) “robot” cars and trucks on our roads. But if Isaac Asimov’s first ethical Rule for Robots “An autonomous machine may not injure a human being or, through inaction, allow a human to be harmed”, is programmed into a car, in the same situation as above, how does the car choose between protecting the person or persons being carried in it and the possibility of injuring a smaller human being? What ethical choice should a “robot car” be programmed to make?
Certainly, in the more distant future, when all cars and trucks are robot-controlled so as to completely avoid colliding with others of the same type and size, there may be significant benefits to robot cars. But to me there will always be what bureaucracy calls “vulnerable road users” (pedestrians, bicyclists and motorcyclists) who will not be recognized by the robot cars. Some software people indicate that such ethical programming for every conceivable situation is currently impractical. Further, our old bikes would not have the electronics which would allow our bikes to communicate their presence to these future “robot automobiles”.
On May 7th, on a Florida highway, a 2015 Tesla S car, operating on their “Autopilot” Beta-stage system, crashed at high speed into the trailer of a crossing tractor-trailer, passing under the trailer, which sheared off the top of the Tesla. The car continued for 700 feet along the highway, gradually going into and out of a fenced field, travelling 200 feet more, then hitting and shearing off a utility pole and finally coming to a halt. The driver of the Tesla was killed, presumably when the upper part of the Tesla body was torn off by the deck of the trailer. The $140,000 Tesla has computer, radar, photo and wireless sensing systems which are supposed to warn the driver of possible collision and then, if the driver does not respond, control the car to take avoidance action and brake itself to a halt.
The accident happened on May 7th but an investigation was not started by the US National Highway Traffic Safety Administration (NHTSA) for about 10 days until Tesla notified them. Public notice only occurred when media reports began to come out on July 1.
What appears to have occurred was :
The Tesla was doing 85 mph on unsigned Florida SR500 (a 4 lane divided-with-median highway with many unmarked intersecting or access paved or gravel roads controlled, in some cases, by Yield signs), when it came over a hill and down to where the tractor trailer was making a legal crossing turn into a side road. The radar send/detect system mounted below the front bumper on the Tesla did not detect the trailer as it could see under it. The camera system mounted higher up at the windshield top seems to have failed to detect as well, claimed by the Tesla firm to be due to lack of contrast between the white trailer and the Florida sky. However, Mobileye, the company that makes camera-based computer-vision systems for autonomous driving has stated, “We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.” So the camera was not designed to see a “crossing” problem at all.
But what of the driver who, according to Tesla’s instructions, should always have his hands on the steering wheel and be paying attention to the road. Tesla also tells their car purchasers that their “Autopilot” may not detect “stationary vehicles or obstacles, bicycles, or pedestrians.” The NHTSA in the US indicates that the Tesla S ranks at only level 2 or 3 out of the 5 levels of technology required to be a fully “self-driving car”. Florida State police found a portable DVD player in the wrecked Tesla just after the crash. The truck driver involved in the accident went to the Tesla and has stated that he could hear sound, coming from the wrecked Tesla. It was, he reported, a Harry Potter movie playing. It would seem that the Tesla driver may have been distracted, at very least, from watching the road.
Incidentally, about a month before this fatal accident, the driver had posted a YouTube video showing how his Tesla had “reacted” to save him from a highway accident—with a white truck. Apparently he was listening to an audio book at the time of that encounter.
It puts me in mind of the long-ago “Pogo” comic strip where the slogan (then referring to pollution problems) originated. That slogan? “We have met the Enemy—and He is Us!”
(From the December, 2016 Canadian Vintage Motorcycle Group News)
You Just Knew This Was Going to Happen
A traffic accident in Norway, involving a Tesla Model S with Autopilot engaged, two other vehicles, and a motorcycle, has prompted questions as to whether testing of Tesla’s Autopilot system sufficiently took into account two-wheeled vehicles. This follows recent official tests in Germany that characterized the Autopilot feature of the Tesla S as a “traffic hazard.”
Meanwhile, a small scandal was unfolding in Germany after the magazine Der Spiegel published a previously unseen report from the Federal Highway Research Institute (BASt) on the Tesla Model S Autopilot. The German tests had started as soon as the first fatal accident involving the system was reported in May in the US. With an estimated 3,000 Tesla Model S cars sold in Germany, the authorities were understandably obliged to look deeper into the matter.
After many thousands of kilometers of testing, BASt reportedly concluded that Autopilot represents a significant traffic hazard. Judging that is was not designed for complex urban traffic situations, the report declared that the car’s sensors are too short-sighted (i.e. short range) to cope with the reality of German motorways. The federal agency in charge of motor transport evaluated the research institute’s results and responded swiftly, proposing that the government provisionally suspend Tesla’s type approval. Although this didn’t happen, German Model S owners are reported to have received official German federal correspondence urging them to remain vigilant while the Autopilot system is engaged.
The Federation of European Motorcyclists’ Associations (FEMA), in co-ordination with the Koninklijke Nederlandse Motorrijders Vereniging (KNMV) and the Motorrijders Actie Groep Nederland (MAG NL) motorcycle clubs, has issued a formal letter to the Dutch vehicle authority RDW inquiring whether testing procedures of autonomous vehicles take into ac- count two-wheelers.
Similar action had been undertaken earlier by the Norwegian riders’ organization NMCU, directing questions towards the transport minister, Ketil Solvik-Olsen, and Tesla co-founder and CEO, Elon Musk. This was sparked by an accident on the E18 road to Drammen, Norway, where a Tesla Model S with Autopilot engaged rear-ended and seriously injured a female motorcyclist on July 27. Motorcycle rear-ending raises questions on Tesla vehicle type approval in Europe.
American research conducted by John F. Lenkeit of Dynamic Research, concludes that forward collision warning systems for automobiles fail dramatically to detect motorcycles, providing inadequate results in 41 percent of tested cases, against only 3.6 percent for passenger cars.
Comment