Self-driving Cars May Not Be as Safe as Conventional Cars [STUDY]

Self-driving cars may not be safer than conventional cars, according to a report by the Center for Data Ethics and Innovation.

Traffic Road Street Cars
Traffic Road Street Cars S. Hermann / F. Richter/Pixabay

Self-Driving Cars on Public Roads

According to BBC, currently, self-driving cars are not permitted on UK roads, but the government has already stated that the first ones might be there by 2025. In the upcoming year, some automobiles, coaches, and trucks with self-driving capabilities may even be traveling on highways, according to the Department of Transportation.

Based on government plans, automobiles should have the same level of safety as a qualified human. It will help define the requirements for allowing self-driving vehicles on public roads, and if they are not met, manufacturers may be subject to penalties.

Automated Vehicle Safety

The study claims that, even though driverless cars are generally safer, the public may not be tolerant of collisions involving one.

According to the CDEI, a government expert group that oversees its work on trustworthy innovation utilizing data and AI, crashes that are blamed on faceless technological corporations or loose regulations may not be well received by the public. Even if it continues, automated automobiles are generally safer than people.

It also cautions that a 100-fold boost in average safety over manually driven automobiles would be necessary if the public expects self-driving cars to be as safe as trains or airplanes.

Professor Jack Stilgoe of University College London, who provided advice to the CDEI, stated, "What we wanted to do was say there's not an easy answer to this question." He said that it should be up to a democratic decision-making process to determine how safe they should be.

CDEI says that it's crucial to take into account how risk is dispersed across various groups. Even if there are general safety gains, certain groups may experience significant safety increases while others may experience none or even additional dangers.

ALSO READ: A Pair of Googly Eyes on the Front of a Self-Driving Car Helps Pedestrians Decide When Crossing the Road

Biased Judgment

The paper cautions that, when the technology is implemented, additional hazards will need to be closely examined. One is the potential for bias in the algorithms controlling automobiles.

It cautions that underrepresentation of some groups, such as wheelchair users in data used to train the software algorithms that operate automobiles, could lead to bias.

According to the research, self-driving vehicles should be prominently marked, and people have the right to know what kind of agents are on the road with them. In a survey of UK public attitudes, it is reported that 86% of the general public concurred with this.

The conduct of self-driving vehicle testing on public highways, according to Professor Stilgoe, raises severe moral issues because other road users could inadvertently participate in these experiments whether they wanted to or not.

He claimed that the ethical notion of informed consent had some significant implications. To accommodate self-driving, there may be a requirement to alter the traffic structure and rules.These, according to Professor Stilgoe, required open discussion and deliberation.

The risk, he said, is kind of sleepwalking into a world in which these changes happen to suit one method of transit and the advantages then don't get shared very widely.

Check out more news and information on Technology in Science Times.

Join the Discussion

Recommended Stories

Real Time Analytics