Take the pledge to vote

For a better tommorow#AajSawaroApnaKal
  • I agree to receive emails from News18

  • I promise to vote in this year's elections no matter what the odds are.
  • Please check above checkbox.

    SUBMIT

Thank you for
taking the pledge

Vote responsibly as each vote counts
and makes a diffrence

Disclaimer:

Issued in public interest by HDFC Life. HDFC Life Insurance Company Limited (Formerly HDFC Standard Life Insurance Company Limited) (“HDFC Life”). CIN: L65110MH2000PLC128245, IRDAI Reg. No. 101 . The name/letters "HDFC" in the name/logo of the company belongs to Housing Development Finance Corporation Limited ("HDFC Limited") and is used by HDFC Life under an agreement entered into with HDFC Limited. ARN EU/04/19/13618
News18 » Auto
1-min read

Hackers Forced a Tesla to Enter the Wrong Lane, Elon Musk Tweets 'It's Fixed'

Hackers have demonstrated how they could trick a Tesla Model S to enter into the wrong lane by using a method called "adversarial attack," a way of manipulating a machine learning (ML) model.

IANS

Updated:April 8, 2019, 3:01 PM IST
facebookTwitterskypewhatsapp
Tesla Model S, Tesla Inc, Electric Vehicles, Alternative Fuel, Sustainable Development, Tesla Cars India
Tesla Model S. (Image: Tesla)
Loading...

Hackers have demonstrated how they could trick a Tesla Model S to enter into the wrong lane by using a method called "adversarial attack," a way of manipulating a machine learning (ML) model. The Tesla Autopilot recognizes lanes and assists control by identifying road traffic markings.

The researchers from the Keen Security Lab of Chinese tech giant Tencent showed that by placing interference stickers on the road, the autopilot system could be fed information that would force it to make an abnormal judgment and make the vehicle enter a wrong lane.

"In this demonstration, the researchers adjusted the physical environment (e.g. placing tape on the road or altering lane lines) around the vehicle to make the car behave differently when the autopilot is in use," a Tesla spokesperson was quoted as saying in a Keen Security Lab blog.

"This is not a real-world concern given that a driver can easily override autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times," the spokesperson said.

According to a report in The Download - MIT Technology Review this month, adversarial attacks could become more common as machine learning is used more widely, especially in areas like network security.

Get the best of News18 delivered to your inbox - subscribe to News18 Daybreak. Follow News18.com on Twitter, Instagram, Facebook, Telegram, TikTok and on YouTube, and stay in the know with what's happening in the world around you – in real time.

| Edited by: Abhinav Jakhar
Read full article
Loading...
Next Story
Next Story

Also Watch

facebookTwitterskypewhatsapp

Live TV

Countdown To Elections Results
To Assembly Elections 2018 Results