Take the pledge to vote

For a better tommorow#AajSawaroApnaKal
  • I agree to receive emails from News18

  • I promise to vote in this year's elections no matter what the odds are.
  • Please check above checkbox.

    SUBMIT

Thank you for
taking the pledge

But the job is not done yet!
Vote for the deserving candidate this year.

Check your mail to know more

Disclaimer:

Issued in public interest by HDFC Life. HDFC Life Insurance Company Limited (Formerly HDFC Standard Life Insurance Company Limited) (“HDFC Life”). CIN: L65110MH2000PLC128245, IRDAI Reg. No. 101 . The name/letters "HDFC" in the name/logo of the company belongs to Housing Development Finance Corporation Limited ("HDFC Limited") and is used by HDFC Life under an agreement entered into with HDFC Limited. ARN EU/04/19/13618
SPONSORED BY
Powered by cricketnext logo
»
1-min read

Hackers Forced a Tesla to Enter the Wrong Lane, Elon Musk Tweets 'It's Fixed'

Hackers have demonstrated how they could trick a Tesla Model S to enter into the wrong lane by using a method called "adversarial attack," a way of manipulating a machine learning (ML) model.

IANS

Updated:April 8, 2019, 3:01 PM IST
facebookTwittergoogleskypewhatsapp
Hackers Forced a Tesla to Enter the Wrong Lane, Elon Musk Tweets 'It's Fixed'
Tesla Model S. (Image: Tesla)
Loading...
Hackers have demonstrated how they could trick a Tesla Model S to enter into the wrong lane by using a method called "adversarial attack," a way of manipulating a machine learning (ML) model. The Tesla Autopilot recognizes lanes and assists control by identifying road traffic markings.

The researchers from the Keen Security Lab of Chinese tech giant Tencent showed that by placing interference stickers on the road, the autopilot system could be fed information that would force it to make an abnormal judgment and make the vehicle enter a wrong lane.

"In this demonstration, the researchers adjusted the physical environment (e.g. placing tape on the road or altering lane lines) around the vehicle to make the car behave differently when the autopilot is in use," a Tesla spokesperson was quoted as saying in a Keen Security Lab blog.

"This is not a real-world concern given that a driver can easily override autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times," the spokesperson said.

According to a report in The Download - MIT Technology Review this month, adversarial attacks could become more common as machine learning is used more widely, especially in areas like network security.
| Edited by: Abhinav Jakhar
Read full article
Loading...
Next Story
Next Story

Also Watch

facebookTwittergoogleskypewhatsapp
 
T&C Apply. ARN EU/04/19/13626
 

Live TV

Loading...
Countdown To Elections Results
  • 01 d
  • 12 h
  • 38 m
  • 09 s
To Assembly Elections 2018 Results