Take the pledge to vote

For a better tommorow#AajSawaroApnaKal
  • I agree to receive emails from News18

  • I promise to vote in this year's elections no matter what the odds are.
  • Please check above checkbox.

    SUBMIT

Thank you for
taking the pledge

Vote responsibly as each vote counts
and makes a diffrence

Disclaimer:

Issued in public interest by HDFC Life. HDFC Life Insurance Company Limited (Formerly HDFC Standard Life Insurance Company Limited) (“HDFC Life”). CIN: L65110MH2000PLC128245, IRDAI Reg. No. 101 . The name/letters "HDFC" in the name/logo of the company belongs to Housing Development Finance Corporation Limited ("HDFC Limited") and is used by HDFC Life under an agreement entered into with HDFC Limited. ARN EU/04/19/13618
Associate PartnerAssociate Partner
  
Tech
News18 » Tech
2-min read

Ladies, Your iPhone is Auto Saving Semi-Nude Selfies in a New Folder

Apple AI, embedded into its photo-gallery app 'Photos', has suddenly sparked up a scare among women iPhone users as it has been found categorizing semi-nude photos of Apple users under a category 'brassiere'.

News18.com

Updated:November 1, 2017, 1:18 PM IST
facebookTwitterskypewhatsapp
Apple Photos AI, Apple Photos brassiere, Apple iPhone Semi-Nude Photos
(Image: AP)

An Apple Photos feature has suddenly sparked widespread shock on social media platforms, especially among women. Apple’s machine learning and AI, embedded into its photo-library app ‘Photos’, apparently enables a feature in the app which most women now find ‘disturbing’. A recent tweet has brought into light the ‘categorisation’ feature of the Apple Photos app. As per the tweet, searching for “brassiere” in the Photos app displays sensitive photos of the users, essentially the ones in which the users are wearing bras, lingerie or bikinis. While the women all over the globe are freaking out about this, it can all be blamed upon a very simple understanding of Apple’s AI.

The machine learning used in Apple Photos groups together pictures of a similar kind under adequate tags. This is done in order to help the user find a particular photo in the phone’s library with ease. Working on the same principle, the Photos app categorises semi-nude pictures of a user under a folder “Brassiere” and shows the same in the results when a search is made using this tag. The feature is nothing new and has been a part of the app since long.

This raises a bigger, more serious concern though. Is Apple collecting the data (in this case, pictures) for further development of its machine learning? If yes, this is a serious privacy concern for most of the iPhone users. This, however, is not the case. Time and again, Apple has revealed that most of the learning by its software, including facial recognition, object and scene detection is done natively on an Apple device and no information is withdrawn by Apple for this purpose, meaning that the data of the Apple users is completely private.

Apple has been using such image recognition AI since the launch of iOS 10. A similar AI based categorisation of pictures can also be seen in ‘Google Photos’, which is even creepier than this, considering it automatically saves the photos on to Google Cloud, unless a user deactivates the auto-sync of Photos.

Watch: Apple iPhone 8 Plus Review | 3 Reasons To Buy & 2 To Skip It | Ft. The Unbiased Blog

 

Get the best of News18 delivered to your inbox - subscribe to News18 Daybreak. Follow News18.com on Twitter, Instagram, Facebook, Telegram, TikTok and on YouTube, and stay in the know with what's happening in the world around you – in real time.

| Edited by: Sarthak Dogra
Read full article
Next Story
Next Story

Also Watch

facebookTwitterskypewhatsapp

Live TV

Countdown To Elections Results
To Assembly Elections 2018 Results