After Google, Microsoft Showcases AI That Can Make Phone Calls to Humans
Microsoft's Xiaoice interacts in text conversations but now the company has started allowing the chat bot to call people on their phones.
AI Helping Differently-Abled to Become More Independent: Microsoft
While Google Duplex, which lets AI mimic a human voice to make appointments and book tables through phone calls, has mesmerised people with its capabilities and attracted flak on ethical grounds at the same time, Microsoft has showcased a similar technology it has been testing in China.
At an AI event in London on Tuesday, Microsoft CEO Satya Nadella revealed that the company's Xiaoice social chat bot has 500 million "friends" and more than 16 channels for Chinese users to interact with it through WeChat and other popular messaging services.
"Microsoft has turned Xiaoice, which is Chinese for 'little Bing', into a friendly bot that has convinced some of its users that the bot is a friend or a human being. Xiaoice has her own TV show, it writes poetry and it does many interesting things," The Verge quoted Nadella as saying.
Xiaoice interacts in text conversations but now the company has started allowing the chat bot to call people on their phones. The bot does not work exactly like Google Duplex, which uses the Assistant to make calls on a user's behalf but it holds a phone conversation with the user.
"One of the things we started doing earlier this year is having full duplex conversations. So now Xiaoice can be conversing with you in WeChat and stop and call you. Then you can just talk to it using voice," Nadella was quoted as saying.
Humans will be humans and the latest victim of humankind was Microsoft. Two years ago, Microsoft launched an artificial intelligence (AI)-powered bot on Twitter, named Tay, for a playful chat with people, only to silence it within 24 hours as users started sharing racist and offensive comments with the bot.
Launched as an experiment in "conversational understanding" and to engage people through "casual and playful conversation", Tay was soon bombarded with racial comments and the innocent bot repeated those comments back with her commentary to users.
Some of the tweets had Tay referring to Hitler, denying the Holocaust, and supporting Donald Trump's immigration plans, among others. Later, a Microsoft spokesperson confirmed to TechCrunch that the company is taking Tay off Twitter as people were posting abusive comments to her.
Watch: Karbonn Titanium Frames S7 Review: A Decent Budget Deal
Get the best of News18 delivered to your inbox - subscribe to News18 Daybreak. Follow News18.com on Twitter, Instagram, Facebook, Telegram, TikTok and on YouTube, and stay in the know with what's happening in the world around you – in real time.
Recommended For You
- Nushrat Bharucha Wants to See Ranbir Kapoor in a Towel in Bigg Boss
- Shashi Tharoor Slams Pakistan's 'Vituperative Mudslinging', Twitter Brings Out Dictionary
- Only 4 Teams Turn Up as Goa Football Association Conducts Players Draft for Indian Women's League Qualifiers
- Truecaller Makes a Big Privacy Pitch as Focus on Instant Messaging Intensifies
- Google Assistant Bug is Causing Battery Drain in Android Phones