Facebook in 2018: How Much More Trust Can The Social Network Really Afford to Lose?

(Image for representation: Reuters)

(Image for representation: Reuters)

Facebook’s terrible 2018 is coming to a close, with the expectations of a better 2019 where users don't log off.

Vishal Mathur
  • News18.com
  • Last Updated: December 31, 2018, 9:09 AM IST
Share this:

“Bringing People Closer Together,” they said. Back in January. If Facebook could turn back the clock, it probably would want to go back to a simpler time prior to March 2018. That is when everyone loved Facebook. That was the time it was the default communication platform for your friends. That was a time prior to the damaging Cambridge Analytica revelation. The biggest scandal in Facebook’s history. One of the biggest in the tech space, ever. A scandal that went beyond an app collecting your date of birth, or email address or location of access. It was about the data that was collected without user consent, was used for political research, Facebook knew about it and yet did nothing about it either. This was the data of as many as 87 million users—and it took Facebook two years to come clean on what happened. Things have been going downhill ever since.

In the months after the world came to know about the Cambridge Analytica scandal, Facebook co-founder and CEO Mark Zuckerberg had testify before the US Congress, the Senate Judiciary Committee, the Senate Committee on Commerce, Science and Transportation and the House Energy and Commerce Committee. And that chapter still continues, and with the US federal regulatory authorities becoming even more interested in regulating tech companies—expect more of this as we head into 2019 and then the big election year in the US. Make no mistake, supposed Russian meddling in the previous US elections is an issue that is still fresh in the minds of the stakeholders, and the wick will surely be turned up soon enough.

Through the year, controversy followed Facebook.

The social media network faced even more questions after it was reported that the Facebook app installed on Android phones had been collecting your call and SMS text message history. The data that was collected included time of voice calls, names of the contacts, and numbers of recipients along with call duration. There was also text message metadata including recipient details and time stamps—even though you did not use the Facebook or Facebook Messenger app for any of these tasks. Facebook, in its statement at the time, denied that the data was collected covertly, and remained adamant that the users gave the app permission to access contact details. “Call and text history logging is part of an opt-in feature for people using Messenger or Facebook Lite on Android. This helps you find and stay connected with the people you care about, and provide you with a better experience across Facebook. People have to expressly agree to use this feature,” said Facebook, in an official statement. The thing is, the End User Licensing Agreement (EULA) contract which Android phone owners agreed to when they install the Facebook app and use it for the first time, does indicate that Facebook had detailed the logging of the data in question, as a requirement. But who reads EULA documentation pop-ups any way. This was perhaps more a case of unclear representation of facts, rather than malice.

In July, Facebook came under fire for refusing to initially ban right-wing conspiracy theorists and propagandists like Alex Jones, despite a clear history of promoting fake news through his own and InfoWars pages on the social network. Eventually, it did shut down four of Jones' pages, citing repeated violations of its hate-speech policies. Facebook clarified at the time that more and more users were reporting the content being posted by Jones, and the review identified "using dehumanizing language to describe people who are transgender, Muslims and immigrants," among other transgressions. At this time, Apple blocked Jones on its platform, so did Spotify and Google took action against Jones’ YouTube presence as well.

And it was not just Facebook. Even the apps that it owned, stumbled.

The United Nations said that Facebook played a role in inciting violence against the minority Rohingya Muslim population—and pointed to the inflammatory posts by the government and military officials as proof that Facebook did nothing to take down such communication. The very popular Facebook-owned instant messaging app WhatsApp, was accused of inciting genocide in Myanmar by not being able to identify and check fake messages, and is also the reason behind the rise in lynching cases in India. Facebook’s response was to limit the amount of messages a user can forward to other users on the messaging app—it is now just 20 users, compared with 250 users earlier. WhatsApp has since been constantly in the line of fire of the Indian Government, for a perceived lack of concrete action in terms of tackling the spread of fake news on the platform. This could gain even more relevance, heading into the 2019 general elections.

If you thought Facebook had faced the worst of it, yet another surprise was in store.

In September, the social network revealed that it suffered a security breach which impacted as many as 90 million user accounts. Facebook said that attackers exploited a vulnerability in Facebook’s code which is linked with “View As”, a feature that lets Facebook users see what their own profile looks like to another user. This vulnerability allowed hackers to get access to the login tokens, which could then be used to take over or control user accounts—without users realizing something has happened.

In November, Facebook accounts of as many as 120 million users were hacked into, giving hackers access to timelines, shares and messages. While Facebook confirmed that its systems were not breached as a part of the attack, browser extensions loaded with malware were used.

The latest data privacy debate, sees Facebook explaining how it shared user data with Netflix and Spotify as a part of a now defunct experiment. Facebook’s explanation focuses on the permissions of “read/write”. In order for a user to write a message to a Facebook friend from within the Spotify app, for instance, Facebook needed to give Spotify “write access” to the user’s data. For the user to then be able to read messages back, Facebook needed Spotify to have “read access.” The third permission is called “Delete access”, which means that if one user deleted a particular message sent on Facebook from within Spotify, that message would be deleted from the Facebook platform itself too.

While there is no evidence to say that Netflix or Spotify did anything to peek into the messages or gather data from Facebook users, but Facebook should perhaps have been more upfront about all these partnerships, whether long term or experimental. Disclosures over time, particularly in the window after the Cambridge Analytica scandal, would have greatly improved Facebook’s standing with its user base. It missed that opportunity.

As things are, 2019 will be a big year for Facebook. A lot will hinge on how it emerges from this mess and tries to rebuild its image, how it tries to earn back user confidence (if at all) and how the governments around the world regulate Facebook. Or at least attempt to regulate, or make it a bit more responsible. The other challenge for Facebook would be to find new users. It has pretty much reached every smartphone totting human being in North America and parts of Europe, and there are very few growth markets left. But in a way, its biggest challenge would be the existing user base, which is either signing off, or simply using Facebook much lesser than it used to earlier.

Also Read | Data Mishandling, Complex News Feed & Millennials Turning Away: Where Does Facebook go From Here?

Also Read | Teens And Europe Are Giving up on Facebook; But Instagram is The Surprise Gainer

Share this:
Next Story