When fiction turns to reality. Black Mirror, the popular Netflix show, started out in life as an outlet that cast a light on how our approach technology and social media can perhaps open avenues of deeper surveillance, tracking, social ranking and potential for misuse by the people in control. Potentially a scarier society than the one we live in today. All that is coming true, slowly and steadily, unfortunately. The 2016 episode, called Nosedive, follows Lacie (Bryce Dallas Howard) in a world where people can rate each other on a scale of one to five stars—based on every interaction they have. Needless to say, there is a craving for the highest possible ratings. The socioeconomic standing depends on how many stars you currently have. And if you fall on the wrong side of the ratings, life becomes a pain.
That was TV. That was a Netflix show. Time to get back to studies, children? Not so fast, because the direction we are headed in, isn’t too different from the world that Black Mirror writer Charlie Brooker depicted in the show.
Facebook, as it turns out, is now assigning a reputation score for its users. The motive, at least what we are being led to believe, is to identify and weed out malicious user accounts. A Facebook user’s reputation score isn’t designed to be the final verdict about a person’s credibility. However, it will be just one more metric among many others that Facebook uses, to make a virtual character sketch of a Facebook user, and identify the risky ones. One of the reasons for this sort of a metric is the growing intolerance in general, which directly led to the misuse of the option to report incorrect, dangerous, fake or malicious content. As it turns out, it is “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” said Facebook’s Tessa Lyons, in an interview with The Washington Post.
The social network is also monitoring which users actively tend to report content as problematic, which may actually be from publishers who are generally deemed trustworthy by other users. False reporting, when done as a coordinated effort, can be used to game the systems put in place by tech companies to monitor and weed out genuinely unacceptable content. Let us take a look at this example. If you report an article you saw on Facebook as false or incorrect, and the Facebook fact-checking process confirms that, it is a good mark on your checklist. In the future, your feedback about content on the social network will have more weightage than someone who in the past has been known to report content which isn’t against the policies of content sharing on the platform.
Twitter is now also factoring in the behaviour of other accounts in your immediate network, to understand the risk factor in spreading tweets. If you happen to be in the same lists or follow, get followed by and interact directly or indirectly with accounts that are deemed to be untrustworthy, your behaviour sketch also gets impacted negatively.
Facebook has around 2.2 billion active users and Twitter has, even though not as massive a user base, but still sizeable 336 million users globally as of July, according to research firm Statista.
Tech companies have relied on algorithms to maintain the basic features of maintenance on websites. Be it the ability to report content you think is offensive, to understanding your usage preferences to serve advertisements right down to the ability to filter out post. However, since it is now easy to game the system (such as mass reporting of a particular content or user), companies need to now look for more clues to understand a user. Facebook insists that this is simply a tool to make sure that the content reporting feature isn’t misused, and nothing to do with a personality or a reputation score of any sort.
Can we be so sure?
As things stand, you will not know what ranking Facebook has given you, or whether Twitter has marked you as someone with a questionable network of friends. We do not know how these systems are designed, how the calculations are done, what will make your rating improve or plummet and how to improve it. We also don’t know which users and in which geographical locations are being given a rating, and whether these are in any way even legally allowed by the laws of different countries. Could that open up Facebook to lawsuits, and even put them in a collision course with the governments around the world?
Does this ranking business remind you of something else?
Sure it does. Our neighbour. China. The country that has been mulling the idea of a social credit system for its entire population. China has a population of 1.4 billion, and growing, according to the United Nations Department of Economic and Social Affairs, Population Division. The seeds were planted in June 2014, when the State Council of China published a document called "Planning Outline for the Construction of a Social Credit System", which mulled the idea of a national trust score that rated you as a citizen. It is expected to be made mandatory by the year 2020. Wait till you are actually living in the Black Mirror world.
“The scores will be based on an integrated database that includes a vast range of information sources, rating aspects like professional conduct, corruption, type of products bought, peers’ own scores and tax evasion,” says Dr. Zahy B Ramadan of the Lebanese American University in a report titled The gamification of trust: the case of China’s “social credit”, published in February. Under this system, every action that you do on a daily basis, such as what you buy, how timely you are repaying your loans, where you travel, where you work, who you are friends with, where and how much your socialize etc., will add up to your rating. If you are found guilty of not paying off debt, not paying bills on time, have a complaint filed against you, are found guilty of even a petty crime or other such offenses or maybe having friends who have low credit ratings, will have a negative impact on your rating. If you are regularly shopping for essential supplies for home, will have a positive rating, as compared with regularly spending on buying video games online or buying cigarettes, for instance. Social media habits will also be a criteria. Everything you do every day will be logged. That is how finely every action you make will be monitored and rated. This will directly have a bearing on whether you can take a loan, buy a property, what sort of public transport you can take, whether you can hire a rental car or even get a hotel booking without an initial deposit.
The Chinese government all through pitches this as a system for building “trust” and a “culture of sincerity”. “It will forge a public opinion environment where keeping trust is glorious. It will strengthen sincerity in government affairs, commercial sincerity, social sincerity and the construction of judicial credibility,” states the Social Credit System policy.
This is not a game.
“If trust is broken in one place, restrictions are imposed everywhere,” the policy document further states. What does this mean? If you have a low rating, you could see yourself blocked off from certain services or avenues. This could include restriction on travel, restriction on where you can go to shop or eat, ability to get a loan or make purchases and more. Chinese citizens with low credit system scores may not even be hired by certain companies, and certain jobs such as civil service, law or media may become out of bounds. If you have a low score, you’ll not be able to enrol your child into a top school. This is not a fantasy list from Dark Mirror or some fantasy TV show. The restrictions mentioned above are a part of a more exhaustive list, which the Chinese citizens will have to deal with, post the year 2020.
If you fear that the proposed credit system will be manipulated and used for furthering the policies of the people in power, to weed out any dissent or opposition and enhance the use of big data for deep surveillance, it may come true. Or it may not. Perhaps, Luigi Tomba, a senior fellow at the Australian National University describes how governance works in China, down to the very micro level, and is considerably different from most other countries. And that is perhaps because of the culture of solidifying bond and relations through a culture of transparency and trust. "Chinese residential communities are places of intense governing and an arena of active political engagement between state and society. Chinese neighborhoods reveal much about the changing nature of governing practices in the country. Government action is driven by the need to preserve social and political stability, but such priorities must adapt to the progressive privatization of urban residential space and an increasingly complex set of societal forces," the book The Government Next Door describes the scenario.
We are, in many ways, already being prepped for a world where we are judged for every single action we do, on a daily basis. And that is linked to a series of further actions, which may entitle us to get or remove the ability to receive, certain things in life. And that in itself is the beginning of becoming part of a herd.