Partial map of the Internet based on January 15, 2005 data found on - a digital network resembling a social network.

Wikimedia CommonsPartial map of the Internet based on January 15, 2005 data found on – a digital network resembling a social network.

A study has found that online social networks are susceptible to forming ‘opinion bubbles’ very easily, opening the door to manipulating these networks for economic and political gains.

What’s common to the next book you’re going to read, the Boston Marathon bombings, India’s “newest vote-bank” and the price of gold?

They are all susceptible to a phenomenon called ‘herding’. “This is when people make decisions simply by following the crowd rather than basing their decision on any real evidence or information other than the decisions of others,” explained Sinan Aral, an associate professor of IT and Marketing at the MIT Sloan School of Management.

Herding is common offline – especially in the stock market, where people buy or sell simply because other people are buying or selling. This is what struck the price of gold in April, 2013, and market experts had no idea what was causing the fluctuation apart from attributing it to people behaving like a herd.

However, the symptom hasn’t been explored much online, mostly because social networks such as those on Facebook or Reddit are only just becoming visible as entities for their social, political or economic value. These two, in fact, played critical roles in the aftermath of the recent Boston Marathon bombings, where many users of the sites turned sleuths to wrongly identify suspects, actions that cascaded into the harassment of many innocent people.

The study

Now, researchers from three universities, including Prof. Aral, have evidence to believe that influential herds exist online as well – and they’re easier than thought to set off in some ways over others.

They rigged an existing social network (unnamed) that lets users vote up or vote down comments posted by other people on specific topics. The researchers randomly upvoted, downvoted or left alone 101,281 new comments, and then set about observing how people’s behaviour toward them changed over 5 months.

Their findings are two-pronged. One: positive and negative sentiments are propagated to different extents on social networks. Two: the herding effect was more prominent when topics in business, culture and politics were being discussed.

They saw that comments that had been randomly upvoted ended up being 25 per cent more positively rated than comments they hadn’t interfered with. Each comment that had been upvoted was also found to be upvoted by other commenters 32 per cent of the time.

The study, published in Science on August 8, notes, “The small manipulation of a single random up-vote when the comment was created resulted in significantly higher accumulated ratings due to social influence,” resulting in a ‘herding bubble’.

Bubbling optimism

This bubble behavior, however, wasn’t seen with downvoted comments. Even though they were more likely to be downvoted further, the trend was offset by a larger ‘correction effect’ — where users were inspired to change the negative ratings to positive ones over time.

That each user is affected by the opinions of users that came before is fairly common-sensical. However, the asymmetry in the treatment of positive and negative opinions could have important repercussions.

For example, if you expressed that a particular political opinion was disagreeable, that sentiment wouldn’t be visible for long as other users would have neutered it to reflect a less pessimistic point of view. Your reaction would be drowned out by a corrective tide.

If, on the other hand, you agreed with a political opinion, you are likely to become a point of social influence to people visiting the comment after you, leading to an exponential climb in its ratings.

An excited polity

There’s an important implication for countries like India where, according to a report published by the Internet and Mobile Association of India in December 2012, the number of social media users could have been around 66 million in July 2013; where politicians are slowly waking up to leveraging the social media to spread their messages.

Another report published earlier this year by the IRIS Knowledge foundation, Mumbai, speculated that politically active people on Facebook could be ‘the newest vote-bank with the power to shape Indian politics.’

One of the effects the report couldn’t have taken into account was this uneven treatment of dissent, especially as Prof. Aral thinks “the results will likely generalise to most ratings systems.”

While Facebook isn’t a rating system, the platform is still conducive to people expressing opposing opinions such as with the “Like” button and other options available via its Open Graph API, and parties or personalities with more “Likes” are likely to have their messages reach more people. In short, it’s a sort of approval system.

Notwithstanding the facts that Aral’s findings can’t yet be generalised without further research, that almost one-tenth of all Facebook accounts are illegitimate by the company’s admission, and online activities reflect a far lesser amount of human agency (in my opinion) as offline activities do: politicians and journalists must be wary of overestimating Facebookers’ clout in influencing elections.

The herd prevails

This is only an example of how a human in a social network might behave — by expressing opinions that to some extent are controlled by the opinions of those who came before. However, the tendencies can also be exploited by fraudsters who, as with the political campaigns example, could interfere with customers’ decision-making, garnering their support for products or services they’d rather not use, in the process lending themselves to the building of a brand they’d have otherwise supported to a lesser extent.

Prof. Aral admits that the behaviours he studied are analogous to herding tendencies offline as well. Thus, as he puts it, “understanding precisely how herding works is the first step toward averting it in a multitude of settings.”

(My friend and colleague Anuj Srivas, an active Redditor, has this to add: “Reddit recognizes this effect all too well – and in fact they’ve implemented a new measure called the ‘comment score hider’ to tackle this. Now what happens is that for a medium-to-long period of time, users cannot see how many upvotes/downvotes a particular comment has received. This removes the herding effect in a sense, as users will have to rate the comment without knowing how their fellow users have rated it.”)



Enhanced by Zemanta