How Social Media is Changing Us Without Us Knowing

Table of Content
    Add a header to begin generating the table of contents

    Writer’s Note: In writing this blog, I do not contend that I am portraying a balanced picture of the impact of social media on our lives. Oftentimes, social media is changing us in ways we do not see. After all, the eye of the storm is calm. My focus is on the negative because the social fabric of our society is getting weaker as a result of social media being misused. As our society becomes increasingly polarized, it is, in my opinion, important to understand the role of social media in subconsciously polarizing and radicalizing us.

    Introduction: Social Media & Us

    Social Media platforms are products but the users do not pay. So how do they earn money? If we’re not paying, someone is. The primary source of revenue for social media sites comes from advertisers. 99% of Facebook’s net income revenue and 80% of Google’s total revenue is generated through advertising. For social media platforms, we’re not consumers; we’re the product. Advertisers pay for us to see them on social media.

    Social media platforms ensure that advertisements reach the right audiences through algorithms. Algorithms are opinions embedded in code, used to determine what content to deliver to us based on our behavior. We, wittingly or unwittingly, consent to this when we use social media platforms.

    Companies are willing to pay huge amounts for their products to be marketed on social media. Worldwide spending on digital advertising is expected to exceed $375 billion in 2021. However, companies aren’t only paying for your information and attention. This investment is justified by social media gradually, slightly, and imperceptibly changing your behavior and perception.

    Not Just What We Do But Who We Are

    Every functional and design element of a social media platform impacts who we are. Facebook introduced the ‘Like’ button in 2009, and we quickly began to base our self-worth on the likes we get. Scientists found that the same brain circuits are activated on seeing many likes as activated when eating chocolate or winning money. The dangers associated with this is best described by Tristan Harris, who worked as a design ethicist at Google:

    “Never before in history have 50 designers, 20-25 year old white guys in California––made decisions that would have an impact on two billion people. Two billion people will have thoughts they did not intend to have because a designer at Google said this is how notifications work on that screen that you wake up to in the morning”.

    Social media platforms track all online activity. They monitor browsing history, physical proximity to things, and content we look at and for how long. All this data is used to build models that predict our actions and determine who we are.

    A/B testing, known as split testing, is used across social media to increase advertisement engagement and conversion rates. These platforms constantly roll out small experiments to assess the most optimal way to tailor their users’ behaviors. People’s real-world behavior and emotions can be changed without triggering their awareness.

    In 2010, Facebook carried out a massive scale contagion experiment using subliminal cues on Facebook pages to increase turnout in US congressional elections. They found that just this one campaign directly increased voter turnout by about 60,000, and indirectly about 238,000!

    Bubbles and Rabbit-Holes

    The information we consume shapes how we see the world. We’re consuming more and more through social media accepting the reality we’re presented with. Social media pushes us towards content we already agree with making it difficult for us to interact with information that contradicts our worldview. This gives us a false sense that everyone agrees with us because our beliefs are reinforced by the bubble we live in and inadvertently helped create.

    Social media platforms have harbored a shift from a tools-based technology environment to a manipulation-based technology environment. If they show interest in a conspiracy theory, they will be led to others. The algorithms draw people towards rabbit holes; like Alice in getting lost in Wonderland.

    Guillaume Charlot reveals in an interview that he was deeply concerned about the algorithms he was working on at Youtube for their recommendations.

    “People think the algorithms are designed to give them what they really want, only it’s not. The algorithm is trying to find a few rabbit holes that are very powerful, trying to find which rabbit hole is the closest to your interest. And then if you start watching one of those videos, then it will recommend it over and over again.”

    These rabbit holes are a force to reckon with. First, it was ‘Pizzagate‘. Pizzagate groups popped up on Facebook in 2016, and people interacting with posts about conspiracy theories were recommended #pizzagate groups. Millions of people picked up on this conspiracy that politicians from the Democratic party are running a pedophile ring from a pizza shop in Washington.

    Taken in by this conspiracy, a man ended up shooting people in a pizza restaurant in Washington. From the Pizzagate conspiracy grew QAnon – a conspiracy theory propagating even more ridiculous claims than Pizzagate.

    In May 2020, Facebook purged pages and groups propagating the conspiracy. However, within 30 days, more than 10,000 people joined new groups. Investigations show that upwards of 3 million people follow the conspiracy on Facebook. Facebook’s own internal research in 2016 found that 64% of all extremist group joins are due to the recommendation tools.

    Misinformation: hard-to-escape

    64.5% of people claim that they receive breaking news from social media. However, a lot of what we consume as ‘news’ is quickly becoming fake news. An MIT study found that fake news on Twitter spreads 6 times faster than true news. This means that 83% of news circulated online is false.

    Social media has fundamentally changed us. The average attention span of social media users is now only 8 seconds. That’s how long we’re willing to spend on obtaining information. An average visitor will only read an article for 15 seconds or less and the average video watch time online is 10 seconds. We spend 8 seconds on an article assuming it’s true without realizing that it’s 83% likely to be false.

    Disinformation for profit has become a business model, and social media platforms have little incentive to regulate messages that spread quickly.

    After Covid-19 spread across the world, social media was filled with misinformation. “Warm water will cure it“. “Chinese food is causing it“. “It is a hoax started by the government to hide something nefarious“. No one on social media is immune to misinformation.

    Conclusion: Is Social Media Bad?

    Absolutely not. Social media connects, inspires, and changes us. It’s a reflection of human inclinations, society, and nature. It’s not that the platforms sought to polarize and radicalize society. They’ve only built tools leveraged by malicious actors. The military in Myanmar capitalized on Facebook to incite violence against Rohingya Muslims. The Chinese used Twitter to sow political discord in Hong Kong. And the stories go on.

    The solution isn’t the abandonment of social media platforms or boycotting those who’ve gone down dangerous rabbit holes. We can push for social media platforms to invest more into monitoring hate speech. If we make a conscious effort to monitor our own behavior, we’d be able to escape the darkest corners of the virtual world.

    Leave a reply
    Subscribe Form
    Start your free trial