This week, I was involved in a heated conversation about Facebook and whistleblower, Frances Haugen. Although it was a small group discussion, I could not help but think that the discourse was a microcosm of conversations taking place all over the country.
Two of the participants in the small group discussion seemed adamant that their beliefs were the absolute truth. Any contrary information that was presented was met with a repetitive (perhaps, erroneous) argument in an increasingly louder tone of voice. Unfortunately, such conversational nuances are common in individuals who have not done much in-depth research into the issue at hand. Such individuals resort to aggressive rhetoric instead of engaging in friendly, scholarly debate.
After a long period of remaining silent during the discussion, I felt obligated to interject in order to support someone who seemed more well-versed in the topic. Frankly, it felt like he was being bullied by the other members of the group and I had reached the limit of my patience. Although I prefer to avoid political debates like the plague, the tipping point was when our conversation about Facebook somehow took a turn toward a conflict about whether or not the incident at the Capitol Building on January 6, 2021, was an insurrection.
It occurred to me that two of the gentlemen in our discussion were hell-bent on having their opinions be heard at the expense of all others. On a personal level, I was frustrated by yet another conversation without real substance. Based on the behavior of the loud-mouthed members of our group, I knew it was a conversation not really worth participating in. It was obvious that no one would be truly heard. On a professional level, I silently contemplated the reasons why individuals are swayed in one direction or another despite facts to the contrary.
A little bit of background is in order. Our group gathered to discuss the Facebook whistleblower’s contention that the social media giant prioritizes profit over public safety. The algorithms that Facebook uses to control what members see in their feeds are designed to optimize content that gets a reaction. Facebook’s own research shows that misinformation and hateful, polarizing content is more enticing to people and keeps them on the platform longer. Facebook makes more money when people consume more content. The more anger people are exposed to, the more they consume.
Facebook’s internal research conflicts with its public face. Internal research indicates that Instagram (owned by Facebook) harms teenage girls by amplifying suicidal ideation and disordered eating behavior. Their research shows that young women consume the content, get increasingly depressed, and use the Instagram app more. As they do so, they come to hate their bodies more and more.
The facts of the whistleblower case and her testimony are too many to list here. But, I will add that Facebook is a publicly-traded company. As such, it is required not to lie or make material misstatements and omissions to its investors. It is, also, noteworthy that Facebook is a $1 trillion company with 2.8 billion members who represent about 60% of all internet-connected people on earth (per 60 Minutes: https://www.cbsnews.com/news/facebook-whistleblower-sec-complaint-60-minutes-2021-10-04/).
Regulating such a massive social media platform would not be easy. Many factors (e.g., freedom of speech) would come into play. But, to my mind, our group discussion would have been more productive if we were able to collectively acknowledge Facebook’s obligation to exercise corporate and social responsibility. Instead, I was immediately attacked for uttering the words “corporate and social responsibility”. It seems my cohorts were advocating a world in which everyone does whatever they want. That brought us full-circle, right back to where we started – in an age of outrage and misinformation.
What then of my question as to why people are swayed to adopt misinformation as gospel while other people are inclined to be more discerning? The truth is we are all susceptible to something called the illusory truth effect which describes how we tend to accept as truth the same misinformation when it has been repeated over and over again. Even when people initially know that the information is false, they may eventually accept the misinformation as truth. In the Information Age, this is an important concept to be aware of. We are all bombarded and influenced by countless messages each day. Buyer beware.