Did Mark Zuckerberg — one of the darlings of Silicon Valley — inadvertently unleash a 21st century Frankenstein when he created Facebook in his Harvard dorm room in 2004?
It is essentially the question Facebook’s top echelon pondered between 2016 and 2018 at Zuckerberg’s direction in response to growing commentary that the world’s most far reaching social media platform with 1.9 billion users was sowing the seeds of divisiveness.
At the center of the question was what drove Facebook’s unprecedented success. It also allowed it to rake in money rivaling the value of gold secured at Fort Knox from targeted advertising.
Algorithms employed by Facebook analyze the popularity of posts, the time people spend with them, and similar posts that reflect a user’s interests basically direct users to other postings.
The danger of that is reflected in a presentation Facebook researchers made in 2016. It showed extremist postings accounted for over 30 percent of German political groups’ Facebook material. The Facebook research determined a small subset of users with racist views and endless conspiracy theories were influencing other users disproportionately. Facebook’s own research showed “64 percent of all extremist group joins are due to our recommendation tools” driven by the social media’s algorithms. In short, the Facebook researchers concluded “our recommendation system grows the problem.”
That means the best recruiting tool extremely left and right radical organizations have that are fueled by hatred and promote violence are algorithms Facebook has put in place that have made Zuckerberg and other tech wiz types working for the company he founded rich.
A slide during a 2018 Facebook effort to make the platform less polarizing stated “our algorithms exploit the human brain’s attraction to divisiveness. If left unchecked (users would be fed) more and more divisive content in an effort to ban user attention & increase time on the platform.”
The research effort spelled it out for Facebook senior executives. The algorithms that spurred Facebook’s growth in users and wealth capitalized on polarization and tribalism to drive people further apart. It was the exact opposite of Facebook’s carefully crafted altruistic goal of bringing people together.
Zuckerberg himself ultimately pulled the plug on efforts for Facebook to become less polarizing by exploring major shifts in its algorithm strategies.
In a way it keeps Facebook as superficial as the original prototype Zuckerberg and three others designed 17 years ago to use the Internet to rate users’ photos as to who is hot and who wasn’t on the Harvard campus.
Zuckerberg hacked into Harvard’s security network where he pilfered photos of students used by dorms to verify identity, put two up at a time to populate his website and then rolled out Facemash to the world.
Basically a platform launched with the intent to attract users who would say who was hot and who was a reject based on their looks using photos he invaded their privacy to steal is now using a similar shallow and ethically challenged strategy on a global level to take bullying to new lows with deadly and racially divisive consequences all in the name of the almighty dollar.
A case can be made that Zuckerberg’s creation doesn’t put a proverbial gun to peoples’ heads to make them use it. But given the premise of internal soul searching that was discarded after it was determined the very reason Facebook was a success should be changed to remedy the spread of divisiveness you could argue what the social media is doing is suggesting the use of potent digital opioids that rot the heart, destroy the soul and bury reason to the equivalent of addicts.
That is what Facebook’s own internal words stating “our algorithms exploit the human brain’s attraction to divisiveness” confirms.
Take the incident in Oakdale that happened on Wednesday. Two groups gathered across the street from each other in that city’s downtown when a melee broke out. One side shouted “All Lives Matter.” The other side shouted “Black Lives Matter.” At one point a fight started in the middle of the street triggering a brawl with most people fleeing for safety. That led to a state of emergency being declared throughout Stanislaus County out of concern violent confrontations could spread to other cities.
What this has to do with Facebook — and even its kissing cousins in the social media world — can be found in what happened the days before the gatherings in Oakdale.
If you trace back some of the postings you will find a tie back to the City of Oakdale in Minnesota. It is town of 27,000 where Derek Chauvin — the officer shown on video with his knee on George Floyd’s neck that led to his death — lived.
There were protests in that Oakdale with a lot of references to how the Minnesota city was racist.
There were a few postings after plans were made for the peaceful Black Lives Matter rally in the Oakdale in Stanislaus County that linked to postings that the Oakdale in Minnesota was a hotbed of racism.
The language quickly deteriorated in some social media chatter. It got to the point if you “liked” Oakdale protests you were being tied to fairly raw postings regarding the other Oakdale.
Facebook was not alone obviously. Several days ago, before the planned rally in the self-proclaimed Cowboy Capital of the World took place you could Google “Oakdale protest” and be directed to a prior protest in Oakdale, Minnesota. A 6-year-old could easily go to various forms of social media and get on threads and/or be directed to them in connection with the Minnesota Oakdale.
While most postings were not much more than strident there were inflammatory ones on both sides.
None of this is to pass judgment either way on either Oakdale. Nor is it to blame what happens on the streets on tech giants.
A case can be made that Zuckerberg and others — while clearly motivated by money despite what they say — hasn’t simply put tools of mass communication in the hands of people but has designed it so algorithms make “decisions” to direct people to those who share their biases. In doing so it creates a target for people with agendas to manipulate from the comfort of their domicile instead of trying to engage — or recruit — them in person.
Making it all the worst is conformation bias. If you get all of your “news” from sites that deliberately slant stories to the hard left or hard right, then you are going to buy a lot of things without thinking especially when firms such as Facebook essentially feed it to you.
In their semi-defense, there are tools you can use to click down to the source. However in a world social media techies created that boils what passes these days as give and take down to 144 characters, few people bother to use complete words let alone double check to make sure whatever is fed to them via Facebook passes the smell test.