Former Facebook data scientist, Sophie Zhang, has released a 6,600-word memo exposing how the platform is ripe for abuse and helping to sway political elections around the world.
Zhang worked in the Facebook Site Integrity fake engagement team for 3 years and discovered foreign governments attempting to abuse the platform.
This bombshell raises an interesting thought. We often get caught up in the latest trends and news stories on social media but we rarely question why everyone is talking about certain topics.
So, what does the memo expose?
Zhang found evidence of coordinated campaigns aimed to boost or hinder political candidates or outcomes. With enough money it’s possible to spread a talking point across a country. You can buy ads and leverage user profiles to create fake engagement on posts, which then gives it the appearance of being a popular opinion. Facebook is unable to identify some of these bad actors.
In the memo, some of the countries where Facebook was used to sway public opinion include Spain, Brazil, India, Ukraine, and Ecuador.
It highlights “10.5 million fake reactions and fans from high-profile politicians in Brazil and the US in the 2018 elections.”
Bots on social media became a widespread discussion during the 2016 U.S. elections, in which claims were made that Russia was interfering. Bots are flagged as users demonstrating inauthentic activity. According to the memo, they were prevalent in countries such as Ecuador and Bolivia but not prioritized due to Zhang’s workload. She claims the company “simply didn’t care enough to stop them”.
She didn’t appear to abuse her position as a self-described ‘part-time dictator’ at Facebook, but what happens when an employee is biased or corrupted by power? We see this everywhere. Would they turn a blind eye or focus on hurting certain political candidates?
Zhang also states that 672,000 fake accounts were removed after discovering coordinated manipulation on the Spanish Health Ministry’s Facebook page during the COVID-19 pandemic.
This just scratches the surface of the memo, which is being published in parts.
How could Facebook change to help prevent global manipulation?
- They would need to stop selling data to third parties. Anyone with a big budget can access users’ personal information and then bombard you with hyper-targeted ads to push an agenda.
- Getting rid of advertising to block the direct influence of unwanted strangers communicating with users on the platform.
- More privacy options are needed to help people choose exactly what information they see and share with everyone on the platform. This equates to less data sharing, which conflicts with the interest of advertisers.
- Enhanced ID verification at sign up deters scammers, predators, and ‘bots’, who typically create fake accounts to mask their dubious activity.
- Breaking up the platform and algorithmic changes reduces the amount of centralised manipulation that can take place. Viral content quickly becomes international news because everything is engineered to spread like wildfire, and so many people can be reached at once.
All of these actions benefit Facebook users but it would destroy the tech giant because they use the free model of social media. myBubble is the first social media platform to offer digital ID verification with no ads or selling data. Our revolutionary model provides many amazing features for all social interactions while keeping users safe and secure. To achieve this, we use an affordable paid model. When you sign up to be an early tester you get your first 3 months for free. It also gives you the chance to have your ideas implemented on the app, which launches in January 2021. Learn more here.