Have you seen me?
Probably not.
In fact, you're probably not even seeing this right now.
Though you may have read and enjoyed my articles in the past, though you may still want to have the opportunity to see and enjoy my posts in the future, you probably aren't seeing them anymore.
The reason? Facebook has employed a new algorithm to determine exactly what you're allowed to see on your news feed.
Like a parent or a government censor, they are scanning your content for certain words, judging your posts based on interactions, and otherwise making choices on your behalf without your consent.
Unless someone pays them to do otherwise. Then they'll spam you with nonsense - fake news, lies, propaganda: it doesn't matter so long as money is changing hands.
Our democracy is a fading dream. Fascism is on the rise.
So homegrown blogs like this one are left in the dust while corporations and lobbyists get a megaphone to shout their ideas across social media.
Look, I don't mean to minimize what Facebook does. There's a ton of information that comes through the network that COULD be displayed on your screen. The company uses an algorithm - a complex set of steps - to determine exactly what to show you and when. But instead of basing that solely on who you've friended and what you're interested in, they've prioritized businesses and shut down the little guy.
Since Facebook made the change in January, my blog only gets about 40% of the hits it did in years passed. And I'm not alone. Other edu-bloggers and organizations dedicated to fighting school privatization and standardization are reporting the same problems - our voices are being silenced.
And all this is happening after a series of Facebook scandals.
After the whole Cambridge Analytica outrage where Facebook gave the data of 87 million users - without their consent - to a political analysis firm that used it to help elect Trump...
After Facebook sold more than $100,000 in advertisements to Russian bots in 2016 who used them to spread propaganda to help elect Trump...
After enabling the spread of hate speech in Myanmar which allowed the military to engage in "ethnic cleansing" of the Rohingya Muslim minority - which has forced 700,000 people from their homes and across the border into neighboring Bangladesh...
After all that, Facebook still pretends that changing its algorithm is simply a way to crack down on "fake news."
It's not.
They are controlling information.
They are policing free expression.
They are NOT cracking down on falsehoods and deception.
In fact, much of what they're doing is completely devoid of ideology. It's business - pure and simple.
They're monetizing the platform. They're finding new and creative ways to squeeze content providers to gain access to users' news feeds.
This won't stop propaganda and fabrications. It just charges a fee to propagate them.
It's the same thing that allowed those Russian bots to spread Trump-friendly lies in 2016.
It's pay-to-play. That's all.
Founder and CEO Mark Zuckerberg characterized the change in January of 2018 as prioritizing content from "friends, family and groups."
Zuckerberg admitted this means it will be harder for brands and publishers to reach an audience on the social media platform - unless they pay for the privilege. That's significant because even though organic reach had been diminishing for some time, this is the first time the company admitted it.
Zuckerberg wrote:
"As we roll this out, you'll see less public content like posts from businesses, brands, and media. And the public content you see more will be held to the same standard--it should encourage meaningful interactions between people."
What are "meaningful interactions"?
Apparently, what the company calls active interactions are more important than passive ones. So commenting and sharing is more important than just liking something.
In practice that means if you comment on someone's post, you're more likely to see things by that person in the future. And if they respond to your comment, their post gets seen by even more people.
Reactions matter, too, as does the intensity of those reactions. If people take the time to hit "Love" for a post, it will be seen by more people than if they hit "Like." But whatever you do, don't give a negative reaction like "Sad" or "Angry." That hurts a post's chances of being seen again.
I know it's weird. If someone shares a sad story about their mother with cancer, the appropriate response is a negative reaction. But doing so will increase the chances the post will be hidden from other viewers. Facebook wants only happy little lab rats.
Sharing a post helps it be seen, but sharing it over messenger is even better. And just sharing it is not enough. It also needs to be engaged in by others once you share it.
Video is also prioritized over text - especially live video. So pop out those cell phone cameras, Fellini, because no one wants to read your reasoned argument against school privatization. Or they may want to, but won't be given a chance. Better to clutter up your news feed with auto-playing videos about your trip to Disneyworld. I suppose us, social justice activists, need to become more comfortable with reading our stuff on camera.
And if you do happen to write something, be careful of the words you use to describe it. The algorithm is looking for negative words and click bait. For example, if you ask readers to like your posts or comment, that increases the chances of Facebook hiding it from others. And God forbid you say something negative even about injustice or civil rights violations. The algorithm will hide that faster than you can say "Eric Garner." So I guess try to be positive when writing about inequality?
Do you happen to know someone famous or someone who has a lot of Facebook followers? If they engage in your posts, your writing gets seen by even more folks. It's just like high school! Being seen with the cool kids counts.
One of the best things readers can do to make sure they see your content is having them follow you or your page. But even better is to click the "Following" tab and then select "See First." That will guarantee they see your posts and they aren't hidden by the algorithm.
I know. I know.
This is all kind of silly, but Facebook is a private corporation. It should be allowed to control speech however it likes. Right?
Wrong.
The social media giant collects a ton of data about its users and sells that to advertisers. As a user, you have to make that Faustian bargain in order to gain free access to the platform. However, as we've seen, that data can be used by political organizations for nefarious ends. Private business cannot be trusted with it.
Moreover, there is the echo chamber effect. Facebook controls what users see. As such, the company has tremendous power to shape public opinion and even our conception of reality. This used to be the province of a free and independent press, but after media conglomeratization and shrinking advertising revenues, our press has become a shadow of its former self.
In order to maintain a democratic system that is not under the sway of any one party, faction or special interest group, it is essential that social media providers like Facebook become public utilities.
It must be regulated and free from manipulation by those who would use it for their own ends.
The way things are going, this seems more unlikely than ever.
Our democracy is a fading dream. Fascism is on the rise.
But if we want even a chance of representative government, we need to reclaim social media for ourselves. We need control over what we get to see on Facebook - whether that be a school teacher's blog or your cousin's muffin recipe.
In the meantime, do what you can to take back your own news feed.
If you want to keep seeing this blog, follow me on Facebook and click "See First." Hit "Love" on my content. Comment and share.
The only thing standing in our way right now is a brainless computer algorithm. We can outsmart it, if we work together.
Hope to be seeing you again real soon.