Facebook, We Need A Break
For more than a month now, I’ve completely eliminated Facebook from daily life. I deleted the app and haven’t thought twice about it, only recently signing on for work-related purposes.
I’ve been disillusioned with Facebook for years, especially when it relates to privacy and the death of political discourse -- something that upsets me greatly.
Recently, when I signed on for work I noticed Facebook had devolved even further than the last time I had been on. This is likely my final Facebook post. I’m not trying to be holier than thou or act like I’m better than anyone else. I simply believe Facebook, and other social media platforms, are not the solution to inspiring trust in our institutions, whether in media or government.
Specifically, over the last few days I’ve seen commentary about a particular issue that’s been not only misleading, but completely and utterly false. Not only that, it’s become abundantly clear people have little regard for conducting independent research, but instead will believe anything they see on on this platform, even if it’s a predictably partisan meme or intentionally misleading headline intended to stoke outrage. People share these ostensible pieces of “content” like digital artillery fire, hoping to weaken their purported enemies on the opposite end of the political spectrum. The fact that adults have little to no comprehension of media literacy is depressing. I want to make clear that this is not a partisan message. However you feel about a certain issue is entirely your prerogative. But there should at least be a fact-based debate on the merits of the issue, which is mostly absent from today’s discourse, starting on social media and bleeding into work spaces, dinner tables, and social gatherings.
Studies have shown that about 60 percent of social media users don’t actually read an article before they share it. That speaks volumes. It’s also distressing, since more than two-thirds of adults actively use Facebook and more than four in 10 adults get their news from Facebook.
Facebook, of course, isn’t the only tech company to blame. Google provides information that fits a particular user’s personal views. And alas, Twitter has it’s own array of problems. In 2016, I interviewed Michael Patrick Lynch, professor of philosophy and director of the Humanities Institute at the University of Connecticut, who likened Google to a “desire machine.”
“What the internet is good at doing is keeping track of our preferences and predicting our preferences, our desires,” Lynch told me. “So in a sense, it’s a sort of desire machine and we get what we want from it. Of course, what we want and what’s true are two different things.”
That last piece is crucial, especially as it pertains to Facebook, which has one objective: Keep people engaged for as long as possible. How do they accomplish this? By delivering content that you’ll enjoy and reaffirms your preexisting biases. People are essentially getting their news in a partisan vacuum that churns uncontrollably as you eagerly swipe for the next piece of dopamine-filled eye candy.
The purpose of this Note is to encourage people -- whatever your political viewpoint -- to seek the truth, be skeptical of content on social media that seems too good to be true, and be more open to conducting research instead of reflexively clicking “share.”
I’m sure I probably failed this test over the years. But we all learn from our mistakes. We should all do better. And, for what it’s worth, I’ve been much happier since going on my self-imposed Facebook exile.