(6) Web 2.0, 9/11 and Fox News

--

It was called Web 2.0, a movement from static domain-generated content to user generated, empowering the individual over the organization. By the early 2000s, online purchasing had begun to disrupt Brick and Mortar sales while online publishing chipped away at print sales and Web 2.0 was going to revolutionize the way people interacted with one another on the internet.

Opening the Information Floodgates

For all of the internet’s infancy, there was a minor obstacle to publishing content, the need for a small degree of technical proficiency. “Blogs” (shortened from Weblogs) began appearing in the late 90s where a user could somewhat easily setup and publish their thoughts to a domain and solicit others to read them. Still, blogs tended to be narrow in their topics of focus and personal in their tone. They were akin to hearing something on the street versus reading it in the morning paper. You might be intrigued by it but until you saw it “in print” you didn’t put much stock in it.

While some news sources had begun creating their online presence in the mid-90s, the trend wouldn’t really take hold until the early 2000s. As such, the weightiness of reading the physical word still comported some sense of trust in the underlying message. Additionally, a print publisher with a large readership and long history had a lot more at stake than one person photocopying a pamphlet. The grumbling that some ethereal “Main-stream media” was left-leaning existed but not yet at the howls you hear today.

Since the days of William Buckley Jr. and his founding of the National Review in 1955 there have long been periodicals dedicated to opinions of an array of political leanings. For a while, with the Fairness Doctrine, periodicals that espoused an opinion on some issue were expected to provide space for an opposing view as well, doing such was a point of pride, proof of confidence in your own argument. There was also a strict expectation of journalistic integrity when reporting news, something multiple journalists learned the hard way. In 1998, Stephen Glass’ career as a journalist ended when the New Republic discovered he’d fabricated several stories. Maintaining popularity as a journalist means more eyes and thus scrutiny, if you assert something to be true, you’re expected to have evidence to prove it.

“Either write something worth reading or do something worth writing.” — Benjamin Franklin

The Drudge Report debuted in 1995 as an email based gossip column that focused on Hollywood and Washington D.C. It exploded in fame in 1998 after breaking the Monica Lewinsky scandal which led to President Clinton’s impeachment. It would later migrate to a website and become an online aggregator of news with links to stories Drudge and his editors deemed most news-worthy, usually of a right-leaning tone.

Started by Matt Drudge, it also notably employed Andrew Breitbart, who would start his own namesake news site in 2005. Steve Bannon would run Breitbart’s site from 2012, following Breitbart’s death, thru 2016, when he became campaign manager for Donald Trump’s Presidential campaign. Drudge was one of the first “things” to recognize the power of targeting internet content to a specific group and building it’s audience based on some set of shared interests.

Fox News debuted in 1996, a year after Drudge. Created by Rupert Murdoch, it was largely the brain-child of Roger Aisles, the Republican Media Consultant that had driven the campaigns of many right-leaning politicians. The two men intended it to be a voice of the conservative movement, a right-wing counter to CNN. Ailes would later resign after being accused of sexual assault by numerous women. Fox, Drudge, Breitbart and Talk radio became the juggernauts of communicating an increasingly entrenched far-right world view and agenda though-out the 2000s.

September 11 Attacks, 9/11/2001 — Reuters

An Attack to Divide the Country

I moved to New York in pursuit of a writing and acting career. I had graduated from Grad school and was living with a girlfriend I was crazy about in the late 1990s. Sadly, by 2000, the relationship was falling apart and I took a job programming computers in the hopes of saving it. That office was located in Tower One of the World Trade Center.

The job didn’t save the relationship but I enjoyed it and the financial security it provided. I was soon able to finance the production of plays my friends and I would put on. My company moved to the AT&T building across Church Street in May of 2001. A few months later, a group of nineteen utterly-delusional assholes boarded planes and attacked my former office, the Pentagon and an empty field in Pennsylvania. The last was likely intended for the Whitehouse but some folks on the plane heard about the other attacks, likely via cellphones, and decided to take on their attackers, causing the plane to crash.

For the next month or two, it felt as though the world had joined arms in solidarity. About a month later the U.S. and a coalition of allies began attacking the Taliban and Al Qaeda inside Afghanistan. I’ll admit, I was gung-ho in favor of our response and hopeful we’d conquer the foe, capture Bin Laden and vanquish them all forever. I was angry. I recognize now, life is far more nuanced and complicated than all that. Unfortunately, reality is messy, the enemy gets a vote, and one likely impact of our response was to breed a new generation of religious fanatics, Islamic and Christian.

By the first anniversary of the 9/11 attacks, fringe elements had begun to question the authenticity of the official narrative. Generally referred to unironically as “truthers”, these skeptics hold views ranging from belief that authorities knew of the impending attacks but let them happen to the absurd belief that the attacks were “an inside job” of the Bush administration.

hurr-durr

Though I didn’t personally vote for Bush nor like him much as a leader, I find the suggestion of his knowledge or involvement offensive and absurd. Bureaucracy tends to be a good antidote to the execution of conspiracies.

While Fox news had steadily increased its viewership through its first few years, the terrorist attacks provided a boon, viewership increased 26% through the end of 2001. Ailes immediately recognized the value of tweaking viewers’ rage, steering coverage toward anything to incite it, further driving viewer engagement in a fashion following Reality TV’s formula.

In 2005 a homemade documentary titled “Loose Change” was released on DVD sold through the creator’s website. Despite being a patchwork of network footage edited on a laptop with obscure speculation and oft-dubious research, it was a hit. The theories quickly found a home on 4chan’s /b/ helping found the conspiracy culture that remains to this day. What was once relegated to bus stops and crack-houses was taking root in the new public square.

Removing the Filters

With Radio and TV, technical requirements meant only one channel or station could exist on a given frequency, operating somewhat like rivers in which boats of content could travel. All that changed with the internet. The capacity to communicate a message or provide a forum for doing so no longer has any limits.

Facebook debuted in 2004 and found traction by 2008–2009. It’s messenger and mobile service started in 2008, revamped in 2010. The like button was introduced in 2009, groups in 2010. Over the next few years the userbase exploded, adding older adults and many who had previously rejected any online social engagement. Some didn’t even use computers. By the early 2010s, Facebook posts were making their way into Legal proceedings. There appeared to be a slight lag between when people recognized they could state something on social media to when they realized there might be consequences for doing so. The internet was still some kind of fantasy world that many thought had no cross-over to real life.

Twitter began in 2006 with a 140 character limit, later expanded to 280. Intended for mobile, it found an audience quickly, becoming the third highest ranked social networking platform by 2009. The book seller, Amazon, had gone public and now sold everything under the sun, becoming one of the largest companies in the world. Jeff Bezos, who also shares my damned birthday, was now one of the richest men in the world.

Instagram launched in 2010. It and other platforms like YouTube and Twitter would give birth to the concept of “influencers” and provide megaphones for people who, like those original Real World cast members, sought attention at any cost. Soon, people were making lots of money as influencers, leading more people to find increasingly extreme ways to follow suit.

The western world had completed its migration to web 2.0, much to the pleasure of those that owned the machines running it. Mountains of data had already been collected, algorithms defined, interfaces designed and people were clicking away for every new hit of dopamine, making the makers fabulously rich. As the data accumulated, thicker lines were drawn, gaps between groups grew and bridges were incinerated, ashes washed away in their respective rivers, erasing any hint they’d ever existed.

Expanding the Divide

Indeed, to drive engagement, those algorithms and interfaces prefer strengthening existing groups, driving new users to their appropriate island where they’re more likely to click and engage. There’s less incentive for users to cross a metaphoric bridge, take a boat to a hostile island or sit silently while they contemplate information new to their world view, arbitrating fact from fiction. So silos of re-amplified opinions fester and subjects of social importance that once required the airing of opposing opinions now just wither and die in dark corners.

While original spam was akin to casting a wide net with only marginal ability to target individuals, social media made creating custom, targeted campaigns far easier. Additionally, state-based actors and well financed corporate entities like Cambridge Analytica quickly collected data of their own and used those platforms to reach large audiences in uniquely personal ways to sculpt and entrench opinions and achieve oft-nefarious political goals. The Matrix had become real, except the ones that thought they were battling the system had yet to notice the strings wrapped around their hands, legs and lips.

“Nearly all men can stand adversity, but if you want to test a man’s character, give him power.” -Abraham Lincoln

Barack Obama was the first Presidential Candidate to use Social Media as part of his campaign. His efforts were successful and he was elected the 44th President on November 4, 2008 and reelected in 2012. He has tweeted over 15,600 times total, including those during his eight years in office.

Donald Trump would join Twitter in March, 2009 primarily to advertise an upcoming book. He would later learn to engage a growing audience on the platform by frequently tweeting derisive, occasionally offensive material about Obama, questioning his birthplace and other inflammatory opinions, many born on 4chan, most succeeding with a whiff of 4chan enthusiasm. He tweeted over 34,000 times during his four year presidency before having his account suspended.

Donald Trump was the first 4chan President.

--

--