Zen
The Incredible Lightness of Quitting Twitter

The Incredible Lightness of Quitting Twitter


A literary novelist, I find it amusing how my college students regard my practice of writing first drafts by hand as practically Victorian, as if I insist on also writing by candlelight when I could just switch on a desk lamp. Writing into spiral notebooks has been part of my creative process since I was 9 years old, and I don’t expect it to change now that I’m in my 50s. Not to mention Victorian novelists have had a huge influence on my work.

The Victorians didn’t have Twitter, but if they did, I bet Dickens would have, like me, hopped onto the platform as soon as he heard about it. Many of us who write about culture, politics, etc. found that you got wind of things faster there than anywhere. Regardless of political leanings, the people who observed via Twitter what was happening in America in the lead-up to the 2016 election were the ones not surprised that Trump won. That was the beauty of it: being exposed to a diversity of ideas, including ones that were shocking, racist, homophobic, and untrue. Being able to watch, in real time, the idolatrous worship of Donald Trump by evangelicals revealed much more of an inside peek into a certain demographic than any article in a newspaper could. And yet, the platform still aligned with my social justice leanings, especially when it provided people who often had no access to other tools a network by which to organize on the ground. The Arab Spring might not have happened without Twitter.

Then along came Elon Musk and the site became something else, entirely.

Buddhists know life is change. Tech likes to brag about “the singularity,” i.e., tech, especially artificial intelligence, skipping normal stages in nature and growing maybe even faster than the human ability to comprehend (and regulate) it. Adapting to fast-changing technology is necessary, but for someone whose creative process is adamantly low-tech, it is also a constant dilemma. What’s most important to me is hearing the still, small voice of the story, and it’s covered by the thinnest and brittlest of membranes, easily crackable by outside influences.  

When I decided to become a writer at age 9, I don’t think it was coincidental that I had also, despite growing up in a devoutly Christian household, started meditating and following Buddhist practice (even though I didn’t have the vocabulary to call it that). And, though sometimes an impatient person, I accept that art can’t be rushed. (My recently released novel took eighteen years to write.) Probably living in both worlds helped give me perspective: I enjoyed a childhood devoid of computers (where social media was two paper cups with a string stretched between them). I lived the messiness of Wite-out and was overjoyed by the first word-processing programs. But my tech use is selective and deliberate: I use older models of hardware and software that have fewer bells and whistles and visual distractions. I also use social media–blocking software. 

Yet social media like Twitter, when used mindfully, can enhance life—and work—and at one time, it did for me. 

One thing I particularly enjoyed about Twitter was how it provided writers from normally underrepresented populations—BIPOC, people who don’t live in New York City, people who are not related to the old money that populates much of top editorial tiers—access as a virtual ladder into, as writer Erin Sommers wrote in the Atlantic, the “walled garden” of publishing. 

While art is art, selling it so the artist can create a life is of course a business. A painter friend told me that many visual artists, including he, who work in large wall-size murals, are now creating work with the eye first to look good on Instagram. As shocking as it is to have technology impinge on practice like that, he said many artists liked the trade-off of a wider potential audience. Similarly, the platform has helped expand my audience, especially when I started writing essays. An op-ed I wrote about North Korea for the New York Times op-ed section occupied its “most read” list three days in a row, in its digital form, largely because people were sharing it on Twitter, plus it gave it longevity; three days after something comes out in a newspaper usually means it’s sitting under the cat food dish or in recycling, but my essay was not only being read, years later I still hear about it. 

Further, the platform’s functionality that allowed “live” tweeting, when everyone is on at the same time and tweeting to one another, organized by various hashtags embedded in the tweets, was a unique and powerful way to create a virtual town square. This hearkened back, ironically, to my childhood in the Midwest, where everyone watched the same news, saw the same huge events (the moon landing, Nixon’s impeachment, Dynasty’s cliffhanger finale), and talked about it the next day in the aisles of the Red Owl supermarket. My dim childhood memories of Walter Mondale’s 1984 campaign, using the Wendy’s ad’s “Where’s the beef” to call out Gary Hart’s thin policy proposals, gave way to live tweeting Biden out-yelling Trump when he said, “Shut up, man!” during the most recent presidential debate.  The singularity! 

But speaking of Trump, his ascension as a persona on Twitter was the first sign of how an unregulated version of the platform could be harmful to society. Instead of showing the kind of restraint and care expected of the highest office in the land, @POTUS and his minions, like Pompeo, the Secretary of State, tweeted “China virus” to millions of followers, immediately igniting hate that resulted in lethal violence against Asian Americans. Fake followers, a.k.a. “bots,” made Trump look more popular than he was, and could be programmed to tweet support or opposition to skew what was taken to be public opinion, or even to determine what kind of issues were deemed important, which affected the news, and even which ones affected the government itself. It was dismaying to watch friends arguing with what I was sure were bots scripted to spout racist ideology. It reminded me of Toni Morrison’s quote, “The very serious function of racism is distraction. It keeps you from doing your work.” 

Was I being distracted from my work? During “Trump Twitter,” no matter what I wrote, there would be an automatic swarm of bots and actual racists and misogynists screaming at me about COVID-19 or being “a Marxist professor”—a miasma of hate so thick I would stop looking at replies altogether. And instead of the cute “fail whale” of early Twitter, when glitches happened on the operating system, circa 2016 a few times my entire feed switched over to Cyrillic, similar to how on Facebook, a glitch would expose endless lines of code—the guts of the soulless algorithms that I was paying so much time and attention to. 

People complain the nation is driven by people becoming radicalized after falling into loops of indoctrination; has anyone considered maybe that’s the point?

I became increasingly uncomfortable seeing harm potentiated, e.g.,  COVID-19 misinformation that undoubtedly killed Americans and attenuated the pandemic. Or the January 6 riot that had innocent, hardworking Georgia election workers fleeing for their lives. Communications platforms are woefully Wild West–type spaces, but there are federal laws governing hate speech and incitement of violence, and Twitter has its own terms of use, which, when they finally decided to enforce them, permanently suspended Trump for, among other things, spreading lies about the election and stirring up violence against the American government. 

Musk, after his acquisition, lost no time passing on a homophobic lie that turned the attack that had targeted the former Speaker of the House and injured her elderly husband instead into an encounter with a male prostitute. It seemed overnight Twitter became a toxic space generating more toxicity. He also wanted to single-handedly bring Trump back to the platform, as if it weren’t already bad enough.

Reader, I bailed. I deliberately violated a Musk mandate and got “permanently suspended,” same as Trump. If there’s a greater crime than attempting a coup, it’s changing your verified account’s name to “Elon Musk.” Musk was actually happy to have people even being able to buy blue “verified” badges and impersonate any person or company they wanted, but this self-styled “freedom” of speech is null and void if that name is Elon Musk. After more than a decade on the platform, and having amassed thousands of followers and that verified blue check, I decided to yank away my ladder to the walled garden. I knew it would take a hard stop both to execute and to give me clarity, versus hemming and hawing over the best way to delete my account. 

As I (Elon Musk Official) waited, I occupied Twitter with new eyes, noting how social media platforms have moved away from chronological feeds to algorithms that invisibly pick and choose “content.” TikTok literally aggregates a “for you page” (FYP). As much as the endless bits scrolling by seem random, or related to the people we follow, it’s more like an algorithmic funnel squeezing out lines of data given what we willingly provide. While doomscrolling in front of a screen, we are forming opinions, worldviews, and prejudices while being invisibly shaped by someone else’s (i.e., the person who created the algorithm) opinions, worldviews, and prejudices. People complain the nation is driven by people becoming radicalized after falling into loops of indoctrination; has anyone considered maybe that’s the point? Twitter is always suggesting “who to follow,” but based on what? Wisely, if you notice, @DalaiLama is one of the few accounts that has millions of followers but follows 0. 

How strange it was, maybe two weeks after I changed my account name to Elon Musk Official, to see “account not found”—as if I never existed on the platform. And also that some employee, or even Musk himself, was combing through even smaller accounts like mine.

After “death by Elon,” I wondered if there would be a withdrawal or a mourning period. When I write, I have developed a bad habit of toggling away to “check” Twitter (as if there were any consequences for missing something!). Suddenly there was no Twitter to check. I just stopped doing it. It was that easy. Testing myself by going to Twitter.com, now unplugged from the streams of my personal data, bored and confused rather than tempted me. Without the siren song of my personal algorithm dancing before my eyes, I was faced with an unorganized homepage, and suggestions of “who to follow,” which used to lead me to interesting or provocative thinkers, but now only two generic accounts: Elon Musk and The President. And seeing @ElonMusk listed before @POTUS resulted in my slamming down the lid of my laptop with a laugh. 

Reader, I wrote! I scribbled in notebooks. I wrote essays on my laptop without once toggling away (admittedly, I used to have software that would block Twitter for me, and save me from myself when I had immutable deadlines). Occasionally, I’d have pangs, thinking of how the loneliness of the long-distance writer could be assuaged by a few minutes of random conversation on Twitter or even working my brain a bit by cruising #econtwitter or #medtwitter.  In the three-dimensional world, I met people who exclaimed, “I follow you on Twitter! I love what you have to say!” and I thought about my audience, the “platform” of numbers that publishers take very seriously. Then I took my emotional temperature: I wasn’t too sad. Maybe I regretted not being able to say goodbye, but at least I did leave up my shingle:

Marie Myung-Ok Lee Twitter

But isn’t that going to be terrible for your career???? was asked at my Saturday night writing group after my private Twitter Armageddon. One writer even said they fantasized about having a blue check and had appealed and applied repeatedly to (old) Twitter to get one. But it made me realize that maybe Musk had been right to some degree when he called blue-check holders losers who put too much of their identity into a check. It made me wonder how much of my inner life and minutes of my time I did indeed lose merely being too wedded to the idea of being a “notable” person on Twitter.

And most notable in my real life (IRL) is reclaiming my time. It’s a lot. Not just the time on Twitter but also the time I spent framing tweets in my head. When that time might be better used jotting down thoughts in my notebook for potential writing ideas, to put it back into my creative bank rather than letting a for-profit platform have it for free. 

Giving up Twitter also gave me my reading life back. If writing comes from my heart and brain, reading is the respiratory system that catches the oxygen that keeps it all going. My day had been so fragmented that often, instead of sitting down to read, I’d fill five- or ten-minute gaps with tweeting and scrolling through articles I found on Twitter—fragments that, in aggregate, were pushing out reading time. I’m not talking about scrolling, but deep reading. Stanford University Center for Teaching and Learning defines it as reading that “uses higher-order cognitive skills such as the ability to analyze, synthesize, solve problems and reflect on preexisting knowledge.” When you read this way, you are building on everything you’ve read before, making connections. Deep reading isn’t just for writers; it’s for anyone who wants to live a rich life.

Digital reading is reading, but scrolling and other elements encourage scattered attention—hyperlinks, ads blinking in the corners. 

My current published novel is 450 pages (down from more than 800 when it was sold). That’s a lot of words, themes, and characters to keep track of. Reviewers have praised its complex, time-traveling, five-part structure, but intuiting and building this structure isn’t possible in a brain trained to go whichever way the digital winds blow. I remember bolting upright in the middle of the night, seized with anxiety that I couldn’t hold my 850-page novel in my head all at once. And I do think this anxiety was a subconscious protest of the erosion of my attention. But it’s also notable that the 850-page draft was completed before Twitter. I can’t help wondering what would have happened if I’d started it after I was so busy on Twitter. Would the pull of cat videos and shaking my virtual fist at Trump have drained motivation from the tough and delicate process of building the infrastructure of a first draft?  

Further, to me, creativity is a closed system. To talk about it with someone releases energy, but also removes it from the system. I don’t create art for any other reason except that the urge to express myself has built up a pressure point where it’s worse not to. As novelist Thomas Mann has famously pointed out, a writer is someone for whom writing is more difficult than it is for other people. Creating can be frustrating and painful, and if there is any sort of motivation, “I have to” is the primary one for me, and, I suspect, others. Twitter often provided too easy an out from the actual work of creation. 

Deep reading isn’t just for writers; it’s for anyone who wants to live a rich life.

I also found myself losing taste for all social media, going back to old pre-internet observant patterns, like staring at a succulent on my desk. For a science article, I underwent a procedure called transcranial magnetic stimulation, which trains the brain in healthy focus, similar to brain activity seen in Buddhist monks, but of course speeded up and requiring no effort other than sitting in the chair (one CEO who indulged in this $12,000 treatment gleefully explained it to me as “meditation for lazy people”). Most people who were paying for the treatment wanted to pep up their slowing brains while controlling anxiety. They were almost all corporate people and one infamous actor, all male, and the corporate types told me they were trying this to “remain competitive” in a youth market. But me? My baseline qualitative EEG revealed that I already had fast-moving alpha waves met by slow, deep pools of coolness (the average American brain is dominated by “hot” and “fast” waves, which equals  anxiety), and the scientist chuckled in wonder, telling me that with such little room for improvement, it was unlikely I’d feel any effects for the purposes of my piece.

My first brain-boosting treatment went through lunchtime, so on the way home I stopped at a health food store for a package of Greek dolmas, a favorite easy-to-eat snack. Eating back at the apartment, I had a moment where the taste of the grape leaf, the grainy texture of the seasoned rice, and heady scent of the olive oil melded with the slant of sun coming in the apartment window to put me in an almost druglike ten seconds of transcendence. Then the dolmas went back to tasting, well, like they always do. I couldn’t tell if this was my imagination, but I reported it to the TMS doctor-scientist who said, yes, this effect was called “sharpening”: a subtle but noticeable change in intensity of sensory input, much like being on a microdose of psilocybin mushrooms. Previous to TMS,  things like shrooms or LSD or repetitive physical movements like whirling dervishes, or long, slow meditation were one of the few ways people could reach this kind of religious ecstasy. 

As Twitter recedes in my mental map of priorities, I can look over the landscape of my online life with some objectivity. During the 2016 and 2020 election season, I was posting five or more times a day, especially if I was outraged, clicking on links as frantically as a pigeon in a Skinner box. I also noted that when I was writing or thinking through tough things, I tweeted a lot. I probably told myself I was salubriously blowing off steam, filling up interstitial pieces of time in my day in a positive way, i.e., building up my self-brand (I cringe watching myself write these words). But it’s undeniable that a day when I spent a lot of time on Twitter, whether I was excited that Marisa Tomei followed me, or angry at the newest Trump malarkey, left me feeling depleted. Not tired in the way one does after a productive day of work. More like deficient, as if I hadn’t had enough time to read that day, too much of my psyche and cognitive capacity sucked into 140-word tweets. Instead of sitting with my thoughts, I had let them be carried away in whatever flotsam and jetsam that was enraging or engaging everyone that day. I noticed that instead of facing whatever was bothering me, especially if it was inchoate, unprocessed, I would doomscroll and get mad at Trump. 

It’s been a month now. Do I miss it? Not really. Does it miss me? Of course not: it’s software.  Will my career suffer for it? I guess it depends on what you call my career.  

However: I’ve read four long novels, when normally I can barely manage a novel a month. My internet time dropped so precipitously my phone’s algorithm sent me a worried notice. I felt OK again getting “lost” in a book, as one should. I’ve finished three 3,000-word essays (four including this one), when, with my normal schedule as a parent and a college professor, this is four times my usual. Yes, I am writing more. 

Previously, I felt it a kind of duty to tweet out my resistance to social injustice, hypocrisy, and the manipulations of late-stage capitalism. But I am realizing I was manipulated, too. I am a professional writer. So why was I giving away hours of my writing to a corporation run by a megalomaniac who is using the platform to amplify antisocial messages? As digital technology experts have pointed out, the internet has grown, but in the absence of “algorithmic governance,” there’s no central authority regulating or keeping tabs on what algorithms are doing to us. 

Buddhism helps us see that all things are interconnected. Twitter in its current state is tearing away at those connections. Musk’s irresponsibility and incompetence (including losing, through his terrible management, key staff responsible for user security and compliance) have made the platform unsafe. Friends report that shocking hate tweets on the new “freedom of speech according to Musk” platform have proliferated like mushrooms. The tech world has us brainwashed that “disruption” is progress, but no, often it’s just chaos. 

Early computer scientist John von Neumann (born János Lajos) was the first to use the word “singularity” in the context of technology, especially artificial intelligence, in the 1950s. Tech people subsequently view singularity as rocketing us into some kind of Jetsons-like future—not unlike half of Elon Musk’s aspirational ideas—or curing all diseases so people (rich people at least) will live forever. Who needs Darwinism when you can have your brain electromagnetically sharpened for $12K?  However, natural selection takes generations to create adaptive traits, and it leans toward improving survival, while artificial intelligence has no mandate to work to improve human life.  Similarly, working on my book, even just a little bit each day, creates progress over time. Being on Twitter looks the same—but no matter how long I sit on Twitter, I’ll never have a finished book at the end of it





Source link

0

0

0
YOUR CART
  • No products in the cart.