Serghei Sadohin, Author at 51łÔąĎ /author/serghei-sadohin/ Fact-based, well-reasoned perspectives from around the world Tue, 28 Jul 2020 14:52:23 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 Welcome to the Curiosity Economy /culture/serghei-sadohin-attention-economy-social-media-smartphone-data-infromation-technology-news-11177/ Mon, 27 Jul 2020 12:09:03 +0000 /?p=90076 In 1954, when television sets were just becoming widespread in American homes, Alfred Hitchcock made “Rear Window.” The film portrays L. B. “Jeff” Jeffries, a magazine photographer played by James Stewart, stuck at home in a wheelchair with a broken leg, whose only entertainment is gazing through his window and observing his neighbors’ private lives… Continue reading Welcome to the Curiosity Economy

The post Welcome to the Curiosity Economy appeared first on 51łÔąĎ.

]]>
In 1954, when television sets were just becoming widespread in American homes, Alfred Hitchcock made “Rear Window.” The film portrays L. B. “Jeff” Jeffries, a magazine photographer played by James Stewart, stuck at home in a wheelchair with a broken leg, whose only entertainment is gazing through his window and observing his neighbors’ private lives in their homes. “First I watched them to kill time,” Stewart the viewer, “but then I couldn’t take my eyes of them, just as you wouldn’t be able to.”

The film is a subtle allusion to the new medium of television that would soon change the world. Using the window as a metaphor, Hitchcock depicts the effects of television on privacy and, most importantly, people’s curiosity about each other. Hitchcock anticipated the cultural shifts to be brought by the effects of television on American society in those times. Watching one’s neighbors through a rectangular window is not much different from watching TV on a couch. In this sense, Hitchcock anticipated the effect of the couch potato, which has now become common in the English idiom.


How Social Media Is Changing Our World

READ MORE


But Hitchcock’s artistic foresight went further than that. He foresaw something more profound, namely the intersection between human curiosity and a new form of communication technology that was television. Today, in the age of the smartphone and social networks, this intersection is even more visible and can easily be called the curiosity economy.

Global Village

Since the heyday of television, human curiosity has driven technology further. The post-TV age made entertainment portable and put it in our pockets. There is a reason smartphone screens have been getting larger, not smaller, over the years. If television meant watching random, unknown people on a wide screen, the smartphone allows us to observe our neighbors on social media. This makes social media much more engaging and personal than television. Teens today can survive without TV, but they can barely do . A study last year that people will have to be actually paid around $1,000 to quit Facebook, even after the Cambridge Analytica scandal that exposed the internet giant’s abuse of user data.

The smartphone era is sometimes referred to as the , where tech companies treat human attention as a scarce commodity and bombard us with push notifications and updates. But our attention is fueled first and foremost by our curiosity and the desire to know what others do and say. The curiosity economy is at the heart of our “global village,” a phrase coined by the media philosopher Marshall McLuhan. Already in the 1960s, McLuhan presciently that “the global village is a world in which you don’t necessarily have harmony. You have extreme concern with everybody else’s business and much involvement in everybody else’s life.”

Reading a book used to be the most private and discreet way of accessing and interpreting information. It is not a public medium such as TV, social media, radio or even the newspaper. It is a fully private dialogue between the writer and the reader, completely desynchronized from the public. But even the book is losing this characteristic of privacy in the curiosity economy. Amazon’s e-reader Kindle shows the most-popular highlights throughout the book and recently introduced a button for Goodreads, a social media website for rating books. In the curiosity economy, it is no longer enough to interpret the content of a book for oneself — Kindle now allows you to do it “together” with other people. It gives clues as to what the public, not the individual reader, perceives as worth noting. Even listening to music is not always private. The music-streaming platform Spotify has the “social” option always on by default so friends can always see what music you are listening to. If you want to be able to listen to music in complete privacy, you have to constantly keep switching the function off.

If privacy is one of the biggest concerns of our times, then curiosity is the other side of the same coin: The former is under threat because the latter has no limits.

Who Killed the Video Star?

The shift from passive TV-viewing to the more engaging smartphone use is also seen on the political scene. If telegenic JFK was the first , then Donald Trump is the first . The TV age was mostly about presentability and image. The social media age is more about engagement and entertainment. In the TV age, one had to be physically present in a specific place at a specific time to be able to tune in to a lengthy political debate or a presidential address to the nation, which sometimes could last for hours. This made the engagement between the voter and the politician less frequent but more substantial.

The portable smartphone changes that. The curious and impatient smartphone voter expects more frequent updates from his politician compared to the TV voter. And the line between the politician and the influencer is increasingly .

President Trump is known for starting his day by tweeting and admits he uses Twitter mainly to “keep people interested.” And he is not alone in doing that. The former Democratic presidential candidate Beto O’Rourke once his visit to the dentist, suggesting that “If it is not on Instagram, it didn’t happen.” Ukraine’s President Volodymyr Zelensky successfully used Instagram for his electoral campaign, and it is for him to address voters on Instagram directly from the gym.

In Brazil, President Jair Bolsonaro largely avoids TV and focuses on engaging with voters on social media instead. One analyst The Economist that Bolsonaro is perceived as more sincere across social media networks because there, he is usually seen among friends and family. The former Italian Deputy Prime Minister Matteo Salvini used Facebook as part of the “every selfie a vote” strategy, as by The Atlantic. Even the old-guard presidential candidate, former New York Mayor Michael Bloomberg, almost entirely his campaign on social media in order to defeat Trump, using the president’s own tactics of simplistic communication through .

In short, in the post-TV age the politician has to drop the suit and tie and behave more like your next-door neighbor. The public figure must provide the voter with a constant stream of information where the emphasis often is on quantity rather than quality.

Survival Instincts

In the Information Age, this constant stream of updates does not go to waste. It is now a valued resource we call data. Just like a river stream is converted into energy, we now use this flow of information to create intelligence — artificial intelligence. The enormous amount of information we generate is turned into a commodity that is now officially more than oil. We have become the hunter-gatherers of information, and this new gold is no longer found underground but on the servers of tech companies.

But how did we arrive at this global village? How could the value of Facebook — a website originally intended for college students to see each other’s pictures — become than the GDP of Argentina? What drove curiosity to become such an important pillar of today’s tech-based society?

One of the answers is that curiosity is deeply ingrained in our very own survival instincts. It is a human trait interlinked with prudence and the fear of the unknown. The word comes from the Latin ł¦Ĺ«°ůľ±ĹŤ˛őłÜ˛ő — a careful, diligent person, with the root word cura, or care. Since the prehistoric times, human beings were never really safe in their village or cave, so they had to explore their immediate surroundings and expand the known territory for possible threats from outside. Attack was always the best form of defense. Exploring and conquering distant lands was a form of protection from the unknown. So was conquering nature. Driven by curiosity, every scientific discovery exposed nature’s secrets and, as a result, its threats. It is telling that NASA’s rover currently exploring Mars is called Curiosity.

It seems that we are curious about each other for the sake of connection as much as protection. If, as the French philosopher Jean-Paul Sartre famously quipped, hell is other people, then being curious about each other also means keeping a close eye on each other. In the end, it is Stuart’s curiosity in the “Rear Window” that saves the day when he eventually discovers a murderer among his neighbors.

In “Man and Technics,” historian Oswald Spengler observes that in the animal kingdom, keeping a close eye on each other is essential for survival. Carnivores higher up the feeding chain usually have their eyes fixated at the front of their skull to be able to set their target on the moving prey, the herbivores. In turn, many herbivores often have their eyes set sideways, which allows them to spot lurking predators while they graze.

The human eye is even more complex. Scientists that humans are the only living creatures to have big white spots — the sclera — around the pupils, which allows them to spot the direction of each other’s gaze with remarkable precision. Just like animals, we rely on information accessed either by sight, smell or hearing. As the saying goes, information is power. But in addition, we have something that animals don’t have, which is speech. This makes us information predators, preying on each other in our own, particular way.

Curiosity and the need to stay informed have pushed humanity to constantly improve its communication methods by preserving and expanding speech across space and time. When speech evolved into writing, we could, via a piece of paper, put spoken words in our pocket or send a message overseas. The technology of writing made speech portable across space and durable across time. We did something similar with the smartphone: We put the stationary TV set, the typewriter and the telephone into a single device to fit in our pockets. Just like the piece paper containing speech, the smartphone encapsulates all our communication devices in portable form across space and easily accessible at all times. Thus, the ancient idea of expanding our communication capability across space and time remained the same.

But the question that arises more and more often these days is whether we now have too much information. If information is our new oil, then the simple rule of economics says that the increase in quantity always means a decrease in value. However, it is not the tech industry that experiences the decrease since the more input the AI machine has, the better. It is rather in the socio-political sphere where the depreciation is more visible. The desire for information for information’s sake risks turning politics into entertainment. One cannot have the cake and eat it too. “He knows a lot about them by now,” the narrator of “Rear Window” sums up Stewart’s curiosity. “Too much perhaps.”

The views expressed in this article are the author’s own and do not necessarily reflect 51łÔąĎ’s editorial policy.

The post Welcome to the Curiosity Economy appeared first on 51łÔąĎ.

]]>
Is Twitter Killing the Written Word? /culture/donald-trump-twitter-mass-communication-social-media-reading-culture-news-15413/ Fri, 14 Feb 2020 19:10:07 +0000 /?p=85260 If Donald Trump had anything in common with a poet, it would be Homer. Homer was a pre-literate poet from Ancient Greece, while Trump is a post-literate US president who prefers to be briefed orally, gets his news from television and boasts that he likes to read as little as possible. Homer refers to speech… Continue reading Is Twitter Killing the Written Word?

The post Is Twitter Killing the Written Word? appeared first on 51łÔąĎ.

]]>
If Donald Trump had anything in common with a poet, it would be Homer. Homer was a pre-literate poet from Ancient Greece, while Trump is a US president who prefers to be , gets his news from television and boasts that he likes to read as little as possible. Homer refers to speech as “winged words” because the spoken word flies away in a blink of an eye. Trump likes to use Twitter — the digital equivalent of the “winged word” bearing a logo of a bird.

Homer uses over and over again, such as “clever Odysseus” or “wise Nestor” for easy memorization. Trump is a master of repetitive catchphrases and nicknames like “crooked Hilary” or “sleepy Joe” for easy “hashtagization.”


Did You Say High Culture Was Dead?

READ MORE


Even though Homer lived in times when information was transmitted only in oral form, Trump managed to become president with as little recourse to the written word as possible. More precisely, Twitter’s 240 characters were enough. But is the issue bigger than Trump and Twitter as far as mass communication is concerned today? Is the Twitter age, in fact, a post-written age? To understand the character of the written word as well as its implications, a little history could be helpful.

The Written Age

Canadian media scholar Marshall McLuhan and his colleagues from the Toronto School of Communication traced the transition from the “acoustic age” of Homer to the “written age” of Plato, who banned oral poets Homer and Hesiod from his “Republic.” Plato was concerned that information transmitted orally limits the acquisition of new knowledge, since for hundreds of years the pre-literate Greeks could only recite by heart the same oral epics instead of moving on to acquiring new information.

By writing down the oral Socratic dialogues and praising the eye that “radiates with intelligence,” Plato set forth a new era in the history of communication. Literacy replaced tribal memorization with private analysis and scrutiny, which would eventually become the bedrock of Western liberal democracy and way of thought. The acoustic age of the ear gave way to the written age, dominated by the eye. As McLuhan says in “The Gutenberg Galaxy,” the eye detribalized the Greek because he could now have a private “point of view” on what he merely heard from the minstrel.

A few centuries after Gutenberg, Nietzsche would say that “The German does not read aloud, does not read for the ear, but merely with his eyes: he has put his ears away in the drawer.” A few years ago, Google illustrated McLuhan’s understanding of the different epochs in the history of communication with this .

But the 20th century brought back the ear from the drawer with radio, television and the internet. The electronic age allowed mass communication to take on an acoustic form once again. As with any technology, the aim is always facilitation, speed and ease of use and, as far as communication is concerned, there is nothing easier than the spoken word. For example, Siri and Alexa were designed for that very reason.

Paradoxically, the faster and more efficient mass communication becomes in the digital age, the closer it gets to the acoustic age of Homer. “We are marching backwards into the future,” McLuhan warned several decades before the internet. Or, as Harold Innis writes in “The Bias of Communication,” “Improvements in communication … make for increased difficulties of understanding.”

Today, the written word loses its dominance, and books are just not the dominant media of communication anymore, neither digitally nor on paper. Several show that reading is in in the West as opposed to digital and social media use. In the electronic age of fast information, the book’s status itself is reserved as a feel-good beach companion rather than the primary source of information.

The simple act of reading turns almost into an of slow reading. Books are so “slow” in the digital age that they have to be revived through and . In this new “acoustic” environment dominated by TV, podcasts, audiobooks, YouTube and Netflix, the written word has to catch up and adapt. It has to become closer to the spoken word in terms of speed and facilitation. And this brings us to Twitter, the equivalent of Siri and Alexa in the public sphere of mass communication.

Short, Inconsequential Bursts of Information

Twitter’s acoustic features are already implied in the name itself. According to its CEO Jack Dorsey, the name “a short inconsequential burst of information, chirps from birds.” Trump is the first to he uses Twitter like a spoken word: “So when somebody says something about me, I am able to go bing, bing, bing, and I take care of it … The other way, I would never get the word out.” The US president even speaks in hashtags. The #crookedHilary hashtag became so popular that he even attempted to from Twitter and turn it into an emoji.

The hashtag is a catchphrase that has to be recognizable and memorable — a trick already used in Homer’s acoustic days. Present on all social media by now, the hashtag is like the Homeric poem: One needs to keep repeating it until it has an effect. The more widely a hashtag is used, the more influential it becomes in public discourse. Very complex issues are crammed down into a simple cluster of words and “fly” across the global cyberspace like a spoken word.

Neither nor were intended by their originators in the way we know them today. A by the think tank Bruegel found that analyzing the #Brexit hashtag was more accurate for predicting the outcome of the 2016 EU referendum than what opinion polls, betting odds and political pundits projected. Once something becomes a hashtag, it morphs into a thing of its own by the power of popular use.

And this is where sophisticated democratic debate suffers. and are less in control of political debate and are rather led by hashtags themselves, just like the Greek listeners who were fully influenced by the Homeric epics. Is technology bringing us back to Homer’s acoustic age? If so, then we would perhaps study the ancient heroes to better understand our current leaders.

*[Correction: An earlier version of this article incorrectly named the Bruegel think tank as Brueghel. Updated 2/18/2020 at 11 a.m. GMT.]

The views expressed in this article are the author’s own and do not necessarily reflect 51łÔąĎ’s editorial policy.

The post Is Twitter Killing the Written Word? appeared first on 51łÔąĎ.

]]>