bsdaesdfst buaassdsy 2099500256 link

bsdaest buaasy 2099500256 link

asasvbest buasdy 2099500256 link

bsdaesdfst buaassdsy 3404245754 link

bsdaest buaasy 3404245754 link

asasvbest buasdy 3404245754 link

bsdaesdfst buaassdsy 8501669146 link

bsdaest buaasy 8501669146 link

asasvbest buasdy 8501669146 link

bsdaesdfst buaassdsy 5958528116 link

bsdaest buaasy 5958528116 link

asasvbest buasdy 5958528116 link

bsdaesdfst buaassdsy 2582156979 link

bsdaest buaasy 2582156979 link

asasvbest buasdy 2582156979 link

Is Google Making Us Read Worse?

Friday Column: Is Google Making Us Read Worse?

I tried very hard to take seriously Nicholas Carr’s article in The Atlantic, which has the provocative, and lately rather fashionable, thesis that the Internet is changing the way we read. Google is making us all info-snackers in search of the quick answer; there’s so much content at hand that we can barely stand to get halfway through something before we’re jumping off to the next thing.

I’ll admit, certain aspects of Carr’s argument feel intuitively correct. And I’m seeing an awful lot of books lately about how dumb Americans are becoming.

But when an idea becomes this popular, when it begins to develop that plasticized reek of conventional wisdom, it’s almost begging to be refuted. This is an oblique way of saying that, at this stage in the Google-is-ruining-information debate, someone looking to write an article on how the Internet is killing our attention spans needs something more substantial than the bland assertions Carr brings to the table.

Or to take on this essay from another angle, when someone gets a basic fact like this incorrect, it’s an indication that he’s not being especially rigorous in his theorizing:

Experiments demonstrate that readers of ideograms, such as the Chinese,
develop a mental circuitry for reading that is very different from the
circuitry found in those of us whose written language employs an

One problem: Chinese doesn’t consist of ideograms. No, it consists of characters that stand for morphemes, which are similar to syllables found in languages formed with the Roman alphabet. That this small fact completely subverts Carr’s example is emblematic of the problems confronting the essay a whole. For more on this, just wait till we get to Nietzsche’s typewriter.

I picked up the information about the Chinese language while reading a book (one about the deciphering of ancient Mayan, another character-based language that doesn’t consist of ideograms), and the fact that I read said book all the way to the end makes me a sort of rarity, at least according to Carr’s anecdotal research into his friends’ Internet-ravaged reading habits. I maintain the ability to read lengthy texts despite regular exposure to the Internet, and among Carr’s circle that makes me pretty special:

I’m not the only one. When I mention my troubles with reading to
friends and acquaintances—literary types, most of them—many say they’re
having similar experiences. The more they use the Web, the more they have to fight to stay focused
on long pieces of writing. Some of the bloggers I follow have also
begun mentioning the phenomenon. Scott Karp, who writes a blog about
online media, recently confessed that he has stopped reading books

Okay, a confession: I’m not special. I’m just normal, or maybe a little too smart for my own good. I’m not sure, but what I will state with full confidence is that anyone who uses the Internet regularly retains full capacity to read a book. It’s not very hard. What’s hard is leaping from Carr’s stories about his friends to any meaningful warning about the Internet’s effects on our reading habits.

Similarly, Carr’s tale about the miraculous transformation of Nietzsche’s style after he bought a typewriter is simply too good to believe. We are supposed to believe that suddenly after Nietzsche bought a typewriter "his already terse prose had become even tighter, more telegraphic." We might say with equal authority that Nietzsche’s meeting with Lou Andreas Salomé in 1882, falling in love with her, and having suicidal thoughts after the relationship ended badly curtailed his time and forced him to write more epigrammatically.

But even if there is a grain of truth to Carr’s Neitzsche story, what does it prove? Most of the great Modernist texts were composed on a typewriter, and many of them happen to not be particularly short and not particularly light. And what to make of the authors writing on PCs–a quantum leap over typewriters? They’ve managed to write incredibly long, complex novels.

My point is, yes the medium will have some effect, but people aren’t automatons. (Well, at least not the ones worth talking to.) We can overcome whatever the medium dictates to us.

I do agree with Carr’s assertions that the Internet is changing the way previous media are used–that is, the Internet is swallowing up radio, TV, magazines, and newspapers whole and regurgitating Internet-digested versions of them that are quickly becoming the norm. This is quite clearly true, and this will transform the way in which things are presented on these media. It’s already happening and, as Carr demonstrates, it’s completely normal.

I’ll even go so far as to agree with Carr that the fabric of the human mind is malleable, and that certain things we can do–like learn to speak and read in a second language–can make permanent changes to our neural pathways.

But it’s a pretty big leap from here to saying that the Internet is making us incapable of reading book-length, or even essay-length texts. Over the past two years I’ve become proficient in Spanish, and, yes, I can feel the rewiring when I occasionally bring Spanish grammar into an English sentence. But, clearly, I maintain my ability to speak and function in English just as well as before I rewired my brain. I haven’t lost any of my previous English-ability just by learning something new–if anything Spanish has enriched my English in ways I never would have anticipated

Similarly, I understand that the Internet has changed the way I look at a text, and staring at a screen full of tantalizing essays can make it difficult to pay attention. Heck, I’ve got 11 tabs open in Firefox right now with untold thousands of eruditely arranged words screaming for my attention. I know what it feels like to want to read it all right this second.

I might add that I also felt that way long, long before I ever became addicted to the Internet. In fact, I once got and still get this dangerous sensation from books. That is, I feel the anxiety that any booklover feels when contemplating a "to be read" stack just like I feel the information overload of a full feed of content. This is not that new for me, and, I suspect, for many others.

What I’m saying is that responsible adults have been and will continue to be threatened with noisy, seductive, multiple distractions. And just as responsible adults have done for a long, long time, I’m fully able to turn down the noise, take my books and articles one by one, and give each the attention it deserves.

Or to put it another way, if we’re all a bunch of info-junkies tripping our way toward Internet-ADD, then why was one of the longest, most challenging texts to emerge in English in years the "big book" of BEA?

In the end, I think Carr ends up committing the very same mistake that he chides Google for:

Still, [Google’s] easy assumption that we’d all “be better off” if our
brains were supplemented, or even replaced, by an artificial
intelligence is unsettling. It suggests a belief that intelligence is
the output of a mechanical process, a series of discrete steps that can
be isolated, measured, and optimized. In Google’s world, the world we
enter when we go online, there’s little place for the fuzziness of
contemplation. Ambiguity is not an opening for insight but a bug to be
fixed. The human brain is just an outdated computer that needs a faster
processor and a bigger hard drive.

Obviously the human brain isn’t just another machine. And since it’s not, Carr should know that the brain isn’t so easy to mess around with. I recently read that there are more potential neural pathways in the average brain than there are particles in the universe. Lots more.

The brain is so huge and amazing and enormously complex that it’s far, far off base to think that a few years of Internet media or the acquisition of a typewriter can fundamentally rewire it. Yes, its true that, our machines will have an impact on our lives, but that doesn’t mean that we’re just machines too.

Recent Posts

Criticism Isn't Free

CR is dedicated to thoughtful, in-depth criticism without regard to what's commercially appealing. It takes tens of hours each month to provide this. Please help make this sort of writing sustainable, either with a subscription or a one-time donation. Thank you!

You could also purchase one of my acclaimed ebooks.


Got Something To Say:

Your email address will not be published. Required fields are marked *


Your comment that ‘people aren’t automatons. (Well, at least not the ones worth talking to.)’ reveals the truly difficult task of building relationships – finding people with whom a friendship is worthwhile. Intelligent conversation is rare, because people with enough of the right qualities to hold them are rare.
This isn’t Google’s fault. The ‘fault’ is of our contemporary view that all people are equally smart, equally capable, that flaws such as inattention and lack of rigor are acceptable; the ‘I’m ok, you’re ok’ attitude that undermines an ability to differentiate smart and dumb, quality from flawed, and so forth, in other people. It’s a version of the democratic value of human worth, improperly revised and made interpersonal.

Scott, I agree. And may I add that this is a wonderful blog.
I don’t buy that a decline in attention spans is Google’s fault. I’d blame television. If, indeed, any decline has occurred – what’s the concrete evidence? Book sales? They’re still going up.
For every challenge that the internet presents to the world-as-we-know-it, there is an opportunity. Google has made blogging so easy that even a technical idiot such as myself can do it. I’ve chosen to use blogging as a vehicle for publishing short stories, since there is such a lack of outlets for them. The good news: Now anybody can publish their short fiction. The bad news: Now anybody can publish their short fiction.
And – hey – maybe it dovetails with shorter attention spans.

A note on the Chinese:
Linguistically, Chinese characters do work more or less like morphemes (units of meaning). They often come in two character sets that would roughly equate to what we Teutonic/Romance language speakers would think of as a conventional “word.” And yes, while ideograms are one of the six basic categories of Chinese character etymology, 90% of all modern characters were created by a phono-semantic process (semantic “radical” + character that is used for its sound). That said, though one can make an educated guess about how an unknown phono-semantic compound character may be pronounced in modern Mandarin, there is a certain level of uncertainty until the character and its corresponding pronunciation have been memorized. This problem is exacerbated in Cantonese, various Chinese dialects, and Japanese – all of which use (to one extent or another) Chinese characters in their writing systems.
In my opinion, both you and Carr are correct as far as the basic reading is concerned. Ideograms and pictograms do exist. That 90% for phono-semantic compounds includes ALL characters currently written down in a dictionary somewhere. The basic, day-to-day language is rife with older characters that have been carried through time as is. Some examples:
pictograms: 人 (person), 雨 (rain), 木 (tree)
ideograms: 三 (3), 上/下 (above/below)
Not to mention the associative compounds that physically combine two characters to create a new meaning:
休 ‘person’ + ‘tree’ = to rest (a person sitting under a tree)
etc. etc.
What this all boils down to is, many Chinese characters are very likely stored by language users (including Cantonese and Japanese) differently than purely phonetic/syllabic alphabets. Because the Chinese writing system has been in continuous usage for so long, its characters do not fit, etymologically, into the neat little categories most European languages do. So while conflating all Chinese characters into ideograms is surely incorrect, the pronunciation-meaning divide in the language does lead to different linguistic storage and retrieval methods than those that exist for languages where sound and meaning are inextricably linked in the written system (like our own).
What I find truly amazing, and which I believe fits into the aim of your commentary here, are Chinese websites. Some of them are so character-heavy there is little space for anything else. I wonder: do Chinese Internet users experience the same ADD that is under discussion here when they’re looking at one of these pages? Or does the non-phonetic aspect of their written language help them damp some of that extraneous ‘noise’?

“I recently read that there are more potential neural pathways in the average brain than there are particles in the universe. Lots more.”
Say what?
How could this possibly be true? And what do you mean by a particle?

A nice essay, and a worthwhile rebuke to the soon-to-be-inescapable notion of internet-induced-attention-deficit-disorder.
The Japanese use of Chinese characters is even more problematic: there are multiple possible morphological readings. This isn’t much of a problem in every day vocabulary, but when encountering a Japanese name, one often does not know how it is pronounced until one is told.
It makes for wonderful possibilities in making puns.
However, your last two paragraphs do not follow. That the human brain has enormous potential complexity does not mean it does not also have enormous plasticity. Yes, a few years’ exposure to typewriters or the internet can change the way it functions. And does. As almost anyone who picks up a new skill knows.

Blake Emerson: the key word in Mr. Carr’s sentence is potential.
There are maybe 10-to-the-63 particles in the universe. Give me 63 (numbered) bags and ten colors of marbles to choose from, and I have more ways I can distribute marbles to the bags than there are particles in the universe. Give me 63 neurons, each able to make ten one-way connections, and there are more ways I could wire up the neurons than there are particles in the universe.

Well, you lost me on that one. Then again, I have 23 Firefox tabs open. Might even be a couple of duplicates in there.

You are correct that the immense complexity of our brains doesn’t preclude great plasticity. That was not my argument. My argument was that the great complexity of our brains makes them much more than a machine, and that theories that take them as such will be hopelessly flawed. Carr is correct in saying that if the brain/machine is Google’s premise, then it’s wrong; but I think his arguments imply that he shares more with Google’s idea of the brain that he’d like to think.
I acknowledge that I could have stated all this a little more clearly than as appears in the column.

It seems to me that one neural pathway is make up of a number of neurons. And one neuron is made up of a particles. So for every nueral pathway, there is a corresponding large number of particles.

As others have noted, the brain’s complexity surely does not preclude plasticity (indeed, complexity may in this case correlate with plasticity). See the comment from neuroscientist James Old about your post here.
As to “ideogram,” I agree that there’s debate on terminology, but in my article I decided to use the common term. The Oxford American Dictionary defines ideogram in this way: “a written character symbolizing the idea of a thing without indicating the sounds used to say it, e.g., numerals and Chinese characters.” The more important question is whether or not the brain circuits used for reading differ between the Chinese and those who read with alphabets. There is considerable evidence, from fMRI experiments, to indicate that they do, which in turn indicates that the brain adapts to the media used for taking in information. I’m not sure from your post whether you are disputing this point or not.
As to your fundamental point – that the brain “isn’t just another machine” – I could not agree more. Let’s hope that, as our minds continue to adapt to the Net, that remains true.

The Internet has given rise to infinite creativity as writers young and old pursue, with fierce determination, a strategy to make their mark in the tough fiction writing sweepstakes. There will be failures, disappointments, and, hopefully, a breakout or two, but I love the challenge of confronting this marriage of technology and fiction writing.

I can’t comment on any scientific aspects of this discussion, Scott, but I here is the way that I feel successful in the process of reading anything.

It goes like this: By reading I enter a discourse with a voice I hear through the text that is utterly original to my ears. I myself make many associations both mental and emotional, but the voice itself is relatively simple and direct even when what it talks about is profoundly complex. I am never distracted or invited to distraction and I’m often guided by a pervasive lyricism in the voice that makes my reading closely related to the way I listen to music.

The original musician Orpheus used his lyre to support the words of his poetry–“prima la parole” in the language of opera. When Orpheus was torn into shreds for his disobedience, words and music also separated into separate arts. It seems to me that when writing reinvents or recovers what music once gave to it, it becomes successful. That is what happens with the great writers I know well, Proust above all.

I trust I make myself obscure, as Thomas More said in “A Man For All Seasons.”


The Surrender is Scott Esposito’s “collection of facts” concerning his lifelong desire to be a woman.


Two long essays of 10,000 words each on sex in—and out of—literature . . .

The first essay dives in to Nicholson Baker’s “sex trilogy,” explaining just what Baker is up to here and why these books ultimately fail to be as sexy as Baker might wish.

From there the book moves on to the second essay, which explains just why Spaniard Javier Marías does right what Baker does wrong . . .


5 essays. 2 interviews.

All in all, over 25,000 words of Latin American literary goodness.

3 never-before-published essays, including “The Digression”—a 4,000-word piece on the most important digression in César Aira’s career.

Shop though these links = Support this site

Copyright © 2017. Powered by WordPress & Romangie Theme.