Not really a "blog", strictly speaking; more of an on-line notebook. A sort of commonplace book , where I can collect short excerpts, and related links, from books that I am reading (and the occasional on-line article). This is mostly for my benefit; things that I want to remember. Sounds dull? Yeah, maybe, but no one is twisting your arm, and besides, there's some good stuff down there...after all, there are certainly worse ways for you to waste fifteen or twenty minutes on the internet.

15.1.11

The Shallows; What The Internet is Doing to Our Brains - Nicholas Carr

- Buy this book.

- About the Author (Wikipedia)
 
- Authors Blog

- Article by Author: "Is Google Making Us Stupid?" (Atlantic)

- Wikipedia Page for "Is Google Making Us Stupid?"

- Article: "Scientific Study Indicates That Internet Addiction May Cause Brain Damage" (Time)


"The reading of a sequence of printed pages was valuable not just for the knowledge readers acquired from the author's words but from the way those words set off intellectual vibrations within their own minds. In the quiet spaces opened up by the prolonged, undistracted reading of a book, people made their own associations, drew their own inferences and analogies, fostered their own ideas. They thought deeply as they read deeply.
    Even the earliest silent readers recognized the striking change in their consciousness that took place as they immersed themselves in the pages of a book. The medieval bishop Isaac of Syria described how, whenever he read to himself, "as in a dream, I enter a state when my sense and thoughts are concentrated. Then, when with prolonging of this silence the turmoil of memories is stilled in my heart, ceaseless waves of joy are sent me by inner thoughts, beyond expectation suddenly arising to delight my heart." Reading a book was a meditative act, but it didn't involve a clearing of the mind. It involved a filling, or replenishing, of the mind.  Readers disengaged their attention from the outward flow of passing stimuli in order to engage it more deeply with an inward flow of words, ideas, and emotions. That was - and is - the essence of the unique mental process of deep reading. It was the technology of the book that made this "strange anomaly" in our psychological history possible. The brain of the book reader was more than a literate brain. It was a literary brain."    (64-5)
    "The process of our mental and social adaptation to new intellectual technologies is reflected in, and reinforced by, the changing metaphors we use to portray and explain the workings of nature. Once maps had become common, people began to picture all sorts of natural and social relationships as cartographic, as a set of fixed, bounded arrangements in real or figurative space. We began to "map" our lives, our social spheres, even our ideas. Under the sway of the mechanical clock, people began thinking of their brains and their bodies - of the entire universe, in fact - as operating "like clock-work." In the clock's tightly interconnected gears, turning in accord with the laws of physics and forming a long and traceable chaim of cause and effect, we found a mechanistic metaphor that seemed to explain the workings of all things, as well as the relations between them. God became the Great Clockmaker. His creation was no longer a mystery to be accepted. It was a puzzle to be worked out. Wrote Descartes in 1646. "Doubtless when the swallows come in spring, they operate like clocks."    (50)


    Differences in brain activity have even been documented among readers of different alphabetic languages. Readers of English, for instance, have been found to draw more heavily on areas of the brain associated with deciphering visual shapes than do readers of Italian. The difference stems, it's believed, from the fact that English words often look very different from they way they sound, whereas Italian words tend to be spelled exactly as they are spoken."    (52)

    "Once a means to an end, a way to identify information for deeper study, scanning is becoming an end in itself - our preferred way of gathering and making sense of information of all sorts. We've reached the point where a Rhodes Scholar like Florida State's John O'Shea - a philosophy major, no less - is comfortable admitting not only does he not read books but that he doesn't see any particular need to read them. Why bother, when you can Google the bits and pieces you need in a fraction of a second? What we are experiencing is, in a metaphorical sense, a reversal of the early trajectory of civilization: we are evolving from being cultivators of personal knowledge to being hunters and gatherers in the electronic data forest."   (138) 


    "...it would be a serious mistake to look narrowly at the Net's benefits and conclude that the technology is making us more intelligent. Jordan Grafman, head of the cognitive neuroscience unit at the National Institute of Neurological Disorders and Stroke, explains that the constant shifting of our attention when we are online may make our brains nimble when it comes to multitasking, but improving our ability to multitask actually hampers our ability to think deeply and creatively. "Does optimizing for multitasking result in better functioning - that is, creativity, inventiveness, productiveness? The answer is, in more cases than not, no," says Grafman. "The more you multitask, the less deliberate you become; the less able to think and reason out a problem." You become, he argues, more likely to rely on conventional ideas and solutions rather than challenging them with original lines of thought. David Meyer, a University of Michigan neuroscientist and one of the leading experts on multitasking, makes a similar point. As we gain more experience in rapidly shifting our attention, we may "overcome some of the inefficiencies: inherent in multitasking, he says, "but except in rare circumstances, you can train until you are blue in the face and you'd never be as good as if you had just focused on one thing at a time."What we are doing when we multitask "is learning to be skillful at a superficial level." The Roman philosopher Seneca may have put it best two thousand years ago: "To be everywhere is to be nowhere."    
    In an article published in Science in early 2009, Patricia Greenfield, a prominent developmental psychologist who teaches at UCLA, reviewed more than fifty studies of the effects of different types of media on people's intelligence and learning ability. She concluded that "every medium develops some cognitive skills at the expense of others."  Our growing use of the Net and other screen based technologies has led to the "widespread and sophisticated development of visual-spatial skills." We can, for example, rotate objects in our minds better than we used to be able to. But our "new strength in visual-spatial intelligence" go hand in hand with a weakening of our capacities for the kind of "deep processing " that underpins " mindful knowledge acquisition, inductive analysis, critical thinking, imagination, and reflection." The Net is making us smarter, in other words, only if we define intelligence by the Net's own standards. If we take a broader more traditional view of intelligence - if we think about the depth of our thought rather than just its speed - we have to come to a different and considerably darker conclusion."    (140)

    "The mental functions that are losing the "survival of the busiest" brain cell battle are those that support calm, linear thought - the ones we use in traversing a lengthy narrative or an involved argument, the ones we draw on when we reflect on our experience or contemplate an outward or inward phenomenon. The winners are those functions that help us speedily locate, categorize, and asses disparate bits of information in a variety of forms, that let us maintain our mental bearings while being bombarded by stimuli. These functions are, not coincidentally, very similar to the ones performed by computers, which are programmed for the high speed transfer of data in and out of memory. Once again, we seem to be taking on the characteristics of a popular new information technology."    (142)


    "Google is neither God nor Satan, and if there are shadows in the Googleplex they're no more than the delusions of grandeur. What's disturbing about the company's founders is not their boyish desire to create an amazingly cool machine that will be able to outhink its creators, but the pinched conception of the human mind that gives rise to such a desire."    (176)

    "The Dutch humanist Desiderius Erasmus, in his 1512 text book De Copia, stressed the connection between memory and reading. He urged students to annotate their books, using " an appropriate little sign" to mark "occurrences of striking words, archaic or novel diction, brilliant flashes of style, adages, examples, and pithy remarks worth memorizing." He also suggested that every student and teacher keep a notebook, organized by subject, "so that whenever he lights on anything worth noting down, he may write it in the appropriate section." Transcribing the excerpts in longhand, and rehearsing them regularly, would help to insure that they remained fixed in the mind. The passages were to be viewed as "kinds of flowers" which, plucked from the pages of books, could be preserved in the pages of memory.

    Erasmus, who as a schoolboy had memorized great swathes of classical literature, including the complete works of the poet Horace and the playwright Terence, was not recommending memorization for memorization's sake or as a rote exercise for retaining facts.  To him, memorizing was far more than a means of storage. It was the first step in a process of synthesis, a process that led to a deeper and more personal understanding of one's reading. He believed, as the classical historian Erika Rummel explains, that a person  should "digest or internalize what he learns and reflect rather than slavishly reproduce the desirable qualities of the model author." Far from being a mechanical, mindless process, Erasmus's brand of memorization engaged the mind fully. It required, Rummel writes, "creativeness and judgment."
    Erasmus's advice echoed that of the Roman Seneca, who also used a botanical metaphor to describe the essential role that memory plays in reading and thinking. "We should imitate bees," Seneca wrote, " and we should keep in separate compartments whatever we have collected from our diverse reading, for things conserved separately  keep better. Then, diligently applying all the resources of our native talent, we should mingle all the various nectars we have tasted, and then turn them into a single sweet substance, in such a way that, even if it is apparent where it originated, it appears quite different from what it was in its original state." Memory, for Seneca as for Erasmus, was as much a crucible as a container. It was more than the sum of things remembered. It was something newly made, the essence of a unique self.
    Erasmus's recommendation that every reader keep a notebook of memorable quotations was widely and enthusiastically followed. Such notebooks, which came to be called "commonplace books," or just "commonplaces," became fixtures of Renaissance schooling. Every student kept one. By the seventeenth century, their use had spread beyond the schoolhouse. Commonplaces were viewed as necessary tools for the cultivation of the educated mind. In 1623, Francis Bacon observed that "there can hardly be anything more useful" as " a sound help for the memory" than " a good and learned Digest of Common Places." By aiding the recording of written works in memory, he wrote, a well-maintained commonplace "supplies matter to invention." Through the eighteenth century, according to American University linguistics professor Naomi Bacon, " a gentleman's commonplace book" served " both as a vehicle for and a chronicle of his intellectual development."
    The popularity of commonplace books ebbed as the pace of life quickened in the nineteenth century, and by the middle of the twentieth century memorization itself had begun to fall from favor.Progressive educators banished the practice from classrooms, dismissing it as a vestige of a less enlightened time. What had long been viewed as a stimulus for personal insight and creativity came to be seen as a barrier to imagination and then simply as a waste of mental energy. The introduction of new storage and recording media throughout the last century - audiotapes, videotapes, microfilm and microfiche, photocopiers, calculators, computer drives - greatly expanded the scope and availability of "artificial memory."  Committing information to ones own mind seemed ever less essential. The net quickly came to be seen as a replacement for, rather than just a supplement to, personal memory. Today, people routinely talk about artificial memory as though it's indistinguishable from biological memory.
    Clive Thompson, the Wired writer, refers to the Net as an "outboard brain" that is taking over the role previously played by inner memory. "I've almost given up making an effort to remember anything," he says, "because I can instantly retrieve the information online." He suggests that "by offloading data onto silicon, we free our own gray matter for more germanely 'human' tasks like brainstorming and daydreaming." David Brooks, the popular New York Times columnist, makes a similar point.  "I had thought that the magic of the information age was that it allowed us to know more." he writes, "but then I realized that the magic of the information age is that it allows us to know less. It provides us with external cognitive servants - silicon memory systems, collaborative online filters, consumer  preference algorithms and networked knowledge. We can burden these servants and liberate ourselves."
    Peter Suderman, who writes for the American Scene, argues that, with our more or less permanent connections to the internet, "its no longer terribly efficient to use our brains to store information." Memory, he says, should now function like a single index, pointing us to places on the Web where we can locate the information we need at the moment we need it: "Why memorize the content of a single book when you can be using your brain to hold  a quick guide to an entire library?   Rather than memorize information, we now store it digitally and just remember what we stored." As the Web "teaches us to think like it does," he says, we'll end up keeping "rather little deep knowledge" in our own heads. Don Tapscott, the technology writer, puts it more bluntly. Now that we can look up anything "with a click on Google," he says, "memorizing long passages or historical facts" is obsolete. memorization is "a waste of time."
    Our embrace of the idea that computer databases provide an effective and even superior substitute for personal memory is not particularly surprising. It culminates a century-long shift in the popular view of the mind. As the machines  we use to store data have become more voluminous, flexible, and responsive, we've grown accustomed to the blurring of artificial and biological memory. But it's an extraordinary development nonetheless.  The notion that memory can be "outsourced" as Brooks put it, would have been unthinkable at any earlier moment in out history.
For the ancient Greeks, memory was a goddess, Mnemosyne, mother of the Muses. To Augustine, it was "a vast and infinite profundity," a reflection of the power of God in man. The classical view remained the common view through the Middle Ages, the Renaissance, and  the Enlightenment - up to, in fact, the close of the nineteenth century. When, in an 1892 lecture before a group of teachers, William James declared that "the art of remembering is the art of thinking," he was stating the obvious. Now, his words seem old-fashioned.  Not only has memory lost its divinity; it's well on its way to losing its humanness. Mnemosyne has become a machine. 
    The shift in our view of memory is yet another manifestation of our acceptance of the metaphor that portrays the brain a s a computer.  If biological memory functions like a hard drive, storing bits of data in fixed locations and serving them up as inputs to the brains calculations, then offloading that storage capacity to the Web is not just possible but, as Thompson and Brooks argue, liberating. It provides us with a much more capacious memory while clearing out space in our brains for more valuable and even more "human" computation. The analogy has a simplicity that makes it compelling, and it certainly seems more "scientific" than the suggestion that our memory is like a book of pressed flowers or the honey in a beehives's comb. But there's a problem with our new, post-Internet conception of human memory. It's wrong.    (178-82)
   
  
   


  
   





No comments:

Post a Comment