# The Information ![rw-book-cover](https://images-na.ssl-images-amazon.com/images/I/514ToTCeWNL._SL200_.jpg) ## Metadata - Author: [[James Gleick]] - Full Title: The Information - Category: #books ## Highlights - The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning. —Claude Shannon (1948) ([Location 47](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=47)) - Tags: [[pink]] - An invention even more profound and more fundamental came in a monograph spread across seventy-nine pages of The Bell System Technical Journal in July and October. No one bothered with a press release. It carried a title both simple and grand—“A Mathematical Theory of Communication”—and the message was hard to summarize. But it was a fulcrum around which the world began to turn. Like the transistor, this development also involved a neologism: the word bit, chosen in this case not by committee but by the lone author, a thirty-two-year-old named Claude Shannon. The bit now joined the inch, the pound, the quart, and the minute as a determinate quantity—a fundamental unit of measure. But measuring what? “A unit for measuring information,” Shannon wrote, as though there were such a thing, measurable and quantifiable, as information. ([Location 61](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=61)) - Tags: [[pink]] - AT&T at midcentury did not demand instant gratification from its research division. It allowed detours into mathematics or astrophysics with no apparent commercial purpose. ([Location 76](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=76)) - Tags: [[pink]] - Campbell’s solution was partly mathematics and partly electrical engineering. His employers learned not to worry much about the distinction. Shannon himself, as a student, had never been quite able to decide whether to become an engineer or a mathematician. For Bell Labs he was both, willy-nilly, practical about circuits and relays but happiest in a realm of symbolic abstraction. ([Location 93](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=93)) - Tags: [[pink]] - he worked with the mathematician and logician Hermann Weyl, who taught him what a theory was: “Theories permit consciousness to ‘jump over its own shadow,’ to leave behind the given, to represent the transcendent, yet, as is self-evident, only in symbols.” ([Location 100](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=100)) - Tags: [[pink]] - For the purposes of science, information had to mean something special. Three centuries earlier, the new discipline of physics could not proceed until Isaac Newton appropriated words that were ancient and vague—force, mass, motion, and even time—and gave them new meanings. Newton made these terms into quantities, suitable for use in mathematical formulas. Until then, motion (for example) had been just as soft and inclusive a term as information. For Aristotelians, motion covered a far-flung family of phenomena: a peach ripening, a stone falling, a child growing, a body decaying. That was too rich. Most varieties of motion had to be tossed out before Newton’s laws could apply and the Scientific Revolution could succeed. In the nineteenth century, energy began to undergo a similar transformation: natural philosophers adapted a word meaning vigor or intensity. They mathematicized it, giving energy its fundamental place in the physicists’ view of nature. It was the same with information. A rite of purification became necessary. ([Location 118](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=118)) - Tags: [[pink]] - when it was made simple, distilled, counted in bits, information was found to be everywhere. Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos. ([Location 126](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=126)) - Tags: [[pink]] - even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level—an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’ ” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions.… If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment. ([Location 135](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=135)) - Tags: [[pink]] - Increasingly, the physicists and the information theorists are one and the same. The bit is a fundamental particle of a different sort: not just tiny but abstract—a binary digit, a flip-flop, a yes-or-no. It is insubstantial, yet as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence. Bridging the physics of the twentieth and twenty-first centuries, John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, put this manifesto in oracular monosyllables: “It from Bit.” Information gives rise to “every it—every particle, every field of force, even the spacetime continuum itself.” This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. Not only is the observer observing, she is asking questions and making statements that must ultimately be expressed in discrete bits. “What we call reality,” Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added: “All things physical are information-theoretic in origin, and this is a participatory universe.” The whole universe is thus seen as a computer—a cosmic information-processing machine. ([Location 156](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=156)) - Tags: [[pink]] - “Tomorrow,” Wheeler declares, “we will have learned to understand and express all of physics in the language of information.” ([Location 178](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=178)) - Tags: [[pink]] - In the beginning was the word, according to John. We are the species that named itself Homo sapiens, the one who knows—and then, after reflection, amended that to Homo sapiens sapiens. The greatest gift of Prometheus to humanity was not fire after all: “Numbers, too, chiefest of sciences, I invented for them, and the combining of letters, creative mother of the Muses’ arts, with which to hold all things in memory.” ([Location 185](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=185)) - Tags: [[pink]] - Samuel F. B. Morse was struggling with his own percussive code, the electromagnetic drumbeat designed to pulse along the telegraph wire. Inventing a code was a complex and delicate problem. He did not even think in terms of a code, at first, but “a system of signs for letters, to be indicated and marked by a quick succession of strokes or shocks of the galvanic current.” The annals of invention offered scarcely any precedent. How to convert information from one form, the everyday language, into another form suitable for transmission by wire taxed his ingenuity more than any mechanical problem of the telegraph. It is fitting that history attached Morse’s name to his code, more than to his device. ([Location 307](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=307)) - Tags: [[pink]] - It is a useful concept but an imperfect one: linguists have found it surprisingly difficult to agree on an exact inventory of phonemes for English or any other language (most estimates for English are in the vicinity of forty-five). The problem is that a stream of speech is a continuum; a linguist may abstractly, and arbitrarily, break it into discrete units, but the meaningfulness of these units varies from speaker to speaker and depends on the context. Most speakers’ instincts about phonemes are biased, too, by their knowledge of the written alphabet, which codifies language in its own sometimes arbitrary ways. In any case, tonal languages, with their extra variable, contain many more phonemes than were first apparent to inexperienced linguists. ([Location 391](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=391)) - Tags: [[pink]] - For every village and every tribe, the drum language began with the spoken word and shed the consonants and vowels. That was a lot to lose. The remaining information stream would be riddled with ambiguity. A double stroke on the high-tone lip of the drum [– –] matched the tonal pattern of the Kele word for father, sango, but naturally it could just as well be songe, the moon; koko, fowl; fele, a species of fish; or any other word of two high tones. Even the limited dictionary of the missionaries at Yakusu contained 130 such words. Having reduced spoken words, in all their sonic richness, to such a minimal code, how could the drums distinguish them? The answer lay partly in stress and timing, but these could not compensate for the lack of consonants and vowels. Thus, Carrington discovered, a drummer would invariably add “a little phrase” to each short word. Songe, the moon, is rendered as songe li tange la manga—“the moon looks down at the earth.” Koko, the fowl, is rendered koko olongo la bokiokio—“the fowl, the little one that says kiokio.” The extra drumbeats, far from being extraneous, provide context. Every ambiguous word begins in a cloud of possible alternative interpretations; then the unwanted possibilities evaporate. This takes place below the level of consciousness. Listeners are hearing only staccato drum tones, low and high, but in effect they “hear” the missing consonants and vowels, too. For that matter, they hear whole phrases, not individual words. “Among peoples who know nothing of writing or grammar, a word per se, cut out of its sound group, seems almost to cease to be an intelligible articulation,” Captain Rattray reported. ([Location 401](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=401)) - Tags: [[pink]] - Note: The added redundancy to phrases does two things: 1. It ensures that we reduce the ambiguities of words, because longer collections of drumbeats have a lower probability of an exact match than do shorter ones 2. It improves communication reliability be ensuring that listeners that miss a few drumbeats can still understand the phrase based on the remaining context - The resemblance to Homeric formulas—not merely Zeus, but Zeus the cloud-gatherer; not just the sea, but the wine-dark sea—is no accident. In an oral culture, inspiration has to serve clarity and memory first. The Muses are the daughters of Mnemosyne. ([Location 418](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=418)) - Tags: [[pink]] - Odysseus wept when he heard the poet sing of his great deeds abroad because, once sung, they were no longer his alone. They belonged to anyone who heard the song. —Ward Just (2004) ([Location 464](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=464)) - Tags: [[pink]] - Oral literature was generally treated as a variant of writing; this, Ong said, was “rather like thinking of horses as automobiles without wheels.” You can, of course, undertake to do this. Imagine writing a treatise on horses (for people who have never seen a horse) which starts with the concept not of “horse” but of “automobile,” built on the readers’ direct experience of automobiles. It proceeds to discourse on horses by always referring to them as “wheelless automobiles,” explaining to highly automobilized readers all the points of difference.… Instead of wheels, the wheelless automobiles have enlarged toenails called hooves; instead of headlights, eyes; instead of a coat of lacquer, something called hair; instead of gasoline for fuel, hay, and so on. In the end, horses are only what they are not. When it comes to understanding the preliterate past, we modern folk are hopelessly automobilized. The written word is the mechanism by which we know what we know. It organizes our thought. We may wish to understand the rise of literacy both historically and logically, but history and logic are themselves the products of literate thought. ([Location 488](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=488)) - Tags: [[pink]] - One unlikely Luddite was also one of the first long-term beneficiaries. Plato (channeling the nonwriter Socrates) warned that this technology meant impoverishment: For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom. External characters which are no part of themselves—this was the trouble. The written word seemed insincere. Ersatz scratchings on papyrus or clay were far abstracted from the real, the free-flowing sound of language, intimately bound up with thought so as to seem coterminous with it. Writing appeared to draw knowledge away from the person, to place their memories in storage. It also separated the speaker from the listener, by so many miles or years. ([Location 504](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=504)) - Tags: [[pink]] - the new channel does more than extend the previous channel. It enables reuse and “re-collection”—new modes. It permits whole new architectures of information. Among them are history, law, business, mathematics, and logic. Apart from their content, these categories represent new techniques. The power lies not just in the knowledge, preserved and passed forward, valuable as it is, but in the methodology: encoded visual indications, the act of transference, substituting signs for things. And then, later, signs for signs. ([Location 530](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=530)) - Tags: [[pink]] - Chinese script began this transition between 4,500 and 8,000 years ago: signs that began as pictures came to represent meaningful units of sound. Because the basic unit was the word, thousands of distinct symbols were required. This is efficient in one way, inefficient in another. Chinese unifies an array of distinct spoken languages: people who cannot speak to one another can write to one another. It employs at least fifty thousand symbols, about six thousand commonly used and known to most literate Chinese. ([Location 541](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=541)) - Tags: [[pink]] - Considering scripts in terms of how many symbols are required and how much meaning each individual symbol conveys, Chinese thus became an extreme case: the largest set of symbols, and the most meaningful individually. Writing systems could take alternative paths: fewer symbols, each carrying less information. An intermediate stage is the syllabary, a phonetic writing system using individual characters to represent syllables, which may or may not be meaningful. A few hundred characters can serve a language. The writing system at the opposite extreme took the longest to emerge: the alphabet, one symbol for one minimal sound. The alphabet is the most reductive, the most subversive of all scripts. ([Location 549](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=549)) - Tags: [[pink]] - The alphabet spread by contagion. The new technology was both the virus and the vector of transmission. It could not be monopolized, and it could not be suppressed. Even children could learn these few, lightweight, semantically empty letters. Divergent routes led to alphabets of the Arab world and of northern Africa; to Hebrew and Phoenician; across central Asia, to Brahmi and related Indian script; and to Greece. The new civilization arising there brought the alphabet to a high degree of perfection. Among others, the Latin and Cyrillic alphabets followed along. ([Location 567](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=567)) - Tags: [[pink]] - Milman Parry, a structural linguist who studied the living tradition of oral epic poetry in Bosnia and Herzegovina, proposed that the Iliad and the Odyssey not only could have been but must have been composed and sung without benefit of writing. The meter, the formulaic redundancy, in effect the very poetry of the great works served first and foremost to aid memory. Its incantatory power made of the verse a time capsule, able to transmit a virtual encyclopedia of culture across generations. ([Location 572](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=572)) - Tags: [[pink]] - Aristotle himself, son of the physician to the king of Macedonia and an avid, organized thinker, was attempting to systematize knowledge. The persistence of writing made it possible to impose structure on what was known about the world and, then, on what was known about knowing. As soon as one could set words down, examine them, look at them anew the next day, and consider their meaning, one became a philosopher, and the philosopher began with a clean slate and a vast project of definition to undertake. Knowledge could begin to pull itself up by the bootstraps. For Aristotle the most basic notions were worth recording and were necessary to record: ([Location 595](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=595)) - Tags: [[pink]] - The multitude cannot accept the idea of beauty in itself rather than many beautiful things, nor anything conceived in its essence instead of the many specific things. Thus the multitude cannot be philosophic. For “the multitude” we may understand “the preliterate.” They “lose themselves and wander amid the multiplicities of multifarious things,” declared Plato, looking back on the oral culture that still surrounded him. They “have no vivid pattern in their souls.” ([Location 607](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=607)) - Tags: [[pink]] - The written word—the persistent word—was a prerequisite for conscious thought as we understand it. It was the trigger for a wholesale, irreversible change in the human psyche—psyche being the word favored by Socrates/Plato as they struggled to understand. Plato, as Havelock puts it, is trying for the first time in history to identify this group of general mental qualities, and seeking for a term which will label them satisfactorily under a single type.… He it was who hailed the portent and correctly identified it. In so doing, he so to speak confirmed and clinched the guesses of a previous generation which had been feeling its way towards the idea that you could “think,” and that thinking was a very special kind of psychic activity, very uncomfortable, but also very exciting, and one which required a very novel use of Greek. Taking the next step on the road of abstraction, Aristotle deployed categories and relationships in a regimented order to develop a symbolism of reasoning: logic—from , logos, the not-quite-translatable word from which so much flows, meaning “speech” or “reason” or “discourse” or, ultimately, just “word.” Logic might be imagined to exist independent of writing—syllogisms can be spoken as well as written—but it did not. Speech is too fleeting to allow for analysis. Logic descended from the written word, in Greece as well as India and China, where it developed independently. Logic turns the act of abstraction into a tool for determining what is true and what is false: truth can be discovered in words alone, apart from concrete experience. Logic takes its form in chains: sequences whose members connect one to another. Conclusions follow from premises. These require a degree of constancy. They have no power unless people can examine and evaluate them. In contrast, an oral narrative proceeds by accretion, the words passing by in a line of parade past the viewing stand, briefly present and then gone, interacting with one another via memory and association. There are no syllogisms in Homer. Experience is arranged in terms of events, not categories. Only with writing does narrative structure come to embody sustained rational argument. Aristotle crossed another level, by seeing the study of such argument—not just the use of argument, but its study—as a tool. His logic expresses an ongoing self-consciousness about the words in which they are composed. When Aristotle unfurls premises and conclusions—If it is possible for no man to be a horse, it is also admissible for no horse to be a man; and if it is admissible for no garment to be white, it is also admissible for nothing white to be a garment. For if any white thing must be a garment, then some garment will necessarily be white—he neither requires nor implies any personal experience of horses, garments, or colors. He has departed that realm. Yet he claims through the manipulation of words to create knowledge anyway, and a superior brand of knowledge at that. ([Location 617](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=617)) - Tags: [[pink]] - The same criticism was made of other constrained channels, created by later technologies—the telegraph, the telephone, radio, and e-mail. Jonathan Miller rephrases McLuhan’s argument in quasi-technical terms of information: “The larger the number of senses involved, the better the chance of transmitting a reliable copy of the sender’s mental state.”* In the stream of words past the ear or eye, we sense not just the items one by one but their rhythms and tones, which is to say their music. We, the listener or the reader, do not hear, or read, one word at a time; we get messages in groupings small and large. Human memory being what it is, larger patterns can be grasped in writing than in sound. The eye can glance back. McLuhan considered this damaging, or at least diminishing. “Acoustic space is organic and integral,” he said, “perceived through the simultaneous interplay of all the senses; whereas ‘rational’ or pictorial space is uniform, sequential and continuous and creates a closed world with none of the rich resonance of the tribal echoland.” ([Location 802](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=802)) - Tags: [[pink]] - The availability—the solidity—of the printed book inspired a sense that the written word should be a certain way, that one form was right and others wrong. First this sense was unconscious; then it began to rise toward general awareness. ([Location 871](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=871)) - Tags: [[pink]] - Gottfried Wilhelm Leibniz made this distinction explicit: Let me mention that the words or names of all things and actions can be brought into a list in two different ways, according to the alphabet and according to nature.… The former go from the word to the thing, the latter from the thing to the word. ([Location 967](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=967)) - Tags: [[pink]] - When Galileo pointed his first telescope skyward and discovered sunspots in 1611, he immediately anticipated controversy—traditionally the sun was an epitome of purity—and he sensed that science could not proceed without first solving a problem of language: So long as men were in fact obliged to call the sun “most pure and most lucid,” no shadows or impurities whatever had been perceived in it; but now that it shows itself to us as partly impure and spotty; why should we not call it “spotted and not pure”? For names and attributes must be accommodated to the essence of things, and not the essence to the names, since things come first and names afterwards. ([Location 1026](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1026)) - Tags: [[pink]] - The dictionary ratifies the persistence of the word. It declares that the meanings of words come from other words. It implies that all words, taken together, form an interlocking structure: interlocking, because all words are defined in terms of other words. This could never have been an issue in an oral culture, where language was barely visible. Only when printing—and the dictionary—put the language into separate relief, as an object to be scrutinized, could anyone develop a sense of word meaning as interdependent and even circular. ([Location 1095](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1095)) - Tags: [[pink]] - What makes cyberspace different from all previous information technologies is its intermixing of scales from the largest to the smallest without prejudice, broadcasting to the millions, narrowcasting to groups, instant messaging one to one. ([Location 1275](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1275)) - Tags: [[pink]] - He might have been described as a professional mathematician, yet here he was touring the country’s workshops and manufactories, trying to discover the state of the art in machine tools. He noted, “Those who enjoy leisure can scarcely find a more interesting and instructive pursuit than the examination of the workshops of their own country, which contain within them a rich mine of knowledge, too generally neglected by the wealthier classes.” ([Location 1305](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1305)) - Tags: [[pink]] - Like the looms, forges, naileries, and glassworks he studied in his travels across northern England, Babbage’s machine was designed to manufacture vast quantities of a certain commodity. The commodity was numbers. The engine opened a channel from the corporeal world of matter to a world of pure abstraction. ([Location 1323](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1323)) - Tags: [[pink]] - Why? Besides the obsession and the ebullience, the creators of number tables had a sense of their economic worth. Consciously or not, they reckoned the price of these special data by weighing the difficulty of computing them versus looking them up in a book. Precomputation plus data storage plus data transmission usually came out cheaper than ad hoc computation. ([Location 1373](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1373)) - Tags: [[pink]] - Logarithmes are Numbers invented for the more easie working of questions in Arithmetike and Geometrie. The name is derived of Logos, which signifies Reason, and Arithmos, signifying Numbers. By them all troublesome Multiplications and Divisions in Arithmetike are avoided, and performed onely by Addition in stead of Multiplication, and by Subtraction in stead of Division. ([Location 1386](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1386)) - Tags: [[pink]] - To multiply two numbers, a calculator could just look up their logarithms and add those. For example: 100 × 1,000,000 = 102 × 106 = 10(2 + 6) Looking up and adding are easier than multiplying. ([Location 1408](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1408)) - Tags: [[pink]] - Napier did not express his idea this way, in terms of exponents. He grasped the thing viscerally: he was thinking in terms of a relationship between differences and ratios. ([Location 1411](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1411)) - Tags: [[pink]] - In Napier’s mind was an analogy: differences are to ratios as addition is to multiplication. His thinking crossed over from one plane to another, from spatial relationships to pure numbers. Aligning these scales side by side, he gave a calculator a practical means of converting multiplication into addition—downshifting, in effect, from the difficult task to the easier one. In a way, the method is a kind of translation, or encoding. The natural numbers are encoded as logarithms. The calculator looks them up in a table, the code book. In this new language, calculation is easy: addition instead of multiplication, or multiplication instead of exponentiation. When the work is done, the result is translated back into the language of natural numbers. ([Location 1418](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1418)) - Tags: [[pink]] - From that time until the arrival of electronic machines, the majority of human computation was performed by means of logarithms. ([Location 1439](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1439)) - Tags: [[pink]] - Newton and Leibniz knew how similar their work was—enough that each accused the other of plagiarism. But they had devised incompatible systems of notation—different languages—and in practice these surface differences mattered more than the underlying sameness. Symbols and operators were what a mathematician had to work with, after all. Babbage, unlike most students, made himself fluent in both—“the dots of Newton, the d’s of Leibnitz”—and felt he had seen the light. “It is always difficult to think and reason in a new language.” ([Location 1492](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1492)) - Tags: [[pink]] - It was odd even so that Babbage thought to exert this potent force in a weightless realm—applying steam to thought and arithmetic. Numbers were the grist for his mill. Racks would slide, pinions would turn, and the mind’s work would be done. ([Location 1542](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1542)) - Tags: [[pink]] - Babbage, for his own purposes, devised a new formal tool, a system of “mechanical notation” (his term). This was a language of signs meant to represent not just the physical form of a machine but its more elusive properties: its timing and its logic. It was an extraordinary ambition, as Babbage himself appreciated. In 1826 he proudly reported to the Royal Society “On a Method of Expressing by Signs the Action of Machinery.” In part it was an exercise in classification. He analyzed the different ways in which something—motion, or power—could be “communicated” through a system. There were many ways. A part could receive its influence simply by being attached to another part, “as a pin on a wheel, or a wheel and pinion on the same axis.” Or transmission could occur “by stiff friction.” A part might be driven constantly by another part “as happens when a wheel is driven by a pinion”—or not constantly, “as is the case when a stud lifts a bolt once in the course of a revolution.” Here a vision of logical branching entered the scheme: the path of communication would vary depending on the alternative states of some part of the machine. ([Location 1677](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1677)) - Tags: [[pink]] - Inspiring him, as well, was the loom on display in the Strand, invented by Joseph-Marie Jacquard, controlled by instructions encoded and stored as holes punched in cards. What caught Babbage’s fancy was not the weaving, but rather the encoding, from one medium to another, of patterns. The patterns would appear in damask, eventually, but first were “sent to a peculiar artist.” This specialist, as he said, punches holes in a set of pasteboard cards in such a manner that when those cards are placed in a Jacquard loom, it will then weave upon its produce the exact pattern designed by the artist. The notion of abstracting information away from its physical substrate required careful emphasis. ([Location 1775](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1775)) - Tags: [[pink]] - She was fearless about drilling down to first principles. Where she felt difficulties, real difficulties lay. ([Location 1824](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1824)) - Tags: [[pink]] - She pondered her growing powers of mind. They were not strictly mathematical, as she saw it. She saw mathematics as merely a part of a greater imaginative world. Mathematical transformations reminded her “of certain sprites & fairies one reads of, who are at one’s elbows in one shape now, & the next minute in a form most dissimilar; and uncommonly deceptive, troublesome & tantalizing are the mathematical sprites & fairies sometimes; like the types I have found for them in the world of Fiction.” Imagination—the cherished quality. She mused on it; it was her heritage from her never-present father. ([Location 1835](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1835)) - Tags: [[pink]] - I believe myself to possess a most singular combination of qualities exactly fitted to make me pre-eminently a discoverer of the hidden realities of nature.… The belief has been forced upon me, & most slow have I been to admit it even. She listed her qualities: Firstly: Owing to some peculiarity in my nervous system, I have perceptions of some things, which no one else has; or at least very few, if any.… Some might say an intuitive perception of hidden things;—that is of things hidden from eyes, ears & the ordinary senses.… Secondly;—my immense reasoning faculties; Thirdly;… the power not only of throwing my whole energy & existence into whatever I choose, but also bring to bear on any one subject or idea, a vast apparatus from all sorts of apparently irrelevant & extraneous sources. I can throw rays from every quarter of the universe into one vast focus. ([Location 1850](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1850)) - Tags: [[pink]] - They offered a vision of the future more general and more prescient than any expressed by Babbage himself. How general? The engine did not just calculate; it performed operations, she said, defining an operation as “any process which alters the mutual relation of two or more things,” and declaring: “This is the most general definition, and would include all subjects in the universe.” The science of operations, as she conceived it, is a science of itself, and has its own abstract truth and value; just as logic has its own peculiar truth and value, independently of the subjects to which we may apply its reasonings and processes.… One main reason why the separate nature of the science of operations has been little felt, and in general little dwelt on, is the shifting meaning of many of the symbols used. ([Location 1895](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1895)) - Tags: [[pink]] - Symbols and meaning: she was emphatically not speaking of mathematics alone. The engine “might act upon other things besides number.” Babbage had inscribed numerals on those thousands of dials, but their working could represent symbols more abstractly. The engine might process any meaningful relationships. It might manipulate language. It might create music. ([Location 1903](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1903)) - Tags: [[pink]] - We may say most aptly, that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves. ([Location 1916](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1916)) - Tags: [[pink]] - A core idea was the entity she and Babbage called the variable. Variables were, in hardware terms, the machine’s columns of number dials. But there were “Variable cards,” too. In software terms they were a sort of receptacle or envelope, capable of representing, or storing, a number of many decimal digits. (“What is there in a name?” Babbage wrote. “It is merely an empty basket until you put something in it.”) Variables were the machine’s units of information. ([Location 1935](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1935)) - Tags: [[pink]] - Babbage’s interests, straying so far from mathematics, seeming so miscellaneous, did possess a common thread that neither he nor his contemporaries could perceive. His obsessions belonged to no category—that is, no category yet existing. His true subject was information: messaging, encoding, processing. ([Location 1988](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1988)) - Tags: [[pink]] - In tree rings he saw nature encoding messages about the past. A profound lesson: that a tree records a whole complex of information in its solid substance. “Every shower that falls, every change of temperature that occurs, and every wind that blows, leaves on the vegetable world the traces of its passage; slight, indeed, and imperceptible, perhaps, to us, but not the less permanently recorded in the depths of those woody fabrics.” ([Location 1996](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1996)) - Tags: [[pink]] - Is it a fact—or have I dreamt it—that, by means of electricity, the world of matter has become a great nerve, vibrating thousands of miles in a breathless point of time? Rather, the round globe is a vast head, a brain, instinct with intelligence! Or, shall we say, it is itself a thought, nothing but thought, and no longer the substance which we deemed it! —Nathaniel Hawthorne (1851) ([Location 2065](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2065)) - Tags: [[pink]] - In their earliest days these inventions inspired exhilaration without precedent in the annals of technology. The excitement passed from place to place in daily newspapers and monthly magazines and, more to the point, along the wires themselves. A new sense of futurity arose: a sense that the world was in a state of change, that life for one’s children and grandchildren would be very different, all because of this force and its uses. “Electricity is the poetry of science,” an American historian declared in 1852. ([Location 2095](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2095)) - Tags: [[pink]] - Chappe took it for granted that the telegraph network of which he dreamed would be a department of the state, government owned and operated. He saw it not as an instrument of knowledge or of riches, but as an instrument of power. “The day will come,” he wrote, “when the Government will be able to achieve the grandest idea we can possibly have of power, by using the telegraph system in order to spread directly, every day, every hour, and simultaneously, its influence over the whole republic.” ([Location 2168](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2168)) - Tags: [[pink]] - “Telegraphy is an element of power and order,” Abraham Chappe had said, but the rising financial and mercantile classes were the next to grasp the value of information leaping across distance. Only two hundred miles separated the Stock Exchange on Threadneedle Street in London from the Bourse at the Palais Brongniart, but two hundred miles meant days. Fortunes could be made by bridging that gap. For speculators a private telegraph would be as useful as a time machine. ([Location 2286](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2286)) - Tags: [[pink]] - As he described his epiphany, it was an insight not about lightning but about signs: “It would not be difficult to construct a system of signs by which intelligence could be instantaneously transmitted.” ([Location 2340](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2340)) - Tags: [[pink]] - Morse had a great insight from which all the rest flowed. Knowing nothing about pith balls, bubbles, or litmus paper, he saw that a sign could be made from something simpler, more fundamental, and less tangible—the most minimal event, the closing and opening of a circuit. Never mind needles. The electric current flowed and was interrupted, and the interruptions could be organized to create meaning. ([Location 2345](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2345)) - Tags: [[pink]] - Information that just two years earlier had taken days to arrive at its destination could now be there—anywhere—in seconds. This was not a doubling or tripling of transmission speed; it was a leap of many orders of magnitude. It was like the bursting of a dam whose presence had not even been known. ([Location 2413](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2413)) - Tags: [[pink]] - the telegraph was “annihilating” time and space. It “enables us to send communications, by means of the mysterious fluid, with the quickness of thought, and to annihilate time as well as space,” announced an American telegraph official in 1860. ([Location 2432](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2432)) - Tags: [[pink]] - “Distance and time have been so changed in our imaginations,” said Josiah Latimer Clark, an English telegraph engineer, “that the globe has been practically reduced in magnitude, and there can be no doubt that our conception of its dimensions is entirely different to that held by our forefathers.” ([Location 2437](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2437)) - Tags: [[pink]] - A message had seemed to be a physical object. That was always an illusion; now people needed consciously to divorce their conception of the message from the paper on which it was written. Scientists, Harper’s explained, will say that the electric current “carries a message,” but one must not imagine that anything—any thing—is transported. There is only “the action and reaction of an imponderable force, and the making of intelligible signals by its means at a distance.” No wonder people were misled. “Such language the world must, perhaps for a long time to come, continue to employ.” ([Location 2484](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2484)) - Tags: [[pink]] - Entirely because of the telegraph, by the late nineteenth century people grew comfortable, or at least familiar, with the idea of codes: signs used for other signs, words used for other words. Movement from one symbolic level to another could be called encoding. ([Location 2507](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2507)) - Tags: [[pink]] - Those who used the telegraph codes slowly discovered an unanticipated side effect of their efficiency and brevity. They were perilously vulnerable to the smallest errors. Because they lacked the natural redundancy of English prose—even the foreshortened prose of telegraphese—these cleverly encoded messages could be disrupted by a mistake in a single character. By a single dot, for that matter. ([Location 2611](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2611)) - Tags: [[pink]] - The legal battle dragged on for six years, until finally the Supreme Court upheld the fine print on the back of the telegraph blank, which spelled out a procedure for protecting against errors: To guard against mistakes or delays, the sender of a message should order it REPEATED; that is telegraphed back to the originating office for comparison.… Said company shall not be liable for mistakes in … any UNREPEATED message … nor in any case for errors in cipher or obscure messages. ([Location 2617](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2617)) - Tags: [[pink]] - Two symbols. In groups of five. “Yield thirty two Differences.” That word, differences, must have struck Wilkins’s readers (few though they were) as an odd choice. But it was deliberate and pregnant with meaning. Wilkins was reaching for a conception of information in its purest, most general form. Writing was only a special case: “For in the general we must note, That whatever is capable of a competent Difference, perceptible to any Sense, may be a sufficient Means whereby to express the Cogitations.” A difference could be “two Bells of different Notes”; or “any Object of Sight, whether Flame, Smoak, &c.”; or trumpets, cannons, or drums. Any difference meant a binary choice. Any binary choice began the expressing of cogitations. Here, in this arcane and anonymous treatise of 1641, the essential idea of information theory poked to the surface of human thought, saw its shadow, and disappeared again for four hundred years. ([Location 2664](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2664)) - Tags: [[pink]] - Lardner had said of the mechanical notation, “The various parts of the machinery being once expressed on paper by proper symbols, the enquirer dismisses altogether from his thoughts the mechanism itself and attends only to the symbols … an almost metaphysical system of abstract signs, by which the motion of the hand performs the office of the mind.” Two younger Englishmen, Augustus De Morgan and George Boole, turned the same methodology to work on an even more abstract material: the propositions of logic. ([Location 2700](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2700)) - Tags: [[pink]] - As the century turned, Bertrand Russell paid George Boole an extraordinary compliment: “Pure mathematics was discovered by Boole, in a work which he called the Laws of Thought.” It has been quoted often. What makes the compliment extraordinary is the seldom quoted disparagement that follows on its heels: He was also mistaken in supposing that he was dealing with the laws of thought: the question how people actually think was quite irrelevant to him, and if his book had really contained the laws of thought, it was curious that no one should ever have thought in such a way before. One might almost think Russell enjoyed paradoxes. ([Location 2764](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2764)) - Tags: [[pink]] - Three great waves of electrical communication crested in sequence: telegraphy, telephony, and radio. People began to feel that it was natural to possess machines dedicated to the sending and receiving of messages. These devices changed the topology—ripped the social fabric and reconnected it, added gateways and junctions where there had only been blank distance. ([Location 2803](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2803)) - Tags: [[pink]] - Bush, like Babbage, hated the numbing, wasteful labor of mere calculation. “A mathematician is not a man who can readily manipulate figures; often he cannot,” Bush wrote. “He is primarily an individual who is skilled in the use of symbolic logic on a high plane, and especially he is a man of intuitive judgment.” ([Location 2839](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2839)) - Tags: [[pink]] - Unlike Babbage’s machine, it did not manipulate numbers. It worked on quantities—generating curves, as Bush liked to say, to represent the future of a dynamical system. We would say now that it was analog rather than digital. Its wheels and disks were arranged to produce a physical analog of the differential equations. ([Location 2849](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2849)) - Tags: [[pink]] - For the telegraph, the point was to reach across long distances by making a chain. For Shannon, the point was not distance but control. A hundred relays, intricately interconnected, switching on and off in particular sequence, coordinated the Differential Analyzer. The best experts on complex relay circuits were telephone engineers; relays controlled the routing of calls through telephone exchanges, as well as machinery on factory assembly lines. Relay circuitry was designed for each particular case. No one had thought to study the idea systematically, but Shannon was looking for a topic for his master’s thesis, and he saw a possibility. In his last year of college he had taken a course in symbolic logic, and, when he tried to make an orderly list of the possible arrangements of switching circuits, he had a sudden feeling of déjà vu. In a deeply abstract way, these problems lined up. The peculiar artificial notation of symbolic logic, Boole’s “algebra,” could be used to describe circuits. ([Location 2858](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2858)) - Tags: [[pink]] - This was an odd connection to make. The worlds of electricity and logic seemed incongruous. Yet, as Shannon realized, what a relay passes onward from one circuit to the next is not really electricity but rather a fact: the fact of whether the circuit is open or closed. If a circuit is open, then a relay may cause the next circuit to open. But the reverse arrangement is also possible, the negative arrangement: when a circuit is open, a relay may cause the next circuit to close. It was clumsy to describe the possibilities with words; simpler to reduce them to symbols, and natural, for a mathematician, to manipulate the symbols in equations. ([Location 2865](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2865)) - Tags: [[pink]] - The invention of writing had catalyzed logic, by making it possible to reason about reasoning—to hold a train of thought up before the eyes for examination—and now, all these centuries later, logic was reanimated with the invention of machinery that could work upon symbols. In logic and mathematics, the highest forms of reasoning, everything seemed to be coming together. ([Location 2926](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2926)) - Tags: [[pink]] - “It was in the air,” Douglas Hofstadter has written, “that truly peculiar things could happen when modern cousins of various ancient paradoxes cropped up inside the rigorously logical world of numbers,… a pristine paradise in which no one had dreamt paradox might arise.” ([Location 2959](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2959)) - Tags: [[pink]] - “This statement is false” is meta-language: language about language. Russell’s paradoxical set relies on a meta-set: a set of sets. So the problem was a crossing of levels, or, as Russell termed it, a mixing of types. His solution: declare it illegal, taboo, out of bounds. No mixing different levels of abstraction. No self-reference; no self-containment. The rules of symbolism in Principia Mathematica would not allow the reaching-back-around, snake-eating-its-tail feedback loop that seemed to turn on the possibility of self-contradiction. This was his firewall. Enter Kurt Gödel. ([Location 2982](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2982)) - Tags: [[pink]] - Gödel praised the Russell and Whitehead project before he buried it: mathematical logic was, he wrote, “a science prior to all others, which contains the ideas and principles underlying all sciences.” Principia Mathematica, the great opus, embodied a formal system that had become, in its brief lifetime, so comprehensive and so dominant that Gödel referred to it in shorthand: PM. ([Location 2994](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2994)) - Tags: [[pink]] - This slight young man turned his doubt into a great and horrifying discovery. He found that lurking within PM—and within any consistent system of logic—there must be monsters of a kind hitherto unconceived: statements that can never be proved, and yet can never be disproved. There must be truths, that is, that cannot be proved—and Gödel could prove it. ([Location 3006](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3006)) - Tags: [[pink]] - He accomplished this with iron rigor disguised as sleight of hand. He employed the formal rules of PM and, as he employed them, also approached them metamathematically—viewed them, that is, from the outside. As he explained, all the symbols of PM—numbers, operations of arithmetic, logical connectors, and punctuation—constituted a limited alphabet. Every statement or formula of PM was written in this alphabet. Likewise every proof comprised a finite sequence of formulas—just a longer passage written in the same alphabet. This is where metamathematics came in. Metamathematically, Gödel pointed out, one sign is as good as another; the choice of a particular alphabet is arbitrary. One could use the traditional assortment of numerals and glyphs (from arithmetic: +, −, =, ×; from logic: ¬, ∨, ⊃, ∃), or one could use letters, or one could use dots and dashes. It was a matter of encoding, slipping from one symbol set to another. Gödel proposed to use numbers for all his signs. Numbers were his alphabet. And because numbers can be combined using arithmetic, any sequence of numbers amounts to one (possibly very large) number. So every statement, every formula of PM can be expressed as a single number, and so can every proof. Gödel outlined a rigorous scheme for doing the encoding—an algorithm, mechanical, just rules to follow, no intelligence necessary. It works forward and backward: given any formula, following the rules generates one number, and given any number, following the rules produces the corresponding formula. Not every number translates into a correct formula, however. Some numbers decode back into gibberish, or formulas that are false within the rules of the system. The string of symbols “0 0 0 = = =” does not make a formula at all, though it translates to some number. The statement “0 = 1” is a recognizable formula, but it is false. The formula “0 + x = x + 0” is true, and it is provable. This last quality—the property of being provable according to PM—was not meant to be expressible in the language of PM. It seems to be a statement from outside the system, a metamathematical statement. But Gödel’s encoding reeled it in. In the framework he constructed, the natural numbers led a double life, as numbers and also as statements. A statement could assert that a given number is even, or prime, or a perfect square, and a statement could also assert that a given number is a provable formula. Given the number 1,044,045,317,700, for example, one could make various statements and test their truth or falsity: this number is even, it is not a prime, it is not a perfect square, it is greater than 5, it is divisible by 121, and (when decoded according to the official rules) it is a provable formula. Gödel laid all this out in a little paper in 1931. Making his proof watertight required complex logic, but the basic argument was simple and elegant. Gödel showed how to construct a formula that said A certain number, x, is not provable. That was easy: there… ([Location 3009](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3009)) - Tags: [[pink]] - Gödel’s conclusion sprang not from a weakness in PM but from a strength. That strength is the fact that numbers are so flexible or “chameleonic” that their patterns can mimic patterns of reasoning.… PM’s expressive power is what gives rise to its incompleteness. The long-sought universal language, the characteristica universalis Leibniz had pretended to invent, had been there all along, in the numbers. Numbers could encode all of reasoning. They could represent any form of knowledge. ([Location 3046](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3046)) - Tags: [[pink]] - At Bell Labs, the last part of this problem looked familiar. It resembled an issue that plagued communication by telephone. The noisy data looked like static on the line. “There is an obvious analogy,” Shannon and his colleagues reported, “between the problem of smoothing the data to eliminate or reduce the effect of tracking errors and the problem of separating a signal from interfering noise in communications systems.” The data constituted a signal; the whole problem was “a special case of the transmission, manipulation, and utilization of intelligence.” Their specialty, at Bell Labs. ([Location 3100](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3100)) - Tags: [[pink]] - The advantages were obvious—but not to everyone. Elisha Gray, a telegraph man who came close to trumping Alexander Graham Bell as inventor of the telephone, told his own patent lawyer in 1875 that the work was hardly worthwhile: “Bell seems to be spending all his energies in [the] talking telegraph. While this is very interesting scientifically it has no commercial value at present, for they can do much more business over a line by methods already in use.” ([Location 3110](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3110)) - Tags: [[pink]] - Note: Very common oversight in business. “This other business that services the same customers is entrenched, so you’ll never catch up.” But the telephone represented a complete change in KIND for communication because users got to use voice (which transfers more information than text) and made it so non-technical people could use it - One reason for these misguesses was just the usual failure of imagination in the face of a radically new technology. The telegraph lay in plain view, but its lessons did not extrapolate well to this new device. The telegraph demanded literacy; the telephone embraced orality. A message sent by telegraph had first to be written, encoded, and tapped out by a trained intermediary. To employ the telephone, one just talked. A child could use it. For that very reason it seemed like a toy. In fact, it seemed like a familiar toy, made from tin cylinders and string. The telephone left no permanent record. The Telephone had no future as a newspaper name. Business people thought it unserious. Where the telegraph dealt in facts and numbers, the telephone appealed to emotions. ([Location 3121](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3121)) - Tags: [[pink]] - Note: All these features are exactly what made the telephone successful over the telegraph - something was carried along the wire in the form of electricity. In the case of the telephone, that thing was sound, simply converted from waves of pressure in the air to waves of electric current. ([Location 3133](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3133)) - Tags: [[pink]] - In a lecture at Cambridge, the physicist James Clerk Maxwell offered a scientific description of the telephone conversation: “The speaker talks to the transmitter at one end of the line, and at the other end of the line the listener puts his ear to the receiver, and hears what the speaker said. The process in its two extreme states is so exactly similar to the old-fashioned method of speaking and hearing that no preparatory practice is required on the part of either operator.” He, too, had noticed its ease of use. ([Location 3139](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3139)) - Tags: [[pink]] - the most prescient comments came from those who focused on the exponential power of interconnection. Scientific American assessed “The Future of the Telephone” as early as 1880 and emphasized the forming of “little clusters of telephonic communicants.” The larger the network and the more diverse its interests, the greater its potential would be. What the telegraph accomplished in years the telephone has done in months. One year it was a scientific toy, with infinite possibilities of practical use; the next it was the basis of a system of communication the most rapidly expanding, intricate, and convenient that the world has known.… Soon it will be the rule and not the exception for business houses, indeed for the dwellings of well-to-do people as well, to be interlocked by means of telephone exchange, not merely in our cities, but in all outlying regions. The result can be nothing less than a new organization of society—a state of things in which every individual, however secluded, will have at call every other individual in the community, to the saving of no end of social and business complications, of needless goings to and fro, of disappointments, delays, and a countless host of those great and little evils and annoyances. The time is close at hand when the scattered members of civilized communities will be as closely united, so far as instant telephonic communication is concerned, as the various members of the body now are by the nervous system. ([Location 3151](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3151)) - Tags: [[pink]] - The telephone was already thought, correctly, to be responsible for rapid industrial progress. The case could hardly be overstated. The areas depending on “instantaneous communication across space” were listed by the United States Commerce Department in 1907: “agriculture, mining, commerce, manufacturing, transportation, and, in fact, all the various branches of production and distribution of natural and artificial resources.” Not to mention “cobblers, cleaners of clothing, and even laundresses.” In other words, every cog in the engine of the economy. ([Location 3164](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3164)) - Tags: [[pink]] - To enable the fast expansion of this extraordinary network, the telephone demanded new technologies and new science. They were broadly of two kinds. One had to do with electricity itself: measuring electrical quantities; controlling the electromagnetic wave, as it was now understood—its modulation in amplitude and in frequency. Maxwell had established in the 1860s that electrical pulses and magnetism and light itself were all manifestations of a single force: “affectations of the same substance,” light being one more case of “an electromagnetic disturbance propagated through the field according to electromagnetic laws.” These were the laws that electrical engineers now had to apply, unifying telephone and radio among other technologies. Even the telegraph employed a simple kind of amplitude modulation, in which only two values mattered, a maximum for “on” and a minimum for “off.” To convey sound required far stronger current, far more delicately controlled. ([Location 3179](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3179)) - Tags: [[pink]] - The engineers also discovered how to modulate individual currents so as to combine them in a single channel—multiplexing—without losing their identity. By 1918 they could get four conversations into a single pair of wires. But it was not currents that preserved identity. Before the engineers quite realized it, they were thinking in terms of the transmission of a signal, an abstract entity, quite distinct from the electrical waves in which it was embodied. ([Location 3189](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3189)) - Tags: [[pink]] - A second, less well defined sort of science concerned the organizing of connections—switching, numbering, and logic. This branch descended from Bell’s original realization, dating from 1877, that telephones need not be sold in pairs; that each individual telephone could be connected to many other telephones, not by direct wires but through a central “exchange.” ([Location 3192](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3192)) - Tags: [[pink]] - That innovation came the next year in Lowell, Massachusetts, where by the end of 1879 four operators managed the connections among two hundred subscribers by shouting to one another across the switchboard room. An epidemic of measles broke out, and Dr. Moses Greeley Parker worried that if the operators succumbed, they would be hard to replace. He suggested identifying each telephone by number. He also suggested listing the numbers in an alphabetical directory of subscribers. These ideas could not be patented and arose again in telephone exchanges across the country, where the burgeoning networks were creating clusters of data in need of organization. ([Location 3202](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3202)) - Tags: [[pink]] - human operators could not sustain a network on the scale now arising. Switching would have to be performed automatically. This meant a mechanical linkage to take from callers not just the sound of their voice but also a number—identifying a person, or at least another telephone. The challenge of converting a number into electrical form still required ingenuity: first push buttons were tried, then an awkward-seeming rotary dial, with ten finger positions for the decimal digits, sending pulses down the line. Then the coded pulses served as an agent of control at the central exchange, where another mechanism selected from an array of circuits and set up a connection. Altogether this made for an unprecedented degree of complexity in the translations between human and machine, number and circuitry. ([Location 3224](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3224)) - Tags: [[pink]] - Its first director, Thornton C. Fry, enjoyed the tension between theory and practice—the clashing cultures. “For the mathematician, an argument is either perfect in every detail or else it is wrong,” he wrote in 1941. “He calls this ‘rigorous thinking.’ The typical engineer calls it ‘hair-splitting.’ ” The mathematician also tends to idealize any situation with which he is confronted. His gases are “ideal,” his conductors “perfect,” his surfaces “smooth.” He calls this “getting down to essentials.” The engineer is likely to dub it “ignoring the facts.” ([Location 3244](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3244)) - Tags: [[pink]] - mathematicians and engineers could not do without each other. Every electrical engineer could now handle the basic analysis of waves treated as sinusoidal signals. But new difficulties arose in understanding the action of networks; network theorems were devised to handle these mathematically. Mathematicians applied queuing theory to usage conflicts; developed graphs and trees to manage issues of intercity trunks and lines; and used combinatorial analysis to break down telephone probability problems. ([Location 3250](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3250)) - Tags: [[pink]] - The theorem was an extension of Nyquist’s formula, and it could be expressed in words: the most information that can be transmitted in any given time is proportional to the available frequency range (he did not yet use the term bandwidth). Hartley was bringing into the open a set of ideas and assumptions that were becoming part of the unconscious culture of electrical engineering, and the culture of Bell Labs especially. ([Location 3303](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3303)) - Tags: [[pink]] - It seemed intuitively clear that the amount of information should be proportional to the number of symbols: twice as many symbols, twice as much information. But a dot or dash—a symbol in a set with just two members—carries less information than a letter of the alphabet and much less information than a word chosen from a thousand-word dictionary. The more possible symbols, the more information each selection carries. How much more? The equation, as Hartley wrote it, was this: H = n log s where H is the amount of information, n is the number of symbols transmitted, and s is the size of the alphabet. In a dot-dash system, s is just 2. A single Chinese character carries so much more weight than a Morse dot or dash; it is so much more valuable. In a system with a symbol for every word in a thousand-word dictionary, s would be 1,000. The amount of information is not proportional to the alphabet size, however. That relationship is logarithmic: to double the amount of information, it is necessary to square the alphabet size. ([Location 3323](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3323)) - Tags: [[pink]] - AT THE HEIGHT OF THE WAR, in early 1943, two like-minded thinkers, Claude Shannon and Alan Turing, met daily at teatime in the Bell Labs cafeteria and said nothing to each other about their work, because it was secret. Both men had become cryptanalysts. Even Turing’s presence at Bell Labs was a sort of secret. ([Location 3363](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3363)) - Tags: [[pink]] - Shannon was working on the X System, used for encrypting voice conversations between Franklin D. Roosevelt at the Pentagon and Winston Churchill in his War Rooms. It operated by sampling the analog voice signal fifty times a second—“quantizing” or “digitizing” it—and masking it by applying a random key, which happened to bear a strong resemblance to the circuit noise with which the engineers were so familiar. ([Location 3367](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3367)) - Tags: [[pink]] - It bordered on impudence to talk about thinking machines in 1943, when both the transistor and the electronic computer had yet to be born. The vision Shannon and Turing shared had nothing to do with electronics; it was about logic. ([Location 3378](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3378)) - Tags: [[pink]] - Note: Extrapolate beyond current limitations - what is the limit imposed on this process by logic and/or physics? - The Entscheidungsproblem was to find a rigorous step-by-step procedure by which, given a formal language of deductive reasoning, one could perform a proof automatically. This was Leibniz’s dream revived once again: the expression of all valid reasoning in mechanical rules. ([Location 3398](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3398)) - Tags: [[pink]] - Then came the final flourish. Turing’s machines, stripped down to a finite table of states and a finite set of input, could themselves be represented as numbers. Every possible state table, combined with its initial tape, represents a different machine. Each machine itself, then, can be described by a particular number—a certain state table combined with its initial tape. Turing was encoding his machines just as Gödel had encoded the language of symbolic logic. This obliterated the distinction between data and instructions: in the end they were all numbers. For every computable number, there must be a corresponding machine number. ([Location 3461](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3461)) - Tags: [[pink]] - Few could follow it. It seems paradoxical—it is paradoxical—but Turing proved that some numbers are uncomputable. (In fact, most are.) Also, because every number corresponds to an encoded proposition of mathematics and logic, Turing had resolved Hilbert’s question about whether every proposition is decidable. He had proved that the Entscheidungsproblem has an answer, and the answer is no. An uncomputable number is, in effect, an undecidable proposition. ([Location 3482](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3482)) - Tags: [[pink]] - Once again, the paradoxes come to life when numbers gain the power to encode the machine’s own behavior. That is the necessary recursive twist. The entity being reckoned is fatally entwined with the entity doing the reckoning. As Douglas Hofstadter put it much later, “The thing hinges on getting this halting inspector to try to predict its own behavior when looking at itself trying to predict its own behavior when looking at itself trying to predict its own behavior when …” A conundrum that at least smelled similar had lately appeared in physics, too: Werner Heisenberg’s new uncertainty principle. When Turing learned about that, he expressed it in terms of self-reference: “It used to be supposed in Science that if everything was known about the Universe at any particular moment then we can predict what it will be through all the future.… More modern science however has come to the conclusion that when we are dealing with atoms and electrons we are quite unable to know the exact state of them; our instruments being made of atoms and electrons themselves.” ([Location 3489](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3489)) - Tags: [[pink]] - This meant building a machine to invert the enciphering of any number of Enigmas. Where his first machine was a phantasm of hypothetical tape, this one, dubbed the Bombe, filled ninety cubic feet with a ton of wire and metal leaking oil and effectively mapping the rotors of the German device onto electric circuitry. The scientific triumph at Bletchley—secret for the duration of the war and for thirty years after—had a greater effect on the outcome than even the Manhattan Project, the real bomb. By the war’s end, the Turing Bombes were deciphering thousands of military intercepts every day: processing information, that is, on a scale never before seen. ([Location 3521](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3521)) - Tags: [[pink]] - The code breakers see a stream of data that looks like junk. They want to find the real signal. “From the point of view of the cryptanalyst,” Shannon noted, “a secrecy system is almost identical with a noisy communication system.” ([Location 3546](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3546)) - Tags: [[pink]] - Every language has a certain statistical structure, Shannon argued, and with it a certain redundancy. Let us call this (he suggested) D. “D measures, in a sense, how much a text in the language can be reduced in length without losing any information.” ([Location 3561](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3561)) - Tags: [[pink]] - With the simplest early substitution ciphers, this redundancy provided the point of first weakness. Edgar Allan Poe knew that when a cryptogram contained more z’s than any other letter, then z was probably the substitute for e, since e is the most frequent letter in English. As soon as q was solved, so was u. A code breaker looked for recurring patterns that might match common words or letter combinations: the, and, -tion. ([Location 3565](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3565)) - Tags: [[pink]] - Shannon, meanwhile, removed himself to the most distant, most general, most theoretical vantage point. A secrecy system comprised a finite (though possibly very large) number of possible messages, a finite number of possible cryptograms, and in between, transforming one to the other, a finite number of keys, each with an associated probability. ([Location 3576](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3576)) - Tags: [[pink]] - Information is uncertainty, surprise, difficulty, and entropy: “Information is closely associated with uncertainty.” Uncertainty, in turn, can be measured by counting the number of possible messages. If only one message is possible, there is no uncertainty and thus no information. Some messages may be likelier than others, and information implies surprise. Surprise is a way of talking about probabilities. If the letter following t (in English) is h, not so much information is conveyed, because the probability of h was relatively high. “What is significant is the difficulty in transmitting the message from one point to another.” Perhaps this seemed backward, or tautological, like defining mass in terms of the force needed to move an object. But then, mass can be defined that way. Information is entropy. This was the strangest and most powerful notion of all. Entropy—already a difficult and poorly understood concept—is a measure of disorder in thermodynamics, the science of heat and energy. ([Location 3598](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3598)) - Tags: [[pink]] - Where a layman might have said that the fundamental problem of communication is to make oneself understood—to convey meaning—Shannon set the stage differently: The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. “Point” was a carefully chosen word: the origin and destination of a message could be separated in space or in time; information storage, as in a phonograph record, counts as a communication. Meanwhile, the message is not created; it is selected. It is a choice. It might be a card dealt from a deck, or three decimal digits chosen from the thousand possibilities, or a combination of words from a fixed code book. ([Location 3626](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3626)) - Tags: [[pink]] - He could hardly overlook meaning altogether, so he dressed it with a scientist’s definition and then showed it the door: Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. Nonetheless, as Weaver took pains to explain, this was not a narrow view of communication. On the contrary, it was all-encompassing: “not only written and oral speech, but also music, the pictorial arts, the theatre, the ballet, and in fact all human behavior.” Nonhuman as well: why should machines not have messages to send? ([Location 3633](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3633)) - Tags: [[pink]] - A communication system must contain the following elements: The information source is the person or machine generating the message, which may be simply a sequence of characters, as in a telegraph or teletype, or may be expressed mathematically as functions—f(x, y, t)—of time and other variables. In a complex example like color television, the components are three functions in a three-dimensional continuum, Shannon noted. The transmitter “operates on the message in some way”—that is, encodes the message—to produce a suitable signal. A telephone converts sound pressure into analog electric current. A telegraph encodes characters in dots, dashes, and spaces. More complex messages may be sampled, compressed, quantized, and interleaved. The channel: “merely the medium used to transmit the signal.” The receiver inverts the operation of the transmitter. It decodes the message, or reconstructs it from the signal. The destination “is the person (or thing)” at the other end. ([Location 3642](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3642)) - Tags: [[pink]] - Every engineer, when asked to push more information through a channel, knew what to do: boost the power. Over long distances, however, this approach was failing, because amplifying a signal again and again leads to a crippling buildup of noise. Shannon sidestepped this problem by treating the signal as a string of discrete symbols. Now, instead of boosting the power, a sender can overcome noise by using extra symbols for error correction—just as an African drummer makes himself understood across long distances, not by banging the drums harder, but by expanding the verbosity of his discourse. Shannon considered the discrete case to be more fundamental in a mathematical sense as well. And he was considering another point: that treating messages as discrete had application not just for traditional communication but for a new and rather esoteric subfield, the theory of computing machines. ([Location 3656](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3656)) - Tags: [[pink]] - A new unit of measure was needed. Shannon said: “The resulting units may be called binary digits, or more briefly, bits.” As the smallest possible quantity of information, a bit represents the amount of uncertainty that exists in the flipping of a coin. The coin toss makes a choice between two possibilities of equal likelihood: in this case p1 and p2 each equal ½; the base 2 logarithm of ½ is −1; so H = 1 bit. A single character chosen randomly from an alphabet of 32 conveys more information: 5 bits, to be exact, because there are 32 possible messages and the logarithm of 32 is 5. A string of 1,000 such characters carries 5,000 bits—not just by simple multiplication, but because the amount of information represents the amount of uncertainty: the number of possible choices. With 1,000 characters in a 32-character alphabet, there are 321000 possible messages, and the logarithm of that number is 5,000. ([Location 3749](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3749)) - Tags: [[pink]] - Quantifying predictability and redundancy in this way is a backward way of measuring information content. If a letter can be guessed from what comes before, it is redundant; to the extent that it is redundant, it provides no new information. If English is 75 percent redundant, then a thousand-letter message in English carries only 25 percent as much information as one thousand letters chosen at random. Paradoxical though it sounded, random messages carry more information. The implication was that natural-language text could be encoded more efficiently for transmission or storage. ([Location 3771](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3771)) - Tags: [[pink]] - MOST MATHEMATICAL THEORIES take shape slowly; Shannon’s information theory sprang forth like Athena, fully formed. ([Location 3816](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3816)) - Tags: [[pink]] - Wiener was as worldly as Shannon was reticent. He was well traveled and polyglot, ambitious and socially aware; he took science personally and passionately. His expression of the second law of thermodynamics, for example, was a cry of the heart: We are swimming upstream against a great torrent of disorganization, which tends to reduce everything to the heat death of equilibrium and sameness.… This heat death in physics has a counterpart in the ethics of Kierkegaard, who pointed out that we live in a chaotic moral universe. In this, our main obligation is to establish arbitrary enclaves of order and system.… Like the Red Queen, we cannot stay where we are without running as fast as we can. ([Location 3881](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3881)) - Tags: [[pink]] - Either way, the key to the process is information. What governs the antiaircraft gun, for example, is information about the plane’s coordinates and about the previous position of the gun itself. Wiener’s friend Bigelow emphasized this: “that it was not some particular physical thing such as energy or length or voltage, but only information (conveyed by any means).” ([Location 3900](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3900)) - Tags: [[pink]] - Negative feedback must be ubiquitous, Wiener felt. He could see it at work in the coordination of eye and hand, guiding the nervous system of a person performing an action as ordinary as picking up a pencil. He focused especially on neurological disorders, maladies that disrupted physical coordination or language. He saw them quite specifically as cases of information feedback gone awry: varieties of ataxia, for example, where sense messages are either interrupted in the spinal cord or misinterpreted in the cerebellum. His analysis was detailed and mathematical, with equations—almost unheard of in neurology. Meanwhile, feedback control systems were creeping into factory assembly lines, because a mechanical system, too, can modify its own behavior. Feedback is the governor, the steersman. ([Location 3903](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3903)) - Tags: [[pink]] - these devices would use the binary number system for simplicity. For advanced calculations they would need to employ a form of logic. What form? Shannon had answered that question in his master’s thesis of 1937, and Wiener offered the same answer: the algebra of logic par excellence, or the Boolean algebra. This algorithm, like the binary arithmetic, is based on the dichotomy, the choice between yes and no, the choice between being in a class and outside. The brain, too, he argued, is at least partly a logical machine. Where computers employ relays—mechanical, or electromechanical, or purely electrical—the brain has neurons. These cells tend to be in one of two states at any given moment: active (firing) or at rest (in repose). So they may be considered relays with two states. They are connected to one another in vast arrays, at points of contact known as synapses. They transmit messages. To store the messages, brains have memory; computing machines, too, need physical storage that can be called memory. (He knew well that this was a simplified picture of a complex system, that other sorts of messages, more analog than digital, seemed to be carried chemically by hormones.) Wiener suggested, too, that functional disorders such as “nervous breakdowns” might have cousins in electronics. Designers of computing machines might need to plan for untimely floods of data—perhaps the equivalent of “traffic problems and overloading in the nervous system.” Brains and electronic computers both use quantities of energy in performing their work of logic—“all of which is wasted and dissipated in heat,” to be carried away by the blood or by ventilating and cooling apparatus. But this is really beside the point, Wiener said. “Information is information, not matter or energy. No materialism which does not admit this can survive at the present day.” ([Location 3935](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3935)) - Tags: [[pink]] - “We are again in one of those prodigious periods of scientific progress—in its own way like the pre-Socratic period,” declared the gnomic, white-bearded neurophysiologist Warren McCulloch to a meeting of British philosophers. He told them that listening to Wiener and von Neumann put him in mind of the debates of the ancients. A new physics of communication had been born, he said, and metaphysics would never be the same: “For the first time in the history of science we know how we know and hence are able to state it clearly.” He offered them heresy: that the knower was a computing machine, the brain composed of relays, perhaps ten billion of them, each receiving signals from other relays and sending them onward. The signals are quantized: they either happen or do not happen. So once again the stuff of the world, he said, turns out to be the atoms of Democritus—“indivisibles—leasts—which go batting about in the void.” It is a world for Heraclitus, always “on the move.” I do not mean merely that every relay is itself being momentarily destroyed and re-created like a flame, but I mean that its business is with information which pours into it over many channels, passes through it, eddies within it and emerges again to the world. ([Location 3951](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3951)) - Tags: [[pink]] - Wiener told them that all these sciences, the social sciences especially, were fundamentally the study of communication, and that their unifying idea was the message. ([Location 3971](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3971)) - Tags: [[pink]] - There was a difference in emphasis between Shannon and Wiener. For Wiener, entropy was a measure of disorder; for Shannon, of uncertainty. Fundamentally, as they were realizing, these were the same. The more inherent order exists in a sample of English text—order in the form of statistical patterns, known consciously or unconsciously to speakers of the language—the more predictability there is, and in Shannon’s terms, the less information is conveyed by each subsequent letter. When the subject guesses the next letter with confidence, it is redundant, and the arrival of the letter contributes no new information. Information is surprise. ([Location 4043](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4043)) - Tags: [[pink]] - The existence (or nonexistence, but at least near existence) of Babbage’s engine allows Turing to rebut a superstition he senses forming in the zeitgeist of 1950. People seem to feel that the magic of digital computers is essentially electrical; meanwhile, the nervous system is also electrical. But Turing is at pains to think of computation in a universal way, which means in an abstract way. He knows it is not about electricity at all: Since Babbage’s machine was not electrical, and since all digital computers are in a sense equivalent, we see that this use of electricity cannot be of theoretical importance.… The feature of using electricity is thus seen to be only a very superficial similarity. ([Location 4170](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4170)) - Tags: [[pink]] - The behaviorists, particularly John B. Watson in the United States and then, most famously, B. F. Skinner, made a science based on stimulus and response: food pellets, bells, electric shocks; salivation, lever pressing, maze running. Watson said that the whole purpose of psychology was to predict what responses would follow a given stimulus and what stimuli could produce a given behavior. Between stimulus and response lay a black box, known to be composed of sense organs, neural pathways, and motor functions, but fundamentally off limits. In effect, the behaviorists were saying yet again that the soul is ineffable. For a half century, their research program thrived because it produced results about conditioning reflexes and controlling behavior. ([Location 4222](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4222)) - Tags: [[pink]] - by midcentury, frustration had set in. The behaviorists’ purity had become a dogma; their refusal to consider mental states became a cage, and psychologists still wanted to understand what the mind was. ([Location 4230](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4230)) - Tags: [[pink]] - A behaviorist running a rat through a maze would discuss the association between stimulus and response but would refuse to speculate in any way about the mind of the rat; now engineers were building mental models of rats out of a few electrical relays. They were not just prying open the black box; they were making their own. Signals were being transmitted, encoded, stored, and retrieved. Internal models of the external world were created and updated. Psychologists took note. From information theory and cybernetics, they received a set of useful metaphors and even a productive conceptual framework. ([Location 4232](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4232)) - Tags: [[pink]] - Ears and eyes were to be understood as message channels, so why not test and measure them like microphones and cameras? “New concepts of the nature and measure of information,” wrote Homer Jacobson, a chemist at Hunter College in New York, “have made it possible to specify quantitatively the informational capacity of the human ear,” and he proceeded to do so. Then he did the same for the eye, arriving at an estimate four hundred times greater, in bits per second. ([Location 4242](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4242)) - Tags: [[pink]] - Recoding, he declared, “seems to me to be the very lifeblood of the thought processes.” The concepts and measures provided by the theory of information provide a quantitative way of getting at some of these questions. The theory provides us with a yardstick for calibrating our stimulus materials and for measuring the performance of our subjects.… Informational concepts have already proved valuable in the study of discrimination and of language; they promise a great deal in the study of learning and memory; and it has even been proposed that they can be useful in the study of concept formation. A lot of questions that seemed fruitless twenty or thirty years ago may now be worth another look. ([Location 4284](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4284)) - Tags: [[pink]] - “Those who take the informational turn see information as the basic ingredient in building a mind,” writes Frederick Adams. “Information has to contribute to the origin of the mental.” As Miller himself liked to say, the mind came in on the back of the machine. ([Location 4292](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4292)) - Tags: [[pink]] - One of Shannon’s essential results, the noisy coding theorem, grew in importance, showing that error correction can effectively counter noise and corruption. At first this was just a tantalizing theoretical nicety; error correction requires computation, which was not yet cheap. But during the 1950s, work on error-correcting methods began to fulfill Shannon’s promise, and the need for them became apparent. One application was exploration of space with rockets and satellites; they needed to send messages very long distances with limited power. Coding theory became a crucial part of computer science, with error correction and data compression advancing side by side. Without it, modems, CDs, and digital television would not exist. For mathematicians interested in random processes, coding theorems are also measures of entropy. ([Location 4321](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4321)) - Tags: [[pink]] - What all these flights of fancy had in common was an extension of algorithmic processes into new realms—the abstract mapping of ideas onto mathematical objects. Later, he wrote thousands of words on scientific aspects of juggling—with theorems and corollaries—and included from memory a quotation from E. E. Cummings: “Some son-of-a-bitch will invent a machine to measure Spring with.” ([Location 4357](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4357)) - Tags: [[pink]] - Thermodynamics arose hand in hand with steam engines; it was at first nothing more than “the theoretical study of the steam engine.” It concerned itself with the conversion of heat, or energy, into work. As this occurs—heat drives an engine—Clausius observed that the heat does not actually get lost; it merely passes from a hotter body into a cooler body. On its way, it accomplishes something. ([Location 4405](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4405)) - Tags: [[pink]] - Carnot kept pointing out in France: water begins at the top and ends at the bottom, and no water is gained or lost, but the water performs work on the way down. Carnot imagined heat as just such a substance. The ability of a thermodynamic system to produce work depends not on the heat itself, but on the contrast between hot and cold. ([Location 4408](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4408)) - Tags: [[pink]] - The problem was not just in choosing between positive and negative. It was subtler than that. Maxwell had first considered entropy as a subtype of energy: the energy available for work. On reconsideration, he recognized that thermodynamics needed an entirely different measure. Entropy was not a kind of energy or an amount of energy; it was, as Clausius had said, the unavailability of energy. Abstract though this was, it turned out to be a quantity as measurable as temperature, volume, or pressure. ([Location 4425](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4425)) - Tags: [[pink]] - With entropy, the “laws” of thermodynamics could be neatly expressed: First law: The energy of the universe is constant. Second law: The entropy of the universe always increases. ([Location 4429](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4429)) - Tags: [[pink]] - Disorder seemed strangely unphysical. It implied that a piece of the equation must be something like knowledge, or intelligence, or judgment. “The idea of dissipation of energy depends on the extent of our knowledge,” Maxwell said. “Available energy is energy which we can direct into any desired channel. Dissipated energy is energy which we cannot lay hold of and direct at pleasure, such as the energy of the confused agitation of molecules which we call heat.” What we can do, or know, became part of the definition. It seemed impossible to talk about order and disorder without involving an agent or an observer—without talking about the mind: Confusion, like the correlative term order, is not a property of material things in themselves, but only in relation to the mind which perceives them. A memorandum-book does not, provided it is neatly written, appear confused to an illiterate person, or to the owner who understands it thoroughly, but to any other person able to read it appears to be inextricably confused. Similarly the notion of dissipated energy could not occur to a being who could not turn any of the energies of nature to his own account, or to one who could trace the motion of every molecule and seize it at the right moment. Order is subjective—in the eye of the beholder. ([Location 4444](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4444)) - Tags: [[pink]] - Suppose the box of gas is divided by a diaphragm. The gas on side A is hotter than the gas on side B—that is, the A molecules are moving faster, with greater energy. As soon as the divider is removed, the molecules begin to mix; the fast collide with the slow; energy is exchanged; and after some time the gas reaches a uniform temperature. The mystery is this: Why can the process not be reversed? In Newton’s equations of motion, time can have a plus sign or a minus sign; the mathematics works either way. In the real world past and future cannot be interchanged so easily. “Time flows on, never comes back,” said Léon Brillouin in 1949. “When the physicist is confronted with this fact he is greatly disturbed.” Maxwell had been mildly disturbed. He wrote to Lord Rayleigh: If this world is a purely dynamical system, and if you accurately reverse the motion of every particle of it at the same instant, then all things will happen backwards to the beginning of things, the raindrops will collect themselves from the ground and fly up to the clouds, etc, etc, and men will see their friends passing from the grave to the cradle till we ourselves become the reverse of born, whatever that is. His point was that in the microscopic details, if we watch the motions of individual molecules, their behavior is the same forward and backward in time. We can run the film backward. But pan out, watch the box of gas as an ensemble, and statistically the mixing process becomes a one-way street. We can watch the fluid for all eternity, and it will never divide itself into hot molecules on one side and cool on the other. The clever young Thomasina says in Tom Stoppard’s Arcadia, “You cannot stir things apart,” and this is precisely the same as “Time flows on, never comes back.” Such processes run in one direction only. ([Location 4462](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4462)) - Tags: [[pink]] - For the box of gas to come unmixed is not physically impossible; it is just improbable in the extreme. So the second law is merely probabilistic. Statistically, everything tends toward maximum entropy. ([Location 4480](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4480)) - Tags: [[pink]] - The improbability of heat passing from a colder to a warmer body (without help from elsewhere) is identical to the improbability of order arranging itself from disorder (without help from elsewhere). Both, fundamentally, are due only to statistics. Counting all the possible ways a system can be arranged, the disorderly ones far outnumber the orderly ones. There are many arrangements, or “states,” in which molecules are all jumbled, and few in which they are neatly sorted. The orderly states have low probability and low entropy. For impressive degrees of orderliness, the probabilities may be very low. Alan Turing once whimsically proposed a number N, defined as “the odds against a piece of chalk leaping across the room and writing a line of Shakespeare on the board.” ([Location 4486](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4486)) - Tags: [[pink]] - Eventually physicists began speaking of microstates and macrostates. A macrostate might be: all the gas in the top half of the box. The corresponding microstates would be all the possible arrangements of all particles—positions and velocities. Entropy thus became a physical equivalent of probability: the entropy of a given macrostate is the logarithm of the number of its possible microstates. The second law, then, is the tendency of the universe to flow from less likely (orderly) to more likely (disorderly) macrostates. ([Location 4492](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4492)) - Tags: [[pink]] - The demon sees what we cannot—because we are so gross and slow—namely, that the second law is statistical, not mechanical. At the level of molecules, it is violated all the time, here and there, purely by chance. The demon replaces chance with purpose. It uses information to reduce entropy. ([Location 4518](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4518)) - Tags: [[pink]] - Note: Is Maxwell’s demon potentially realizable with nanotechnology? Can identify and move specific molecules to specific places, creating temperature gradients out of thin air - “We must therefore seek for evidence that the organism can control the, otherwise, un-co-ordinated motions of the individual molecules.” Is it not strange that while we see that most of our human effort is that of directing natural agencies and energies into paths which they would not otherwise take, we should yet have failed to think of primitive organisms, or even of the tissue elements in the bodies of the higher organisms, as possessing also the power of directing physico-chemical processes? ([Location 4540](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4540)) - Tags: [[pink]] - Szilárd showed that even this perpetual motion machine would have to fail. What was the catch? Simply put: information is not free. Maxwell, Thomson, and the rest had implicitly talked as though knowledge was there for the taking—knowledge of the velocities and trajectories of molecules coming and going before the demon’s eyes. They did not consider the cost of this information. They could not; for them, in a simpler time, it was as if the information belonged to a parallel universe, an astral plane, not linked to the universe of matter and energy, particles and forces, whose behavior they were learning to calculate. But information is physical. Maxwell’s demon makes the link. The demon performs a conversion between information and energy, one particle at a time. Szilárd—who did not yet use the word information—found that, if he accounted exactly for each measurement and memory, then the conversion could be computed exactly. So he computed it. He calculated that each unit of information brings a corresponding increase in entropy—specifically, by k log 2 units. Every time the demon makes a choice between one particle and another, it costs one bit of information. The payback comes at the end of the cycle, when it has to clear its memory (Szilárd did not specify this last detail in words, but in mathematics). Accounting for this properly is the only way to eliminate the paradox of perpetual motion, to bring the universe back into harmony, to “restore concordance with the Second Law.” ([Location 4562](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4562)) - Tags: [[pink]] - To the physicist, entropy is a measure of uncertainty about the state of a physical system: one state among all the possible states it can be in. These microstates may not be equally likely, so the physicist writes S = −Σ pi log pi. To the information theorist, entropy is a measure of uncertainty about a message: one message among all the possible messages that a communications source can produce. The possible messages may not be equally likely, so Shannon wrote H = −Σ pi log pi. It is not just a coincidence of formalism: nature providing similar answers to similar problems. It is all one problem. To reduce entropy in a box of gas, to perform useful work, one pays a price in information. Likewise, a particular message reduces the entropy in the ensemble of possible messages—in terms of dynamical systems, a phase space. ([Location 4577](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4577)) - Tags: [[pink]] - We all behave like Maxwell’s demon. Organisms organize. In everyday experience lies the reason sober physicists across two centuries kept this cartoon fantasy alive. We sort the mail, build sand castles, solve jigsaw puzzles, separate wheat from chaff, rearrange chess pieces, collect stamps, alphabetize books, create symmetry, compose sonnets and sonatas, and put our rooms in order, and to do all this requires no great energy, as long as we can apply intelligence. ([Location 4594](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4594)) - Tags: [[pink]] - Not only do living things lessen the disorder in their environments; they are in themselves, their skeletons and their flesh, vesicles and membranes, shells and carapaces, leaves and blossoms, circulatory systems and metabolic pathways—miracles of pattern and structure. It sometimes seems as if curbing entropy is our quixotic purpose in this universe. ([Location 4601](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4601)) - Tags: [[pink]] - heredity. This struck Schrödinger as requiring explanation. “When is a piece of matter said to be alive?” he asked. He skipped past the usual suggestions—growth, feeding, reproduction—and answered as simply as possible: “When it goes on ‘doing something,’ moving, exchanging material with its environment, and so forth, for a much longer period than we would expect an inanimate piece of matter to ‘keep going’ under similar circumstances.” Ordinarily, a piece of matter comes to a standstill; a box of gas reaches a uniform temperature; a chemical system “fades away into a dead, inert lump of matter”—one way or another, the second law is obeyed and maximum entropy is reached. Living things manage to remain unstable. Norbert Wiener pursued this thought in Cybernetics: enzymes, he wrote, may be “metastable” Maxwell’s demons—meaning not quite stable, or precariously stable. “The stable state of an enzyme is to be deconditioned,” he noted, “and the stable state of a living organism is to be dead.” Schrödinger felt that evading the second law for a while, or seeming to, is exactly why a living creature “appears so enigmatic.” The organism’s ability to feign perpetual motion leads so many people to believe in a special, supernatural life force. ([Location 4616](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4616)) - Tags: [[pink]] - the organism feeds upon negative entropy. “To put it less paradoxically,” he added paradoxically, “the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive.” In other words, the organism sucks orderliness from its surroundings. Herbivores and carnivores dine on a smorgasbord of structure; they feed on organic compounds, matter in a well-ordered state, and return it “in a very much degraded form—not entirely degraded, however, for plants can make use of it.” ([Location 4627](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4627)) - Tags: [[pink]] - No one could yet see these hypothetical genes, but surely the time was not far off. Microscopic observations made it possible to estimate their size: perhaps 100 or 150 atomic distances; perhaps one thousand atoms or fewer. Yet somehow these tiny entities must encapsulate the entire pattern of a living creature—a fly or a rhododendron, a mouse or a human. And we must understand this pattern as a four-dimensional object: the structure of the organism through the whole of its ontogenetic development, every stage from embryo to adult. ([Location 4647](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4647)) - Tags: [[pink]] - Living creatures confound the usual computation of entropy. More generally, so does information. “Take an issue of The New York Times, the book on cybernetics, and an equal weight of scrap paper,” suggested Brillouin. “Do they have the same entropy?” If you are feeding the furnace, yes. But not if you are a reader. There is entropy in the arrangement of the ink spots. ([Location 4673](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4673)) - Tags: [[pink]] - What lies at the heart of every living thing is not a fire, not warm breath, not a “spark of life.” It is information, words, instructions. If you want a metaphor, don’t think of fires and sparks and breath. Think, instead, of a billion discrete, digital characters carved in tablets of crystal. —Richard Dawkins (1986) ([Location 4685](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4685)) - Tags: [[pink]] - “The personal qualities of any individual organism do not at all cause the qualities of its offspring; but the qualities of both ancestor and descendent are in quite the same manner determined by the nature of the ‘sexual substances’—i.e., the gametes—from which they have developed.” What is inherited is more abstract, more in the nature of potentiality. To banish the fallacious thinking, he proposed a new terminology, beginning with gene: “nothing but a very applicable little word, easily combined with others.”* It hardly mattered that neither he nor anyone else knew what a gene actually was; “it may be useful as an expression for the ‘unit-factors,’ ‘elements,’ or ‘allelomorphs.’… As to the nature of the ‘genes’ it is as yet of no value to propose a hypothesis.” Gregor Mendel’s years of research with green and yellow peas showed that such a thing must exist. ([Location 4698](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4698)) - Tags: [[pink]] - When Schrödinger contemplated the gene, he faced a problem. How could such a “tiny speck of material” contain the entire complex code-script that determines the elaborate development of the organism? To resolve the difficulty Schrödinger summoned an example not from wave mechanics or theoretical physics but from telegraphy: Morse code. He noted that two signs, dot and dash, could be combined in well-ordered groups to generate all human language. Genes, too, he suggested, must employ a code: “The miniature code should precisely correspond with a highly complicated and specified plan of development and should somehow contain the means to put it into action.” ([Location 4708](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4708)) - Tags: [[pink]] - Codes, instructions, signals—all this language, redolent of machinery and engineering, pressed in on biologists like Norman French invading medieval English. In the 1940s the jargon had a precious, artificial feeling, but that soon passed. The new molecular biology began to examine information storage and information transfer. Biologists could count in terms of “bits.” ([Location 4713](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4713)) - Tags: [[pink]] - The growth of the bacterium could be analyzed as a reduction in the entropy of its part of the universe. Quastler himself wanted to take the measure of higher organisms in terms of information content: not in terms of atoms (“this would be extremely wasteful”) but in terms of “hypothetical instructions to build an organism.” This brought him, of course, to genes. The whole set of instructions—situated “somewhere in the chromosomes”—is the genome. This is a “catalogue,” he said, containing, if not all, then at least “a substantial fraction of all information about an adult organism.” ([Location 4728](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4728)) - Tags: [[pink]] - They dispensed with the timidity in another paper a few weeks later. In each chain the sequence of bases appeared to be irregular—any sequence was possible, they observed. “It follows that in a long molecule many different permutations are possible.” Many permutations—many possible messages. Their next remark set alarms sounding on both sides of the Atlantic: “It therefore seems likely that the precise sequence of the bases is the code which carries the genetical information.” In using these terms, code and information, they were no longer speaking figuratively. ([Location 4770](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4770)) - Tags: [[pink]] - The macromolecules of organic life embody information in an intricate structure. A single hemoglobin molecule comprises four chains of polypeptides, two with 141 amino acids and two with 146, in strict linear sequence, bonded and folded together. Atoms of hydrogen, oxygen, carbon, and iron could mingle randomly for the lifetime of the universe and be no more likely to form hemoglobin than the proverbial chimpanzees to type the works of Shakespeare. Their genesis requires energy; they are built up from simpler, less patterned parts, and the law of entropy applies. For earthly life, the energy comes as photons from the sun. The information comes via evolution. ([Location 4775](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4775)) - Tags: [[pink]] - The DNA molecule was special: the information it bears is its only function. ([Location 4779](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4779)) - Tags: [[pink]] - Closely linked to DNA, its single-stranded cousin, RNA, appeared to play the role of messenger or translator. Gamow said explicitly that the underlying chemistry hardly mattered. He and others who followed him understood this as a puzzle in mathematics—a mapping between messages in different alphabets. If this was a coding problem, the tools they needed came from combinatorics and information theory. Along with physicists, they consulted cryptanalysts. ([Location 4794](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4794)) - Tags: [[pink]] - By now the word code was so deeply embedded in the conversation that people seldom paused to notice how extraordinary it was to find such a thing—abstract symbols representing arbitrarily different abstract symbols—at work in chemistry, at the level of molecules. The genetic code performed a function with uncanny similarities to the metamathematical code invented by Gödel for his philosophical purposes. Gödel’s code substitutes plain numbers for mathematical expressions and operations; the genetic code uses triplets of nucleotides to represent amino acids. Douglas Hofstadter was the first to make this connection explicitly, in the 1980s: “between the complex machinery in a living cell that enables a DNA molecule to replicate itself and the clever machinery in a mathematical system that enables a formula to say things about itself.” In both cases he saw a twisty feedback loop. “Nobody had ever in the least suspected that one set of chemicals could code for another set,” Hofstadter wrote. Indeed, the very idea is somewhat baffling: If there is a code, then who invented it? What kinds of messages are written in it? Who writes them? Who reads them? ([Location 4823](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4823)) - Tags: [[pink]] - DNA serves two different functions. First, it preserves information. It does this by copying itself, from generation to generation, spanning eons—a Library of Alexandria that keeps its data safe by copying itself billions of times. Notwithstanding the beautiful double helix, this information store is essentially one-dimensional: a string of elements arrayed in a line. In human DNA, the nucleotide units number more than a billion, and this detailed gigabit message must be conserved perfectly, or almost perfectly. Second, however, DNA also sends that information outward for use in the making of the organism. The data stored in a one-dimensional strand has to flower forth in three dimensions. This information transfer occurs via messages passing from the nucleic acids to proteins. So DNA not only replicates itself; separately, it dictates the manufacture of something entirely different. These proteins, with their own enormous complexity, serve as the material of a body, the mortar and bricks, and also as the control system, the plumbing and wiring and the chemical signals that control growth. ([Location 4834](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4834)) - Tags: [[pink]] - The replication of DNA is a copying of information. The manufacture of proteins is a transfer of information: the sending of a message. Biologists could see this clearly now, because the message was now well defined and abstracted from any particular substrate. If messages could be borne upon sound waves or electrical pulses, why not by chemical processes? ([Location 4842](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4842)) - Tags: [[pink]] - Even before the exact answer was reached, Crick crystallized its fundamental principles in a statement that he called (and is called to this day) the Central Dogma. It is a hypothesis about the direction of evolution and the origin of life; it is provable in terms of Shannon entropy in the possible chemical alphabets: Once “information” has passed into protein it cannot get out again. In more detail, the transfer of information from nucleic acid to nucleic acid, or from nucleic acid to protein may be possible, but transfer from protein to protein, or from protein to nucleic acid is impossible. Information means here the precise determination of sequence. ([Location 4862](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4862)) - Tags: [[pink]] - “All of biochemistry up to the fifties was concerned with where you get the energy and the materials for cell function,” Brenner said. “Biochemists only thought about the flux of energy and the flow of matter. Molecular biologists started to talk about the flux of information. Looking back, one can see that the double helix brought the realization that information in biological systems could be studied in much the same way as energy and matter.… ([Location 4881](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4881)) - Tags: [[pink]] - As molecular biology perfected its knowledge of the details of DNA and grew more skillful in manipulating these molecular prodigies, it was natural to see them as the answer to the great question of life: how do organisms reproduce themselves? We use DNA, just as we use lungs to breathe and eyes to see. We use it. “This attitude is an error of great profundity,” Dawkins wrote. “It is the truth turned crashingly on its head.” DNA came first—by billions of years—and DNA comes first, he argued, when life is viewed from the proper perspective. From that perspective, genes are the focus, the sine qua non, the star of the show. In his first book—published in 1976, meant for a broad audience, provocatively titled The Selfish Gene—he set off decades of debate by declaring: “We are survival machines—robot vehicles blindly programmed to preserve the selfish molecules known as genes.” He said this was a truth he had known for years. ([Location 4922](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4922)) - Tags: [[pink]] - They are past masters of the survival arts. But do not look for them floating loose in the sea; they gave up that cavalier freedom long ago. Now they swarm in huge colonies, safe inside gigantic lumbering robots, sealed off from the outside world, communicating with it by tortuous indirect routes, manipulating it by remote control. They are in you and in me; they created us, body and mind; and their preservation is the ultimate rationale for our existence. They have come a long way, those replicators. Now they go by the name of genes, and we are their survival machines. ([Location 4932](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4932)) - Tags: [[pink]] - Dawkins was purveying an even more radical shift of perspective. He was not just nudging aside the human (and the hen) but the organism, in all its multifarious glory. How could biology not be the study of organisms? If anything, he understated the difficulty when he wrote, “It requires a deliberate mental effort to turn biology the right way up again, and remind ourselves that the replicators come first, in importance as well as in history.” ([Location 4956](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=4956)) - Tags: [[pink]] - Ideas have retained some of the properties of organisms. Like them, they tend to perpetuate their structure and to breed; they too can fuse, recombine, segregate their content; indeed they too can evolve, and in this evolution selection must surely play an important role. ([Location 5082](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5082)) - Tags: [[pink]] - The American neurophysiologist Roger Sperry had put forward a similar notion several years earlier, arguing that ideas are “just as real” as the neurons they inhabit. Ideas have power, he said. Ideas cause ideas and help evolve new ideas. They interact with each other and with other mental forces in the same brain, in neighboring brains, and thanks to global communication, in far distant, foreign brains. And they also interact with the external surroundings to produce in toto a burstwise advance in evolution that is far beyond anything to hit the evolutionary scene yet.… ([Location 5087](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5087)) - Tags: [[pink]] - Richard Dawkins made his own connection between the evolution of genes and the evolution of ideas. His essential actor was the replicator, and it scarcely mattered whether replicators were made of nucleic acid. His rule is “All life evolves by the differential survival of replicating entities.” Wherever there is life, there must be replicators. Perhaps on other worlds replicators could arise in a silicon-based chemistry—or in no chemistry at all. ([Location 5094](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5094)) - Tags: [[pink]] - For this bodiless replicator itself, Dawkins proposed a name. He called it the meme, and it became his most memorable invention, far more influential than his selfish genes or his later proselytizing against religiosity. “Memes propagate themselves in the meme pool by leaping from brain to brain via a process which, in the broad sense, can be called imitation,” he wrote. They compete with one another for limited resources: brain time or bandwidth. They compete most of all for attention. ([Location 5101](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5101)) - Tags: [[pink]] - Memes emerge in brains and travel outward, establishing beachheads on paper and celluloid and silicon and anywhere else information can go. They are not to be thought of as elementary particles but as organisms. The number three is not a meme; nor is the color blue, nor any simple thought, any more than a single nucleotide can be a gene. Memes are complex units, distinct and memorable—units with staying power. ([Location 5120](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5120)) - Tags: [[pink]] - “A wagon with spoked wheels carries not only grain or freight from place to place; it carries the brilliant idea of a wagon with spoked wheels from mind to mind.” ([Location 5126](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5126)) - Tags: [[pink]] - Dawkins’s way of speaking was not meant to suggest that memes are conscious actors, only that they are entities with interests that can be furthered by natural selection. Their interests are not our interests. “A meme,” Dennett says, “is an information packet with attitude.” When we speak of fighting for a principle or dying for an idea, we may be more literal than we know. “To die for an idea; it is unquestionably noble,” H. L. Mencken wrote. “But how much nobler it would be if men died for ideas that were true!” ([Location 5139](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5139)) - Tags: [[pink]] - Rhyme and rhythm help people remember bits of text. Or: rhyme and rhythm help bits of text get remembered. Rhyme and rhythm are qualities that aid a meme’s survival, just as strength and speed aid an animal’s. Patterned language has an evolutionary advantage. Rhyme, rhythm, and reason—for reason, too, is a form of pattern. ([Location 5144](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5144)) - Tags: [[pink]] - Like genes, memes have effects on the wide world beyond themselves: phenotypic effects. In some cases (the meme for making fire; for wearing clothes; for the resurrection of Jesus) the effects can be powerful indeed. As they broadcast their influence on the world, memes thus influence the conditions affecting their own chances of survival. The meme or memes composing Morse code had strong positive feedback effects. “I believe that, given the right conditions, replicators automatically band together to create systems, or machines, that carry them around and work to favour their continued replication,” wrote Dawkins. ([Location 5147](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5147)) - Tags: [[pink]] - When Dawkins first floated the meme meme, Nicholas Humphrey, an evolutionary psychologist, said immediately that these entities should be considered “living structures, not just metaphorically but technically”: When you plant a fertile meme in my mind you literally parasitize my brain, turning it into a vehicle for the meme’s propagation in just the way that a virus may parasitize the genetic mechanism of a host cell. And this isn’t just a way of talking—the meme for, say, “belief in life after death” is actually realized physically, millions of times over, as a structure in the nervous systems of individual men the world over. ([Location 5156](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5156)) - Tags: [[pink]] - Language serves as culture’s first catalyst. It supersedes mere imitation, spreading knowledge by abstraction and encoding. ([Location 5173](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5173)) - Tags: [[pink]] - They had thirty-three, all variants of a single letter, with mutations in the form of misspellings, omissions, and transposed words and phrases. “These letters have passed from host to host, mutating and evolving,” they reported. Like a gene, their average length is about 2,000 characters. Like a potent virus, the letter threatens to kill you and induces you to pass it on to your “friends and associates”—some variation of this letter has probably reached millions of people. Like an inheritable trait, it promises benefits for you and the people you pass it on to. Like genomes, chain letters undergo natural selection and sometimes parts even get transferred between coexisting “species.” Reaching beyond these appealing metaphors, they set out to use the letters as a “test bed” for algorithms used in evolutionary biology. The algorithms were designed to take the genomes of various modern creatures and work backward, by inference and deduction, to reconstruct their phylogeny—their evolutionary trees. If these mathematical methods worked with genes, the scientists suggested, they should work with chain letters, too. In both cases the researchers were able to verify mutation rates and relatedness measures. ([Location 5250](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5250)) - Tags: [[pink]] - “The human world is made of stories, not people,” writes David Mitchell. “The people the stories use to tell themselves are not to be blamed.” Margaret Atwood writes: “As with all knowledge, once you knew it, you couldn’t imagine how it was that you hadn’t known it before. Like stage magic, knowledge before you knew it took place before your very eyes, but you were looking elsewhere.” ([Location 5283](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5283)) - Tags: [[pink]] - Fred Dretske, a philosopher of mind and knowledge, wrote in 1981: “In the beginning there was information. The word came later.” He added this explanation: “The transition was achieved by the development of organisms with the capacity for selectively exploiting this information in order to survive and perpetuate their kind.” Now we might add, thanks to Dawkins, that the transition was achieved by the information itself, surviving and perpetuating its kind and selectively exploiting organisms. ([Location 5290](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5290)) - Tags: [[pink]] - For von Neumann, Gödel’s proof was a point of no return: It was a very serious conceptual crisis, dealing with rigor and the proper way to carry out a correct mathematical proof. In view of the earlier notions of the absolute rigor of mathematics, it is surprising that such a thing could have happened, and even more surprising that it could have happened in these latter days when miracles are not supposed to take place. Yet it did happen. ([Location 5316](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5316)) - Tags: [[pink]] - Why? Chaitin asked. He wondered if at some level Gödel’s incompleteness could be connected to that new principle of quantum physics, uncertainty, which smelled similar somehow. Later, the adult Chaitin had a chance to put this question to the oracular John Archibald Wheeler. Was Gödel incompleteness related to Heisenberg uncertainty? Wheeler answered by saying he had once posed that very question to Gödel himself, in his office at the Institute for Advanced Study—Gödel with his legs wrapped in a blanket, an electric heater glowing warm against the wintry drafts. Gödel refused to answer. In this way, Wheeler refused to answer Chaitin. ([Location 5321](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5321)) - Tags: [[pink]] - The common element was randomness, Chaitin suddenly thought. Shannon linked randomness, perversely, to information. Physicists had found randomness inside the atom—the kind of randomness that Einstein deplored by complaining about God and dice. All these heroes of science were talking about or around randomness. ([Location 5328](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5328)) - Tags: [[pink]] - “Chance is only the measure of our ignorance,” Henri Poincaré famously said. “Fortuitous phenomena are by definition those whose laws we do not know.” Immediately he recanted: “Is this definition very satisfactory? When the first Chaldean shepherds watched the movements of the stars, they did not yet know the laws of astronomy, but would they have dreamed of saying that the stars move at random?” For Poincaré, who understood chaos long before it became a science, examples of randomness included such phenomena as the scattering of raindrops, their causes physically determined but so numerous and complex as to be unpredictable. In physics—or wherever natural processes seem unpredictable—apparent randomness may be noise or may arise from deeply complex dynamics. ([Location 5338](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5338)) - Tags: [[pink]] - Shannon’s first formulation of information theory treated messages statistically, as choices from the ensemble of all possible messages—in the case of A and B, 250 of them. But Shannon also considered redundancy within a message: the pattern, the regularity, the order that makes a message compressible. The more regularity in a message, the more predictable it is. The more predictable, the more redundant. The more redundant a message is, the less information it contains. The telegraph operator sending message A has a shortcut: he can transmit something like “Repeat ‘01’ twenty-five times.” For longer messages with easy patterns, the savings in keystrokes becomes enormous. Once the pattern is clear, the extra characters are free. The operator for message B must soldier on the hard way, sending every character, because every character is a complete surprise; every character costs one bit. This pair of questions—how random and how much information—turn out to be one and the same. They have a single answer. ([Location 5397](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5397)) - Tags: [[pink]] - The universal computer gave him a nice way to distinguish between numbers like Alice and Bob’s A and B. He could write a program to make a Turing machine print out “010101 …” a million times, and he could write down the length of that program—quite short. But given a million random digits—no pattern, no regularity, nothing special at all—there could be no shortcut. The computer program would have to incorporate the entire number. To make the IBM mainframe print out those million digits, he would have to put the whole million digits into the punched cards. To make the Turing machine do it, he would still need the million digits for input. ([Location 5426](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5426)) - Tags: [[pink]] - But why do we say Π is not random? Chaitin proposed a clear answer: a number is not random if it is computable—if a definable computer program will generate it. Thus computability is a measure of randomness. ([Location 5439](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5439)) - Tags: [[pink]] - For Turing computability was a yes-or-no quality—a given number either is or is not. But we would like to say that some numbers are more random than others—they are less patterned, less orderly. Chaitin said the patterns and the order express computability. Algorithms generate patterns. So we can gauge computability by looking at the size of the algorithm. Given a number—represented as a string of any length—we ask, what is the length of the shortest program that will generate it? Using the language of a Turing machine, that question can have a definite answer, measured in bits. Chaitin’s algorithmic definition of randomness also provides an algorithmic definition of information: the size of the algorithm measures how much information a given string contains. ([Location 5440](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5440)) - Tags: [[pink]] - The eighteen-year-old Chaitin felt this was no accident. He ended this first paper by applying algorithmic information theory to the process of science itself. “Consider a scientist,” he proposed, “who has been observing a closed system that once every second either emits a ray of light or does not.” He summarizes his observations in a sequence of 0s and 1s in which a 0 represents “ray not emitted” and a 1 represents “ray emitted.” The sequence may start 0110101110 … and continue for a few thousand more bits. The scientist then examines the sequence in the hope of observing some kind of pattern or law. What does he mean by this? It seems plausible that a sequence of 0s and 1s is patternless if there is no better way to calculate it than just by writing it all out at once from a table giving the whole sequence. But if the scientist could discover a way to produce the same sequence with an algorithm, a computer program significantly shorter than the sequence, then he would surely know the events were not random. He would say that he had hit upon a theory. This is what science always seeks: a simple theory that accounts for a large set of facts and allows for prediction of events still to come. It is the famous Occam’s razor. ([Location 5447](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5447)) - Tags: [[pink]] - How much information is contained in a given “finite object”? An object could be a number (a series of digits) or a message or a set of data. He described three approaches: the combinatorial, the probabilistic, and the algorithmic. The first and second were Shannon’s, with refinements. They focused on the probability of one object among an ensemble of objects—one particular message, say, chosen from a set of possible messages. How would this work, Kolmogorov wondered, when the object was not just a symbol in an alphabet or a lantern in a church window but something big and complicated—a genetic organism, or a work of art? How would one measure the amount of information in Tolstoy’s War and Peace? “Is it possible to include this novel in a reasonable way in the set of ‘all possible novels’ and further to postulate the existence of a certain probability distribution in this set?” he asked. Or could one measure the amount of genetic information in, say, the cuckoo bird by considering a probability distribution in the set of all possible species? His third approach to measuring information—the algorithmic—avoided the difficulties of starting with ensembles of possible objects. It focused on the object itself.* Kolmogorov introduced a new word for the thing he was trying to measure: complexity. As he defined this term, the complexity of a number, or message, or set of data is the inverse of simplicity and order and, once again, it corresponds to information. The simpler an object is, the less information it conveys. The more complexity, the more information. And, just as Gregory Chaitin did, Kolmogorov put this idea on a solid mathematical footing by calculating complexity in terms of algorithms. The complexity of an object is the size of the smallest computer program needed to generate it. An object that can be produced by a short algorithm has little complexity. On the other hand, an object needing an algorithm every bit as long as the object itself has maximal complexity. A simple object can be generated—or computed, or described—with just a few bits. A complex object requires an algorithm of many bits. ([Location 5511](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5511)) - Tags: [[pink]] - The Kolmogorov complexity of an object is the size, in bits, of the shortest algorithm needed to generate it. This is also the amount of information. And it is also the degree of randomness—Kolmogorov declared “a new conception of the notion ‘random’ corresponding to the natural assumption that randomness is the absence of regularity.” The three are fundamentally equivalent: information, randomness, and complexity—three powerful abstractions, bound all along like secret lovers. ([Location 5534](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5534)) - Tags: [[pink]] - For Kolmogorov, these ideas belonged not only to probability theory but also to physics. To measure the complexity of an orderly crystal or a helter-skelter box of gas, one could measure the shortest algorithm needed to describe the state of the crystal or gas. Once again entropy was the key. ([Location 5538](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5538)) - Tags: [[pink]] - he began laying the groundwork for the renaissance in chaos theory to come in the 1970s: analyzing dynamical systems in terms of entropy and information dimension. It made sense now to say that a dynamical system produces information. If it is unpredictable, it produces a great deal of information. ([Location 5543](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5543)) - Tags: [[pink]] - Chaitin and Kolmogorov revived Berry’s paradox in inventing algorithmic information theory. An algorithm names a number. “The paradox originally talks about English, but that’s much too vague,” Chaitin says. “I pick a computer-programming language instead.” Naturally he picks the language of a universal Turing machine. And then what does it mean, how do you name an integer? Well, you name an integer by giving a way to calculate it. A program names an integer if its output is that integer—you know, it outputs that integer, just one, and then it stops. Asking whether a number is interesting is the inverse of asking whether it is random. If the number n can be computed by an algorithm that is relatively short, then n is interesting. If not, it is random. The algorithm PRINT 1 AND THEN PRINT 100 ZEROES generates an interesting number (a googol). Similarly, FIND THE FIRST PRIME NUMBER, ADD THE NEXT PRIME NUMBER, AND REPEAT A MILLION TIMES generates a number that is interesting: the sum of the first million primes. It would take a Turing machine a long time to compute that particular number, but a finite time nonetheless. The number is computable. But if the most concise algorithm for n is “PRINT [n]”—an algorithm incorporating the entire number, with no shorthand—then we may say that there is nothing interesting about n. In Kolmogorov’s terms, this number is random—maximally complex. It will have to be patternless, because any pattern would provide a way to devise a shorthand algorithm. “If there is a small, concise computer program that calculates the number, that means it has some quality or characteristic that enables you to pick it out and to compress it into a smaller algorithmic description,” Chaitin says. “So that’s unusual; that’s an interesting number.” ([Location 5574](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5574)) - Tags: [[pink]] - “In spite of incompleteness and uncomputability and even algorithmic randomness,” he said, “mathematicians don’t want to give up absolute certainty. Why? Well, absolute certainty is like God.” ([Location 5621](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5621)) - Tags: [[pink]] - Among its lessons were these: Most numbers are random. Yet very few of them can be proved random. A chaotic stream of information may yet hide a simple algorithm. Working backward from the chaos to the algorithm may be impossible. Kolmogorov-Chaitin (KC) complexity is to mathematics what entropy is to thermodynamics: the antidote to perfection. Just as we can have no perpetual-motion machines, there can be no complete formal axiomatic systems. Some mathematical facts are true for no reason. They are accidental, lacking a cause or deeper meaning. ([Location 5627](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5627)) - Tags: [[pink]] - Joseph Ford, a physicist studying the behavior of unpredictable dynamical systems in the 1980s, said that Chaitin had “charmingly captured the essence of the matter” by showing the path from Gödel’s incompleteness to chaos. This was the “deeper meaning of chaos,” Ford declared: Chaotic orbits exist but they are Gödel’s children, so complex, so overladen with information that humans can never comprehend them. But chaos is ubiquitous in nature; therefore the universe is filled with countless mysteries that man can never understand. ([Location 5631](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5631)) - Tags: [[pink]] - the Information Packing Problem: how much information could one “pack” into a given number of bits, or conversely, given some information, how could one pack it into the fewest possible bits. ([Location 5650](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5650)) - Tags: [[pink]] - When humans or computers learn from experience, they are using induction: recognizing regularities amid irregular streams of information. From this point of view, the laws of science represent data compression in action. A theoretical physicist acts like a very clever coding algorithm. “The laws of science that have been discovered can be viewed as summaries of large amounts of empirical data about the universe,” wrote Solomonoff. “In the present context, each such law can be transformed into a method of compactly coding the empirical data that gave rise to that law.” A good scientific theory is economical. ([Location 5661](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5661)) - Tags: [[pink]] - Random sequences are “normal”—a term of art meaning that on average, in the long run, each digit appears exactly as often as the others, one time in ten; and each pair of digits, from 00 to 99, appears one time in a hundred; and each triplet likewise, and so on. No string of any particular length is more likely to appear than any other string of that length. ([Location 5699](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5699)) - Tags: [[pink]] - In a way, then, the use of minimal program size to define complexity seems perfect—a fitting apogee for Shannon information theory. In another way it remains deeply unsatisfying. This is particularly so when turning to the big questions—one might say, the human questions—of art, of biology, of intelligence. According to this measure, a million zeroes and a million coin tosses lie at opposite ends of the spectrum. The empty string is as simple as can be; the random string is maximally complex. The zeroes convey no information; coin tosses produce the most information possible. Yet these extremes have something in common. They are dull. They have no value. If either one were a message from another galaxy, we would attribute no intelligence to the sender. If they were music, they would be equally worthless. Everything we care about lies somewhere in the middle, where pattern and randomness interlace. ([Location 5787](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5787)) - Tags: [[pink]] - The amount of work it takes to compute something had been mostly disregarded—set aside—in all the theorizing based on Turing machines, which work, after all, so ploddingly. Bennett brought it back. There is no logical depth in the parts of a message that are sheer randomness and unpredictability, nor is there logical depth in obvious redundancy—plain repetition and copying. Rather, he proposed, the value of a message lies in “what might be called its buried redundancy—parts predictable only with difficulty, things the receiver could in principle have figured out without being told, but only at considerable cost in money, time, or computation.” When we value an object’s complexity, or its information content, we are sensing a lengthy hidden computation. This might be true of music or a poem or a scientific theory or a crossword puzzle, which gives its solver pleasure when it is neither too cryptic nor too shallow, but somewhere in between. ([Location 5802](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5802)) - Tags: [[pink]] - In 1989 he offered his final catchphrase: It from Bit. His view was extreme. It was immaterialist: information first, everything else later. “Otherwise put,” he said, every it—every particle, every field of force, even the space-time continuum itself—derives its function, its meaning, its very existence … from bits. Why does nature appear quantized? Because information is quantized. The bit is the ultimate unsplittable particle. ([Location 5850](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5850)) - Tags: [[pink]] - IBM did not manufacture Turing machines, of course. But at some point it dawned on Bennett that a special-purpose Turing machine had already been found in nature: namely RNA polymerase. He had learned about polymerase directly from Watson; it is the enzyme that crawls along a gene—its “tape”—transcribing the DNA. It steps left and right; its logical state changes according to the chemical information written in sequence; and its thermodynamic behavior can be measured. ([Location 5913](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5913)) - Tags: [[pink]] - Landauer tried in 1961 to prove von Neumann’s formula for the cost of information processing and discovered that he could not. On the contrary, it seemed that most logical operations have no entropy cost at all. When a bit flips from zero to one, or vice-versa, the information is preserved. The process is reversible. Entropy is unchanged; no heat needs to be dissipated. Only an irreversible operation, he argued, increases entropy. ([Location 5929](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5929)) - Tags: [[pink]] - He confirmed that a great deal of computation can be done with no energy cost at all. In every case, Bennett found, heat dissipation occurs only when information is erased. Erasure is the irreversible logical operation. When the head on a Turing machine erases one square of the tape, or when an electronic computer clears a capacitor, a bit is lost, and then heat must be dissipated. In Szilárd’s thought experiment, the demon does not incur an entropy cost when it observes or chooses a molecule. The payback comes at the moment of clearing the record, when the demon erases one observation to make room for the next. Forgetting takes work. ([Location 5935](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5935)) - Tags: [[pink]] - Information is physical. It is no use talking about quantum states without considering the information about the quantum states. Information is not just an abstract notion. It requires a physical carrier, and the latter is (approximately) localized. After all, it was the business of the Bell Telephone Company to transport information from one telephone to another telephone, in a different location. … When Alice measures her spin, the information she gets is localized at her position, and will remain so until she decides to broadcast it. Absolutely nothing happens at Bob’s location.… It is only if and when Alice informs Bob of the result she got (by mail, telephone, radio, or by means of any other material carrier, which is naturally restricted to the speed of light) that Bob realizes that his particle has a definite pure state. For that matter, Christopher Fuchs argues that it is no use talking about quantum states at all. The quantum state is a construct of the observer—from which many troubles spring. Exit states; enter information. “Terminology can say it all: A practitioner in this field, whether she has ever thought an ounce about quantum foundations, is just as likely to say ‘quantum information’ as ‘quantum state’…‘What does the quantum teleportation protocol do?’ A now completely standard answer would be: ‘It transfers quantum information from Alice’s site to Bob’s.’ What we have here is a change of mind-set.” ([Location 6013](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6013)) - Tags: [[pink]] - He knew very well what the problem was for computation—for simulating quantum physics with a computer. The problem was probability. Every quantum variable involved probabilities, and that made the difficulty of computation grow exponentially. “The number of information bits is the same as the number of points in space, and therefore you’d have to have something like NN configurations to be described to get the probability out, and that’s too big for our computer to hold.… It is therefore impossible, according to the rules stated, to simulate by calculating the probability.” So he proposed fighting fire with fire. “The other way to simulate a probabilistic Nature, which I’ll call N for the moment, might still be to simulate the probabilistic Nature by a computer C which itself is probabilistic.” A quantum computer would not be a Turing machine, he said. It would be something altogether new. “Feynman’s insight,” says Bennett, “was that a quantum system is, in a sense, computing its own future all the time. You may say it’s an analog computer of its own dynamics.” Researchers quickly realized that if a quantum computer had special powers in cutting through problems in simulating physics, it might be able to solve other types of intractable problems as well. The power comes from that shimmering, untouchable object the qubit. The probabilities are built in. Embodying a superposition of states gives the qubit more power than the classical bit, always in only one state or the other, zero or one, “a pretty miserable specimen of a two-dimensional vector,” as David Mermin says. ([Location 6046](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6046)) - Tags: [[pink]] - their power also makes them exceedingly difficult to work with. Extracting information from a system means observing it, and observing a system means interfering with the quantum magic. Qubits cannot be watched as they do their exponentially many operations in parallel; measuring that shadow-mesh of possibilities reduces it to a classical bit. Quantum information is fragile. The only way to learn the result of a computation is to wait until after the quantum work is done. ([Location 6109](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6109)) - Tags: [[pink]] - It was John Wheeler who left behind an agenda for quantum information science—a modest to-do list for the next generation of physicists and computer scientists together: “Translate the quantum versions of string theory and of Einstein’s geometrodynamics from the language of continuum to the language of bit,” he exhorted his heirs. “Survey one by one with an imaginative eye the powerful tools that mathematics—including mathematical logic—has won … and for each such technique work out the transcription into the world of bits.” And, “From the wheels-upon-wheels-upon-wheels evolution of computer programming dig out, systematize and display every feature that illuminates the level-upon-level-upon-level structure of physics.” And, “Finally. Deplore? No, celebrate the absence of a clean clear definition of the term ‘bit’ as elementary unit in the establishment of meaning.… If and when we learn how to combine bits in fantastically large numbers to obtain what we call existence, we will know better what we mean both by bit and by existence.” This is the challenge that remains, and not just for scientists: the establishment of meaning. ([Location 6126](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6126)) - Tags: [[pink]] - Edgar Allan Poe, following Babbage’s work eagerly, saw the point. “No thought can perish,” he wrote in 1845, in a dialogue between two angels. “Did there not cross your mind some thought of the physical power of words? Is not every word an impulse on the air?” Further, every impulse vibrates outward indefinitely, “upward and onward in their influences upon all particles of all matter,” until it must, “in the end, impress every individual thing that exists within the universe.” Poe was also reading Newton’s champion Pierre-Simon Laplace. “A being of infinite understanding,” wrote Poe, “—one to whom the perfection of the algebraic analysis lay unfolded” could trace the undulations backward to their source. ([Location 6181](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6181)) - Tags: [[pink]] - The universe, which others called a library or an album, then came to resemble a computer. Alan Turing may have noticed this first: observing that the computer, like the universe, is best seen as a collection of states, and the state of the machine at any instant leads to the state at the next instant, and thus all the future of the machine should be predictable from its initial state and its input signals. The universe is computing its own destiny. Turing noticed that Laplace’s dream of perfection might be possible in a machine but not in the universe, because of a phenomenon which, a generation later, would be discovered by chaos theorists and named the butterfly effect. ([Location 6220](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6220)) - Tags: [[pink]] - A useful term of art emerged from computer science: namespace, a realm within which all names are distinct and unique. The world has long had namespaces based on geography and other namespaces based on economic niche. You could be Bloomingdale’s as long as you stayed out of New York; you could be Ford if you did not make automobiles. ([Location 6439](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6439)) - Tags: [[pink]] - The information produced and consumed by humankind used to vanish—that was the norm, the default. The sights, the sounds, the songs, the spoken word just melted away. Marks on stone, parchment, and paper were the special case. It did not occur to Sophocles’ audiences that it would be sad for his plays to be lost; they enjoyed the show. Now expectations have inverted. Everything may be recorded and preserved, at least potentially: every musical performance; every crime in a shop, elevator, or city street; every volcano or tsunami on the remotest shore; every card played or piece moved in an online game; every rugby scrum and cricket match. Having a camera at hand is normal, not exceptional; something like 500 billion images were captured in 2010. YouTube was streaming more than a billion videos a day. ([Location 6543](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6543)) - Tags: [[pink]] - It is finally natural—even inevitable—to ask how much information is in the universe. It is the consequence of Charles Babbage and Edgar Allan Poe saying, “No thought can perish.” Seth Lloyd does the math. He is a moon-faced, bespectacled quantum engineer at MIT, a theorist and designer of quantum computers. The universe, by existing, registers information, he says. By evolving in time, it processes information. How much? To figure that out, Lloyd takes into account how fast this “computer” works and how long it has been working. Considering the fundamental limit on speed, operations per second (“where E is the system’s average energy above the ground state and = 1.0545 × 10−34 joule-sec is Planck’s reduced constant”), and on memory space, limited by entropy to S/kB ln 2 (“where S is the system’s thermodynamic entropy and kB = 1.38 × 10−23 joules/K is Boltzmann’s constant”), along with the speed of light and the age of the universe since the Big Bang, Lloyd calculates that the universe can have performed something on the order of 10120 “ops” in its entire history. Considering “every degree of freedom of every particle in the universe,” it could now hold something like 1090 bits. And counting. ([Location 6551](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6551)) - Tags: [[pink]] - In 1963, reading the warnings of the president of the American Historical Association, Eisenstein found herself agreeing that the profession faced a crisis, of sorts. But she felt Bridenbaugh had it exactly backward. He thought the problem was forgetfulness: “As I see it,” he said dramatically, “mankind is faced with nothing short of the loss of its memory, and this memory is history.” Eisenstein, looking at the same new information technologies that so troubled older historians, drew the opposite lesson. The past is not receding from view but, on the contrary, becoming more accessible and more visible. “In an age that has seen the deciphering of Linear B and the discovery of the Dead Sea Scrolls,” she wrote, “there appears to be little reason to be concerned about ‘the loss of mankind’s memory.’ There are good reasons for being concerned about the overloading of its circuits.” As for the amnesia lamented by Bridenbaugh and so many of his colleagues: This is a misreading of the predicament confronting historians today. It is not the onset of amnesia that accounts for present difficulties but a more complete recall than any prior generation has ever experienced. Steady recovery, not obliteration, accumulation, rather than loss, have led to the present impasse. From her point of view, a five-centuries-old communications revolution was still gathering momentum. How could they not see this? ([Location 6609](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6609)) - Tags: [[pink]] - Deluge became a common metaphor for people describing information surfeit. There is a sensation of drowning: information as a rising, churning flood. Or it calls to mind bombardment, data impinging in a series of blows, from all sides, too fast. Fear of the cacophony of voices can have a religious motivation, a worry about secular noise overwhelming the truth. T. S. Eliot expressed that in 1934:           Knowledge of speech, but not of silence;           Knowledge of words, and ignorance of the Word.           All our knowledge brings us nearer to our ignorance,           All our ignorance brings us nearer to death,           But nearness to death no nearer to GOD. ([Location 6642](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6642)) - Tags: [[pink]] - The humanist and philosopher of technology Lewis Mumford, for example, restated it in 1970: “Unfortunately, ‘information retrieving,’ however swift, is no substitute for discovering by direct personal inspection knowledge whose very existence one had possibly never been aware of, and following it at one’s own pace through the further ramification of relevant literature.” He begged for a return to “moral self-discipline.” There is a whiff of nostalgia in this sort of warning, along with an undeniable truth: that in the pursuit of knowledge, slower can be better. Exploring the crowded stacks of musty libraries has its own rewards. Reading—even browsing—an old book can yield sustenance denied by a database search. Patience is a virtue, gluttony a sin. ([Location 6665](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6665)) - Tags: [[pink]] - One worker in the area was Siegfried Streufert, who reported in a series of papers in the 1960s that the relation between information load and information handling typically looked like an “inverted U”: more information was helpful at first, then not so helpful, and then actually harmful. One of his studies took 185 university students (all male) and had them pretend to be commanders making decisions in a tactical game. They were told: The information you are receiving is prepared for you in the same way it would be prepared for real commanders by a staff of intelligence officers.… You may instruct these intelligence officers to increase or decrease the amount of information they present to you.… Please check your preference: I would prefer to: receive much more information receive a little more information receive about the same amount of information receive a little less information receive much less information. No matter what they chose, their preferences were ignored. The experimenter, not the subjects, predetermined the amount of information. Streufert concluded from the data that “superoptimal” information loads caused poor performance, “yet it should be noted that even at highly superoptimal information loads (i.e., 25 messages per 30-minute period), the subjects are still asking for increased information levels.” Later, he used similar methodology to study the effects of drinking too much coffee. ([Location 6696](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6696)) - Tags: [[pink]] - It comes to us instantly, or at light speed. It is a symptom of omniscience. It is what the critic Alex Ross calls the Infinite Playlist, and he sees how mixed is the blessing: “anxiety in place of fulfillment, an addictive cycle of craving and malaise. No sooner has one experience begun than the thought of what else is out there intrudes.” The embarrassment of riches. Another reminder that information is not knowledge, and knowledge is not wisdom. ([Location 6761](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6761)) - Tags: [[pink]] - Strategies emerge for coping. There are many, but in essence they all boil down to two: filter and search. The harassed consumer of information turns to filters to separate the metal from the dross; filters include blogs and aggregators—the choice raises issues of trust and taste. The need for filters intrudes on any thought experiment about the wonders of abundant information. When Dennett imagined his Complete Poetry Network, he saw the problem. “The obvious counterhypothesis arises from population memetics,” he said. “If such a network were established, no poetry lover would be willing to wade through thousands of electronic files filled with doggerel, looking for good poems.” Filters would be needed—editors and critics. “They flourish because of the short supply and limited capacity of minds, whatever the transmission media between minds.” When information is cheap, attention becomes expensive. For the same reason, mechanisms of search—engines, in cyberspace—find needles in haystacks. By now we’ve learned that it is not enough for information to exist. A “file” was originally—in sixteenth-century England—a wire on which slips and bills and notes and letters could be strung for preservation and reference. Then came file folders, file drawers, and file cabinets; then the electronic namesakes of all these; and the inevitable irony. Once a piece of information is filed, it is statistically unlikely ever to be seen again by human eyes. Even in 1847, Augustus De Morgan, Babbage’s friend, knew this. For any random book, he said, a library was no better than a wastepaper warehouse. “Take the library of the British Museum, for instance, valuable and useful and accessible as it is: what chance has a work of being known to be there, merely because it is there? If it be wanted, it can be asked for; but to be wanted it must be known. Nobody can rummage the library.” Too much information, and so much of it lost. An unindexed Internet site is in the same limbo as a misshelved library book. This is why the successful and powerful business enterprises of the information economy are built on filtering and searching. Even Wikipedia is a combination of the two: powerful search, mainly driven by Google, and a vast, collaborative filter, striving to gather the true facts and screen out the false ones. Searching and filtering are all that stand between this world and the Library of Babel. ([Location 6764](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6764)) - Tags: [[pink]] - We are today as far into the electric age as the Elizabethans had advanced into the typographical and mechanical age. And we are experiencing the same confusions and indecisions which they had felt when living simultaneously in two contrasted forms of society and experience. ([Location 6810](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6810)) - Tags: [[blue]] - “Today,” he wrote, “we have extended our central nervous systems in a global embrace, abolishing both space and time as far as our planet is concerned. Rapidly, we approach the final phase of the extensions of man—the technological simulation of consciousness, when the creative process of knowing will be collectively and corporately extended to the whole of human society.” ([Location 6816](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6816)) - Tags: [[blue]] - On the contrary, the idea of perfection is contrary to the nature of language. Information theory has helped us understand that—or, if you are a pessimist, forced us to understand it. “We are forced to see,” Palmer continues, that words are not themselves ideas, but merely strings of ink marks; we see that sounds are nothing more than waves. In a modern age without an Author looking down on us from heaven, language is not a thing of definite certainty, but infinite possibility; without the comforting illusion of meaningful order we have no choice but to stare into the face of meaningless disorder; without the feeling that meaning can be certain, we find ourselves overwhelmed by all the things that words might mean. ([Location 6904](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6904)) - Tags: [[blue]] - Then came Google. Brin and Page moved their fledgling company from their Stanford dorm rooms into offices in 1998. Their idea was that cyberspace possessed a form of self-knowledge, inherent in the links from one page to another, and that a search engine could exploit this knowledge. As other scientists had done before, they visualized the Internet as a graph, with nodes and links: by early 1998, 150 million nodes joined by almost 2 billion links. They considered each link as an expression of value—a recommendation. And they recognized that all links are not equal. They invented a recursive way of reckoning value: the rank of a page depends on the value of its incoming links; the value of a link depends on the rank of its containing page. Not only did they invent it, they published it. Letting the Internet know how Google worked did not hurt Google’s ability to leverage the Internet’s knowledge. ([Location 6972](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6972)) - Tags: [[blue]]