Friday, November 11, 2016

the invention of science

I am behind, and in particular I have a backlog of Kindle books in a folder (okay, Kindle calls them "collections") that I keep specifically for Kindle books I've read but haven't yet transcribed notes from -- some for work, some for here, some for fiction projects.

On top of that, I had an ongoing series of thoughts about the election building up, mostly in the form of "hoo geez" and "you gotta be kiddin me" thought balloons popping up in response to things other people said as the election first loomed and then happened. One of the purposes of this weirdly multipurpose blog is to be a safety valve for social media, so that I vent here instead of there, so I was going to rail about people who ask their Trump supporting friends to please stop saying their nasty things in front of them (instead of, you know, actually confronting those friends on their racism, xenophobia, and other abominable beliefs - today's "stop telling racist jokes in front of me, teehee"), and about the "Bernie coulda won it, I tells ya" narrative, and the prematurity of all the other hot takes.

But I think I am, as you must be, too weary of reading postmortem analysis and reactions. Which is not to say I am not engaged - just the opposite, but the last thing I feel like doing right now is swatting down nonsense just to vent about it, because it doesn't feel like it would serve any therapeutic point this time.

I will say that my instinct as a historian says that the dumbest thing you can do is to marry yourself to some model of "what happened" that you read or devise in the first week or two after the election, because you're just going to weigh new data and new models against the one that you've "picked," even though the only reason you've picked it is, ultimately, because of the imagined need for an immediate explanation -- the irrational belief that an incomplete or misleading explanation today is better than an accurate explanation tomorrow. This is a small part of what makes teaching history to people so hard. Among other things, history sometimes includes things they have lived through, and that gives them a remarkable capacity to believe that living as a bystander to an event with very, very small access to data makes them an expert, resistant to the view of the event that has developed in retrospect. This is, of course, a bigger problem than just in the realm of getting people to understand history, but everybody walks down different hallways of the house, and this is one of mine.

Moving on away from the present, away from politics, let me start by finishing off this blog entry on a history of the scientific revolution.

When I started at Hampshire - before I started, actually, at the open house or the pitch at the interview or somewhere in that process - one of the ways they explained the whole approach to things there was by explaining the Div III, which is a mandatory thesis-like project. "See, because there are no majors at Hampshire, and no minors, you can take the things you're interested in and pursue them jointly instead of separately. You don't have to major in psychology and minor in art history. You can do your Div III on the history of the impact of psychological ideas on the visual arts."

I made up that example, but it's a representative one. It's a decent pitch, especially aimed at high school kids (and the parents thereof) who have probably been starved for any kind of serious intellectual or creative stimulus, at least within the bounds of their classrooms.

In the end, though, I had trouble living it out. I wanted to study psychology AND art history. Not just their intersection. I wanted to know about the parts of psychology that had absolutely no impact on art history, and the parts of art history that had nothing to do with psychology. The freedom to cross the streams was well and good, but I quickly took it for granted and wanted to be able to not have to cross the streams.

For instance, I wound up "majoring" - not that Hampshire has majors - in pop culture, but had strong interests in cognitive science and the then-novel study of online culture and communities. What I had zero interest in was combining any of those things for my Div III, which was -- at least in draft form, since I dropped out and transferred before completing it -- a long and rambling thesis on representations of superheroes in comic books and other media, which if it was grounded in anything was grounded more in literary theory and gender studies than cognitive science (though I led up to it with a fifty-page research paper on the role of nostalgia in the history and appeal of Batman, which is at least cogsci-adjacent), and had no overlap with the work I had done on online communities.

So like I've covered before, I have a variegated academic background, and as a grad student in an interdisciplinary program, I took a number of courses on the history and philosophy of science -- if there was such a thing as a "graduate minor," that would pretty much be mine, not because HPS and religious studies are two key branches in the history of ideas -- which would be a valid reason -- but simply because HPS also interested me, even though at that point in my academic career I was supposed to be narrowing my interests, not expanding them.

So it's something I try to keep up with, albeit not at as high a priority as religious studies.

David Wootton's The Invention of Science: A New History of the Scientific Revolution was published with a good deal of fanfare last fall. I'm generally reluctant to jump on the New Release section, for a number of reasons:

1: New releases on scholarly topics are, when they're good work, ultimately of the most value to working scholars in those areas, because those are the readers who are versed in the conversation. After all, if they have something significant to say, other scholars working in that area are going to need to respond to it, and won't have had time yet.

2: New releases on broad historical topics don't always have a particularly good reason to exist, apart from everyone in the field already having read the canonical works on the topic, and those works maybe being out of print. The existence of a new book on the topic is not evidence of the existence of new material. This is a constant source of frustration for me when I write about history, because publishers often stipulate in their style guide that at least half of your sources need to have been published within the last X years, where X is a fairly small number; it is very rarely the case with history that half of the good sources, or a tenth of the good sources, are that recent, especially when you're writing for a general audience and have no need to cite recent journal articles on minor points. (If you're wondering: yes, as a result I have to pad the bibliography with inferior recent work, or recent work that is good but not as directly related to what I'm writing about, in order to balance out the number of older but necessary sources I use.)

3: There is a ... boy, how do I not sound like an asshole here. There is a certain kind of reader I don't want to be. A certain kind of thinker I don't want to be, who's read Malcolm Gladwell and Jared Diamond, both of whom are moderately to completely awful, but nothing that's more than a couple decades old. That's the kind of reading diet that leads to a particularly shallow understanding of things -- but like Gladwell's work in general, to pick on him a little more, it's not a diet that's really designed for actual understanding so much as the satisfied feeling of the illusion of understanding, a junk food eureka.

I made an exception here because the reviews were enough to convince me that #2 and #3 weren't concerns, but that sort of puts a responsibility on me -- even if I'm the only one who perceives or cares about that responsibility -- to keep track of the Invention of Science conversation in the next few years, so I don't make the same mistake I discussed above in my discussion of politics.

Wootton's book traces what he calls the invention of modern science, "between 1572, when Tycho Brahe saw a nova, or new star, and 1704, when Newton published his Opticks... There were systems of knowledge we call 'sciences' before 1572, but the only one which functioned remotely like a modern science, in that it had sophisticated theories based on a substantial body of evidence and could make reliable predictions, was astronomy, and it was astronomy that was transformed in the years after 1572 into the first true science."

This idea, that science before this period was distinct from modern science, is key, and is part of a broader shift in thinking that affected not just the physical sciences but all scholarly pursuits. For that matter, it's not a coincidence that westerners don't really talk about "fiction" as such until the Scientific Revolution, when "nonfiction" becomes rigorously defined. I have banged my head against the wall repeatedly trying to get people to understand this. Obviously I'm not saying that no one wrote any fictitious stories before a certain point in time -- but the modern reader, who thinks of one set of shelves on the library as nonfiction, and another as fiction, is a fairly recent creature. When people describe the Bible or stories in the midrash, for instance, as "fiction," they're imposing a modern frame that didn't exist for the people who created and first received those texts. This isn't splitting hairs, because it's just as important to oppose the fundamentalists who insist that, if the Bible isn't fiction, it is therefore literally true. Again: it's not a coincidence that this claim became as popular when it did, and became politicized, at the point in time that it did. Both of these claims -- that the Bible is fiction, that the Bible is literally true -- are implicitly based on false premises about the nature of sacred texts in antiquity.

To use another example, history as we know it -- that is, the field of history as we know it -- is remarkably recent, in the sense that the idea that the goal of the historian should be to accurately record or recount the details of historical events, drawing on evidence wherever possible, dates to about the Enlightenment, at least in the West. This is kind of fucking crazy, and I feel like people who haven't taken historiography in grad school don't fully believe me when I talk about it, but it goes to Wootton's point. We take scientific thinking for granted now, to an amazing degree -- when you read fantasy novels and whatnot, a ridiculous level of scientific thinking is often ascribed to members of civilizations who would not necessarily be in a position to have developed it, for no apparent reason other than the fact that basic elements of this thinking have so permeated modern thought that it is taken for granted.

The layman often thinks of the history of science as a series of discoveries, rather than creations of different ways of thinking -- and doesn't usually have a good way to explain that most of the scientists famous now for major contributions also pursued wrong avenues (like Newton's extensive work in alchemy) or had no trouble accepting things that would be easily disproven (old beliefs in biology are full of this, and I don't just mean beliefs about race or gender that are motivated by politics and power structure -- there was a simple lack of rigor and attention to detail, by modern standards). If you think science is just a timeline of discoveries, you think the only difference between a clever person in 2016 and a clever person in 1316 is that the clever person in 1316 lives in a world where a bunch of shit hasn't been discovered yet, but that the two basically see their respective worlds the same way and deal with new information the same way, and this is enormously wrong. Even our ability to realize this is not an ability available to all the clever people in history.

One of the ideas that is ingrained in the modern mind that Wootton tackles straight off the bat is the idea of scientific progress. People may debate whether or not history itself "progresses" -- after all, everyone bitches and moans about trivial horseshit like "oh, the kids should still learn cursive" because they're attached to the first Polaroid of the world they watched come into focus -- but everyone today basically sees science as constantly moving forward.

This is a very new idea.

As Wootton points out, until the period he's talking about, not only did people not conceive of "the history of humanity ... as a history of progress," but the rate of technological advancement wasn't just slow, it sometimes went backwards. "The Romans were amazed by stories of what Archimedes had been able to do; and fifteenth-century Italian architects explored the ruined buildings of ancient Rome convinced that they were studying a far more advanced civilization of their own." Technologies were developed, lost, forgotten. This is inconceivable now -- so much so that the "lost technology" trope, when it pops up in popular culture, refers not to sophisticated architecture but to ancient electronics, pyramid magic, and other modern or futuristic technologies transposed to an ancient setting.

Wootton also points to the way Shakespeare and his contemporaries depict ancient Rome technologically identical to Renaissance Europe, with mechanical clocks and nautical compasses. In Borges's words, "all characters are treated as if they were Shakespeare's contemporaries. Shakespeare felt the variety of men, but not the variety of historical eras. History did not exist for him." As Wootton points out, this is a misleading charge -- Shakespeare was well versed in history as history was understood in his day. What he lacked was an understanding of historical change of the sort that we now treat as synonymous with "history."

"We might think that gunpowder, the printing press, and the discovery of America in 1492 should have obliged the Renaissance to acquire a sense of the past as lost and gone for ever, but the educated only slowly became aware of the irreversible consequences that flowed from these crucial innovations. It was only with hindsight that they came to symbolize a new era; and it was the Scientific Revolution itself which was chiefly responsible for the Enlightenment's conviction that progress had become unstoppable. By the middle of the eighteenth century, Shakespeare's sense of time had been replaced by our own."

The term "the Scientific Revolution" is itself one that can be interrogated, and Wootton explains that it's only in the 20th century -- the mid-20th century, at that -- that the term came to mean the creation of modern science as exemplified by Newton's physics. The inspiration for the term is not the American or French Revolution -- both of which were referred to as revolutions as they occurred -- but the Industrial Revolution. Like the Scientific Revolution, the Industrial Revolution was named after the fact -- and although it occurred later, it was named first. Wootton correctly points out that any term introduced by historians after the fact is a term that will be challenged by later historians after THAT fact, which is one of the things that will make your eyes glaze over and start unsubscribing from mailing lists. But anyway.

"In medieval universities, the core curriculum consisted of the seven liberal 'arts' and 'sciences': grammar, rhetoric, and logic; mathematics, geometry, music, and astronomy." There is some digression about what was meant by "art" and "science" at the time, the upshot of which is that all seven were considered both, whereas philosophy and theology were sciences but not arts. Anyway. "Moreover, these sciences were organized into a hierarchy: the theologians felt entitled to order the philosophers to demonstrate the rationality of belief in an immortal soul; the philosophers felt entitled to order the mathematicians to prove that all motion in the heavens is circular ... A basic description of the Scientific Revolution is to say that it represented a successful rebellion by the mathematicians against the authority of the philosophers, and of both against the authority of the theologians."

Da Vinci, for instance, said, "No human investigation can be termed true science if it is not capable of mathematical demonstration. If you say that the sciences which begin and end in the mind are true, that is not to be conceded, but is denied for many reasons, and chiefly the fact that the test of experience is absent from the exercises of the mind, and without it nothing can be certain." The reason we no longer class philosophy and theology as sciences is because we think of science as dealing with not just theory but experiment: testable, verifiable, observable, repeatable results. To return to the refrain: this is a relatively new idea.

One of the most interesting parts of Wootton's book is something I'm still pondering: "before Columbus discovered America in 1492, there was no clear-cut and well-established idea of discovery; the idea of discovery is, as will become apparent, a precondition for the invention of science."

Now, maybe I don't need to point this out, but the idea of "Columbus discovering America" is not important to Wootton's claim here: that is, it is not important that other non-Americans came to the continent before he did, nor that, since there were people fucking living here, none of these non-Americans actually discovered the damn thing. What's key is the European world absorbing the idea of Columbus discovering America and subsequently engaging with America: the phenomenon, so to speak, of "Columbus discovered America." There are other books on the physical and cultural effects of Columbus's voyages; this isn't that. This is about the scientific and intellectual reverberation of the concept of "discovery."

(Wootton also argues that, while everyone correctly dismisses the "Columbus proved the Earth was round" nonsense sometimes taught in elementary schools, the voyages to America did nevertheless change the European conception of the globe, by proving the existence of antipodean land masses, which were believed to be impossible. The Columbian contact did change the understanding of the Earth, then, just not in as simplistic a way as changing it from flat to round.)

"It is discovery itself which has transformed our world," Wootton points out, "in a way that simply locating a new land mass could never do. Before discovery history was assumed to repeat itself and tradition to provide a reliable guide to the future; and the greatest achievements of civilization were believed to lie not in the present or the future but in the past, in ancient Greece and classical Rome. It is easy to say that our world has been made by science or by technology, but scientific and technological progress depend on a pre-existing assumption, the assumption that there are discoveries to be made. ... It is this assumption which has transformed the world, for it has made modern science and technology possible."

Wootton backs up his proposition of the 1492 (essentially) invention of the idea of discovery with linguistic evidence, investigating the various Romance language terms for discovery and related concepts, and how they were used. This is the area where I expect to see other historians responding, refuting, or amplifying -- his evidence is compelling enough, but I'm in no position to tell whether he's cherry-picked it, how much he's interpreting it to favor his argument, etc. See what I mean about the problem with recent scholarly works (at least when they're outside your area of expertise)?

This is a key part of his "discovery" argument: "Although there were already ways of saying something had been found for the first time and had never been found before, it was very uncommon before 1492 for people to want to say anything of the sort, because the governing assumption was that there was 'nothing new under the sun.' The introduction of a new meaning for descrobir implied a radical shift in perspective and a transformation in how people understood their own actions. There were, one can properly say, no voyages of discovery before 1486, only voyages of exploration. Discovery was a new type of enterprise which came into existence along with the word."

Wootton makes a distinction between "discovery" and words like "boredom" and "embarrass," in that we accept that before there was a word for it, people felt bored; before "embarrass" acquired its modern meaning in the 19th century, people could feel embarrassed. But "discovery" is "'an actor's concept' ... you have to have the concept in order to perform the action ... So although there were discoveries and inventions before 1486, the invention and dissemination of a word for 'discovery' marks a decisive moment, because it makes discovery an actor's concept: you can set out to make discoveries, knowing that is what you are doing."

There is an interesting digression about the great sociologist of science Robert K Merton, to whom we owe the phrases "role model," "self-fulfilling prophecy," and "unintended consequence," and who wrote an entire book about the phrase "standing on the shoulders of giants." What Merton was unable to popularize was the idea of multiple discovery: the idea that "there are nearly always several people who can lay claim to a discovery, and that where there are not this is because one person has so successfully publicized his own claim that other claims are forestalled."

"We cannot give up the idea that discovery, like a race, is a game in which one person wins and everyone else loses. The sociologist's view is that every race ends with a winner, so that winning is utterly predictable. If the person in the lead trips and falls, the outcome is not that no one wins but that someone else wins. In each race there are multiple potential winners." Think of how many time travel stories revolve around, I don't know, going back in time to keep somebody from inventing the time machine. That's taking the Great Man view of history -- the view that prevailed in the 19th century, when history was portrayed as the result of specific heroic egos. Other than in fiction and, perhaps, biography, it is not a view that is looked on kindly anymore -- no one person, no one thousand people, can be said to be uniquely responsible for the major events of history, not because that person did not do the things they did, not because those things lacked significance, but because other people would do other significant things to move history in largely the same way. If you go back in time and kill Li'l Lincoln when he's a toddler, you don't wake up in a 2016 that still has slavery -- you simply find out that slavery ended under some other presidency instead. We pretty much accept that -- like I said, except in fiction -- but this centrality of the individual in discovery persists, even when as Wootton points out, numerous people independently discovered the sine law of refraction, the law of fall, Boyle's law, oxygen, the telescope.

I am skipping over lots and lots of detail about the effects of the telescope and the microscope, in part because it doesn't excerpt well.

Wootton also goes into a dispute with relativists, especially "the strong programme," which I think is too inside baseball to get into here, especially since as someone not employed by a university, I have no reason to be invested in the fight. (Okay, that's not entirely true. The Science Wars are important, and impact not only the work I do in educational publishing and reference books, but overall science literacy and public policy. But Wootton's comments on it are not the part of this book that will stay with me the longest, even as they frame the rest of it.) As a fan of Kuhn but not of many of the historians of science who have followed in Kuhn's footsteps -- the relativists, in other words -- but who has also been accused of relativism because I disagree with the way history is constructed by strict realists, I think I agree with Wootton when he says that "this book will look realist to relativists and relativist to realists."



No comments:

Post a Comment