Explaining the Venn diagram of evangelical, charismatic, Pentecostal, born again, literalist, inerrantist, and fundamentalist is a complicated thing, and beyond the scope of this blog entry, but one of the things that has happened since midcentury is that "evangelical Christianity" has become synonymous in popular discourse with conservative (and especially ultra-conservative) fundamentalist- and literalist-leaning Christianity, and in particular the white churches fitting that description, and this is neither evangelical Christianity's origins, nor does it describe all of evangelical Christianity (or all of white evangelical Christianity) today.
After all, evangelical Christianity began more or less in the early 18th century -- or to consider it another way, it is more than twice as old as the fundamentalism with which it is now so closely associated. However, pointing that out doesn't mean fundamentalism has no place in evangelicalism, either; many of the characteristics now central to Christianity overall didn't develop for centuries, and this is just the way religion works.
But it raises the question: how did it get here? (where is that large automobile?)
Molly Worthen's Apostles of Reason: The Crisis of Authority in American Evangelicalism provides some of the answers. It's not a comprehensive history of American evangelicalism, nor of American evangelicalism in the period of time it covers -- it focuses principally on white evangelicalism, for one thing, and seeks to trace the rise of the politicized conservatives among them, by tracing evangelicals from the late 19th century to more or less the present day.
One of the themes that emerges quickly that of recurring clashes between -- let's just use this terminology -- conservatives on one side, and liberals and moderates on the other, resulting in the backing down or attrition of the liberals or moderates, and the subsequent strengthening of the conservative position. One of the most emblematic examples of this for me involves faculty members at an evangelical college, raising issues with the requirement that they pledge to uphold inerrancy-- what exactly does inerrancy mean here, the moderates asked? It means what it means, the conservatives said, that the Bible overrules reason, so there's no point thinking too much about it; the conservatives wound up resigning over the debate, but the moderates didn't really "win" as a result, because of the way these "losses" feed the conservative evangelical persecution complex and the need for enemies.
When I said that explaining that Venn diagram is a complicated thing, though, I wasn't blowing smoke. Just defining those terms is a tough thing. "The term evangelical," Worthen points out, "is so mired in adjectives and qualifiers, contaminated by politicization and stereotype, that many commentators have suggested it has outlived its usefulness. In America alone, the broad tent of evangelicalism includes a definition-defying array of doctrines, practices, and political persuasions. Perhaps no label is elastic enough to contain a flock that ranges from churchly Virginia Baptists to nondenominational charismatics in Los Angeles. At the same time, the mudslinging of the 1990s culture wars turned many conservative American Protestants away from a label now synonymous in the media with right-wing radicalism and prejudice. Yet we are stuck with it. Believers and atheist scholars, politicians and pundits, all continue to use the word evangelical. To observers and insiders alike there still seems to be a there there: a nebulous community that shares something, even if it is not always clear what that something is."
Nailing down what "evangelical" means is sort of like nailing down what "Christian" means -- both insiders and outsiders think they know, and yet the reality is that there are many different groups answering to that name, sharing some common ground but disagreeing about doctrine, practice, and in-group membership, and often unaware of how deep this disagreement runs or how many groups unlike them there are. This tends to be especially problematic with evangelicalism, since at least Christianity as a whole has actual named denominations and movements, whereas the differences within evangelicalism are not so clearly labeled, and not every evangelical church or group belongs to a larger ecumenical organization.
"... The trouble is that evangelicals differ widely in how they interpret and emphasize 'fundamental' doctrines. Even with the 'born again experience,' supposedly the quintessence of evangelicalism, is not an ironclad indicator. Some evangelicals have always viewed conversion as an incremental process rather than an instantaneous rebirth (and their numbers may be increasing)."
Evangelicalism has historical roots in European Pietism; "catchphrases like 'Bible-believing' and 'born again' are modern translations of the Reformers' slogan sola scriptura and Pietists' emphasis on internal spiritual transformation."
"Three elemental concerns unite [evangelicals]: how to repair the fracture between spiritual and rational knowledge; how to assure salvation and a true relationship with God; and how to solve the tension between the demands of personal belief and the constraints of a secularized public square." These are pretty broad concerns, and not all Christian groups with these concerns are evangelical, certainly; Worthen is just attempting to delineate the common ground among evangelicals without invoking specific terminology like "born-again" which, as pointed out, is treated differently in different groups. But as the subtitle of the book indicates, what she sees here is an overall concern with "problems of intellectual and spiritual authority."
This is one of the most important parts of evangelicalism: "American evangelicals have a strong primitivist bent. They often prefer to think their faith indistinguishable from the faith of Christ's apostles, and scoff at history's claims on them. But they are creatures of history like everyone else, whether they like it or not."
In what we now think of as the classic conservative evangelical community, this "primitivist bent" is not just integral to group identity, it's the basis of evangelicals' criticism of other Christians, which in turn insulates them from both criticism and analysis. Fundamentalism, for these folks, was not introduced at the dawn of the 20th century, it was revived at that point, returning Christian practice to the only form that true Christian practice could ever have taken. What I'm describing is, of course, the conservative view -- but the conservative view has become particularly important since, as I've talked about before, it became so dominant in religious discourse, influencing the way liberal Christians talked about their faith and non-Christians' views of Christianity.
Worthen's three keys again, rephrased: "three questions unite evangelicals: how to reconcile faith and reason; how to know Jesus; how to act publicly on faith after the rupture of Christendom."
Despite modern evangelicals' originalist claims, they act fairly divorced from history, and for that reason they often -- not just the conservatives -- forget just how close their origins are to the Protestant Reformation, which must seem to them like the distant past. But under Catholicism, certain questions -- especially about the role of faith in the public sphere -- had, if not easy answers as such, at least well-established arguments. Much as the formation of Christianity meant reevaluating Jewish thought and scripture to ask, what do we take, what do we leave behind, and what to we reinterpret, Protestant factions had to do likewise with the previous millennium-plus of Catholic thought. Some of it was dispensed immediately because it was the source of the rift, but Catholic thought by that point in time was incredibly broad, and Catholic theology was built on generations upon generations of commentators. How much of doctrine went away? How much of that theology was no longer valid?
That too is too complicated an issue for this blog entry, but what matters is the big canvas that it created. As far as the issue of authority, for instance, for various reasons -- particularly the political power and political relationships of the Catholic Church at the time of the Reformation, as well as the need to protect the interests of Reformers, and the political interests of the states that flipped Protestant -- most of the early Protestant religions were state churches, affiliated with national governments. The antecedents to evangelicalism begin with a reaction against that, which goes a long way toward explaining evangelicals' embrace of home churches, unaffiliated churches, and churches that are only loosely affiliated with one another rather being led by formal ecumenical hierarchies.
"Pietist preachers critiqued the state churches that emerged from the Reformation as overly formal and cerebral. They called on believers to study the Bible and strive for personal holiness." That could damn near describe evangelical churches today, but Worthen is describing the late seventeenth century.
As for fundamentalism, it grew out of "The Fundamentals," a series of pamphlets written in the 1910s by faculty at Princeton's seminary, controlled at the time by theological conservatives. The Princeton conservatives considered inerrancy -- the idea that the Bible is without error, which is not the same as the idea that the Bible is literally true, but the line between an inerrantists and a literalist can be fine, and laymen do not always pause to consider the difference, however important its implications -- to be "fundamental" to the Christian faith, and their defense of it was a reaction to the Biblical criticism coming out of the German universities. Generally moderate, even conservative by today's standards, the "modernist" approach to Biblical criticism seemed impious to these proto-fundamentalists because it did not take the Bible's inerrancy for granted. Keep in mind that although some of the scholars the conservatives took issue with would certainly include non-believers or scholars challenging basic premises of religious belief, it also included numerous scholars who were themselves religious but simply didn't believe it was necessary to consider the Bible an error-free account of history, or to ignore the obvious parallels between the Old Testament and other Semitic religions, or to believe the traditional views that Moses had written the Torah and the Gospels had been written shortly after the death of Jesus, as first- or secondhand accounts. Scholars, again many of them religious (including clergy), were willing to question whether Jesus's miracles really occurred, not to mention what to make of the Earth being created in "six days." The basic dispute, from a conservative perspective especially, was whether the Bible overruled reason or vice versa.
(There is an extraordinarily large problem with taking the conservative side even apart from rejecting reason, which is the problem with "sola scriptura": despite hundreds of years of claims to the contrary, the Bible cannot be understood "by itself." Many readers certainly believe that, having read it -- or some part of it -- they come to a conclusion about what it means, but they do so because they bring to it preconceptions, viewpoints and perspectives impacted by religious teachings they have absorbed prior to their reading, and so on. Other readers claim that this problem can be alleviated by praying for guidance, and yet history clearly shows that the Bible and individual passages of the Bible have been interpreted in many different ways at many different times -- not just opportunistically, but by sincerely pious people who we must assume similarly prayed for guidance. In other words: whatever your religious beliefs, to make the claim that the products of human reason must be weighed against what it says in the Bible is nonsense, because "what it says in the Bible" cannot be ascertained except as a product of human reason. Indeed, belief in God, God's creation of humankind, and the divine guidance of the writing of the Bible should carry with it belief that this is one of the ends to which reason should be used.)
This is a dispute that continues in much of conservative Christianity today, obviously, and the dumbing down of Christianity, the sapping of religious literacy, has not helped. One reason conservatives are so convinced of the originalism of their Christianity is because some hundred years ago, they began constructing their echo chamber, expelling moderates and liberals from their churches, seminaries, and periodicals when they had the power to do so, starting their own when they did not. An echo chamber, a persecution complex, and a sense of superiority arising from the conviction that they were practicing the only real form of Christianity while everyone else readied themselves for the coals: that is the essence of conservative evangelical Christianity as it developed in the United States over the course of the 20th century.
(Even the pamphlets did not uniformly defend inerrancy, it's worth noting, despite being published by an inerrantist group and inspiring a largely inerrantist movement. They defended "traditional interpretations of the Bible," which is a slightly broader category, and the authors included some theologians who would soon, as The Fundamentals gave way to fundamentalism, be positioned more at the moderate edge of conservatism, notably James Orr, who argued stridently against inerrancy, but also against the excesses of German modernism. Fundamentalism wasn't defined by The Fundamentals so much as by the activity that surrounded and followed them, and the pamphlets themselves -- written by theologians, typically subtler and more nuanced in their arguments than either the laity or, for that matter, less intellectually-inclined clergy -- represent a wider range of theological positions than would later be tolerated.)
(A quick Worthen quote that goes far in explaining the difference between inerrancy and literalism: "Inerrantists often acknowledged scripture's inconsistencies, such as multiple, conflicting accounts of the same event. They freely admitted that when God inspired the biblical authors to set down his perfect revelation, he did not place them in a divine crow's nest peering over space and time. They asserted that simply because, in our finite judgment, the evangelists seem to disagree about how many times the cock crowed before Peter denied Jesus, there is no reason to conclude that the first chapter of Genesis is all metaphor, that the Marys did not find the tomb empty -- or, more fundamentally, that scriptura could truly stand sola, that the plain meaning of God's word somehow depended on human authorship or interpretation.")
(Both literalism and inerrancy are difficult to impossible theological positions to defend. But literalism approaches true indefensibility: the Bible cannot be literally true, because it contains too many contradictions. It is easier to argue that the Bible is "without error," and construct it as a series of divine revelations that may contain details that conflict with the details of other revelations in that series, but as moderate evangelical theologians themselves have pointed out, this requires incredibly detailed discussions of what exactly "inerrancy" means, such that the inerrancy claim becomes pointless. And yet.)
As Worthen points out, "Pastors who encountered the careful critiques of [theologians like] Orr second- or third-hand rarely preserved their prudence and intellectual agility. As conflict against modernists intensified in the 1920s and 1930s, fundamentalists lost interest in nuance. They refashioned a once-subtle doctrine into a shield to protect the Bible from the revisions of blasphemers. Orr and the scholars of old Princeton had understood themselves as explicating centuries of Christian widsom in modern terms. They held steady at the siren call of sola scriptura -- that problematic promise that every believer could grasp scripture's plain meaning -- by binding themselves to the mast of a venerable theological tradition. Later fundamentalists, however, became polemicists rather than apologists. The difference is subtle but crucial. Winning the war against modernism became more important than illuminating orthodoxy. Inerrancy came to represent not only a set of beliefs about creation or the reality of Jesus's miracles, but the pledge that human reason must always bow to the Bible. As fear of modernist theology and new science began to infect a wide range of Protestant churches, this new variety of fundamentalist deployed inerrancy as a simple shibboleth to separate sheep from the goats. It was no longer a doctrine with historical roots or an ongoing debate among theologians. Inerrancy was common sense."
Emphasis mine. Surely you have had an argument with a fundamentalist at some point -- even an atheist parroting fundamentalist views as an argument against religion -- who has made essentially these "points," since they have become so commonplace.
To Worthen's basic focus:
"From the beginning, [evangelicals'] concerns were existential and epistemological: They had to do not just with points of belief, but with how Christians accounted for human knowledge, how they lived in the world, and how they claimed to 'know' the divine in their minds and hearts. While many ancient Christians assented to the basic doctrines that scholars mark as 'evangelical,' that assent took on a different character after the seventeenth-century rebirth of reason and the invention of our present-day notions of 'religious' and 'secular.' The sundry believers who share the evangelical label have all lacked an extrabiblical authority powerful enough to guide them through these crises. Roman Catholics obey the Vatican (more or less). Liberal Protestants tend to allow the goddess of reason to rule over the Bible (or to rule, relatively untroubled, in her separate sphere). Evangelicals claim sola scriptura as their guide, but it is no secret that the challenge of determining what the Bible actually means finds its ultimate caricature in their schisming and squabbling. They are the children of estranged parents -- Pietism and the Enlightenment -- but behave like orphans. This confusion over authority is both their greatest affliction and their most potent source of vitality."
Out of the second generation of fundamentalists mentioned above came the neo-evangelicals (such as Billy Graham). "'Neo-evangelical' would come to describe a self-aware intellectual movement of pastors, scholars, and evangelists within the conservative Protestant community roughly (but not entirely) contained within the NAE. ... while evangelical connoted a broad swath of conservative Protestants averse to the old fundamentalist model of feuding separatism but still eager to defend the authenticity of religious experience and the authority of the Bible, neo-evangelical became a more precise label, embraced by a small circle of self-appointed leaders."
The NAE is the National Association of Evangelicals, formed in 1942. It is not a denomination but an association of evangelicals that includes both evangelical denominations (dozens now) and nondenominational churches. and later (in the 1970s) sponsored the New International Version translation of the Bible. The NAE's founders had concrete problems to address, very similar to the concerns of today's conservatives -- increasing evangelical representation among the chaplains who served in the armed forces, fighting the influence of modernism of public school curriculums, guiding Sunday School curriculums, and increasing the presence of conservative voices on religious radio, which was then dominated by liberal denominations, believe it or not. The fight against modernism and the defense of "traditional Biblical beliefs" informed all of these concerns, and just as you hear from the radical Right today, the threat posed is one of corruption from within: "Without a firm defense of Biblical inerrancy ... America would fall to enemies within and without, as had imperial Rome. Western civilization was sick with secularism and socialism, the modern spores that had overrun their hosts in the Soviet Union. The Kingdom of Hell was at hand."
Furthermore, in the view of the neo-evangelicals, "prior to the advent of modern biblical criticism and the theory of evolution, all Westerners shared a Christian Weltanschauung -- an unqualified respect for biblical authority, even if corrupted in some regions by Catholic rule.... The neo-evangelicals were overfond of this word, Weltanschauung, and its English synonyms: worldview, world-and-life view. They intoned it like a ghostly intonation whenever they wrote of the decline of Christendom, the decoupling of faith and reason, and the needful pinprick of the gospel in every corner of thought and action."
"From the neo-evangelical point of view, if Christian civilization was to survive the twentieth century, then biblical inerrancy and a reenergized Christian Weltenschauung must form its bedrock. The neo-evangelicals championed other theological principles too, but they recognized that conservative Protestants might reasonably disagree on details of doctrine. The NAE had no business taking a firm stand on predestination or exactly when Christ was due to return. Biblical inerrancy and the totality of the Christian world-and-life view, on the other hand, were different. These were not really doctrines at all, but facts: facts that made sense in an age when everyone from Nazis and communists to Catholic theologians and U.S. Foreign Service officers were talking about worldviews and presuppositions."
"The trouble was that neo-evangelicals presumed an evangelical solidarity that did not exist. The call for cooperation that began with the NAE would expose discord and ambivalence -- not least because, as it turned out, the neo-evangelicals' instinctive response to debate was to turn a deaf ear and close ranks. They differed from their fundamentalist forefathers only in the degree of their separatist impulse." Many evangelical denominations declined to join; the Nazarenes didn't join until the 1980s, and the Southern Baptist Convention, not only the largest Protestant denomination in the country but several times the size of the entire NAE, stayed out of it entirely, as did most Restorationist churches and denominations -- the churches that had grown out of the Second Great Awakening.
"Neo-evangelicals assumed that the battles against modernists in the early decades of the twentieth century had left all evangelicals with the same experience and collective memory. Nothing could be further from the truth. Restorationists fought over the use of musical instruments and worship and the degree of bureaucratic organization permissible for a 'New Testament Church.' The Nazarenes and Mennonites argued about 'worldliness' and abandonment of older customs and styles of dress."
Southern Baptists, meanwhile, rejected the NAE because both evangelicalism and fundamentalism seemed like Yankee phenomena to them, even though the Southern Baptists had fought the same fight against modernism, and shared many of the same anxieties. However, while the Princeton fundamentalists had been concerned with biblical criticism, the Southern Baptists of the same era had been occupied with an internal battle between conservatives who supported the Convention's authority and moderates and Landmarkers who fought for the autonomy of individual congregations -- a battle informed by concerns over how those autonomous congregations would then interpret the Bible, in light of prevailing modernist trends, but nevertheless not solely concerned with interpretation.
The Mennonites are part of the Anabaptists tradition. Writing in 1955, young Mennonite scholar John Howard Yoder -- after corresponding with Carl Henry of the NAE -- articulated some of the problems with neo-evangelicals, from the point of view of Mennonites and like-minded evangelicals: "Yoder urged Henry to relinquish his obsession with doctrinal details and philosophical rationalism. The Fundamentals pamphlet series 'was a time-bound polemic strategy' that addressed issues pertinent to the early twentieth century, but could not meet the challenges of the 1950s. 'For instance, they included nothing about social ethics, nothing about what Christian United is and is not, and further, the polemic strategy then chosen served better to build a barrier than to speak across the gap.'"
"Only a small minority of conservative American Protestants shared the neo-evangelicals' rationalist, Reformed heritage. Most churches continued to emphasize other themes -- such as personal holiness, internal transformation, or gifts of the Holy Spirit -- over intellectual assent to philosophical claims about the nature of God."
The NAE's striving for evangelical solidarity also suffered due to some of the alliances made by some of its more prominent members. Billy Graham, one of the most famous crusaders of the 20th century, came under fire by some conservative Protestants for allying himself both with Catholics and with liberal Protestants in the name of revivalism (and, arguably, in the service of promoting the Billy Graham brand).
Graham was a trustee at Fuller Theological Seminary, which is at the heart of one of the phenomena I mentioned earlier, the attrition of liberal and moderate evangelicals. Fuller was founded in 1947 by radio evangelist Charles Fuller and NAE co-founder Harold Ockenga. "In the late 1950s, a number of Fuller professors -- including the founder's son, Dan Fuller -- concluded that they could not abide by the seminary's statement of faith on the point of strict inerrancy. They had come to believe that while the Bible remained an 'infallible' guide on matters of doctrine, worship, and Christian life, it was not accurate in every scientific and historical fact. By 1961, the atmosphere at Fuller was poisonous."
One of the moderate board members pointed out the "complexity of the inerrancy debate, the 'very real problem of arriving at a precise meaning of the word inerrancy ... It would be literally impossible for you or anyone else who has a good knowledge of the Bible to sign our doctrinal statement without at least some degree of reservation. I -- along with others -- believe the statement should be carefully review by our faculty in much prayer and in the Holy Spirit ... the Fuller faculty and board compose the only group I know of in evangelical circles who are honest enough to face this matter openly."
Consider the source here. This isn't a modernist. This isn't a secularist. This is a professor at an evangelical seminary founded by the neo-evangelicals, literally the intellectual center of the neo-evangelicals, the home of Billy Graham, arguing not to throw out evangelicalism but simply that inerrancy is an indefensible doctrine, and that the faculty should pray together and come to a decision about revising the statement of faith required of professors and students. This is not a liberal revolt.
But it was treated as one. The moderates "won," insofar as Fuller to this day admits both conservative and liberal Protestant students, and the most conservative faculty resigned over the inerrancy issue. But they made martyrs of themselves in resigning, and included many of the best-known names among the faculty, the heroes of the scene. Inerrancy was the hill they were willing to die on.
Billy Graham launched Christianity Today in 1956, one of many evangelical magazines in an age when most American households still subscribed to and read multiple magazines. The title is a deliberate counterpoint to The Christian Century, the most popular mainline Protestant magazine of the day. "The neo-evangelicals behind Christianity Today did not propose to modernize old-time religion. On the contrary, they were proud defenders of fundamentalism." But they still sought to engage with mainstream America more than the less worldly, separatist, less "neo" evangelicals -- what Worthen calls the unreconstructed evangelicals.
Christianity Today is the spiritual sibling, so to speak, of The National Review, founded the previous year by William F. Buckley Jr. "Neoevangelical conservatives felt just as embattled as the editors at Christianity Today. The editors of the National Review founded their magazine out of a similar desire to rally their cause in a hostile marketplace and overcome liberals' caricatures of the 'Neanderthal Right,' as Buckley put it."
"Midcentury American conservatism featured -- on a grander, more tumultuous scale -- the same insecurity and discord that the neo-evangelicals perceived in their conservative Protestant world. Its various factions formed a dysfunctional family, clamoring with clashing beliefs and pet obsessions, whose members spent as much time squabbling among themselves as they did lobbying for right-to-work laws or denouncing progressive rulings by the Supreme Court. They felt both marginalized in the corridors of power and exhilarated by their increasingly well-funded drive to take back America. Buckley, however, was a master coalition builder. He managed to keep secular libertarians and Catholic traditionalists on the same masthead and maintain a healthy distance from the John Birch Society and other radioactive characters in the movement. This uneasy alliance was the key to his ability to lead an intellectual resurgence that eventually penetrated Washington ... [In 1961], one journalist conducted a survey of the shifting climate on college campuses and gave Buckley 's organizations the lion's share of the credit for the burgeoning 'revolt not only against socialist welfare statism in government, but also against indoctrination by leftist professors ... The conservative student revolt is a campus phenomenon from Stanford and Berkeley on the West coast to the Ivy League, from the University of Washington to the University of Miami."
Christianity Today was dependent on the financial support of oil exec John Howard Pew, who skewed more conservative than some of the writers -- Pew had also provided funding for libertarian journals and political groups, the Liberty League (an anti New Deal group), and the John Birch Society. In part because of Pew's support -- which kept the magazine afloat in the first decade, when it ran at a deficit of several hundred thousand dollars a year -- Christianity Today hewed close to the National Review's political conservatism, "toe[ing] the conservative line on every significant political and theological issue from foreign policy and civil rights to evolution and the ecumenical movement. (Pew called one of the CT founders, Carl Henry, a "socialist" for believing that evangelicals should be getting more involved in activism like the civil rights movement rather than doubling down on their opposition to it; their clashes contributed to Henry's forced resignation in 1968.)
Political conservatism and conservative Protestant theology seem synonymous now, inevitably linked hand in hand, but it need not be so, nor was it always so. For one thing, conservative Protestants -- evangelicals especially -- had often called for keeping churches out of politics, and explicitly contrasted themselves with Catholics and the Social Gospel movement among liberal and moderate Protestants when doing so. For another, conservative Protestants have also supported liberal political positions in the past -- some of them in alliances with the Social Gospel on some issues, for instance. The history of the politicization of the abortion issue is by now well-publicized; conservative evangelicals were largely disinterested in it until relatively recently in their history. Christianity Today's political positioning was part of the process that laid the groundwork for this politicized and politically conservative evangelicalism.
That said, it's interesting what form political conservatism took in those early days: after Henry resigned -- in 1968, remember -- Pew's further financial contributions were "contingent on the promise that Henry's successor, Harold Lindsell, would continue condemning 'the ecumenical church's political involvement.'" It's hard to imagine today's analogues to Pew having reservations with the church being politically involved. The assumption was that "political involvement" meant, at least to some degree, support for liberal causes, rather than the conservative activism that soon became common.
"These noisy internal quarrels concealed one remarkable silence: the dearth of conversation with conservatives outside the neo-evangelical bubble. In the magazine's early years, when nearly every issue featured essays lambasting communism, urging a retrenchment of conservative Christian values, and otherwise echoing many themes favored by William F Buckley and other writers in the rash of new conservative journals, the editors of Christianity Today gave little sign that they considered themselves comrades in arms with conservative Catholics, Jews, and repentant ex-socialists."
Fundamentalism and obsession with the end times had played roles in shifting evangelicalism away from social activism. "The belief that this world will fall into greater misery and chaos before Christ's Second Coming dampened evangelical enthusiasm for collaborating with secular authorities to reform society ... After all, social decay was a sign that Christ's return was drawing near. In the context of the fundamentalist-modernist controversy, large-scale social activism was contaminated by association with the enemy: those heterodox liberals who did not merely live out the gospel through good deeds, but seemed to believe that good deeds might replace the gospel altogether. In a massive shift that evangelical sociologist David Moberg later called 'The Great Reversal,' many conservative Protestants began focusing more energy on evangelism, personal moral crusades (pressing for Prohibition rather than fighting poverty), and denouncing modernism, all at the expense of social reform."
Both CT and the NR were intellectual outlets. However ... "In the 1950s, many conservative intellectuals were in the business of historical rediscovery, reconstructing an intellectual genealogy to support their critique of liberal theories of human progress and individual autonomy. Against modern secular liberalism, they asserted the sacralized and sin-stained worldview of medieval Christendom, the natural law of the ancient Greeks, the civic decrees of Roman philosophers."
"The editors at CT pondered much of the same history in these years ... yet the commitment to biblical inerrancy had warped neo-evangelicals' understanding of the past. Although no godly revivalist's teachings stood on par with scripture, the basic principle of inerrancy -- that historical circumstance does not influence human authorship or interpretation, when that human writes or thinks by God's will -- seeped into the way they interpreted history outside the Bible as well. They were less interested in understanding ancient thinkers in their own historical context than in thinking themselves to a succession of proto-fundamentalist torchbearers, Christians who 'believed in the Bible' as the neo-evangelicals themselves thought scripture should be read. Their ahistorical view of scripture, their overriding desire to defend their doctrine of inerrancy as ancient, immutable, and God-given, made sensitive scholarship impossible. In the hands of CT's editors history became a legal brief for inerrancy, a purity test for the present."
The intellectual commitment of CT -- and the failings of that commitment -- would have extended to Crusade University, "the first evangelical research university, an omnibus institution with undergraduate and graduate programs, churning out original scholarship in the Lord's name," the brainchild of Graham, Henry, and other CT founders. They were unable to either secure funding, though, or -- particularly given the neo-evangelicals' opposition to the separatism of unreconstructed fundamentalists represented by Bob Jones University (founded in the 1920s) -- successfully address the balance of evangelical orthodoxy and mainstream academia.
Crusade University's failure to manifest did not keep evangelicals from going to school -- and grad school -- in large numbers, however. "Several neo-evangelical scholars earned PhDs and ThDs at Harvard in the 1940s and 1950s ... Fuller graduates were winning Fulbrights ... as Mark Noll has noted, the problem was not so much evangelicals' failure to excel at secular academic institutions, but rather their ability to compartmentalize their faith from new learning. They tended to position themselves in fields where no one would corner them too aggressively on their views about, for example, how one ought to interpret the creation narrative in Genesis."
That said, many evangelicals viewed higher education with suspicion, even as they availed themselves of it. Bob Jones and other "separatist" universities existed in part because of this superstition, and evangelical colleges dragged their heels in seeking accreditation -- and were often criticized for doing so -- usually not doing so until the late 1950s, about 3-4 decades after mainline Protestant schools and 20 years after Catholic ones. "Fundamentalists and conservative evangelicals understood the history of American higher education as a story of decline from holiness to heterodoxy. Their own institutions were oases where the Bible still reigned. Moreover, early Bible college leaders were unimpressed by a self-policing, credentialed elite. They exalted the common sense of the layman whose faith was unmuddled by the mystifications of the so-called experts."
Sound familiar?
I'm going to stop there because that's where I stopped marking pages -- I tended to mark fewer in the book's coverage of the 70s and beyond because, well, the part of history that I lived through, I'm more likely to remember.
Showing posts with label reading. Show all posts
Showing posts with label reading. Show all posts
Monday, January 9, 2017
Saturday, November 12, 2016
cuisine and empire
I think my first blog blog, as opposed to LiveJournal or whatever other platforms I may have forgotten, was a cooking blog. I have cooked for a fair bit. About thirty years, I suppose, but only on the regular for about ... well, twenty-five years, then. From the latter part of high school on, I've cooked the overwhelming majority of the meals I've eaten, and sometime long after people said I was good at it, I started actually being good at it. The Food Network came and went (I realize it's still broadcasting, but come on). I had a few blogs. At one point or another I've made most things from scratch that don't require a still or a grain mill.
I think about cooking a lot, is the thing. I wouldn't do it if it weren't something that intrigued me. Or I suppose I would do, but I'd do it differently, you know -- I wouldn't cook the way that I do, which is a whole thing we don't need to get into here. I think about flavor, I think about technique, I think about context, and because I'm a historian, I think about history.
There aren't a whole lot of book-length histories of cuisine out there. It's a slightly more popular topic for microhistory -- you can find a number of different histories of coffeehouses, tea, pizza -- and there has been a small but promising uptick (maybe too small, maybe I shouldn't spook it) in books about immigrant cuisines in the United States in the last decade or so, which is very very cool. But there are only a handful of broad histories of cuisine overall, of which Reay Tannahill's is probably still the canonical.
Rachel Laudan's Cuisine and Empire is a valuable addition to the field. It differs from Tannahill quite deliberately in that Tannahill (writing in the 70s, perhaps relevantly) primarily organizes her work by country (or empire), while Laudan emphasizes the contacts between cultures.
This is a huge book, and by its nature not something can be summarized, so there will be a lot of detail that I skip over here because I just didn't think to dogear it.
It's been some decades since Tannahill's book, and in that time there has been considerable activity on the matter of cooking in prehistory. Most famously, Richard Wrangham has proposed that cooking actually predates us -- us meaning H. sapiens anyway, and that Homo erectus first cooked its food nearly two million years ago. Further, Wrangham argues that, as the subtitle of his book Catching Fire would have it, Cooking Made Us Human -- that it was our discovery of cooking that drove human evolution on a path divergent from the other primates, one that led not only to less time foraging but less time eating. Chimpanzees spend need to spend five hours a day chewing their food in order to get enough energy to get through that day. Cooking not only softens food (and in the case of many ingredients, increase bio-availability of many nutrients, and of course neutralizes many toxins), it's Wrangham's view that an early development of cooking contributed to numerous evolutionary advantages, including a more efficient digestive tract. This is not a universally held view, mainly because there is insufficient archaeological evidence to compel it, but it is more widely accepted that our various masteries of eating -- cooking, hunting, and much much later agriculture -- contributed to brain growth.
Our modern expectation to eat "fresh" and "natural" foods is possible only because we eat foods out of season -- radically out of season, in senses incomprehensible to the past: we not only rapidly transport food across the world from where it is grown or raised to where it is eaten, we not only refrigerate food to extend its freshness, we alter the natural life cycles of animals in order to have meat and dairy on demand, and we've spent thousands of years breeding both animals and plants for more desirable food traits. Plant-based foods take longer to spoil and are more resistant to pests; meat is more abundant; dairy is sweeter.
Humankind is possible only because of unfresh food, of course, preserved food, smoked, dried, salted, fermented. Grain that's been in the granary for months. Dried out meat you have to boil for a few hours before it's soft enough that you can eat it. Different peoples faced different challenges of climate, and had access to different food resources -- those differences, ultimately, account for the earliest cuisines, which is to say, sets of cooking methods, techniques, habits, and technologies characteristic of a given region or culture.
Laudan classifies cooking operations into four groups: "changing temperature (heating and cooling); encouraging biochemical activity (fermenting); changing chemical characteristics by treating with water, acids, and alkalis (leaching and marinating, for instance); and changing the size and shape of the raw materials using mechanical force (cutting, grinding, pounding, and grating, for example)." It's an important reminder in part because until we get to the fairly recent past, cooks had to do much more of this than they do now; most purchased ingredients are already heavily processed, though we don't think of them that way. Even at farmer's markets, for instance, many of the vegetables have been washed (even if not as efficiently as supermarket vegetables are), and possibly trimmed. But that's the most minor example compared to the preparation of grains -- which required hours of work for every day's worth of food -- or meat. "Take meat, for example. A carcass has to be skinned before meat can be cut from the bone and then into portions. These may then be eaten, or subjected to heat and then eaten, or frozen and dried or fermented so that they can be eaten at a later date." Food historians generally refer to those preliminary operations as "processing," although moderns tend to think of "processed food" as spray cheese and Tofurkey.
The stories of the earliest cities are the stories of cuisines based primarily on grains and roots -- and really, the Neolithic Revolution, the Agricultural Revolution, might better be called the Grain Revolution, because while it is sometimes simply described as "when people started planting crops," which led to permanent settlements instead of nomadic hunting and gathering, it was mastery of grain that made this possible, and it occurs relatively late in the history of cooking (especially if we accept Wrangham's view) because dealing with grain is so fucking difficult. There's some debate about whether we may have been grain-foragers before we were grain-planters -- I mean, presumably we had to have been, but the debate is about how long that went on -- but in the grand scale of things it doesn't make much difference. Grain is fucking difficult. The seeds are very small and very hard and even once you deal with them, you still need to process them further to eat them. (Keep in mind that even gathering fuel for cooking fires was a lot of work and time.)
As Laudan points out, "Cities, states, and armies appeared only in regions of grain cuisines. When they did, grain cuisine splintered into subcuisines for powerful and poor, town and country, settled populations and nomads. A feast following a sacrifice to the gods was the emblematic meal everywhere, the meal that represented and united the society, as Thanksgiving now does in the United States. It is not clear whether these global parallels reflect widespread contact between societies, the logic of emerging social organization, or a combination of the two."
That last sentence sums up a lot of history and anthropology, incidentally. Don't trust anyone who insists that when you find X and sort-of-X in two places, it must be because contact between the two places transmitted X. That recent study claiming ancient origins for Little Red Riding Hood et al based on phylogenetic analyses? Don't take it at its word.
Anyway, the crazy difficulty of grain (even apart from how much more difficult it is earlier in history at the dawn of the Neolithic Revolution): "Steamed broomcorn millet and foxtail millet, tiny round grains from disparate boanical genera, were the basis of the first cuisine we encounter in the Yellow River Valley in ancient China. There peasants lived in small villages, their dwellings half buried in the ground and roofed with thick thatch to protect against the freezing winters, and the interiors crammed with grain and preserved vegetables. Small patches of millet dotted the valley's fertile yellow soil, which was brought by floods and winds from the steppe. To prepare the millet, peasants lifted heavy pestles high above mortars and let them fall repeatedly until the inedible outer hulls were cracked. Beginning around the first century BCE, they used foot-trodden pestles to pound grain in a mortar buried in the ground, a less demanding method. When all the hulls were cracked, they tossed the grains in a basket, winnowing away the lighter hulls. Then they steamed the grains until they were light and fluffy in three-legged pots set over small fires, a method that conserved scarce fuel. Before dipping their fingers into the communal bowl, they offered a little to the gods and the ancestors. They accompanied the millet with bites of pickled vegetables, cabbage of various kinds, mallow, water-shield (an aquatic plant), or bamboo shoots, seasoned and preserved with costly salt. Sometimes, when they had trapped small wild animals, they had a bit of boiled or steamed meat, seasoned with Chinese chives, Chinese dates, or sour apricots."
By this point, other grains had been introduced to the region from the Fertile Crescent -- wheat and barley, collectively referred to as mai -- but were unpalatably tough and chewy when prepared like millet, and were usually eaten only in lean times, like the months before the new harvest, when last year's millet stock began to run low.
A more elaborate sacrificial feast:
"Servants set out mats of aromatic reeds, small stools to support the diners' elbows, and dishes of bronze, wood, bamboo, and pottery. Meat on the bone and grain went on the left of each setting, sliced meat, drinks, and syrups on the right, and around them minced and roast meats, onions, and drinks were arranged in a symmetrical pattern. After making an offering to the ancestors, the king and the nobles knelt to eat, each man's seniority and valor in battle determining where he knelt and what pieces of meat he was entitled to. The warriors took morsels of the drier dishes with their fingers: meats marinated in vinegar, fried, and served over millet or rice; jerky spiced with brown pepper; and jerky seasoned with ginger, cinnamon, and salt. They scooped up keng, a stew soured with vinegar or sour apricots (Prunus mume, the "plums" of plum sauce). They nibbled on small cubes of raw beef, cured in chiu, and served with pickles, vinegar, or the juice of sour apricots; on meatballs of rice and pork, mutton, or beef; and on the much-sought-after roasted, fat-wrapped dog's liver."
But up above, we mentioned roots too, not just grains. The tropical monsoon region begins a few hundred miles south of the Yellow River Valley, and included both a root cuisine and a rice cuisine, about which much less is known than the Yellow River Valley cuisine. "To begin with the root cuisine, taro, yam, and the cooking banana (the starchy, high-yielding fruit of Musa spp., as well as its root) were boiled or steamed, and most likely pounded to pastes that could be scooped up with the fingers. People on the oceanic side of New Guinea loaded outriggers with the basics of this culinary package and sailed east into the Pacific. To sustain themselves at sea, they stowed lightweight, long-lasting dried or fermented fish, breadfruit, and bananas for food. They filled gourds and bamboo sections with water, and drank the water inside coconuts. They packed slips, cuttings, young plants, and taro and yams in moist moss, then wrapped them in a covering such as leaves or bark cloth, tucked them into palm-leaf casings, and hung them out of reach of salt spray. Breeding pairs of pigs, chickens, and dogs, which, if worst came to worst, could be eaten on the way, were carried on board. Between 1400 and 900 BCE, they settled many of the South Pacific Islands."
Another sacrificial feast, in Mesopotamia (followed by some general detail):
"A sacrificial feast included sauces, sweets, and appetizers, hallmarks of high cuisine. Fried grasshoppers or locusts made tasty appetizers. Pickles and condiments concocted from seeds, sesame oil, vegetables, fruits, garlic, turnip, onion, nuts, and olives titillated the palate. Sauces were prepared from an onion-and-garlic flavoring base combined with a rich fatty broth thickened with breadcrumbs, the ancestors of sauces still served in the Middle East and even of present-day English bread sauce. Pomegranates, grapes, dates, and confections of milk, cheese, honey, and pistachios provided a sweet touch.
"Professional cooks labored in kitchens that were as large as three thousand square feet, much of the space devoted to making grain-based dishes, bread, and beer. From the coarse groats and fine flour provided by the grinders -- perhaps prisoners and convicts -- cooks prepared porridge, flatbreads, and slightly leavened breads, the latter in three hundred named varieties. Dough was shaped into the form of hearts, hands, and women's breasts, seasoned with spices, and filled with fruit, with the texture often softened by oil, milk, ale, or sweeteners. A flour-oil pastry was enlivened with dates, nuts, or spices such as cumin or coriander. Stuffed pastries were pressed into an oiled pottery mold with a design on the bottom before baking. Flatbreads were baked on the inside walls of large ceramic pots. There is some evidence that bulgur, an easy to cook food, was made by drying parboiled wheat.
"To feed the cities, barley was shipped along rivers and canals. Onions of various kinds, garlic, herbs such as rue, and fruits such as apples, pears, figs, pomegranates, and grapes came from the gardens of the wealthy. The animals were driven to the city, where they were slaughtered, the lambs and the kids going to the temples and noble houses, the male sheep and goats to the officials, royalty, and nobles, the tough ox and ewe meat to the army, and the carcasses of donkeys to the dogs, perhaps royal hunting dogs. Saltwater fish, turtles, and shellfish came from the salt marshes and the Persian Gulf. Dried fish, probably a specialized and regulated industry, came from the Persian Gulf and from as far away as Mohenjo-Daro on the Indus and the Arabian Sea. Salt, excavated from the mountains or evaporated from briny springs and brackish river water, was shipped to distribution centers and packed onto asses, probably in standard-sized, solid-footed goblets.
"Barley was wealth. It paid for the meat and cheeses. It paid for the lapis lazuli and carnelian dishes for the sacrifice, the gold and silver for jewelry, the boatloads of copper that came down the Euphrates or from Dilmun on the Persian Gulf, the metals from Oman and the Sinai, the granite and marble from Turkey and Persia, and the lumber from Lebanon used to build the temples.
"Nomads around the fringes of the irrigated and cultivated areas included the Hebrews, whose daily fare large comprised barley pottages flavored with greens and herbs and flatbreads of barley and wheat, which they farmed in oases during the growing season or acquired by bartering their barren ewes and young rams. They made yogurt and fresh cheese from the milk of their flocks, which they ate accompanied by olive or sesame oil, honey, and grape must and date sweeteners (both of which were also called honey). To conserve their flocks, the source of their wealth, they enjoyed meat only on special occasions following the sacrifice of the 'fruit of the ground' (barley and wheat) and the 'firstlings of the flock' (lambs and kids) to Jehovah."
On various uses of grain:
"The ancient Romans built their empire on barley porridge. The Chinese enjoy rice porridge, the Indians rice and lentil porridge. Polenta (millet and later maize porridge) has sustained generations of Italian peasants. Similarly, grits and mushes were staples of the American colonies. Turkish families commemorate Noah's rescue from the flood with a porridge of mixed grains, fruit, and nuts. Left to sour or ferment slightly, boiled grain dishes became tangy, a flavor much appreciated in eastern Europe, for example.
"Bread -- baked flour and water paste -- was much more portable, but it needed more fuel. Early bread was nothing like our puffy square loaf. Because so much of the bran had to be sifted out to make white flour, white bread was reserved for the very rich until the nineteenth century. Most bread was dark and flat, made of one or more of the hard grains, such as barley, wheat, oats, and later rye, often with some mixture of beans and the starchier nuts, such as chestnuts or acorns.
"To run a city-state or provision an army, rulers had to make sure that grains were extracted from those who worked the land, then transported to cities and put in storage. Sometimes they demanded grain as tribute; sometimes they operated what were in effect agribusinesses farmed by slaves, serfs, or other barely free labor to produce grain; and later they exacted taxes to be paid in grain. Grains, more important, if less glamorous, than the precious metals, exotic wild animals, and beautiful slave girls that they also collected, were processed and redistributed to the ruler's household and bodyguard as pay in kind. Kings, emperors, landlords, and the great religious houses continued to collect grain long after money was invented."
More on the sheer labor of cooking:
"Before beginning to cook, women had to gather scraps of brush, seaweed, dung, furze -- anything that would burn. Steaming and boiling, which use the least fuel, were the commonest ways of cooking. A hot meal was often prepared only once a day, other meals being cold. Water for cooking, drinking, and washing, enough for one to five gallons a day per person (contemporary Americans use about seventy-two gallons a day), had to be carried from a river or well; three gallons weigh about twenty-four pounds. Salt was a luxury, reserved for making salty preserves that accompanied salt-free porridge or bread."
On blood, beliefs about which inform ancient meat cuisines, and sacrifice:
"Blood congealed into flesh, according to the Chinese, the Hebrews, and the Greeks. It was what food finally turned into in animals, said Aristotle. Consequently few societies were neutral about blood as food: some valued it highly, others prohibited it. In the first group were nomads who harvested blood from their animals, Christians who drained the blood of carcasses and used it to make sausages or thicken sauces, and the Chinese. Even today many Hong Kong Chinese mothers feed their children blood soup to sharpen their minds before examination. In the second group were Jews and Muslims, who slaughtered animals so as to drain all blood from the body.
"The sacrifice was followed by the sacrificial feast -- humans eating the gods' leftovers, which were charged with divine power. This might mean eating the flesh of sacrificed humans, a practice motivated not by hunger but by the logic of sharing the gods' leftovers. At least some northern Europeans ate the brains of the sacrificed in the third millennium BCE. The Cocoma people of Brazil, when admonished by the Jesuits for eating their dead and drinking an alcohol laced with ground bones, reportedly said that it 'was better to be inside a friend than to be swallowed by the cold earth.' The Aztecs ate slivers of flesh from those who had been sacrificed on the pyramids. More commonly, however, the feast featured roast meat from sacrificed animals."
Eating human flesh, whether or not in the context of sacrifice, is one of those topics that's subject to a lot of controversy and misinformation. Depictions of the Aztecs as bloodthirsty cannibals, for instance, were obviously pulpy nonsense cooked up much later, but a rejection of that depiction led to a widespread rejection of the notion of any Aztec cannibalism, which is also -- from what I understand, though Mesoamerican history is not at all my area -- false. Cannibalism in times of desperation is obviously widespread in the sense that you find it in any culture, in any time or part of the world, where there is such desperation and famine. Sacrificial or ritual cannibalism is sort of a different thing, though of course some historians and anthropologists theorize that cultures that resorted to desperation-induced cannibalism frequently enough simply developed rituals around it.
Which brings us to theories of other food rituals and food rules, the best known of which in the Western world are the Jewish dietary restrictions:
"Jewish culinary rules were laid out in Leviticus and other books of the Old Testament. Blood, animals with cloven hooves unless they chewed their cud, pork, animals with both fins and scales who lived in the water, and (echoing Persian practice) insects were all forbidden as foods. So was cooking meat in milk and dining with non-Jews. Temple priests followed rules of purification before sacrifice, slaughtered animals so that the lifeblood drained out, and refrained from offering impure fermented (corrupted) foods.
"In the mid-twentieth century, scholars offered opposing interpretations of Jewish food rules, particularly the ban on pork. Marvin Harris argued that they were health measures to prevent trichinosis." The Harris theory was still widely disseminated when I was in grad school, incidentally, and vaguely familiar to a lot of people outside the field, even though it isn't very good (or perhaps because it doesn't require much information). "Mary Douglas and Jean Soler contended that they were designed to create a distinct Jewish identity. The latter interpretation squares better with the simultaneous proliferation of culinary rules in the Persian Empire and the Indian states. With limited culinary resources, identity is most easily established by banning certain foodstuffs, cooking methods, and ways of dining. Pigs, being difficult to herd, were not popular with peoples of nomadic origin, so the force of the rule probably only became fully felt centuries later when Jews became a minority in pork-eating Roman or Christian lands."
Speaking of Rome:
"Every morning and evening, Roman infantrymen prepared meals like those they would have eaten at home on the farm. They boiled wheat to make wheat porridge or wheat pottage (wheat cooked with dried peas, beans, or lentils, a bit of oil, salt, and a little salt pork), which they dipped into with wooden spoons. Or they mixed whole-wheat flour, water, and salt and baked coarse whole-wheat bread, probably in the ashes of the campfire, to eat with a bit of cheese. In the morning, these foot soldiers ate standing up like animals outside their goatskin tents. In the evening, they ate seated on the ground in the tents like children or slaves. They drank water cut with wine or vinegar. Sacrifices on festival days, before they went into battle, and to celebrate victory, added a treat of boiled or roast beef. On the move or near the enemy, biscuit -- twice cooked bread that lasted a long time -- made an instant meal."
Making that porridge or potage required soldiers to grind grain, for which pack mules carried grindstones. "One of the soldiers assembled the grindstone, placing first a skin or cloth on the ground to catch the flour, then the squat lower grooved cylindrical stone, then the top stone, which rotated over the lower one. He squatted like a woman or slave over the grindstone. With one hand he rotated the upper stone using a peg near the circumference as a handle; with the other he poured handfuls of grain into a hole in the upper stone. The grain dribbled onto the lower stone and was sheared by the movement of the upper. The flour moved toward the circumference along grooves cut in the lower stone. He could grind enough meal for an eight-man squad in about an hour and a half with this rotary grinder, compared to at least four or five hours had he used a simple grindstone.
"Adopting the rotary grindstone involved a series of tradeoffs. It ground faster. The weight of the upper stone, not the weight of the grinder, did the shearing, making the work less exhausting. On the other hand, the rotary grindstone was heavier, more expensive, and more difficult to make than a simple grindstone. Nor could it produce the fine gradations of flour that the simple grindstone could deliver. ... If every squad of eight men required a mill and if at its height, the army comprised half a million men, then some sixty thousand grindstones were lugged over the Roman roads. A millennium and a half was to pass before any other European army was as well fed."
Roman feasts during the Empire:
"The dinner included appetizers, sauced dishes, and desserts, all spurned by republicans. For appetizers, diners might have lettuce (perhaps served with an oil and vinegar dressing), sliced leeks (boiled, sliced in rounds, and dressed with oil, garum [like fish sauce], and wine), tuna garnished with eggs on rue leaves, eggs baked in the embers, fresh cheese with herbs, and olives with honeyed wine.
"For the main course, slaves brought in dishes such as red mullet roasted and served with a pine nut sauce; mussels cooked with wine, garum, and herbs; sow's udder, boiled until soft and then grilled and served with sauce; chicken with a stuffing of ground pork, boiled wheat, herbs, and eggs; and crane with turnips in an herb-flavored vinegar sauce. Exotic fare, such as a pea dish, a chicken stew, and baked lamb with a sweet-and-sour sauce, attributed to Persia, added a cosmopolitan touch.
"Typically, sauces were made by pulverizing hard spices, usually pepper or cumin, but also anise, caraway, celery seed, cinnamon, coriander, cardamom, cassia, dill, mustard, poppy, and sesame, in a mortar. Nuts, such as almonds, filberts, and pine nuts, or fruits, such as dates, raisins, and plums, were added and the mass was worked to a paste. To this mixture, fresh herbs such as basil, bay, capers, garlic, fennel, ginger, juniper, lovage, mint, onion, parsley, rosemary, rue, saffron, savory, shallot, thyme, or turmeric were added, followed by garum and perhaps wine, must, honey, olive oil, or milk. The mixture was warmed to blend the tastes and sometimes thickened with wheat starch, eggs, rice, or crumbled pastry."
Outside of the Roman Empire, grain processing was as much as four times more labor-intensive, and in many parts of the world, unleavened bread (and steamed and boiled doughs in the form of pasta and dumplings, as in China) continued to be the norm. Eventually some cultures caught up to the Romans' efficiency, but beyond that, "There was to be little change in grain processing until the Industrial Revolution, and little change in the final cooking of grains until the twentieth century."
"To supplement the staple grain, oil seeds and olives were crushed in a variety of mills and mortars and pressed in a variety of presses. Sweeteners continued to be produced by many different methods -- sprouting grains (malt sugar in China), boiling down sap (palm sugar in India), boiling down fruit juices (grape and other fruit juices in the Middle East), and taking honeycombs from hives (honey in the Roman Empire). Alcoholic and lactic fermentations in the western half of Eurasia and mold ferments in the eastern half were used to make staple dishes (raised bread) and alcoholic drinks (wine, beer, and chiu) as well as to preserve foods (cheese and sausage in the Roman Empire, milk in the Middle East) and create condiments (fermented beans in China). Autolysis (self-digestion) produced garum in the Mediterranean and probably fish sauce in Southeast Asia."
An important point I brought up earlier about food processing is mentioned as we moved through the next millennium and a half:
"Cooking was a form of alchemy, the most sophisticated understanding of changes in matter then available. Cooking and alchemy used the same tools and equipment. Both sought to find the real nature or essence of a natural substance by applying the purifying power of fire. Just as a crude ore had to be refined in the fire to release the pure shining metal, so raw wheat or sugarcane had to be similarly refined to extract the pure white flour (originally "flower") or gleaming sugar. Culinary processes such as sugar refining and bread baking were thus potent metaphors for spiritual progress. Unlike our contemporary understanding of natural food as having received only minimal processing, this earlier understanding was that processing and cooking were essential to reveal what was natural."
This all reflects one of the major principles of ancient culinary philosophy: the theory of the culinary cosmos, which led to the practice of eating only cooked food (even most fruits were not often eaten uncooked in Europe, for instance), and, for those who could afford the options, to eat food that "balances the temperament," an idea that trickles down today in a lot of horseshit diets.
This balancing the temperament stuff was grounded in the idea of the "humors," or maybe it's better to think of them as both coming from the same worldview, and it informed the view of what we would now term "healthy eating." "In preparing food for their noble employers, cooks were as aware of the need to balance the humors as we are today of, say, the need to have all food groups represented. Root vegetables such as turnips were by nature earthy (dry and cold) and thus better left to peasants. Chard, onions, and fish were cold and wet, so that frying was appropriate. Mushrooms were so cold and wet that they were best avoided entirely. Melons and other fresh fruit were not much better, being very moist and thus thought likely to putrefy in the stomach. Grapes were best served dried as raisins, quinces were dried and cooked with extra sugar -- warm in humoral theory -- to make quince paste. Red wine tended to be cold and dry, so it was best served warm with added sugar and spices."
Another major principle was the hierarchical principle, which broadly called for eating according to your station in life -- a high cuisine for the court, a humble cuisine for the poor -- which roughly in this time period was extended in many parts of the world to include higher cuisine for holy men and intellectuals than for the unenlightened, rather than basing hierarchy only on overt political power.
The third ancient culinary principle was that of sacrifice, which had been largely been phased out in the Axial Age and replaced with new religious rules for eating: "these rules identified preferred ingredients and dishes, often ones believed to enhance contemplation, such as meat substitutes (fish, tofu, gluten), sweetened soft fruit and nut drinks, or stimulating drinks such as tea, coffee, and chocolate. They specified how to process an cook foods, including guidelines for slaughtering, and laid down rules about how cooks should purify themselves, whether fermented foods were acceptable, and which foods could and could not be combined. A third cluster of rules specified mealtimes, days of fasting and feasting, and who could dine with whom.
"The rules, stricter for religious elites than for ordinary believers, were formulated and reformulated for centuries because the founders of the religions, although they relied on culinary metaphors to explain beliefs and doctrines, rarely laid down clear and consistent regulations for cooking and eating. Christians, for example, were not required to fast until the fourth or fifth century. Then they were instructed to fast on about half the days of the year. Today, in the Roman Catholic Church, fasting has been reduced to a minimum."
"Even more important in the dissemination of the new cuisines were monasteries, shorthand for permanent religious houses. Like courts, they were places where all ranks of society met, from clerics to their servants and slaves. Like court kitchens, monastery kitchens were huge and complex, turning out different meals for different ranks: noble and aristocratic visitors; passing merchants, monks, and nuns; the poor and indigent; the sick; and students studying in the monastery school. ... Like courts, they invested in food-processing equipment like gristmills, oil presses, and sugar mills, processing and adding value to foodstuffs. These they sold or offered as gifts, thereby creating loyalty. Like courts, monasteries were part of networks that crossed state boundaries, in this case by the movement of religious orders and missionaries rather than marriage."
"As theocratic cuisines spread, so did their preferred raw materials: plants and sometimes animals. Particularly important were the transfers of southeastern and Chinese plants to Buddhist India, Indian plants to Buddhist China, Chinese plants to Korea and Japan, Indian plants to Islamic lands, and European plants to the Americas through the Columbian Exchange. Royal and monastic gardens and large estates transplanted, ennobled, and grew sugarcane, rice, grapevines, tea, coffee, and other crops essential to the new cuisines."
Here we come to one of my favorite topics in culinary history:
"Whereas culinary diffusion prior to world religions had primarily meant emulating or rejecting neighboring high cuisines, with world religions the relation between successive cuisines became more complex. 'Fusion,' the term so often used, does not do justice to the variety of interactions. One cuisine could be layered over another, as happened with the Spanish conquests in the Americas, the conquerors eating Catholic cuisine, the indigenous retaining their own cuisine. Specific dishes, techniques, plants, and animals might be adopted, as Europeans, for example, adopted distilling, confectionary, and citrus from Islam."
Oh, if I had a nickel for every dipshit going on about how some dish or approach isn't "authentic."
There is no authentic cuisine. All cuisines are in flux and ever have been. Lots of people carry around a sense of normalcy based on a sphere that extends for a couple hundred miles and a couple dozen years, and think that sense of normalcy reflects something real, something other than their memory of food they've experienced. That's not how it works. Italian food didn't suddenly become Italian food when tomatoes arrived on the boot, or when immigrants in the northeast US started making meatballs. And putting tomato sauce on that pasta for the first time, making those first giant meals of spaghetti and meatballs, didn't invalidate those meals either.
Nobody worried about this bullshit when they actually fucking cooked. It's the hobbyhorse of the dilettante.
Meanwhile! In the Mongol Empire!
"Twenty seven soups dominate the ninety-five food recipes [in Hu's Proper and Essential Things]. The centerpiece of Mongol cuisine, these soups could be quite liquid or thickened to become solid. The basic recipe went as follows:
"1: Chop meat on the bone (usually mutton, but also game such as curlew, swan, wolf, snow leopard) into pieces. Boil in a cauldron of water until tender. Strain the broth and cut up the meat.
"2: Boil the broth with a variety of thickeners, vegetables, and tsaoko cardamom.
"3: Add the meat.
"4: Season to taste with salt, coriander, and onions.
"For a traditional Mongol taste, the thickeners might be chickpeas, hulled barley, or barley meal. To give the soup a Persian touch, it was thickened with aromatic rice or chickpeas, seasoned with cinnamon, fenugreek seeds, saffron, turmeric, asafetida, attar of roses, or black pepper, and finished with a touch of wine vinegar. For a Chinese taste, it was thickened with wheat-flour dumplings and glutinous rice powder or rice-flour noodles, and flavorings of ginger, orange peel, soybean sauce, and bean paste. In this way, the soup of the khans could be adjusted to the preferences of the peoples they had conquered."
Authenticity my ass.
This is basically how pizza adapts to local culinary niches today, and of course what McDonald's does internationally.
Now coffee enters the scene, thank God:
"Coffee, like wine, was an aid to union with the divine. Long before the time of the Sufis, coffee beans, the fruit of a bush native to the highland forests of southwestern Ethiopia, had been chewed like a nut or mixed with animal fat to make a portable, satisfying, and stimulating food for warriors." If you haven't seen the way coffee grows, the bean is just the seed, and of course has a softer fruit surrounding it (which is also lightly caffeinated, and is sometimes used now in some coffee-growing regions to make a vaguely hibiscus-like drink). "Coffee plants were naturalized in Yemen perhaps as early as the sixth century BCE when the Abyssinians invaded Arabia. Later, a new way of preparing coffee by toasting the beans, grinding them, and brewing them with hot water was developed, perhaps in Iran. The Arabic word for coffee, qahwah, probably derives from a word meaning to have little appetite and hence to be able to do without. It had been first applied to wine and later to coffee (which suppressed the desire to sleep). Sufi pilgrims, traders, students, and travelers consumed coffee to keep awake during ceremonies and induce a sense of euphoria, spreading its use throughout the Islamic world between the thirteenth and fifteenth centuries."
Islam introduced coffee to the West, as with so many things, and that's not all! They also introduced stuff to have with coffee.
"Sugar cookery was introduced from Islam in the twelfth century by a physician known as Pseudo-Messue. The English words syrup, sherbet, and candy all have Arabic roots. Medicinal electuaries, pastes of spices and drugs, and comfits, sugar-coated spices, were the distant forerunners of candy. Sugared spices did not break the fast, Thomas Aquinas said, because 'although they are nutritious themselves, sugared spices are nonetheless not eaten with the end in mind of nourishment, but rather for ease in digestion.' It was an important decision, both because it gave medical respectability to sugar and because it foreshadowed later arguments about chocolate."
Arab fruit pastes became Portuguese quince marmalada, later inspiring citrus marmalades that are more familiar to Americans, and the sweet fried doughs used to celebrate the end of Ramadan inspired similar fried doughs in Catholic traditions, eaten before the Lenten fast: doughnuts, beignets, etc. (The Brits have their pancakes.)
Along with all this came distillation and better booze. Not too shabby.
"In the early fourteenth century, cookbook manuscripts began appearing across Europe.... Rarely were these cookbooks step-by-step manuals, being, rather, testimonials to a ruler's fine cuisine or aide-memoires to professional cooks. With the invention of printing, the number increased again."
Medieval history is not at all my area of expertise, but this broadly fits my understanding of the ... history of professionalization, sort of, the history of procedural rigor, if you will.
The dissemination of cookbooks further contributed to the Westernization of Islamic dishes in Europe, in much the same way that nineteenth and twentieth century cookbooks Americanized immigrant and foreign cuisines:
"Al-sikbaj (meat cooked in a mixture of sweetener and vinegar) was transformed into fried or poached fish (or chicken, rabbit, or pork) in an acid marinade of vinegar or orange (escabeche), perhaps the origin of aspic." Al-sikbaj was a characteristic dish of the Moors who conquered Spain, but has since died out in the Muslim world. "Ruperto de Nola's Libre del coch included thin noodles, bitter oranges, fried fish, escabeche, almond sauces, and almond confections. Martinez Motino's Arte de cocina contained several recipes for meatballs and capirotada, and one for couscous. It also had one for Moorish hen, roast chicken cut into pieces, simmered with bacon, onion, broth, wine, and spices -- which were not named, but probably included pepper, cinnamon, and cloves -- and then enlivened with a final dash of vinegar. The bacon and wine were typically Christian, but the sour-spicy sauce justifies the name."
So here's the other thing about sugar: it used to be in fucking everything. The line between "sweet" and "savory" isn't just a recent thing, it's the defining characteristic of the modern palate. Candies and confectionery used to include not just candied oranges and cherries, but carrots and turnips. Meat dishes in high cuisines were regularly served in sweet sauces -- no, not like at that Chinese place, no not like barbecue sauce, like really noticeably sweet, not tangy.
Then that changed.
If you have to pick a point where things start to change, it's 1651, when Pierre Francois La Varenne published La Cuisinier Francois, which was widely translated, and which inspired numerous imitators. The middle of the seventeenth century saw a significant shift in Western tastes characterized by two changes: "the disappearance of spices and sugar from savory dishes [notice how rarely we use 'baking' spices like cinnamon, clove, nutmeg, etc., in savory dishes, whereas they are still common in Middle Eastern, North African, and Central Asian cuisines] and the appearance of new fat-based sauces, many thickened with flour."
The traditional Catholic cuisine was displaced piecemeal across Europe. In England, "the aristocracy dined on the new French cuisine. The gentry, by contrast, rejected this in favor of a middling bread-and-beef cuisine optimistically described as the national cuisine." Across most of western Europe, sweet and sour were segregated to different dishes and usually different courses, while beef and bread became higher profile, as did dairy, and sauces using fat and flour. French cuisine both informed other European cuisines while at the same time absorbing and reinterpreting elements of them, a process that continued for the next couple centuries.
One of the major innovations of the time period was "middling cuisines," a prerequisite to modern cuisine: "Middling in the sense of bridging high and low cuisine, rich in fats, sugar, and exotic foodstuffs, featuring sauces and sweets, and eaten with specialized utensils in dedicated dining areas, middling cuisine became available to an increasing proportion of the population in the following centuries. Changes in political and nutritional theory underwrote this closing of the gap between high and humble cuisines. As more nations followed the Dutch and British in locating the source of rulers' legitimacy not in hereditary or divine rights but in some form of consent or expression of the will of the people, it became increasingly difficult to deny to all citizens the right to eat the same kind of food. In the West, the appearance of middling cuisines ran in close parallel with the extension of the vote. Reinforcing this, nutritional theory abandoned the idea that cuisine determined and reflected rank in society in favor of a single cuisine appropriate for every class of people.
"The growth of middling cuisines is what nutritionists call the 'nutrition transition,' the sequential global shift from diets composed largely of grains to diets high in sugar, oils, and meat ... the nutrition transition increases food security [but] brings in its wake many associated health problems, including increased incidence of strokes, heart attacks, obesity, and diabetes, and with them increased costs for society." (Of course, poverty and malnutrition have also decreased, so there's that.)
These middling cuisines began before the Industrial Revolution, but that was a huge driver in really bringing all these trends together and forming what we would recognize as modern cuisine. The advances of the Industrial Revolution brought about more efficient and cheaper forms of food preservation, refrigeration and rapid transportation of fresh food, extraordinary advances in agriculture (among them new fertilizers and pesticides), and so on, transforming the quality, quantity, and price of food more dramatically than any development had since the mastery of grain cookery thousands of years earlier. Those advances in transportation also made more feasible the waves of immigration that repopulated the United States after Native American tribes were decimated, and the arrival of many, many different immigrant groups, all with their own cuisines -- but not always with access to ingredients from home, and sometimes finding it easier to adapt what was available -- contributed to what is erroneously called the "melting pot," an American cuisine that was and I think remains in flux. Americans were also the first to begin using ice in their drinks -- and in a million other ways -- to such a great extent, and pioneered the commercial ice business.
The influence of French cuisine on modern cooking remained strong, and in the nineteenth and twentieth centuries, numerous dishes in non-French cuisines were created or altered with distinctive French touches -- adding butter instead of oil, reducing the spices and herbs in Greek dishes, adding dressing cold cooked vegetables or meats with mayonnaise or raw vegetables with vinaigrette. Bechamel -- originally Italian! but popularized by La Varenne -- showed up everywhere, with Russians using it as a piroshki filling with mushrooms or to thicken soup, Mexican chefs using it to dress fish, Indian chefs using it to dress eggplant and pumpkin. Beef Stroganov, unsurprisingly, is one of the most famous Russian dishes attempting to emulate French cooking, while bechamel was repopularized in northern Italy and found its way into lasagna.
A middling cuisine means, by extension, that pretty much everyone eats pretty much the same thing, at least in the broad strokes. Inevitably that means the specifics invite criticism. "Religious groups, conservatives, socialists, and feminists attacked modern middling cuisines. Some wanted the egalitarianism of modern culinary philosophy but rejected other aspects. For example, many reformers turned their backs on meat, white bread, and alcohol, developing alternative physiologies and nutritional theories to explain why vegetarianism [a term coined in the 1840s] or whole grains were superior. Others attacked domesticity, liberalism, and free trade, proposing alternative ways of organizing modern cooking, commerce, and farming. Yet others hoped it might be possible to return to a [purely] imagined egalitarian past, invoking agrarian and romantic traditions to criticize modern, industrialized cuisines."
One key to remember with the historical development of these things, and when encountering new such things in the wild, is, you know, the rejection tends to come first, with the rationale developed shortly thereafter. By the time you hear about it, that may not be clear, because once the rationale is developed, it's all "so I was doing research on Youtube and I discovered, holy fuck, bananas cause butt cancer," but really it's just that this one guy didn't fucking like bananas, or the idea of bananas, or he really liked the idea of conspicuously avoiding consumption of something, and later he came up with the butt cancer thing.
Okay! That brings us close enough to the present day to wrap up there. One more book down.
I think about cooking a lot, is the thing. I wouldn't do it if it weren't something that intrigued me. Or I suppose I would do, but I'd do it differently, you know -- I wouldn't cook the way that I do, which is a whole thing we don't need to get into here. I think about flavor, I think about technique, I think about context, and because I'm a historian, I think about history.
There aren't a whole lot of book-length histories of cuisine out there. It's a slightly more popular topic for microhistory -- you can find a number of different histories of coffeehouses, tea, pizza -- and there has been a small but promising uptick (maybe too small, maybe I shouldn't spook it) in books about immigrant cuisines in the United States in the last decade or so, which is very very cool. But there are only a handful of broad histories of cuisine overall, of which Reay Tannahill's is probably still the canonical.
Rachel Laudan's Cuisine and Empire is a valuable addition to the field. It differs from Tannahill quite deliberately in that Tannahill (writing in the 70s, perhaps relevantly) primarily organizes her work by country (or empire), while Laudan emphasizes the contacts between cultures.
This is a huge book, and by its nature not something can be summarized, so there will be a lot of detail that I skip over here because I just didn't think to dogear it.
It's been some decades since Tannahill's book, and in that time there has been considerable activity on the matter of cooking in prehistory. Most famously, Richard Wrangham has proposed that cooking actually predates us -- us meaning H. sapiens anyway, and that Homo erectus first cooked its food nearly two million years ago. Further, Wrangham argues that, as the subtitle of his book Catching Fire would have it, Cooking Made Us Human -- that it was our discovery of cooking that drove human evolution on a path divergent from the other primates, one that led not only to less time foraging but less time eating. Chimpanzees spend need to spend five hours a day chewing their food in order to get enough energy to get through that day. Cooking not only softens food (and in the case of many ingredients, increase bio-availability of many nutrients, and of course neutralizes many toxins), it's Wrangham's view that an early development of cooking contributed to numerous evolutionary advantages, including a more efficient digestive tract. This is not a universally held view, mainly because there is insufficient archaeological evidence to compel it, but it is more widely accepted that our various masteries of eating -- cooking, hunting, and much much later agriculture -- contributed to brain growth.
Our modern expectation to eat "fresh" and "natural" foods is possible only because we eat foods out of season -- radically out of season, in senses incomprehensible to the past: we not only rapidly transport food across the world from where it is grown or raised to where it is eaten, we not only refrigerate food to extend its freshness, we alter the natural life cycles of animals in order to have meat and dairy on demand, and we've spent thousands of years breeding both animals and plants for more desirable food traits. Plant-based foods take longer to spoil and are more resistant to pests; meat is more abundant; dairy is sweeter.
Humankind is possible only because of unfresh food, of course, preserved food, smoked, dried, salted, fermented. Grain that's been in the granary for months. Dried out meat you have to boil for a few hours before it's soft enough that you can eat it. Different peoples faced different challenges of climate, and had access to different food resources -- those differences, ultimately, account for the earliest cuisines, which is to say, sets of cooking methods, techniques, habits, and technologies characteristic of a given region or culture.
Laudan classifies cooking operations into four groups: "changing temperature (heating and cooling); encouraging biochemical activity (fermenting); changing chemical characteristics by treating with water, acids, and alkalis (leaching and marinating, for instance); and changing the size and shape of the raw materials using mechanical force (cutting, grinding, pounding, and grating, for example)." It's an important reminder in part because until we get to the fairly recent past, cooks had to do much more of this than they do now; most purchased ingredients are already heavily processed, though we don't think of them that way. Even at farmer's markets, for instance, many of the vegetables have been washed (even if not as efficiently as supermarket vegetables are), and possibly trimmed. But that's the most minor example compared to the preparation of grains -- which required hours of work for every day's worth of food -- or meat. "Take meat, for example. A carcass has to be skinned before meat can be cut from the bone and then into portions. These may then be eaten, or subjected to heat and then eaten, or frozen and dried or fermented so that they can be eaten at a later date." Food historians generally refer to those preliminary operations as "processing," although moderns tend to think of "processed food" as spray cheese and Tofurkey.
The stories of the earliest cities are the stories of cuisines based primarily on grains and roots -- and really, the Neolithic Revolution, the Agricultural Revolution, might better be called the Grain Revolution, because while it is sometimes simply described as "when people started planting crops," which led to permanent settlements instead of nomadic hunting and gathering, it was mastery of grain that made this possible, and it occurs relatively late in the history of cooking (especially if we accept Wrangham's view) because dealing with grain is so fucking difficult. There's some debate about whether we may have been grain-foragers before we were grain-planters -- I mean, presumably we had to have been, but the debate is about how long that went on -- but in the grand scale of things it doesn't make much difference. Grain is fucking difficult. The seeds are very small and very hard and even once you deal with them, you still need to process them further to eat them. (Keep in mind that even gathering fuel for cooking fires was a lot of work and time.)
As Laudan points out, "Cities, states, and armies appeared only in regions of grain cuisines. When they did, grain cuisine splintered into subcuisines for powerful and poor, town and country, settled populations and nomads. A feast following a sacrifice to the gods was the emblematic meal everywhere, the meal that represented and united the society, as Thanksgiving now does in the United States. It is not clear whether these global parallels reflect widespread contact between societies, the logic of emerging social organization, or a combination of the two."
That last sentence sums up a lot of history and anthropology, incidentally. Don't trust anyone who insists that when you find X and sort-of-X in two places, it must be because contact between the two places transmitted X. That recent study claiming ancient origins for Little Red Riding Hood et al based on phylogenetic analyses? Don't take it at its word.
Anyway, the crazy difficulty of grain (even apart from how much more difficult it is earlier in history at the dawn of the Neolithic Revolution): "Steamed broomcorn millet and foxtail millet, tiny round grains from disparate boanical genera, were the basis of the first cuisine we encounter in the Yellow River Valley in ancient China. There peasants lived in small villages, their dwellings half buried in the ground and roofed with thick thatch to protect against the freezing winters, and the interiors crammed with grain and preserved vegetables. Small patches of millet dotted the valley's fertile yellow soil, which was brought by floods and winds from the steppe. To prepare the millet, peasants lifted heavy pestles high above mortars and let them fall repeatedly until the inedible outer hulls were cracked. Beginning around the first century BCE, they used foot-trodden pestles to pound grain in a mortar buried in the ground, a less demanding method. When all the hulls were cracked, they tossed the grains in a basket, winnowing away the lighter hulls. Then they steamed the grains until they were light and fluffy in three-legged pots set over small fires, a method that conserved scarce fuel. Before dipping their fingers into the communal bowl, they offered a little to the gods and the ancestors. They accompanied the millet with bites of pickled vegetables, cabbage of various kinds, mallow, water-shield (an aquatic plant), or bamboo shoots, seasoned and preserved with costly salt. Sometimes, when they had trapped small wild animals, they had a bit of boiled or steamed meat, seasoned with Chinese chives, Chinese dates, or sour apricots."
By this point, other grains had been introduced to the region from the Fertile Crescent -- wheat and barley, collectively referred to as mai -- but were unpalatably tough and chewy when prepared like millet, and were usually eaten only in lean times, like the months before the new harvest, when last year's millet stock began to run low.
A more elaborate sacrificial feast:
"Servants set out mats of aromatic reeds, small stools to support the diners' elbows, and dishes of bronze, wood, bamboo, and pottery. Meat on the bone and grain went on the left of each setting, sliced meat, drinks, and syrups on the right, and around them minced and roast meats, onions, and drinks were arranged in a symmetrical pattern. After making an offering to the ancestors, the king and the nobles knelt to eat, each man's seniority and valor in battle determining where he knelt and what pieces of meat he was entitled to. The warriors took morsels of the drier dishes with their fingers: meats marinated in vinegar, fried, and served over millet or rice; jerky spiced with brown pepper; and jerky seasoned with ginger, cinnamon, and salt. They scooped up keng, a stew soured with vinegar or sour apricots (Prunus mume, the "plums" of plum sauce). They nibbled on small cubes of raw beef, cured in chiu, and served with pickles, vinegar, or the juice of sour apricots; on meatballs of rice and pork, mutton, or beef; and on the much-sought-after roasted, fat-wrapped dog's liver."
But up above, we mentioned roots too, not just grains. The tropical monsoon region begins a few hundred miles south of the Yellow River Valley, and included both a root cuisine and a rice cuisine, about which much less is known than the Yellow River Valley cuisine. "To begin with the root cuisine, taro, yam, and the cooking banana (the starchy, high-yielding fruit of Musa spp., as well as its root) were boiled or steamed, and most likely pounded to pastes that could be scooped up with the fingers. People on the oceanic side of New Guinea loaded outriggers with the basics of this culinary package and sailed east into the Pacific. To sustain themselves at sea, they stowed lightweight, long-lasting dried or fermented fish, breadfruit, and bananas for food. They filled gourds and bamboo sections with water, and drank the water inside coconuts. They packed slips, cuttings, young plants, and taro and yams in moist moss, then wrapped them in a covering such as leaves or bark cloth, tucked them into palm-leaf casings, and hung them out of reach of salt spray. Breeding pairs of pigs, chickens, and dogs, which, if worst came to worst, could be eaten on the way, were carried on board. Between 1400 and 900 BCE, they settled many of the South Pacific Islands."
Another sacrificial feast, in Mesopotamia (followed by some general detail):
"A sacrificial feast included sauces, sweets, and appetizers, hallmarks of high cuisine. Fried grasshoppers or locusts made tasty appetizers. Pickles and condiments concocted from seeds, sesame oil, vegetables, fruits, garlic, turnip, onion, nuts, and olives titillated the palate. Sauces were prepared from an onion-and-garlic flavoring base combined with a rich fatty broth thickened with breadcrumbs, the ancestors of sauces still served in the Middle East and even of present-day English bread sauce. Pomegranates, grapes, dates, and confections of milk, cheese, honey, and pistachios provided a sweet touch.
"Professional cooks labored in kitchens that were as large as three thousand square feet, much of the space devoted to making grain-based dishes, bread, and beer. From the coarse groats and fine flour provided by the grinders -- perhaps prisoners and convicts -- cooks prepared porridge, flatbreads, and slightly leavened breads, the latter in three hundred named varieties. Dough was shaped into the form of hearts, hands, and women's breasts, seasoned with spices, and filled with fruit, with the texture often softened by oil, milk, ale, or sweeteners. A flour-oil pastry was enlivened with dates, nuts, or spices such as cumin or coriander. Stuffed pastries were pressed into an oiled pottery mold with a design on the bottom before baking. Flatbreads were baked on the inside walls of large ceramic pots. There is some evidence that bulgur, an easy to cook food, was made by drying parboiled wheat.
"To feed the cities, barley was shipped along rivers and canals. Onions of various kinds, garlic, herbs such as rue, and fruits such as apples, pears, figs, pomegranates, and grapes came from the gardens of the wealthy. The animals were driven to the city, where they were slaughtered, the lambs and the kids going to the temples and noble houses, the male sheep and goats to the officials, royalty, and nobles, the tough ox and ewe meat to the army, and the carcasses of donkeys to the dogs, perhaps royal hunting dogs. Saltwater fish, turtles, and shellfish came from the salt marshes and the Persian Gulf. Dried fish, probably a specialized and regulated industry, came from the Persian Gulf and from as far away as Mohenjo-Daro on the Indus and the Arabian Sea. Salt, excavated from the mountains or evaporated from briny springs and brackish river water, was shipped to distribution centers and packed onto asses, probably in standard-sized, solid-footed goblets.
"Barley was wealth. It paid for the meat and cheeses. It paid for the lapis lazuli and carnelian dishes for the sacrifice, the gold and silver for jewelry, the boatloads of copper that came down the Euphrates or from Dilmun on the Persian Gulf, the metals from Oman and the Sinai, the granite and marble from Turkey and Persia, and the lumber from Lebanon used to build the temples.
"Nomads around the fringes of the irrigated and cultivated areas included the Hebrews, whose daily fare large comprised barley pottages flavored with greens and herbs and flatbreads of barley and wheat, which they farmed in oases during the growing season or acquired by bartering their barren ewes and young rams. They made yogurt and fresh cheese from the milk of their flocks, which they ate accompanied by olive or sesame oil, honey, and grape must and date sweeteners (both of which were also called honey). To conserve their flocks, the source of their wealth, they enjoyed meat only on special occasions following the sacrifice of the 'fruit of the ground' (barley and wheat) and the 'firstlings of the flock' (lambs and kids) to Jehovah."
On various uses of grain:
"The ancient Romans built their empire on barley porridge. The Chinese enjoy rice porridge, the Indians rice and lentil porridge. Polenta (millet and later maize porridge) has sustained generations of Italian peasants. Similarly, grits and mushes were staples of the American colonies. Turkish families commemorate Noah's rescue from the flood with a porridge of mixed grains, fruit, and nuts. Left to sour or ferment slightly, boiled grain dishes became tangy, a flavor much appreciated in eastern Europe, for example.
"Bread -- baked flour and water paste -- was much more portable, but it needed more fuel. Early bread was nothing like our puffy square loaf. Because so much of the bran had to be sifted out to make white flour, white bread was reserved for the very rich until the nineteenth century. Most bread was dark and flat, made of one or more of the hard grains, such as barley, wheat, oats, and later rye, often with some mixture of beans and the starchier nuts, such as chestnuts or acorns.
"To run a city-state or provision an army, rulers had to make sure that grains were extracted from those who worked the land, then transported to cities and put in storage. Sometimes they demanded grain as tribute; sometimes they operated what were in effect agribusinesses farmed by slaves, serfs, or other barely free labor to produce grain; and later they exacted taxes to be paid in grain. Grains, more important, if less glamorous, than the precious metals, exotic wild animals, and beautiful slave girls that they also collected, were processed and redistributed to the ruler's household and bodyguard as pay in kind. Kings, emperors, landlords, and the great religious houses continued to collect grain long after money was invented."
More on the sheer labor of cooking:
"Before beginning to cook, women had to gather scraps of brush, seaweed, dung, furze -- anything that would burn. Steaming and boiling, which use the least fuel, were the commonest ways of cooking. A hot meal was often prepared only once a day, other meals being cold. Water for cooking, drinking, and washing, enough for one to five gallons a day per person (contemporary Americans use about seventy-two gallons a day), had to be carried from a river or well; three gallons weigh about twenty-four pounds. Salt was a luxury, reserved for making salty preserves that accompanied salt-free porridge or bread."
On blood, beliefs about which inform ancient meat cuisines, and sacrifice:
"Blood congealed into flesh, according to the Chinese, the Hebrews, and the Greeks. It was what food finally turned into in animals, said Aristotle. Consequently few societies were neutral about blood as food: some valued it highly, others prohibited it. In the first group were nomads who harvested blood from their animals, Christians who drained the blood of carcasses and used it to make sausages or thicken sauces, and the Chinese. Even today many Hong Kong Chinese mothers feed their children blood soup to sharpen their minds before examination. In the second group were Jews and Muslims, who slaughtered animals so as to drain all blood from the body.
"The sacrifice was followed by the sacrificial feast -- humans eating the gods' leftovers, which were charged with divine power. This might mean eating the flesh of sacrificed humans, a practice motivated not by hunger but by the logic of sharing the gods' leftovers. At least some northern Europeans ate the brains of the sacrificed in the third millennium BCE. The Cocoma people of Brazil, when admonished by the Jesuits for eating their dead and drinking an alcohol laced with ground bones, reportedly said that it 'was better to be inside a friend than to be swallowed by the cold earth.' The Aztecs ate slivers of flesh from those who had been sacrificed on the pyramids. More commonly, however, the feast featured roast meat from sacrificed animals."
Eating human flesh, whether or not in the context of sacrifice, is one of those topics that's subject to a lot of controversy and misinformation. Depictions of the Aztecs as bloodthirsty cannibals, for instance, were obviously pulpy nonsense cooked up much later, but a rejection of that depiction led to a widespread rejection of the notion of any Aztec cannibalism, which is also -- from what I understand, though Mesoamerican history is not at all my area -- false. Cannibalism in times of desperation is obviously widespread in the sense that you find it in any culture, in any time or part of the world, where there is such desperation and famine. Sacrificial or ritual cannibalism is sort of a different thing, though of course some historians and anthropologists theorize that cultures that resorted to desperation-induced cannibalism frequently enough simply developed rituals around it.
Which brings us to theories of other food rituals and food rules, the best known of which in the Western world are the Jewish dietary restrictions:
"Jewish culinary rules were laid out in Leviticus and other books of the Old Testament. Blood, animals with cloven hooves unless they chewed their cud, pork, animals with both fins and scales who lived in the water, and (echoing Persian practice) insects were all forbidden as foods. So was cooking meat in milk and dining with non-Jews. Temple priests followed rules of purification before sacrifice, slaughtered animals so that the lifeblood drained out, and refrained from offering impure fermented (corrupted) foods.
"In the mid-twentieth century, scholars offered opposing interpretations of Jewish food rules, particularly the ban on pork. Marvin Harris argued that they were health measures to prevent trichinosis." The Harris theory was still widely disseminated when I was in grad school, incidentally, and vaguely familiar to a lot of people outside the field, even though it isn't very good (or perhaps because it doesn't require much information). "Mary Douglas and Jean Soler contended that they were designed to create a distinct Jewish identity. The latter interpretation squares better with the simultaneous proliferation of culinary rules in the Persian Empire and the Indian states. With limited culinary resources, identity is most easily established by banning certain foodstuffs, cooking methods, and ways of dining. Pigs, being difficult to herd, were not popular with peoples of nomadic origin, so the force of the rule probably only became fully felt centuries later when Jews became a minority in pork-eating Roman or Christian lands."
Speaking of Rome:
"Every morning and evening, Roman infantrymen prepared meals like those they would have eaten at home on the farm. They boiled wheat to make wheat porridge or wheat pottage (wheat cooked with dried peas, beans, or lentils, a bit of oil, salt, and a little salt pork), which they dipped into with wooden spoons. Or they mixed whole-wheat flour, water, and salt and baked coarse whole-wheat bread, probably in the ashes of the campfire, to eat with a bit of cheese. In the morning, these foot soldiers ate standing up like animals outside their goatskin tents. In the evening, they ate seated on the ground in the tents like children or slaves. They drank water cut with wine or vinegar. Sacrifices on festival days, before they went into battle, and to celebrate victory, added a treat of boiled or roast beef. On the move or near the enemy, biscuit -- twice cooked bread that lasted a long time -- made an instant meal."
Making that porridge or potage required soldiers to grind grain, for which pack mules carried grindstones. "One of the soldiers assembled the grindstone, placing first a skin or cloth on the ground to catch the flour, then the squat lower grooved cylindrical stone, then the top stone, which rotated over the lower one. He squatted like a woman or slave over the grindstone. With one hand he rotated the upper stone using a peg near the circumference as a handle; with the other he poured handfuls of grain into a hole in the upper stone. The grain dribbled onto the lower stone and was sheared by the movement of the upper. The flour moved toward the circumference along grooves cut in the lower stone. He could grind enough meal for an eight-man squad in about an hour and a half with this rotary grinder, compared to at least four or five hours had he used a simple grindstone.
"Adopting the rotary grindstone involved a series of tradeoffs. It ground faster. The weight of the upper stone, not the weight of the grinder, did the shearing, making the work less exhausting. On the other hand, the rotary grindstone was heavier, more expensive, and more difficult to make than a simple grindstone. Nor could it produce the fine gradations of flour that the simple grindstone could deliver. ... If every squad of eight men required a mill and if at its height, the army comprised half a million men, then some sixty thousand grindstones were lugged over the Roman roads. A millennium and a half was to pass before any other European army was as well fed."
Roman feasts during the Empire:
"The dinner included appetizers, sauced dishes, and desserts, all spurned by republicans. For appetizers, diners might have lettuce (perhaps served with an oil and vinegar dressing), sliced leeks (boiled, sliced in rounds, and dressed with oil, garum [like fish sauce], and wine), tuna garnished with eggs on rue leaves, eggs baked in the embers, fresh cheese with herbs, and olives with honeyed wine.
"For the main course, slaves brought in dishes such as red mullet roasted and served with a pine nut sauce; mussels cooked with wine, garum, and herbs; sow's udder, boiled until soft and then grilled and served with sauce; chicken with a stuffing of ground pork, boiled wheat, herbs, and eggs; and crane with turnips in an herb-flavored vinegar sauce. Exotic fare, such as a pea dish, a chicken stew, and baked lamb with a sweet-and-sour sauce, attributed to Persia, added a cosmopolitan touch.
"Typically, sauces were made by pulverizing hard spices, usually pepper or cumin, but also anise, caraway, celery seed, cinnamon, coriander, cardamom, cassia, dill, mustard, poppy, and sesame, in a mortar. Nuts, such as almonds, filberts, and pine nuts, or fruits, such as dates, raisins, and plums, were added and the mass was worked to a paste. To this mixture, fresh herbs such as basil, bay, capers, garlic, fennel, ginger, juniper, lovage, mint, onion, parsley, rosemary, rue, saffron, savory, shallot, thyme, or turmeric were added, followed by garum and perhaps wine, must, honey, olive oil, or milk. The mixture was warmed to blend the tastes and sometimes thickened with wheat starch, eggs, rice, or crumbled pastry."
Outside of the Roman Empire, grain processing was as much as four times more labor-intensive, and in many parts of the world, unleavened bread (and steamed and boiled doughs in the form of pasta and dumplings, as in China) continued to be the norm. Eventually some cultures caught up to the Romans' efficiency, but beyond that, "There was to be little change in grain processing until the Industrial Revolution, and little change in the final cooking of grains until the twentieth century."
"To supplement the staple grain, oil seeds and olives were crushed in a variety of mills and mortars and pressed in a variety of presses. Sweeteners continued to be produced by many different methods -- sprouting grains (malt sugar in China), boiling down sap (palm sugar in India), boiling down fruit juices (grape and other fruit juices in the Middle East), and taking honeycombs from hives (honey in the Roman Empire). Alcoholic and lactic fermentations in the western half of Eurasia and mold ferments in the eastern half were used to make staple dishes (raised bread) and alcoholic drinks (wine, beer, and chiu) as well as to preserve foods (cheese and sausage in the Roman Empire, milk in the Middle East) and create condiments (fermented beans in China). Autolysis (self-digestion) produced garum in the Mediterranean and probably fish sauce in Southeast Asia."
An important point I brought up earlier about food processing is mentioned as we moved through the next millennium and a half:
"Cooking was a form of alchemy, the most sophisticated understanding of changes in matter then available. Cooking and alchemy used the same tools and equipment. Both sought to find the real nature or essence of a natural substance by applying the purifying power of fire. Just as a crude ore had to be refined in the fire to release the pure shining metal, so raw wheat or sugarcane had to be similarly refined to extract the pure white flour (originally "flower") or gleaming sugar. Culinary processes such as sugar refining and bread baking were thus potent metaphors for spiritual progress. Unlike our contemporary understanding of natural food as having received only minimal processing, this earlier understanding was that processing and cooking were essential to reveal what was natural."
This all reflects one of the major principles of ancient culinary philosophy: the theory of the culinary cosmos, which led to the practice of eating only cooked food (even most fruits were not often eaten uncooked in Europe, for instance), and, for those who could afford the options, to eat food that "balances the temperament," an idea that trickles down today in a lot of horseshit diets.
This balancing the temperament stuff was grounded in the idea of the "humors," or maybe it's better to think of them as both coming from the same worldview, and it informed the view of what we would now term "healthy eating." "In preparing food for their noble employers, cooks were as aware of the need to balance the humors as we are today of, say, the need to have all food groups represented. Root vegetables such as turnips were by nature earthy (dry and cold) and thus better left to peasants. Chard, onions, and fish were cold and wet, so that frying was appropriate. Mushrooms were so cold and wet that they were best avoided entirely. Melons and other fresh fruit were not much better, being very moist and thus thought likely to putrefy in the stomach. Grapes were best served dried as raisins, quinces were dried and cooked with extra sugar -- warm in humoral theory -- to make quince paste. Red wine tended to be cold and dry, so it was best served warm with added sugar and spices."
Another major principle was the hierarchical principle, which broadly called for eating according to your station in life -- a high cuisine for the court, a humble cuisine for the poor -- which roughly in this time period was extended in many parts of the world to include higher cuisine for holy men and intellectuals than for the unenlightened, rather than basing hierarchy only on overt political power.
The third ancient culinary principle was that of sacrifice, which had been largely been phased out in the Axial Age and replaced with new religious rules for eating: "these rules identified preferred ingredients and dishes, often ones believed to enhance contemplation, such as meat substitutes (fish, tofu, gluten), sweetened soft fruit and nut drinks, or stimulating drinks such as tea, coffee, and chocolate. They specified how to process an cook foods, including guidelines for slaughtering, and laid down rules about how cooks should purify themselves, whether fermented foods were acceptable, and which foods could and could not be combined. A third cluster of rules specified mealtimes, days of fasting and feasting, and who could dine with whom.
"The rules, stricter for religious elites than for ordinary believers, were formulated and reformulated for centuries because the founders of the religions, although they relied on culinary metaphors to explain beliefs and doctrines, rarely laid down clear and consistent regulations for cooking and eating. Christians, for example, were not required to fast until the fourth or fifth century. Then they were instructed to fast on about half the days of the year. Today, in the Roman Catholic Church, fasting has been reduced to a minimum."
"Even more important in the dissemination of the new cuisines were monasteries, shorthand for permanent religious houses. Like courts, they were places where all ranks of society met, from clerics to their servants and slaves. Like court kitchens, monastery kitchens were huge and complex, turning out different meals for different ranks: noble and aristocratic visitors; passing merchants, monks, and nuns; the poor and indigent; the sick; and students studying in the monastery school. ... Like courts, they invested in food-processing equipment like gristmills, oil presses, and sugar mills, processing and adding value to foodstuffs. These they sold or offered as gifts, thereby creating loyalty. Like courts, monasteries were part of networks that crossed state boundaries, in this case by the movement of religious orders and missionaries rather than marriage."
"As theocratic cuisines spread, so did their preferred raw materials: plants and sometimes animals. Particularly important were the transfers of southeastern and Chinese plants to Buddhist India, Indian plants to Buddhist China, Chinese plants to Korea and Japan, Indian plants to Islamic lands, and European plants to the Americas through the Columbian Exchange. Royal and monastic gardens and large estates transplanted, ennobled, and grew sugarcane, rice, grapevines, tea, coffee, and other crops essential to the new cuisines."
Here we come to one of my favorite topics in culinary history:
"Whereas culinary diffusion prior to world religions had primarily meant emulating or rejecting neighboring high cuisines, with world religions the relation between successive cuisines became more complex. 'Fusion,' the term so often used, does not do justice to the variety of interactions. One cuisine could be layered over another, as happened with the Spanish conquests in the Americas, the conquerors eating Catholic cuisine, the indigenous retaining their own cuisine. Specific dishes, techniques, plants, and animals might be adopted, as Europeans, for example, adopted distilling, confectionary, and citrus from Islam."
Oh, if I had a nickel for every dipshit going on about how some dish or approach isn't "authentic."
There is no authentic cuisine. All cuisines are in flux and ever have been. Lots of people carry around a sense of normalcy based on a sphere that extends for a couple hundred miles and a couple dozen years, and think that sense of normalcy reflects something real, something other than their memory of food they've experienced. That's not how it works. Italian food didn't suddenly become Italian food when tomatoes arrived on the boot, or when immigrants in the northeast US started making meatballs. And putting tomato sauce on that pasta for the first time, making those first giant meals of spaghetti and meatballs, didn't invalidate those meals either.
Nobody worried about this bullshit when they actually fucking cooked. It's the hobbyhorse of the dilettante.
Meanwhile! In the Mongol Empire!
"Twenty seven soups dominate the ninety-five food recipes [in Hu's Proper and Essential Things]. The centerpiece of Mongol cuisine, these soups could be quite liquid or thickened to become solid. The basic recipe went as follows:
"1: Chop meat on the bone (usually mutton, but also game such as curlew, swan, wolf, snow leopard) into pieces. Boil in a cauldron of water until tender. Strain the broth and cut up the meat.
"2: Boil the broth with a variety of thickeners, vegetables, and tsaoko cardamom.
"3: Add the meat.
"4: Season to taste with salt, coriander, and onions.
"For a traditional Mongol taste, the thickeners might be chickpeas, hulled barley, or barley meal. To give the soup a Persian touch, it was thickened with aromatic rice or chickpeas, seasoned with cinnamon, fenugreek seeds, saffron, turmeric, asafetida, attar of roses, or black pepper, and finished with a touch of wine vinegar. For a Chinese taste, it was thickened with wheat-flour dumplings and glutinous rice powder or rice-flour noodles, and flavorings of ginger, orange peel, soybean sauce, and bean paste. In this way, the soup of the khans could be adjusted to the preferences of the peoples they had conquered."
Authenticity my ass.
This is basically how pizza adapts to local culinary niches today, and of course what McDonald's does internationally.
Now coffee enters the scene, thank God:
"Coffee, like wine, was an aid to union with the divine. Long before the time of the Sufis, coffee beans, the fruit of a bush native to the highland forests of southwestern Ethiopia, had been chewed like a nut or mixed with animal fat to make a portable, satisfying, and stimulating food for warriors." If you haven't seen the way coffee grows, the bean is just the seed, and of course has a softer fruit surrounding it (which is also lightly caffeinated, and is sometimes used now in some coffee-growing regions to make a vaguely hibiscus-like drink). "Coffee plants were naturalized in Yemen perhaps as early as the sixth century BCE when the Abyssinians invaded Arabia. Later, a new way of preparing coffee by toasting the beans, grinding them, and brewing them with hot water was developed, perhaps in Iran. The Arabic word for coffee, qahwah, probably derives from a word meaning to have little appetite and hence to be able to do without. It had been first applied to wine and later to coffee (which suppressed the desire to sleep). Sufi pilgrims, traders, students, and travelers consumed coffee to keep awake during ceremonies and induce a sense of euphoria, spreading its use throughout the Islamic world between the thirteenth and fifteenth centuries."
Islam introduced coffee to the West, as with so many things, and that's not all! They also introduced stuff to have with coffee.
"Sugar cookery was introduced from Islam in the twelfth century by a physician known as Pseudo-Messue. The English words syrup, sherbet, and candy all have Arabic roots. Medicinal electuaries, pastes of spices and drugs, and comfits, sugar-coated spices, were the distant forerunners of candy. Sugared spices did not break the fast, Thomas Aquinas said, because 'although they are nutritious themselves, sugared spices are nonetheless not eaten with the end in mind of nourishment, but rather for ease in digestion.' It was an important decision, both because it gave medical respectability to sugar and because it foreshadowed later arguments about chocolate."
Arab fruit pastes became Portuguese quince marmalada, later inspiring citrus marmalades that are more familiar to Americans, and the sweet fried doughs used to celebrate the end of Ramadan inspired similar fried doughs in Catholic traditions, eaten before the Lenten fast: doughnuts, beignets, etc. (The Brits have their pancakes.)
Along with all this came distillation and better booze. Not too shabby.
"In the early fourteenth century, cookbook manuscripts began appearing across Europe.... Rarely were these cookbooks step-by-step manuals, being, rather, testimonials to a ruler's fine cuisine or aide-memoires to professional cooks. With the invention of printing, the number increased again."
Medieval history is not at all my area of expertise, but this broadly fits my understanding of the ... history of professionalization, sort of, the history of procedural rigor, if you will.
The dissemination of cookbooks further contributed to the Westernization of Islamic dishes in Europe, in much the same way that nineteenth and twentieth century cookbooks Americanized immigrant and foreign cuisines:
"Al-sikbaj (meat cooked in a mixture of sweetener and vinegar) was transformed into fried or poached fish (or chicken, rabbit, or pork) in an acid marinade of vinegar or orange (escabeche), perhaps the origin of aspic." Al-sikbaj was a characteristic dish of the Moors who conquered Spain, but has since died out in the Muslim world. "Ruperto de Nola's Libre del coch included thin noodles, bitter oranges, fried fish, escabeche, almond sauces, and almond confections. Martinez Motino's Arte de cocina contained several recipes for meatballs and capirotada, and one for couscous. It also had one for Moorish hen, roast chicken cut into pieces, simmered with bacon, onion, broth, wine, and spices -- which were not named, but probably included pepper, cinnamon, and cloves -- and then enlivened with a final dash of vinegar. The bacon and wine were typically Christian, but the sour-spicy sauce justifies the name."
So here's the other thing about sugar: it used to be in fucking everything. The line between "sweet" and "savory" isn't just a recent thing, it's the defining characteristic of the modern palate. Candies and confectionery used to include not just candied oranges and cherries, but carrots and turnips. Meat dishes in high cuisines were regularly served in sweet sauces -- no, not like at that Chinese place, no not like barbecue sauce, like really noticeably sweet, not tangy.
Then that changed.
If you have to pick a point where things start to change, it's 1651, when Pierre Francois La Varenne published La Cuisinier Francois, which was widely translated, and which inspired numerous imitators. The middle of the seventeenth century saw a significant shift in Western tastes characterized by two changes: "the disappearance of spices and sugar from savory dishes [notice how rarely we use 'baking' spices like cinnamon, clove, nutmeg, etc., in savory dishes, whereas they are still common in Middle Eastern, North African, and Central Asian cuisines] and the appearance of new fat-based sauces, many thickened with flour."
The traditional Catholic cuisine was displaced piecemeal across Europe. In England, "the aristocracy dined on the new French cuisine. The gentry, by contrast, rejected this in favor of a middling bread-and-beef cuisine optimistically described as the national cuisine." Across most of western Europe, sweet and sour were segregated to different dishes and usually different courses, while beef and bread became higher profile, as did dairy, and sauces using fat and flour. French cuisine both informed other European cuisines while at the same time absorbing and reinterpreting elements of them, a process that continued for the next couple centuries.
One of the major innovations of the time period was "middling cuisines," a prerequisite to modern cuisine: "Middling in the sense of bridging high and low cuisine, rich in fats, sugar, and exotic foodstuffs, featuring sauces and sweets, and eaten with specialized utensils in dedicated dining areas, middling cuisine became available to an increasing proportion of the population in the following centuries. Changes in political and nutritional theory underwrote this closing of the gap between high and humble cuisines. As more nations followed the Dutch and British in locating the source of rulers' legitimacy not in hereditary or divine rights but in some form of consent or expression of the will of the people, it became increasingly difficult to deny to all citizens the right to eat the same kind of food. In the West, the appearance of middling cuisines ran in close parallel with the extension of the vote. Reinforcing this, nutritional theory abandoned the idea that cuisine determined and reflected rank in society in favor of a single cuisine appropriate for every class of people.
"The growth of middling cuisines is what nutritionists call the 'nutrition transition,' the sequential global shift from diets composed largely of grains to diets high in sugar, oils, and meat ... the nutrition transition increases food security [but] brings in its wake many associated health problems, including increased incidence of strokes, heart attacks, obesity, and diabetes, and with them increased costs for society." (Of course, poverty and malnutrition have also decreased, so there's that.)
These middling cuisines began before the Industrial Revolution, but that was a huge driver in really bringing all these trends together and forming what we would recognize as modern cuisine. The advances of the Industrial Revolution brought about more efficient and cheaper forms of food preservation, refrigeration and rapid transportation of fresh food, extraordinary advances in agriculture (among them new fertilizers and pesticides), and so on, transforming the quality, quantity, and price of food more dramatically than any development had since the mastery of grain cookery thousands of years earlier. Those advances in transportation also made more feasible the waves of immigration that repopulated the United States after Native American tribes were decimated, and the arrival of many, many different immigrant groups, all with their own cuisines -- but not always with access to ingredients from home, and sometimes finding it easier to adapt what was available -- contributed to what is erroneously called the "melting pot," an American cuisine that was and I think remains in flux. Americans were also the first to begin using ice in their drinks -- and in a million other ways -- to such a great extent, and pioneered the commercial ice business.
The influence of French cuisine on modern cooking remained strong, and in the nineteenth and twentieth centuries, numerous dishes in non-French cuisines were created or altered with distinctive French touches -- adding butter instead of oil, reducing the spices and herbs in Greek dishes, adding dressing cold cooked vegetables or meats with mayonnaise or raw vegetables with vinaigrette. Bechamel -- originally Italian! but popularized by La Varenne -- showed up everywhere, with Russians using it as a piroshki filling with mushrooms or to thicken soup, Mexican chefs using it to dress fish, Indian chefs using it to dress eggplant and pumpkin. Beef Stroganov, unsurprisingly, is one of the most famous Russian dishes attempting to emulate French cooking, while bechamel was repopularized in northern Italy and found its way into lasagna.
A middling cuisine means, by extension, that pretty much everyone eats pretty much the same thing, at least in the broad strokes. Inevitably that means the specifics invite criticism. "Religious groups, conservatives, socialists, and feminists attacked modern middling cuisines. Some wanted the egalitarianism of modern culinary philosophy but rejected other aspects. For example, many reformers turned their backs on meat, white bread, and alcohol, developing alternative physiologies and nutritional theories to explain why vegetarianism [a term coined in the 1840s] or whole grains were superior. Others attacked domesticity, liberalism, and free trade, proposing alternative ways of organizing modern cooking, commerce, and farming. Yet others hoped it might be possible to return to a [purely] imagined egalitarian past, invoking agrarian and romantic traditions to criticize modern, industrialized cuisines."
One key to remember with the historical development of these things, and when encountering new such things in the wild, is, you know, the rejection tends to come first, with the rationale developed shortly thereafter. By the time you hear about it, that may not be clear, because once the rationale is developed, it's all "so I was doing research on Youtube and I discovered, holy fuck, bananas cause butt cancer," but really it's just that this one guy didn't fucking like bananas, or the idea of bananas, or he really liked the idea of conspicuously avoiding consumption of something, and later he came up with the butt cancer thing.
Okay! That brings us close enough to the present day to wrap up there. One more book down.
Friday, November 11, 2016
the invention of science
I am behind, and in particular I have a backlog of Kindle books in a folder (okay, Kindle calls them "collections") that I keep specifically for Kindle books I've read but haven't yet transcribed notes from -- some for work, some for here, some for fiction projects.
On top of that, I had an ongoing series of thoughts about the election building up, mostly in the form of "hoo geez" and "you gotta be kiddin me" thought balloons popping up in response to things other people said as the election first loomed and then happened. One of the purposes of this weirdly multipurpose blog is to be a safety valve for social media, so that I vent here instead of there, so I was going to rail about people who ask their Trump supporting friends to please stop saying their nasty things in front of them (instead of, you know, actually confronting those friends on their racism, xenophobia, and other abominable beliefs - today's "stop telling racist jokes in front of me, teehee"), and about the "Bernie coulda won it, I tells ya" narrative, and the prematurity of all the other hot takes.
But I think I am, as you must be, too weary of reading postmortem analysis and reactions. Which is not to say I am not engaged - just the opposite, but the last thing I feel like doing right now is swatting down nonsense just to vent about it, because it doesn't feel like it would serve any therapeutic point this time.
I will say that my instinct as a historian says that the dumbest thing you can do is to marry yourself to some model of "what happened" that you read or devise in the first week or two after the election, because you're just going to weigh new data and new models against the one that you've "picked," even though the only reason you've picked it is, ultimately, because of the imagined need for an immediate explanation -- the irrational belief that an incomplete or misleading explanation today is better than an accurate explanation tomorrow. This is a small part of what makes teaching history to people so hard. Among other things, history sometimes includes things they have lived through, and that gives them a remarkable capacity to believe that living as a bystander to an event with very, very small access to data makes them an expert, resistant to the view of the event that has developed in retrospect. This is, of course, a bigger problem than just in the realm of getting people to understand history, but everybody walks down different hallways of the house, and this is one of mine.
Moving on away from the present, away from politics, let me start by finishing off this blog entry on a history of the scientific revolution.
When I started at Hampshire - before I started, actually, at the open house or the pitch at the interview or somewhere in that process - one of the ways they explained the whole approach to things there was by explaining the Div III, which is a mandatory thesis-like project. "See, because there are no majors at Hampshire, and no minors, you can take the things you're interested in and pursue them jointly instead of separately. You don't have to major in psychology and minor in art history. You can do your Div III on the history of the impact of psychological ideas on the visual arts."
I made up that example, but it's a representative one. It's a decent pitch, especially aimed at high school kids (and the parents thereof) who have probably been starved for any kind of serious intellectual or creative stimulus, at least within the bounds of their classrooms.
In the end, though, I had trouble living it out. I wanted to study psychology AND art history. Not just their intersection. I wanted to know about the parts of psychology that had absolutely no impact on art history, and the parts of art history that had nothing to do with psychology. The freedom to cross the streams was well and good, but I quickly took it for granted and wanted to be able to not have to cross the streams.
For instance, I wound up "majoring" - not that Hampshire has majors - in pop culture, but had strong interests in cognitive science and the then-novel study of online culture and communities. What I had zero interest in was combining any of those things for my Div III, which was -- at least in draft form, since I dropped out and transferred before completing it -- a long and rambling thesis on representations of superheroes in comic books and other media, which if it was grounded in anything was grounded more in literary theory and gender studies than cognitive science (though I led up to it with a fifty-page research paper on the role of nostalgia in the history and appeal of Batman, which is at least cogsci-adjacent), and had no overlap with the work I had done on online communities.
So like I've covered before, I have a variegated academic background, and as a grad student in an interdisciplinary program, I took a number of courses on the history and philosophy of science -- if there was such a thing as a "graduate minor," that would pretty much be mine, not because HPS and religious studies are two key branches in the history of ideas -- which would be a valid reason -- but simply because HPS also interested me, even though at that point in my academic career I was supposed to be narrowing my interests, not expanding them.
So it's something I try to keep up with, albeit not at as high a priority as religious studies.
David Wootton's The Invention of Science: A New History of the Scientific Revolution was published with a good deal of fanfare last fall. I'm generally reluctant to jump on the New Release section, for a number of reasons:
1: New releases on scholarly topics are, when they're good work, ultimately of the most value to working scholars in those areas, because those are the readers who are versed in the conversation. After all, if they have something significant to say, other scholars working in that area are going to need to respond to it, and won't have had time yet.
2: New releases on broad historical topics don't always have a particularly good reason to exist, apart from everyone in the field already having read the canonical works on the topic, and those works maybe being out of print. The existence of a new book on the topic is not evidence of the existence of new material. This is a constant source of frustration for me when I write about history, because publishers often stipulate in their style guide that at least half of your sources need to have been published within the last X years, where X is a fairly small number; it is very rarely the case with history that half of the good sources, or a tenth of the good sources, are that recent, especially when you're writing for a general audience and have no need to cite recent journal articles on minor points. (If you're wondering: yes, as a result I have to pad the bibliography with inferior recent work, or recent work that is good but not as directly related to what I'm writing about, in order to balance out the number of older but necessary sources I use.)
3: There is a ... boy, how do I not sound like an asshole here. There is a certain kind of reader I don't want to be. A certain kind of thinker I don't want to be, who's read Malcolm Gladwell and Jared Diamond, both of whom are moderately to completely awful, but nothing that's more than a couple decades old. That's the kind of reading diet that leads to a particularly shallow understanding of things -- but like Gladwell's work in general, to pick on him a little more, it's not a diet that's really designed for actual understanding so much as the satisfied feeling of the illusion of understanding, a junk food eureka.
I made an exception here because the reviews were enough to convince me that #2 and #3 weren't concerns, but that sort of puts a responsibility on me -- even if I'm the only one who perceives or cares about that responsibility -- to keep track of the Invention of Science conversation in the next few years, so I don't make the same mistake I discussed above in my discussion of politics.
Wootton's book traces what he calls the invention of modern science, "between 1572, when Tycho Brahe saw a nova, or new star, and 1704, when Newton published his Opticks... There were systems of knowledge we call 'sciences' before 1572, but the only one which functioned remotely like a modern science, in that it had sophisticated theories based on a substantial body of evidence and could make reliable predictions, was astronomy, and it was astronomy that was transformed in the years after 1572 into the first true science."
This idea, that science before this period was distinct from modern science, is key, and is part of a broader shift in thinking that affected not just the physical sciences but all scholarly pursuits. For that matter, it's not a coincidence that westerners don't really talk about "fiction" as such until the Scientific Revolution, when "nonfiction" becomes rigorously defined. I have banged my head against the wall repeatedly trying to get people to understand this. Obviously I'm not saying that no one wrote any fictitious stories before a certain point in time -- but the modern reader, who thinks of one set of shelves on the library as nonfiction, and another as fiction, is a fairly recent creature. When people describe the Bible or stories in the midrash, for instance, as "fiction," they're imposing a modern frame that didn't exist for the people who created and first received those texts. This isn't splitting hairs, because it's just as important to oppose the fundamentalists who insist that, if the Bible isn't fiction, it is therefore literally true. Again: it's not a coincidence that this claim became as popular when it did, and became politicized, at the point in time that it did. Both of these claims -- that the Bible is fiction, that the Bible is literally true -- are implicitly based on false premises about the nature of sacred texts in antiquity.
To use another example, history as we know it -- that is, the field of history as we know it -- is remarkably recent, in the sense that the idea that the goal of the historian should be to accurately record or recount the details of historical events, drawing on evidence wherever possible, dates to about the Enlightenment, at least in the West. This is kind of fucking crazy, and I feel like people who haven't taken historiography in grad school don't fully believe me when I talk about it, but it goes to Wootton's point. We take scientific thinking for granted now, to an amazing degree -- when you read fantasy novels and whatnot, a ridiculous level of scientific thinking is often ascribed to members of civilizations who would not necessarily be in a position to have developed it, for no apparent reason other than the fact that basic elements of this thinking have so permeated modern thought that it is taken for granted.
The layman often thinks of the history of science as a series of discoveries, rather than creations of different ways of thinking -- and doesn't usually have a good way to explain that most of the scientists famous now for major contributions also pursued wrong avenues (like Newton's extensive work in alchemy) or had no trouble accepting things that would be easily disproven (old beliefs in biology are full of this, and I don't just mean beliefs about race or gender that are motivated by politics and power structure -- there was a simple lack of rigor and attention to detail, by modern standards). If you think science is just a timeline of discoveries, you think the only difference between a clever person in 2016 and a clever person in 1316 is that the clever person in 1316 lives in a world where a bunch of shit hasn't been discovered yet, but that the two basically see their respective worlds the same way and deal with new information the same way, and this is enormously wrong. Even our ability to realize this is not an ability available to all the clever people in history.
One of the ideas that is ingrained in the modern mind that Wootton tackles straight off the bat is the idea of scientific progress. People may debate whether or not history itself "progresses" -- after all, everyone bitches and moans about trivial horseshit like "oh, the kids should still learn cursive" because they're attached to the first Polaroid of the world they watched come into focus -- but everyone today basically sees science as constantly moving forward.
This is a very new idea.
As Wootton points out, until the period he's talking about, not only did people not conceive of "the history of humanity ... as a history of progress," but the rate of technological advancement wasn't just slow, it sometimes went backwards. "The Romans were amazed by stories of what Archimedes had been able to do; and fifteenth-century Italian architects explored the ruined buildings of ancient Rome convinced that they were studying a far more advanced civilization of their own." Technologies were developed, lost, forgotten. This is inconceivable now -- so much so that the "lost technology" trope, when it pops up in popular culture, refers not to sophisticated architecture but to ancient electronics, pyramid magic, and other modern or futuristic technologies transposed to an ancient setting.
Wootton also points to the way Shakespeare and his contemporaries depict ancient Rome technologically identical to Renaissance Europe, with mechanical clocks and nautical compasses. In Borges's words, "all characters are treated as if they were Shakespeare's contemporaries. Shakespeare felt the variety of men, but not the variety of historical eras. History did not exist for him." As Wootton points out, this is a misleading charge -- Shakespeare was well versed in history as history was understood in his day. What he lacked was an understanding of historical change of the sort that we now treat as synonymous with "history."
"We might think that gunpowder, the printing press, and the discovery of America in 1492 should have obliged the Renaissance to acquire a sense of the past as lost and gone for ever, but the educated only slowly became aware of the irreversible consequences that flowed from these crucial innovations. It was only with hindsight that they came to symbolize a new era; and it was the Scientific Revolution itself which was chiefly responsible for the Enlightenment's conviction that progress had become unstoppable. By the middle of the eighteenth century, Shakespeare's sense of time had been replaced by our own."
The term "the Scientific Revolution" is itself one that can be interrogated, and Wootton explains that it's only in the 20th century -- the mid-20th century, at that -- that the term came to mean the creation of modern science as exemplified by Newton's physics. The inspiration for the term is not the American or French Revolution -- both of which were referred to as revolutions as they occurred -- but the Industrial Revolution. Like the Scientific Revolution, the Industrial Revolution was named after the fact -- and although it occurred later, it was named first. Wootton correctly points out that any term introduced by historians after the fact is a term that will be challenged by later historians after THAT fact, which is one of the things that will make your eyes glaze over and start unsubscribing from mailing lists. But anyway.
"In medieval universities, the core curriculum consisted of the seven liberal 'arts' and 'sciences': grammar, rhetoric, and logic; mathematics, geometry, music, and astronomy." There is some digression about what was meant by "art" and "science" at the time, the upshot of which is that all seven were considered both, whereas philosophy and theology were sciences but not arts. Anyway. "Moreover, these sciences were organized into a hierarchy: the theologians felt entitled to order the philosophers to demonstrate the rationality of belief in an immortal soul; the philosophers felt entitled to order the mathematicians to prove that all motion in the heavens is circular ... A basic description of the Scientific Revolution is to say that it represented a successful rebellion by the mathematicians against the authority of the philosophers, and of both against the authority of the theologians."
Da Vinci, for instance, said, "No human investigation can be termed true science if it is not capable of mathematical demonstration. If you say that the sciences which begin and end in the mind are true, that is not to be conceded, but is denied for many reasons, and chiefly the fact that the test of experience is absent from the exercises of the mind, and without it nothing can be certain." The reason we no longer class philosophy and theology as sciences is because we think of science as dealing with not just theory but experiment: testable, verifiable, observable, repeatable results. To return to the refrain: this is a relatively new idea.
One of the most interesting parts of Wootton's book is something I'm still pondering: "before Columbus discovered America in 1492, there was no clear-cut and well-established idea of discovery; the idea of discovery is, as will become apparent, a precondition for the invention of science."
Now, maybe I don't need to point this out, but the idea of "Columbus discovering America" is not important to Wootton's claim here: that is, it is not important that other non-Americans came to the continent before he did, nor that, since there were people fucking living here, none of these non-Americans actually discovered the damn thing. What's key is the European world absorbing the idea of Columbus discovering America and subsequently engaging with America: the phenomenon, so to speak, of "Columbus discovered America." There are other books on the physical and cultural effects of Columbus's voyages; this isn't that. This is about the scientific and intellectual reverberation of the concept of "discovery."
(Wootton also argues that, while everyone correctly dismisses the "Columbus proved the Earth was round" nonsense sometimes taught in elementary schools, the voyages to America did nevertheless change the European conception of the globe, by proving the existence of antipodean land masses, which were believed to be impossible. The Columbian contact did change the understanding of the Earth, then, just not in as simplistic a way as changing it from flat to round.)
"It is discovery itself which has transformed our world," Wootton points out, "in a way that simply locating a new land mass could never do. Before discovery history was assumed to repeat itself and tradition to provide a reliable guide to the future; and the greatest achievements of civilization were believed to lie not in the present or the future but in the past, in ancient Greece and classical Rome. It is easy to say that our world has been made by science or by technology, but scientific and technological progress depend on a pre-existing assumption, the assumption that there are discoveries to be made. ... It is this assumption which has transformed the world, for it has made modern science and technology possible."
Wootton backs up his proposition of the 1492 (essentially) invention of the idea of discovery with linguistic evidence, investigating the various Romance language terms for discovery and related concepts, and how they were used. This is the area where I expect to see other historians responding, refuting, or amplifying -- his evidence is compelling enough, but I'm in no position to tell whether he's cherry-picked it, how much he's interpreting it to favor his argument, etc. See what I mean about the problem with recent scholarly works (at least when they're outside your area of expertise)?
This is a key part of his "discovery" argument: "Although there were already ways of saying something had been found for the first time and had never been found before, it was very uncommon before 1492 for people to want to say anything of the sort, because the governing assumption was that there was 'nothing new under the sun.' The introduction of a new meaning for descrobir implied a radical shift in perspective and a transformation in how people understood their own actions. There were, one can properly say, no voyages of discovery before 1486, only voyages of exploration. Discovery was a new type of enterprise which came into existence along with the word."
Wootton makes a distinction between "discovery" and words like "boredom" and "embarrass," in that we accept that before there was a word for it, people felt bored; before "embarrass" acquired its modern meaning in the 19th century, people could feel embarrassed. But "discovery" is "'an actor's concept' ... you have to have the concept in order to perform the action ... So although there were discoveries and inventions before 1486, the invention and dissemination of a word for 'discovery' marks a decisive moment, because it makes discovery an actor's concept: you can set out to make discoveries, knowing that is what you are doing."
There is an interesting digression about the great sociologist of science Robert K Merton, to whom we owe the phrases "role model," "self-fulfilling prophecy," and "unintended consequence," and who wrote an entire book about the phrase "standing on the shoulders of giants." What Merton was unable to popularize was the idea of multiple discovery: the idea that "there are nearly always several people who can lay claim to a discovery, and that where there are not this is because one person has so successfully publicized his own claim that other claims are forestalled."
"We cannot give up the idea that discovery, like a race, is a game in which one person wins and everyone else loses. The sociologist's view is that every race ends with a winner, so that winning is utterly predictable. If the person in the lead trips and falls, the outcome is not that no one wins but that someone else wins. In each race there are multiple potential winners." Think of how many time travel stories revolve around, I don't know, going back in time to keep somebody from inventing the time machine. That's taking the Great Man view of history -- the view that prevailed in the 19th century, when history was portrayed as the result of specific heroic egos. Other than in fiction and, perhaps, biography, it is not a view that is looked on kindly anymore -- no one person, no one thousand people, can be said to be uniquely responsible for the major events of history, not because that person did not do the things they did, not because those things lacked significance, but because other people would do other significant things to move history in largely the same way. If you go back in time and kill Li'l Lincoln when he's a toddler, you don't wake up in a 2016 that still has slavery -- you simply find out that slavery ended under some other presidency instead. We pretty much accept that -- like I said, except in fiction -- but this centrality of the individual in discovery persists, even when as Wootton points out, numerous people independently discovered the sine law of refraction, the law of fall, Boyle's law, oxygen, the telescope.
I am skipping over lots and lots of detail about the effects of the telescope and the microscope, in part because it doesn't excerpt well.
Wootton also goes into a dispute with relativists, especially "the strong programme," which I think is too inside baseball to get into here, especially since as someone not employed by a university, I have no reason to be invested in the fight. (Okay, that's not entirely true. The Science Wars are important, and impact not only the work I do in educational publishing and reference books, but overall science literacy and public policy. But Wootton's comments on it are not the part of this book that will stay with me the longest, even as they frame the rest of it.) As a fan of Kuhn but not of many of the historians of science who have followed in Kuhn's footsteps -- the relativists, in other words -- but who has also been accused of relativism because I disagree with the way history is constructed by strict realists, I think I agree with Wootton when he says that "this book will look realist to relativists and relativist to realists."
On top of that, I had an ongoing series of thoughts about the election building up, mostly in the form of "hoo geez" and "you gotta be kiddin me" thought balloons popping up in response to things other people said as the election first loomed and then happened. One of the purposes of this weirdly multipurpose blog is to be a safety valve for social media, so that I vent here instead of there, so I was going to rail about people who ask their Trump supporting friends to please stop saying their nasty things in front of them (instead of, you know, actually confronting those friends on their racism, xenophobia, and other abominable beliefs - today's "stop telling racist jokes in front of me, teehee"), and about the "Bernie coulda won it, I tells ya" narrative, and the prematurity of all the other hot takes.
But I think I am, as you must be, too weary of reading postmortem analysis and reactions. Which is not to say I am not engaged - just the opposite, but the last thing I feel like doing right now is swatting down nonsense just to vent about it, because it doesn't feel like it would serve any therapeutic point this time.
I will say that my instinct as a historian says that the dumbest thing you can do is to marry yourself to some model of "what happened" that you read or devise in the first week or two after the election, because you're just going to weigh new data and new models against the one that you've "picked," even though the only reason you've picked it is, ultimately, because of the imagined need for an immediate explanation -- the irrational belief that an incomplete or misleading explanation today is better than an accurate explanation tomorrow. This is a small part of what makes teaching history to people so hard. Among other things, history sometimes includes things they have lived through, and that gives them a remarkable capacity to believe that living as a bystander to an event with very, very small access to data makes them an expert, resistant to the view of the event that has developed in retrospect. This is, of course, a bigger problem than just in the realm of getting people to understand history, but everybody walks down different hallways of the house, and this is one of mine.
Moving on away from the present, away from politics, let me start by finishing off this blog entry on a history of the scientific revolution.
When I started at Hampshire - before I started, actually, at the open house or the pitch at the interview or somewhere in that process - one of the ways they explained the whole approach to things there was by explaining the Div III, which is a mandatory thesis-like project. "See, because there are no majors at Hampshire, and no minors, you can take the things you're interested in and pursue them jointly instead of separately. You don't have to major in psychology and minor in art history. You can do your Div III on the history of the impact of psychological ideas on the visual arts."
I made up that example, but it's a representative one. It's a decent pitch, especially aimed at high school kids (and the parents thereof) who have probably been starved for any kind of serious intellectual or creative stimulus, at least within the bounds of their classrooms.
In the end, though, I had trouble living it out. I wanted to study psychology AND art history. Not just their intersection. I wanted to know about the parts of psychology that had absolutely no impact on art history, and the parts of art history that had nothing to do with psychology. The freedom to cross the streams was well and good, but I quickly took it for granted and wanted to be able to not have to cross the streams.
For instance, I wound up "majoring" - not that Hampshire has majors - in pop culture, but had strong interests in cognitive science and the then-novel study of online culture and communities. What I had zero interest in was combining any of those things for my Div III, which was -- at least in draft form, since I dropped out and transferred before completing it -- a long and rambling thesis on representations of superheroes in comic books and other media, which if it was grounded in anything was grounded more in literary theory and gender studies than cognitive science (though I led up to it with a fifty-page research paper on the role of nostalgia in the history and appeal of Batman, which is at least cogsci-adjacent), and had no overlap with the work I had done on online communities.
So like I've covered before, I have a variegated academic background, and as a grad student in an interdisciplinary program, I took a number of courses on the history and philosophy of science -- if there was such a thing as a "graduate minor," that would pretty much be mine, not because HPS and religious studies are two key branches in the history of ideas -- which would be a valid reason -- but simply because HPS also interested me, even though at that point in my academic career I was supposed to be narrowing my interests, not expanding them.
So it's something I try to keep up with, albeit not at as high a priority as religious studies.
David Wootton's The Invention of Science: A New History of the Scientific Revolution was published with a good deal of fanfare last fall. I'm generally reluctant to jump on the New Release section, for a number of reasons:
1: New releases on scholarly topics are, when they're good work, ultimately of the most value to working scholars in those areas, because those are the readers who are versed in the conversation. After all, if they have something significant to say, other scholars working in that area are going to need to respond to it, and won't have had time yet.
2: New releases on broad historical topics don't always have a particularly good reason to exist, apart from everyone in the field already having read the canonical works on the topic, and those works maybe being out of print. The existence of a new book on the topic is not evidence of the existence of new material. This is a constant source of frustration for me when I write about history, because publishers often stipulate in their style guide that at least half of your sources need to have been published within the last X years, where X is a fairly small number; it is very rarely the case with history that half of the good sources, or a tenth of the good sources, are that recent, especially when you're writing for a general audience and have no need to cite recent journal articles on minor points. (If you're wondering: yes, as a result I have to pad the bibliography with inferior recent work, or recent work that is good but not as directly related to what I'm writing about, in order to balance out the number of older but necessary sources I use.)
3: There is a ... boy, how do I not sound like an asshole here. There is a certain kind of reader I don't want to be. A certain kind of thinker I don't want to be, who's read Malcolm Gladwell and Jared Diamond, both of whom are moderately to completely awful, but nothing that's more than a couple decades old. That's the kind of reading diet that leads to a particularly shallow understanding of things -- but like Gladwell's work in general, to pick on him a little more, it's not a diet that's really designed for actual understanding so much as the satisfied feeling of the illusion of understanding, a junk food eureka.
I made an exception here because the reviews were enough to convince me that #2 and #3 weren't concerns, but that sort of puts a responsibility on me -- even if I'm the only one who perceives or cares about that responsibility -- to keep track of the Invention of Science conversation in the next few years, so I don't make the same mistake I discussed above in my discussion of politics.
Wootton's book traces what he calls the invention of modern science, "between 1572, when Tycho Brahe saw a nova, or new star, and 1704, when Newton published his Opticks... There were systems of knowledge we call 'sciences' before 1572, but the only one which functioned remotely like a modern science, in that it had sophisticated theories based on a substantial body of evidence and could make reliable predictions, was astronomy, and it was astronomy that was transformed in the years after 1572 into the first true science."
This idea, that science before this period was distinct from modern science, is key, and is part of a broader shift in thinking that affected not just the physical sciences but all scholarly pursuits. For that matter, it's not a coincidence that westerners don't really talk about "fiction" as such until the Scientific Revolution, when "nonfiction" becomes rigorously defined. I have banged my head against the wall repeatedly trying to get people to understand this. Obviously I'm not saying that no one wrote any fictitious stories before a certain point in time -- but the modern reader, who thinks of one set of shelves on the library as nonfiction, and another as fiction, is a fairly recent creature. When people describe the Bible or stories in the midrash, for instance, as "fiction," they're imposing a modern frame that didn't exist for the people who created and first received those texts. This isn't splitting hairs, because it's just as important to oppose the fundamentalists who insist that, if the Bible isn't fiction, it is therefore literally true. Again: it's not a coincidence that this claim became as popular when it did, and became politicized, at the point in time that it did. Both of these claims -- that the Bible is fiction, that the Bible is literally true -- are implicitly based on false premises about the nature of sacred texts in antiquity.
To use another example, history as we know it -- that is, the field of history as we know it -- is remarkably recent, in the sense that the idea that the goal of the historian should be to accurately record or recount the details of historical events, drawing on evidence wherever possible, dates to about the Enlightenment, at least in the West. This is kind of fucking crazy, and I feel like people who haven't taken historiography in grad school don't fully believe me when I talk about it, but it goes to Wootton's point. We take scientific thinking for granted now, to an amazing degree -- when you read fantasy novels and whatnot, a ridiculous level of scientific thinking is often ascribed to members of civilizations who would not necessarily be in a position to have developed it, for no apparent reason other than the fact that basic elements of this thinking have so permeated modern thought that it is taken for granted.
The layman often thinks of the history of science as a series of discoveries, rather than creations of different ways of thinking -- and doesn't usually have a good way to explain that most of the scientists famous now for major contributions also pursued wrong avenues (like Newton's extensive work in alchemy) or had no trouble accepting things that would be easily disproven (old beliefs in biology are full of this, and I don't just mean beliefs about race or gender that are motivated by politics and power structure -- there was a simple lack of rigor and attention to detail, by modern standards). If you think science is just a timeline of discoveries, you think the only difference between a clever person in 2016 and a clever person in 1316 is that the clever person in 1316 lives in a world where a bunch of shit hasn't been discovered yet, but that the two basically see their respective worlds the same way and deal with new information the same way, and this is enormously wrong. Even our ability to realize this is not an ability available to all the clever people in history.
One of the ideas that is ingrained in the modern mind that Wootton tackles straight off the bat is the idea of scientific progress. People may debate whether or not history itself "progresses" -- after all, everyone bitches and moans about trivial horseshit like "oh, the kids should still learn cursive" because they're attached to the first Polaroid of the world they watched come into focus -- but everyone today basically sees science as constantly moving forward.
This is a very new idea.
As Wootton points out, until the period he's talking about, not only did people not conceive of "the history of humanity ... as a history of progress," but the rate of technological advancement wasn't just slow, it sometimes went backwards. "The Romans were amazed by stories of what Archimedes had been able to do; and fifteenth-century Italian architects explored the ruined buildings of ancient Rome convinced that they were studying a far more advanced civilization of their own." Technologies were developed, lost, forgotten. This is inconceivable now -- so much so that the "lost technology" trope, when it pops up in popular culture, refers not to sophisticated architecture but to ancient electronics, pyramid magic, and other modern or futuristic technologies transposed to an ancient setting.
Wootton also points to the way Shakespeare and his contemporaries depict ancient Rome technologically identical to Renaissance Europe, with mechanical clocks and nautical compasses. In Borges's words, "all characters are treated as if they were Shakespeare's contemporaries. Shakespeare felt the variety of men, but not the variety of historical eras. History did not exist for him." As Wootton points out, this is a misleading charge -- Shakespeare was well versed in history as history was understood in his day. What he lacked was an understanding of historical change of the sort that we now treat as synonymous with "history."
"We might think that gunpowder, the printing press, and the discovery of America in 1492 should have obliged the Renaissance to acquire a sense of the past as lost and gone for ever, but the educated only slowly became aware of the irreversible consequences that flowed from these crucial innovations. It was only with hindsight that they came to symbolize a new era; and it was the Scientific Revolution itself which was chiefly responsible for the Enlightenment's conviction that progress had become unstoppable. By the middle of the eighteenth century, Shakespeare's sense of time had been replaced by our own."
The term "the Scientific Revolution" is itself one that can be interrogated, and Wootton explains that it's only in the 20th century -- the mid-20th century, at that -- that the term came to mean the creation of modern science as exemplified by Newton's physics. The inspiration for the term is not the American or French Revolution -- both of which were referred to as revolutions as they occurred -- but the Industrial Revolution. Like the Scientific Revolution, the Industrial Revolution was named after the fact -- and although it occurred later, it was named first. Wootton correctly points out that any term introduced by historians after the fact is a term that will be challenged by later historians after THAT fact, which is one of the things that will make your eyes glaze over and start unsubscribing from mailing lists. But anyway.
"In medieval universities, the core curriculum consisted of the seven liberal 'arts' and 'sciences': grammar, rhetoric, and logic; mathematics, geometry, music, and astronomy." There is some digression about what was meant by "art" and "science" at the time, the upshot of which is that all seven were considered both, whereas philosophy and theology were sciences but not arts. Anyway. "Moreover, these sciences were organized into a hierarchy: the theologians felt entitled to order the philosophers to demonstrate the rationality of belief in an immortal soul; the philosophers felt entitled to order the mathematicians to prove that all motion in the heavens is circular ... A basic description of the Scientific Revolution is to say that it represented a successful rebellion by the mathematicians against the authority of the philosophers, and of both against the authority of the theologians."
Da Vinci, for instance, said, "No human investigation can be termed true science if it is not capable of mathematical demonstration. If you say that the sciences which begin and end in the mind are true, that is not to be conceded, but is denied for many reasons, and chiefly the fact that the test of experience is absent from the exercises of the mind, and without it nothing can be certain." The reason we no longer class philosophy and theology as sciences is because we think of science as dealing with not just theory but experiment: testable, verifiable, observable, repeatable results. To return to the refrain: this is a relatively new idea.
One of the most interesting parts of Wootton's book is something I'm still pondering: "before Columbus discovered America in 1492, there was no clear-cut and well-established idea of discovery; the idea of discovery is, as will become apparent, a precondition for the invention of science."
Now, maybe I don't need to point this out, but the idea of "Columbus discovering America" is not important to Wootton's claim here: that is, it is not important that other non-Americans came to the continent before he did, nor that, since there were people fucking living here, none of these non-Americans actually discovered the damn thing. What's key is the European world absorbing the idea of Columbus discovering America and subsequently engaging with America: the phenomenon, so to speak, of "Columbus discovered America." There are other books on the physical and cultural effects of Columbus's voyages; this isn't that. This is about the scientific and intellectual reverberation of the concept of "discovery."
(Wootton also argues that, while everyone correctly dismisses the "Columbus proved the Earth was round" nonsense sometimes taught in elementary schools, the voyages to America did nevertheless change the European conception of the globe, by proving the existence of antipodean land masses, which were believed to be impossible. The Columbian contact did change the understanding of the Earth, then, just not in as simplistic a way as changing it from flat to round.)
"It is discovery itself which has transformed our world," Wootton points out, "in a way that simply locating a new land mass could never do. Before discovery history was assumed to repeat itself and tradition to provide a reliable guide to the future; and the greatest achievements of civilization were believed to lie not in the present or the future but in the past, in ancient Greece and classical Rome. It is easy to say that our world has been made by science or by technology, but scientific and technological progress depend on a pre-existing assumption, the assumption that there are discoveries to be made. ... It is this assumption which has transformed the world, for it has made modern science and technology possible."
Wootton backs up his proposition of the 1492 (essentially) invention of the idea of discovery with linguistic evidence, investigating the various Romance language terms for discovery and related concepts, and how they were used. This is the area where I expect to see other historians responding, refuting, or amplifying -- his evidence is compelling enough, but I'm in no position to tell whether he's cherry-picked it, how much he's interpreting it to favor his argument, etc. See what I mean about the problem with recent scholarly works (at least when they're outside your area of expertise)?
This is a key part of his "discovery" argument: "Although there were already ways of saying something had been found for the first time and had never been found before, it was very uncommon before 1492 for people to want to say anything of the sort, because the governing assumption was that there was 'nothing new under the sun.' The introduction of a new meaning for descrobir implied a radical shift in perspective and a transformation in how people understood their own actions. There were, one can properly say, no voyages of discovery before 1486, only voyages of exploration. Discovery was a new type of enterprise which came into existence along with the word."
Wootton makes a distinction between "discovery" and words like "boredom" and "embarrass," in that we accept that before there was a word for it, people felt bored; before "embarrass" acquired its modern meaning in the 19th century, people could feel embarrassed. But "discovery" is "'an actor's concept' ... you have to have the concept in order to perform the action ... So although there were discoveries and inventions before 1486, the invention and dissemination of a word for 'discovery' marks a decisive moment, because it makes discovery an actor's concept: you can set out to make discoveries, knowing that is what you are doing."
There is an interesting digression about the great sociologist of science Robert K Merton, to whom we owe the phrases "role model," "self-fulfilling prophecy," and "unintended consequence," and who wrote an entire book about the phrase "standing on the shoulders of giants." What Merton was unable to popularize was the idea of multiple discovery: the idea that "there are nearly always several people who can lay claim to a discovery, and that where there are not this is because one person has so successfully publicized his own claim that other claims are forestalled."
"We cannot give up the idea that discovery, like a race, is a game in which one person wins and everyone else loses. The sociologist's view is that every race ends with a winner, so that winning is utterly predictable. If the person in the lead trips and falls, the outcome is not that no one wins but that someone else wins. In each race there are multiple potential winners." Think of how many time travel stories revolve around, I don't know, going back in time to keep somebody from inventing the time machine. That's taking the Great Man view of history -- the view that prevailed in the 19th century, when history was portrayed as the result of specific heroic egos. Other than in fiction and, perhaps, biography, it is not a view that is looked on kindly anymore -- no one person, no one thousand people, can be said to be uniquely responsible for the major events of history, not because that person did not do the things they did, not because those things lacked significance, but because other people would do other significant things to move history in largely the same way. If you go back in time and kill Li'l Lincoln when he's a toddler, you don't wake up in a 2016 that still has slavery -- you simply find out that slavery ended under some other presidency instead. We pretty much accept that -- like I said, except in fiction -- but this centrality of the individual in discovery persists, even when as Wootton points out, numerous people independently discovered the sine law of refraction, the law of fall, Boyle's law, oxygen, the telescope.
I am skipping over lots and lots of detail about the effects of the telescope and the microscope, in part because it doesn't excerpt well.
Wootton also goes into a dispute with relativists, especially "the strong programme," which I think is too inside baseball to get into here, especially since as someone not employed by a university, I have no reason to be invested in the fight. (Okay, that's not entirely true. The Science Wars are important, and impact not only the work I do in educational publishing and reference books, but overall science literacy and public policy. But Wootton's comments on it are not the part of this book that will stay with me the longest, even as they frame the rest of it.) As a fan of Kuhn but not of many of the historians of science who have followed in Kuhn's footsteps -- the relativists, in other words -- but who has also been accused of relativism because I disagree with the way history is constructed by strict realists, I think I agree with Wootton when he says that "this book will look realist to relativists and relativist to realists."
Subscribe to:
Posts (Atom)