Monday, January 9, 2017

recent reading: apostles of reason

Explaining the Venn diagram of evangelical, charismatic, Pentecostal, born again, literalist, inerrantist, and fundamentalist is a complicated thing, and beyond the scope of this blog entry, but one of the things that has happened since midcentury is that "evangelical Christianity" has become synonymous in popular discourse with conservative (and especially ultra-conservative) fundamentalist- and literalist-leaning Christianity, and in particular the white churches fitting that description, and this is neither evangelical Christianity's origins, nor does it describe all of evangelical Christianity (or all of white evangelical Christianity) today.

After all, evangelical Christianity began more or less in the early 18th century -- or to consider it another way, it is more than twice as old as the fundamentalism with which it is now so closely associated. However, pointing that out doesn't mean fundamentalism has no place in evangelicalism, either; many of the characteristics now central to Christianity overall didn't develop for centuries, and this is just the way religion works.

But it raises the question: how did it get here? (where is that large automobile?)

Molly Worthen's Apostles of Reason: The Crisis of Authority in American Evangelicalism provides some of the answers. It's not a comprehensive history of American evangelicalism, nor of American evangelicalism in the period of time it covers -- it focuses principally on white evangelicalism, for one thing, and seeks to trace the rise of the politicized conservatives among them, by tracing evangelicals from the late 19th century to more or less the present day.

One of the themes that emerges quickly that of recurring clashes between -- let's just use this terminology -- conservatives on one side, and liberals and moderates on the other, resulting in the backing down or attrition of the liberals or moderates, and the subsequent strengthening of the conservative position. One of the most emblematic examples of this for me involves faculty members at an evangelical college, raising issues with the requirement that they pledge to uphold inerrancy-- what exactly does inerrancy mean here, the moderates asked? It means what it means, the conservatives said, that the Bible overrules reason, so there's no point thinking too much about it; the conservatives wound up resigning over the debate, but the moderates didn't really "win" as a result, because of the way these "losses" feed the conservative evangelical persecution complex and the need for enemies.

When I said that explaining that Venn diagram is a complicated thing, though, I wasn't blowing smoke. Just defining those terms is a tough thing. "The term evangelical," Worthen points out, "is so mired in adjectives and qualifiers, contaminated by politicization and stereotype, that many commentators have suggested it has outlived its usefulness. In America alone, the broad tent of evangelicalism includes a definition-defying array of doctrines, practices, and political persuasions. Perhaps no label is elastic enough to contain a flock that ranges from churchly Virginia Baptists to nondenominational charismatics in Los Angeles. At the same time, the mudslinging of the 1990s culture wars turned many conservative American Protestants away from a label now synonymous in the media with right-wing radicalism and prejudice. Yet we are stuck with it. Believers and atheist scholars, politicians and pundits, all continue to use the word evangelical. To observers and insiders alike there still seems to be a there there: a nebulous community that shares something, even if it is not always clear what that something is."

Nailing down what "evangelical" means is sort of like nailing down what "Christian" means -- both insiders and outsiders think they know, and yet the reality is that there are many different groups answering to that name, sharing some common ground but disagreeing about doctrine, practice, and in-group membership, and often unaware of how deep this disagreement runs or how many groups unlike them there are. This tends to be especially problematic with evangelicalism, since at least Christianity as a whole has actual named denominations and movements, whereas the differences within evangelicalism are not so clearly labeled, and not every evangelical church or group belongs to a larger ecumenical organization.

"... The trouble is that evangelicals differ widely in how they interpret and emphasize 'fundamental' doctrines. Even with the 'born again experience,' supposedly the quintessence of evangelicalism, is not an ironclad indicator. Some evangelicals have always viewed conversion as an incremental process rather than an instantaneous rebirth (and their numbers may be increasing)."

Evangelicalism has historical roots in European Pietism; "catchphrases like 'Bible-believing' and 'born again' are modern translations of the Reformers' slogan sola scriptura and Pietists' emphasis on internal spiritual transformation."

"Three elemental concerns unite [evangelicals]: how to repair the fracture between spiritual and rational knowledge; how to assure salvation and a true relationship with God; and how to solve the tension between the demands of personal belief and the constraints of a secularized public square." These are pretty broad concerns, and not all Christian groups with these concerns are evangelical, certainly; Worthen is just attempting to delineate the common ground among evangelicals without invoking specific terminology like "born-again" which, as pointed out, is treated differently in different groups. But as the subtitle of the book indicates, what she sees here is an overall concern with "problems of intellectual and spiritual authority."

This is one of the most important parts of evangelicalism: "American evangelicals have a strong primitivist bent. They often prefer to think their faith indistinguishable from the faith of Christ's apostles, and scoff at history's claims on them. But they are creatures of history like everyone else, whether they like it or not."

In what we now think of as the classic conservative evangelical community, this "primitivist bent" is not just integral to group identity, it's the basis of evangelicals' criticism of other Christians, which in turn insulates them from both criticism and analysis. Fundamentalism, for these folks, was not introduced at the dawn of the 20th century, it was revived at that point, returning Christian practice to the only form that true Christian practice could ever have taken. What I'm describing is, of course, the conservative view -- but the conservative view has become particularly important since, as I've talked about before, it became so dominant in religious discourse, influencing the way liberal Christians talked about their faith and non-Christians' views of Christianity.

Worthen's three keys again, rephrased: "three questions unite evangelicals: how to reconcile faith and reason; how to know Jesus; how to act publicly on faith after the rupture of Christendom."

Despite modern evangelicals' originalist claims, they act fairly divorced from history, and for that reason they often -- not just the conservatives -- forget just how close their origins are to the Protestant Reformation, which must seem to them like the distant past. But under Catholicism, certain questions -- especially about the role of faith in the public sphere -- had, if not easy answers as such, at least well-established arguments. Much as the formation of Christianity meant reevaluating Jewish thought and scripture to ask, what do we take, what do we leave behind, and what to we reinterpret, Protestant factions had to do likewise with the previous millennium-plus of Catholic thought. Some of it was dispensed immediately because it was the source of the rift, but Catholic thought by that point in time was incredibly broad, and Catholic theology was built on generations upon generations of commentators. How much of doctrine went away? How much of that theology was no longer valid?

That too is too complicated an issue for this blog entry, but what matters is the big canvas that it created. As far as the issue of authority, for instance, for various reasons -- particularly the political power and political relationships of the Catholic Church at the time of the Reformation, as well as the need to protect the interests of Reformers, and the political interests of the states that flipped Protestant -- most of the early Protestant religions were state churches, affiliated with national governments. The antecedents to evangelicalism begin with a reaction against that, which goes a long way toward explaining evangelicals' embrace of home churches, unaffiliated churches, and churches that are only loosely affiliated with one another rather being led by formal ecumenical hierarchies.

"Pietist preachers critiqued the state churches that emerged from the Reformation as overly formal and cerebral. They called on believers to study the Bible and strive for personal holiness." That could damn near describe evangelical churches today, but Worthen is describing the late seventeenth century.

As for fundamentalism, it grew out of "The Fundamentals," a series of pamphlets written in the 1910s by faculty at Princeton's seminary, controlled at the time by theological conservatives. The Princeton conservatives considered inerrancy -- the idea that the Bible is without error, which is not the same as the idea that the Bible is literally true, but the line between an inerrantists and a literalist can be fine, and laymen do not always pause to consider the difference, however important its implications -- to be "fundamental" to the Christian faith, and their defense of it was a reaction to the Biblical criticism coming out of the German universities. Generally moderate, even conservative by today's standards, the "modernist" approach to Biblical criticism seemed impious to these proto-fundamentalists because it did not take the Bible's inerrancy for granted. Keep in mind that although some of the scholars the conservatives took issue with would certainly include non-believers or scholars challenging basic premises of religious belief, it also included numerous scholars who were themselves religious but simply didn't believe it was necessary to consider the Bible an error-free account of history, or to ignore the obvious parallels between the Old Testament and other Semitic religions, or to believe the traditional views that Moses had written the Torah and the Gospels had been written shortly after the death of Jesus, as first- or secondhand accounts. Scholars, again many of them religious (including clergy), were willing to question whether Jesus's miracles really occurred, not to mention what to make of the Earth being created in "six days." The basic dispute, from a conservative perspective especially, was whether the Bible overruled reason or vice versa.

(There is an extraordinarily large problem with taking the conservative side even apart from rejecting reason, which is the problem with "sola scriptura": despite hundreds of years of claims to the contrary, the Bible cannot be understood "by itself." Many readers certainly believe that, having read it -- or some part of it -- they come to a conclusion about what it means, but they do so because they bring to it preconceptions, viewpoints and perspectives impacted by religious teachings they have absorbed prior to their reading, and so on. Other readers claim that this problem can be alleviated by praying for guidance, and yet history clearly shows that the Bible and individual passages of the Bible have been interpreted in many different ways at many different times -- not just opportunistically, but by sincerely pious people who we must assume similarly prayed for guidance. In other words: whatever your religious beliefs, to make the claim that the products of human reason must be weighed against what it says in the Bible is nonsense, because "what it says in the Bible" cannot be ascertained except as a product of human reason. Indeed, belief in God, God's creation of humankind, and the divine guidance of the writing of the Bible should carry with it belief that this is one of the ends to which reason should be used.)

This is a dispute that continues in much of conservative Christianity today, obviously, and the dumbing down of Christianity, the sapping of religious literacy, has not helped. One reason conservatives are so convinced of the originalism of their Christianity is because some hundred years ago, they began constructing their echo chamber, expelling moderates and liberals from their churches, seminaries, and periodicals when they had the power to do so, starting their own when they did not. An echo chamber, a persecution complex, and a sense of superiority arising from the conviction that they were practicing the only real form of Christianity while everyone else readied themselves for the coals: that is the essence of conservative evangelical Christianity as it developed in the United States over the course of the 20th century.

(Even the pamphlets did not uniformly defend inerrancy, it's worth noting, despite being published by an inerrantist group and inspiring a largely inerrantist movement. They defended "traditional interpretations of the Bible," which is a slightly broader category, and the authors included some theologians who would soon, as The Fundamentals gave way to fundamentalism, be positioned more at the moderate edge of conservatism, notably James Orr, who argued stridently against inerrancy, but also against the excesses of German modernism. Fundamentalism wasn't defined by The Fundamentals so much as by the activity that surrounded and followed them, and the pamphlets themselves -- written by theologians, typically subtler and more nuanced in their arguments than either the laity or, for that matter, less intellectually-inclined clergy -- represent a wider range of theological positions than would later be tolerated.)

(A quick Worthen quote that goes far in explaining the difference between inerrancy and literalism: "Inerrantists often acknowledged scripture's inconsistencies, such as multiple, conflicting accounts of the same event. They freely admitted that when God inspired the biblical authors to set down his perfect revelation, he did not place them in a divine crow's nest peering over space and time. They asserted that simply because, in our finite judgment, the evangelists seem to disagree about how many times the cock crowed before Peter denied Jesus, there is no reason to conclude that the first chapter of Genesis is all metaphor, that the Marys did not find the tomb empty -- or, more fundamentally, that scriptura could truly stand sola, that the plain meaning of God's word somehow depended on human authorship or interpretation.")

(Both literalism and inerrancy are difficult to impossible theological positions to defend. But literalism approaches true indefensibility: the Bible cannot be literally true, because it contains too many contradictions. It is easier to argue that the Bible is "without error," and construct it as a series of divine revelations that may contain details that conflict with the details of other revelations in that series, but as moderate evangelical theologians themselves have pointed out, this requires incredibly detailed discussions of what exactly "inerrancy" means, such that the inerrancy claim becomes pointless. And yet.)

As Worthen points out, "Pastors who encountered the careful critiques of [theologians like] Orr second- or third-hand rarely preserved their prudence and intellectual agility. As conflict against modernists intensified in the 1920s and 1930s, fundamentalists lost interest in nuance. They refashioned a once-subtle doctrine into a shield to protect the Bible from the revisions of blasphemers. Orr and the scholars of old Princeton had understood themselves as explicating centuries of Christian widsom in modern terms. They held steady at the siren call of sola scriptura -- that problematic promise that every believer could grasp scripture's plain meaning -- by binding themselves to the mast of a venerable theological tradition. Later fundamentalists, however, became polemicists rather than apologists. The difference is subtle but crucial. Winning the war against modernism became more important than illuminating orthodoxy. Inerrancy came to represent not only a set of beliefs about creation or the reality of Jesus's miracles, but the pledge that human reason must always bow to the Bible. As fear of modernist theology and new science began to infect a wide range of Protestant churches, this new variety of fundamentalist deployed inerrancy as a simple shibboleth to separate sheep from the goats. It was no longer a doctrine with historical roots or an ongoing debate among theologians. Inerrancy was common sense."

Emphasis mine. Surely you have had an argument with a fundamentalist at some point -- even an atheist parroting fundamentalist views as an argument against religion -- who has made essentially these "points," since they have become so commonplace.

To Worthen's basic focus:

"From the beginning, [evangelicals'] concerns were existential and epistemological: They had to do not just with points of belief, but with how Christians accounted for human knowledge, how they lived in the world, and how they claimed to 'know' the divine in their minds and hearts. While many ancient Christians assented to the basic doctrines that scholars mark as 'evangelical,' that assent took on a different character after the seventeenth-century rebirth of reason and the invention of our present-day notions of 'religious' and 'secular.' The sundry believers who share the evangelical label have all lacked an extrabiblical authority powerful enough to guide them through these crises. Roman Catholics obey the Vatican (more or less). Liberal Protestants tend to allow the goddess of reason to rule over the Bible (or to rule, relatively untroubled, in her separate sphere). Evangelicals claim sola scriptura as their guide, but it is no secret that the challenge of determining what the Bible actually means finds its ultimate caricature in their schisming and squabbling. They are the children of estranged parents -- Pietism and the Enlightenment -- but behave like orphans. This confusion over authority is both their greatest affliction and their most potent source of vitality."

Out of the second generation of fundamentalists mentioned above came the neo-evangelicals (such as Billy Graham). "'Neo-evangelical' would come to describe a self-aware intellectual movement of pastors, scholars, and evangelists within the conservative Protestant community roughly (but not entirely) contained within the NAE. ... while evangelical connoted a broad swath of conservative Protestants averse to the old fundamentalist model of feuding separatism but still eager to defend the authenticity of religious experience and the authority of the Bible, neo-evangelical became a more precise label, embraced by a small circle of self-appointed leaders."

The NAE is the National Association of Evangelicals, formed in 1942. It is not a denomination but an association of evangelicals that includes both evangelical denominations (dozens now) and nondenominational churches. and later (in the 1970s) sponsored the New International Version translation of the Bible. The NAE's founders had concrete problems to address, very similar to the concerns of today's conservatives -- increasing evangelical representation among the chaplains who served in the armed forces, fighting the influence of modernism of public school curriculums, guiding Sunday School curriculums, and increasing the presence of conservative voices on religious radio, which was then dominated by liberal denominations, believe it or not. The fight against modernism and the defense of "traditional Biblical beliefs" informed all of these concerns, and just as you hear from the radical Right today, the threat posed is one of corruption from within: "Without a firm defense of Biblical inerrancy ... America would fall to enemies within and without, as had imperial Rome. Western civilization was sick with secularism and socialism, the modern spores that had overrun their hosts in the Soviet Union. The Kingdom of Hell was at hand."

Furthermore, in the view of the neo-evangelicals, "prior to the advent of modern biblical criticism and the theory of evolution, all Westerners shared a Christian Weltanschauung -- an unqualified respect for biblical authority, even if corrupted in some regions by Catholic rule.... The neo-evangelicals were overfond of this word, Weltanschauung, and its English synonyms: worldview, world-and-life view. They intoned it like a ghostly intonation whenever they wrote of the decline of Christendom, the decoupling of faith and reason, and the needful pinprick of the gospel in every corner of thought and action."

"From the neo-evangelical point of view, if Christian civilization was to survive the twentieth century, then biblical inerrancy and a reenergized Christian Weltenschauung must form its bedrock. The neo-evangelicals championed other theological principles too, but they recognized that conservative Protestants might reasonably disagree on details of doctrine. The NAE had no business taking a firm stand on predestination or exactly when Christ was due to return. Biblical inerrancy and the totality of the Christian world-and-life view, on the other hand, were different. These were not really doctrines at all, but facts: facts that made sense in an age when everyone from Nazis and communists to Catholic theologians and U.S. Foreign Service officers were talking about worldviews and presuppositions."

"The trouble was that neo-evangelicals presumed an evangelical solidarity that did not exist. The call for cooperation that began with the NAE would expose discord and ambivalence -- not least because, as it turned out, the neo-evangelicals' instinctive response to debate was to turn a deaf ear and close ranks. They differed from their fundamentalist forefathers only in the degree of their separatist impulse." Many evangelical denominations declined to join; the Nazarenes didn't join until the 1980s, and the Southern Baptist Convention, not only the largest Protestant denomination in the country but several times the size of the entire NAE, stayed out of it entirely, as did most Restorationist churches and denominations -- the churches that had grown out of the Second Great Awakening.

"Neo-evangelicals assumed that the battles against modernists in the early decades of the twentieth century had left all evangelicals with the same experience and collective memory. Nothing could be further from the truth. Restorationists fought over the use of musical instruments and worship and the degree of bureaucratic organization permissible for a 'New Testament Church.' The Nazarenes and Mennonites argued about 'worldliness' and abandonment of older customs and styles of dress."

Southern Baptists, meanwhile, rejected the NAE because both evangelicalism and fundamentalism seemed like Yankee phenomena to them, even though the Southern Baptists had fought the same fight against modernism, and shared many of the same anxieties. However, while the Princeton fundamentalists had been concerned with biblical criticism, the Southern Baptists of the same era had been occupied with an internal battle between conservatives who supported the Convention's authority and moderates and Landmarkers who fought for the autonomy of individual congregations -- a battle informed by concerns over how those autonomous congregations would then interpret the Bible, in light of prevailing modernist trends, but nevertheless not solely concerned with interpretation.

The Mennonites are part of the Anabaptists tradition. Writing in 1955, young Mennonite scholar John Howard Yoder -- after corresponding with Carl Henry of the NAE -- articulated some of the problems with neo-evangelicals, from the point of view of Mennonites and like-minded evangelicals: "Yoder urged Henry to relinquish his obsession with doctrinal details and philosophical rationalism. The Fundamentals pamphlet series 'was a time-bound polemic strategy' that addressed issues pertinent to the early twentieth century, but could not meet the challenges of the 1950s. 'For instance, they included nothing about social ethics, nothing about what Christian United is and is not, and further, the polemic strategy then chosen served better to build a barrier than to speak across the gap.'"

"Only a small minority of conservative American Protestants shared the neo-evangelicals' rationalist, Reformed heritage. Most churches continued to emphasize other themes -- such as personal holiness, internal transformation, or gifts of the Holy Spirit -- over intellectual assent to philosophical claims about the nature of God."

The NAE's striving for evangelical solidarity also suffered due to some of the alliances made by some of its more prominent members. Billy Graham, one of the most famous crusaders of the 20th century, came under fire by some conservative Protestants for allying himself both with Catholics and with liberal Protestants in the name of revivalism (and, arguably, in the service of promoting the Billy Graham brand).

Graham was a trustee at Fuller Theological Seminary, which is at the heart of one of the phenomena I mentioned earlier, the attrition of liberal and moderate evangelicals. Fuller was founded in 1947 by radio evangelist Charles Fuller and NAE co-founder Harold Ockenga. "In the late 1950s, a number of Fuller professors -- including the founder's son, Dan Fuller -- concluded that they could not abide by the seminary's statement of faith on the point of strict inerrancy. They had come to believe that while the Bible remained an 'infallible' guide on matters of doctrine, worship, and Christian life, it was not accurate in every scientific and historical fact. By 1961, the atmosphere at Fuller was poisonous."

One of the moderate board members pointed out the "complexity of the inerrancy debate, the 'very real problem of arriving at a precise meaning of the word inerrancy ... It would be literally impossible for you or anyone else who has a good knowledge of the Bible to sign our doctrinal statement without at least some degree of reservation. I -- along with others -- believe the statement should be carefully review by our faculty in much prayer and in the Holy Spirit ... the Fuller faculty and board compose the only group I know of in evangelical circles who are honest enough to face this matter openly."

Consider the source here. This isn't a modernist. This isn't a secularist. This is a professor at an evangelical seminary founded by the neo-evangelicals, literally the intellectual center of the neo-evangelicals, the home of Billy Graham, arguing not to throw out evangelicalism but simply that inerrancy is an indefensible doctrine, and that the faculty should pray together and come to a decision about revising the statement of faith required of professors and students. This is not a liberal revolt.

But it was treated as one. The moderates "won," insofar as Fuller to this day admits both conservative and liberal Protestant students, and the most conservative faculty resigned over the inerrancy issue. But they made martyrs of themselves in resigning, and included many of the best-known names among the faculty, the heroes of the scene. Inerrancy was the hill they were willing to die on.

Billy Graham launched Christianity Today in 1956, one of many evangelical magazines in an age when most American households still subscribed to and read multiple magazines. The title is a deliberate counterpoint to The Christian Century, the most popular mainline Protestant magazine of the day. "The neo-evangelicals behind Christianity Today did not propose to modernize old-time religion. On the contrary, they were proud defenders of fundamentalism." But they still sought to engage with mainstream America more than the less worldly, separatist, less "neo" evangelicals -- what Worthen calls the unreconstructed evangelicals.

Christianity Today is the spiritual sibling, so to speak, of The National Review, founded the previous year by William F. Buckley Jr. "Neoevangelical conservatives felt just as embattled as the editors at Christianity Today. The editors of the National Review founded their magazine out of a similar desire to rally their cause in a hostile marketplace and overcome liberals' caricatures of the 'Neanderthal Right,' as Buckley put it."

"Midcentury American conservatism featured -- on a grander, more tumultuous scale -- the same insecurity and discord that the neo-evangelicals perceived in their conservative Protestant world. Its various factions formed a dysfunctional family, clamoring with clashing beliefs and pet obsessions, whose members spent as much time squabbling among themselves as they did lobbying for right-to-work laws or denouncing progressive rulings by the Supreme Court. They felt both marginalized in the corridors of power and exhilarated by their increasingly well-funded drive to take back America. Buckley, however, was a master coalition builder. He managed to keep secular libertarians and Catholic traditionalists on the same masthead and maintain a healthy distance from the John Birch Society and other radioactive characters in the movement. This uneasy alliance was the key to his ability to lead an intellectual resurgence that eventually penetrated Washington ... [In 1961], one journalist conducted a survey of the shifting climate on college campuses and gave Buckley 's organizations the lion's share of the credit for the burgeoning 'revolt not only against socialist welfare statism in government, but also against indoctrination by leftist professors ... The conservative student revolt is a campus phenomenon from Stanford and Berkeley on the West coast to the Ivy League, from the University of Washington to the University of Miami."

Christianity Today was dependent on the financial support of oil exec John Howard Pew, who skewed more conservative than some of the writers -- Pew had also provided funding for libertarian journals and political groups, the Liberty League (an anti New Deal group), and the John Birch Society. In part because of Pew's support -- which kept the magazine afloat in the first decade, when it ran at a deficit of several hundred thousand dollars a year -- Christianity Today hewed close to the National Review's political conservatism, "toe[ing] the conservative line on every significant political and theological issue from foreign policy and civil rights to evolution and the ecumenical movement. (Pew called one of the CT founders, Carl Henry, a "socialist" for believing that evangelicals should be getting more involved in activism like the civil rights movement rather than doubling down on their opposition to it; their clashes contributed to Henry's forced resignation in 1968.)

Political conservatism and conservative Protestant theology seem synonymous now, inevitably linked hand in hand, but it need not be so, nor was it always so. For one thing, conservative Protestants -- evangelicals especially -- had often called for keeping churches out of politics, and explicitly contrasted themselves with Catholics and the Social Gospel movement among liberal and moderate Protestants when doing so. For another, conservative Protestants have also supported liberal political positions in the past -- some of them in alliances with the Social Gospel on some issues, for instance. The history of the politicization of the abortion issue is by now well-publicized; conservative evangelicals were largely disinterested in it until relatively recently in their history. Christianity Today's political positioning was part of the process that laid the groundwork for this politicized and politically conservative evangelicalism.

That said, it's interesting what form political conservatism took in those early days: after Henry resigned -- in 1968, remember -- Pew's further financial contributions were "contingent on the promise that Henry's successor, Harold Lindsell, would continue condemning 'the ecumenical church's political involvement.'" It's hard to imagine today's analogues to Pew having reservations with the church being politically involved. The assumption was that "political involvement" meant, at least to some degree, support for liberal causes, rather than the conservative activism that soon became common.

"These noisy internal quarrels concealed one remarkable silence: the dearth of conversation with conservatives outside the neo-evangelical bubble. In the magazine's early years, when nearly every issue featured essays lambasting communism, urging a retrenchment of conservative Christian values, and otherwise echoing many themes favored by William F Buckley and other writers in the rash of new conservative journals, the editors of Christianity Today gave little sign that they considered themselves comrades in arms with conservative Catholics, Jews, and repentant ex-socialists."

Fundamentalism and obsession with the end times had played roles in shifting evangelicalism away from social activism. "The belief that this world will fall into greater misery and chaos before Christ's Second Coming dampened evangelical enthusiasm for collaborating with secular authorities to reform society ... After all, social decay was a sign that Christ's return was drawing near. In the context of the fundamentalist-modernist controversy, large-scale social activism was contaminated by association with the enemy: those heterodox liberals who did not merely live out the gospel through good deeds, but seemed to believe that good deeds might replace the gospel altogether. In a massive shift that evangelical sociologist David Moberg later called 'The Great Reversal,' many conservative Protestants began focusing more energy on evangelism, personal moral crusades (pressing for Prohibition rather than fighting poverty), and denouncing modernism, all at the expense of social reform."

Both CT and the NR were intellectual outlets. However ... "In the 1950s, many conservative intellectuals were in the business of historical rediscovery, reconstructing an intellectual genealogy to support their critique of liberal theories of human progress and individual autonomy. Against modern secular liberalism, they asserted the sacralized and sin-stained worldview of medieval Christendom, the natural law of the ancient Greeks, the civic decrees of Roman philosophers."

"The editors at CT pondered much of the same history in these years ... yet the commitment to biblical inerrancy had warped neo-evangelicals' understanding of the past. Although no godly revivalist's teachings stood on par with scripture, the basic principle of inerrancy -- that historical circumstance does not influence human authorship or interpretation, when that human writes or thinks by God's will -- seeped into the way they interpreted history outside the Bible as well. They were less interested in understanding ancient thinkers in their own historical context than in thinking themselves to a succession of proto-fundamentalist torchbearers, Christians who 'believed in the Bible' as the neo-evangelicals themselves thought scripture should be read. Their ahistorical view of scripture, their overriding desire to defend their doctrine of inerrancy as ancient, immutable, and God-given, made sensitive scholarship impossible. In the hands of CT's editors history became a legal brief for inerrancy, a purity test for the present."

The intellectual commitment of CT -- and the failings of that commitment -- would have extended to Crusade University, "the first evangelical research university, an omnibus institution with undergraduate and graduate programs, churning out original scholarship in the Lord's name," the brainchild of Graham, Henry, and other CT founders. They were unable to either secure funding, though, or -- particularly given the neo-evangelicals' opposition to the separatism of unreconstructed fundamentalists represented by Bob Jones University (founded in the 1920s) -- successfully address the balance of evangelical orthodoxy and mainstream academia.

Crusade University's failure to manifest did not keep evangelicals from going to school -- and grad school -- in large numbers, however. "Several neo-evangelical scholars earned PhDs and ThDs at Harvard in the 1940s and 1950s ... Fuller graduates were winning Fulbrights ... as Mark Noll has noted, the problem was not so much evangelicals' failure to excel at secular academic institutions, but rather their ability to compartmentalize their faith from new learning. They tended to position themselves in fields where no one would corner them too aggressively on their views about, for example, how one ought to interpret the creation narrative in Genesis."

That said, many evangelicals viewed higher education with suspicion, even as they availed themselves of it. Bob Jones and other "separatist" universities existed in part because of this superstition, and evangelical colleges dragged their heels in seeking accreditation -- and were often criticized for doing so -- usually not doing so until the late 1950s, about 3-4 decades after mainline Protestant schools and 20 years after Catholic ones. "Fundamentalists and conservative evangelicals understood the history of American higher education as a story of decline from holiness to heterodoxy. Their own institutions were oases where the Bible still reigned. Moreover, early Bible college leaders were unimpressed by a self-policing, credentialed elite. They exalted the common sense of the layman whose faith was unmuddled by the mystifications of the so-called experts."

Sound familiar?

I'm going to stop there because that's where I stopped marking pages -- I tended to mark fewer in the book's coverage of the 70s and beyond because, well, the part of history that I lived through, I'm more likely to remember.

Friday, December 23, 2016

here are some of the reasons you or some grade-A asshole you know will give for not helping other people

That charity's CEO makes too much

Give me a fucking break.

First of all, CEO compensation is not nearly as important as a charity's efficiency, as measured by, for instance, Charity Navigator. CEOs of very large charities may make amounts that sound large, but relative to the amount of money the charity takes in and redistributes, a seemingly large number doesn't necessarily mean anything.

Second of all ... look, all you're doing here is making it clear how little attention you pay to how much OTHER CEOs make, the ones who don't work for charities. The charity CEOs sure as fuck aren't making an amount the private sector would find significant. They're making more than you are, and more than I am. A lot more. But if you're outraged by that, I think it can only be because of a lack of perspective, a lack of understanding the real scale of income in the world. There may be a handful of charity executives who are overcompensated -- I certainly can't claim to have seen everyone's tax forms -- but far more common are the executives making much less than their private sector counterparts.

According to CharityWatch, the highest paid charity CEO (who makes twice as much as the second-highest) makes about $3.5 million. Sure, it sounds like a lot. The average income of a CEO of an S&P 500 company is almost $14 million. The median income of charity CEOs is about 1.1% of that. PLEASE PAY ATTENTION TO THAT DECIMAL POINT, YOU SHITHEAD.

You pay charity CEOs more than other workers for the same reason you pay other CEOs more than other workers -- because our culture expects and encourages salaries to work that way, and because you are competing for their labor with other organizations that could offer them more. Any legitimate problem you can raise with that is best addressed to the private sector, where the inequality is far greater, and where much, much more of the money from your wallet ends up in the offshore accounts of CEOs.

Third, thinking of CEO compensation purely in terms of income misunderstands a great deal about the world of the American wealthy, and charity CEOs aren't receiving the tremendous benefits of their private sector counterparts. Even that 1.1% figure is overselling the wealth of the heads of charities.

Fourth, what is it you're actually outraged about, when you come down to it?

Is it really the amount of money? You're really not contributing much to it, let's face it. The only charity CEOs making serious money are heading charities large enough that your donations are a drop in the bucket -- you may as well complain about the contribution of your sales tax to the salaries of the state government officials you don't like, if you're going to fixate on the unfair distribution of your every dime.

I think this complaint more frequently speaks to a conviction people have that people who work for charities should do so primarily out of a motivation to be charitable, and that there is therefore something inappropriate about them being paid for it. We are so fucking miserly in our approach to charity, our approach to helping others, that even when we're in essence hiring a service to do good works on our behalf, we don't want that service to retain any of the money we're giving them to do so. We want it to pass untouched directly to the recipients. We want our charities to be volunteer-run and incur no expenses beyond a postage stamp, while somehow managing to distribute our charity in more useful and efficient forms than we can do ourselves.

That doesn't say great things about you or the grade-A asshole you know.

Goodwill doesn't work the way I thought it did

Well Jesus Christ, tough shit, buttercup.

Goodwill does not redistribute donated goods to the poor.

Goodwill provides charitable services in the form of employment, training, and related community programs to its employees, and collects donated goods in order to keep its overhead costs low so that it can afford to fund those services and pay those incomes.

That has always been Goodwill's model, and they've been around for over a century.

If you don't think that's a model you want to fund, don't do business with Goodwill. But the number of people who think Goodwill is deceiving them and collecting donated goods under false pretenses is ridiculous. It isn't their fault that you don't pay attention to one of the best-known charities in the country.

Charities don't even do charity, man, they just keep all the money

cf. the CEO argument above.

There are some bad charities. There are a few different kinds, I guess: actual scams, de facto scams that nevertheless legally operate as charities, charities that are run incompetently, charities that are run incredibly inefficiently, and charities that allow political or religious motivations to impact the way they operate, without sufficiently disclosing that to donors.

But first of all: this is not the majority of charities.

Second: You are not a fucking nineteenth century street urchin with nothing to rely on but the life skills you earned at the School of Hard Knocks. You have the internet. You have Charity Navigator. You have the ability to Google for NYT articles about a given charity. It is not fucking difficult to figure out if a given charity is a) real, b) the subject of a recent or ongoing legitimate scandal that should give you pause, c) good at doing whatever it is you want to accomplish with your charitable donation. This takes five minutes at the most.

Third: Apart from the scams, most of those problems are not about "keeping all the money," they're organizational problems or problems of the charity's goals not matching your own. People jump to this idea of charities being secret profit monsters really quickly, because -- well, because they're assholes without much real compassion, and a lot of these arguments, you'll notice, have a foul core at the center of the onion that is all about who does and doesn't deserve your compassion.

Drug-test welfare recipients

There are so many reasons this is a horseshit idea, and only a horseshit person would support it:

1: Assuming your goal is to save money by denying assistance to those who test positive, it doesn't work: these programs consistently cost more money than they save.

This fight has already been played out in decades of workplace drug-testing, which has declined since its 1990s peak not because drug use has declined but because employers discovered that spending money on drug testing employees or applicants was not actually resulting in gains of productivity. Outside of safety-relevant contexts, the purpose of drug testing is not really to improve performance or save money, it's to ostracize drug users.

2: The whole premise is wrongly predicated. Welfare recipients are significantly less likely to use drugs than the general population.

There is a common image of the average drug user as a strung-out addict living on the streets, an image promoted by both Nixon's War on Drugs and, especially, Reagan's campaign against crack, but one that Democrats have bought into just as much. It's not true, of course. It's insane that I have to point out that it isn't true. You are in all likelihood a current or past drug user yourself.

Drugs cost money. People with more money are more likely to be able to afford drugs. It's not fucking rocket science, and it's not a secret. Study after study -- and the War on Drugs has motivated many such studies -- of illegal drug use has confirmed this for decades. Lower-income people just don't have the money to spare to buy drugs as often or in as great numbers as the rest of the population does. What is true, however, is that when they buy drugs they are more likely to buy them in public or semi-public places rather than from a classmate or co-worker like the more monied people who don't think of themselves as drug users, and they are more likely to be targeted and intercepted.

Every level of the War on Drugs already disproportionately impacts lower-income drug users (and accused drug users and their families): drug investigations, searches, drug arrests, drug convictions, asset forfeiture and civil forfeiture, parole denials, readmissions to prison for parole violations. Felony conviction -- and some misdemeanor convictions in some jurisdictions -- results in significant loss of rights, which can include the right to vote, to serve on juries, and to receive welfare or other public assistance. It seriously impacts job prospects in most states, and can disqualify applicants from college scholarships. And obviously felony conviction for drug offenses, once again, disproportionately impacts people of lower income.

So you're already accomplishing your goal: denying benefits to people who are both needy and drug users.

3: But why do you have that goal?

Because you are a suckhole piece of shit, is why.

Because the core premise of laws like this is that aid goes to "the deserving," and we have demonized drug use, but we have demonized it in a very specific way. Drug use is prevalent throughout all ranks of society, after all. We have primarily demonized it among people who are vulnerable to being caught doing it.

We don't propose drug testing kids applying for student loans, the CEOs of companies receiving corporate welfare, etc etc. We single out the small percentage of the population receiving a specific form of aid that has been subject to decades of demonization, because ultimately this argument is just about looking for an excuse to deny welfare benefits to people, because you don't think anyone deserves them.

Why should we spend money on foreign aid when we have homeless veterans right here

Porque no los dos, you fuckknuckle?

This "why help this cause when this other cause exists" argument is obviously one of the lowest forms of conversation, but I know you know some grade-A asshole who brings it up.

Why should we have welfare, just work hard

However hard you work, there are hundreds of thousands of people who work harder and have less to show for it.

However hard you work, luck is always a factor. I'm not even just talking about privilege here. Privilege is important: regardless of effort and personal achievement, the system benefits white people, benefits native English speakers, benefits men, benefits people who could afford a college education regardless of whether they truly needed the content of that education for their job. But I'm talking about luck. If you don't understand that luck has played a role in your every success, you have badly misread the story of your life. If you don't understand that it could have easily gone another way if the person who interviewed you for a job just happened to be in a worse mood that day, or if your parents had moved to Town X instead of Town Y, then you understand almost nothing about your own life, and you certainly don't understand anything about the circumstances anyone else faces.

You can work hard and have nothing to show for it. You can work hard and prosper. The problem comes when those who work and prosper assume that their prosperity is evidence of their character, and that by extension everyone else's successes or lack thereof reflects their own worth.

Very few people these days will come out and say "the tangible rewards you have in life reflect who you are as a person and your value" (though of course some close, like the people who subscribe to abominable, morally toxic doctrines that correlate "positive or negative thinking" to the positive or negative events in your life). But you only have to talk to a few people in the course of any day to realize that it's what many of them believe.

And they are grade-A assholes.

People deserve welfare because they're people.

People deserve compassion because they're people.

Except for maybe your grade-A asshole friend over there.

But I don't really ...

No, I know you don't. Listen. This is the problem. This is what so much of this horseshit comes down to, and you or your grade-A asshole friend ought to shut the fuck up unless you want people to realize this about you: it comes down to a desire to withhold compassion.

A desire to use it as a reward. To give it -- whether "it" means caring about what happens to people, acting on that caring, or supporting policies and organizations that distribute material assistance and services to people -- only to "the deserving."

This desire to withhold compassion, to withhold charity, consistently overrides even pragmatic thinking: nevermind that it costs more to drug-test welfare recipients than you could ever save by doing so, as long as one drug user is denied some food stamps, it's all worth it, right? Nevermind that drug treatment programs are proven, again and again, to cost less and prevent more crime than prison sentences for drug users. Why should we help someone who doesn't deserve it, even if it benefits us to do so? Nevermind that it helps the economy to fund housing, education, healthcare, and job training for anyone who needs it -- what really matters here is, do we think they deserve the help?

If we try hard enough, I'm pretty sure we can always find a way to no. We are a pretty innovative fucking people, after all.




Saturday, November 12, 2016

cuisine and empire

I think my first blog blog, as opposed to LiveJournal or whatever other platforms I may have forgotten, was a cooking blog. I have cooked for a fair bit. About thirty years, I suppose, but only on the regular for about ... well, twenty-five years, then. From the latter part of high school on, I've cooked the overwhelming majority of the meals I've eaten, and sometime long after people said I was good at it, I started actually being good at it. The Food Network came and went (I realize it's still broadcasting, but come on). I had a few blogs. At one point or another I've made most things from scratch that don't require a still or a grain mill.

I think about cooking a lot, is the thing. I wouldn't do it if it weren't something that intrigued me. Or I suppose I would do, but I'd do it differently, you know -- I wouldn't cook the way that I do, which is a whole thing we don't need to get into here. I think about flavor, I think about technique, I think about context, and because I'm a historian, I think about history.

There aren't a whole lot of book-length histories of cuisine out there. It's a slightly more popular topic for microhistory -- you can find a number of different histories of coffeehouses, tea, pizza -- and there has been a small but promising uptick (maybe too small, maybe I shouldn't spook it) in books about immigrant cuisines in the United States in the last decade or so, which is very very cool. But there are only a handful of broad histories of cuisine overall, of which Reay Tannahill's is probably still the canonical.

Rachel Laudan's Cuisine and Empire is a valuable addition to the field. It differs from Tannahill quite deliberately in that Tannahill (writing in the 70s, perhaps relevantly) primarily organizes her work by country (or empire), while Laudan emphasizes the contacts between cultures.

This is a huge book, and by its nature not something can be summarized, so there will be a lot of detail that I skip over here because I just didn't think to dogear it.

It's been some decades since Tannahill's book, and in that time there has been considerable activity on the matter of cooking in prehistory. Most famously, Richard Wrangham has proposed that cooking actually predates us -- us meaning H. sapiens anyway, and that Homo erectus first cooked its food nearly two million years ago. Further, Wrangham argues that, as the subtitle of his book Catching Fire would have it, Cooking Made Us Human -- that it was our discovery of cooking that drove human evolution on a path divergent from the other primates, one that led not only to less time foraging but less time eating. Chimpanzees spend need to spend five hours a day chewing their food in order to get enough energy to get through that day. Cooking not only softens food (and in the case of many ingredients, increase bio-availability of many nutrients, and of course neutralizes many toxins), it's Wrangham's view that an early development of cooking contributed to numerous evolutionary advantages, including a more efficient digestive tract. This is not a universally held view, mainly because there is insufficient archaeological evidence to compel it, but it is more widely accepted that our various masteries of eating -- cooking, hunting, and much much later agriculture -- contributed to brain growth.

Our modern expectation to eat "fresh" and "natural" foods is possible only because we eat foods out of season -- radically out of season, in senses incomprehensible to the past: we not only rapidly transport food across the world from where it is grown or raised to where it is eaten, we not only refrigerate food to extend its freshness, we alter the natural life cycles of animals in order to have meat and dairy on demand, and we've spent thousands of years breeding both animals and plants for more desirable food traits. Plant-based foods take longer to spoil and are more resistant to pests; meat is more abundant; dairy is sweeter.

Humankind is possible only because of unfresh food, of course, preserved food, smoked, dried, salted, fermented. Grain that's been in the granary for months. Dried out meat you have to boil for a few hours before it's soft enough that you can eat it. Different peoples faced different challenges of climate, and had access to different food resources -- those differences, ultimately, account for the earliest cuisines, which is to say, sets of cooking methods, techniques, habits, and technologies characteristic of a given region or culture.

Laudan classifies cooking operations into four groups: "changing temperature (heating and cooling); encouraging biochemical activity (fermenting); changing chemical characteristics by treating with water, acids, and alkalis (leaching and marinating, for instance); and changing the size and shape of the raw materials using mechanical force (cutting, grinding, pounding, and grating, for example)." It's an important reminder in part because until we get to the fairly recent past, cooks had to do much more of this than they do now; most purchased ingredients are already heavily processed, though we don't think of them that way. Even at farmer's markets, for instance, many of the vegetables have been washed (even if not as efficiently as supermarket vegetables are), and possibly trimmed. But that's the most minor example compared to the preparation of grains -- which required hours of work for every day's worth of food -- or meat. "Take meat, for example. A carcass has to be skinned before meat can be cut from the bone and then into portions. These may then be eaten, or subjected to heat and then eaten, or frozen and dried or fermented so that they can be eaten at a later date." Food historians generally refer to those preliminary operations as "processing," although moderns tend to think of "processed food" as spray cheese and Tofurkey.

The stories of the earliest cities are the stories of cuisines based primarily on grains and roots -- and really, the Neolithic Revolution, the Agricultural Revolution, might better be called the Grain Revolution, because while it is sometimes simply described as "when people started planting crops," which led to permanent settlements instead of nomadic hunting and gathering, it was mastery of grain that made this possible, and it occurs relatively late in the history of cooking (especially if we accept Wrangham's view) because dealing with grain is so fucking difficult. There's some debate about whether we may have been grain-foragers before we were grain-planters -- I mean, presumably we had to have been, but the debate is about how long that went on -- but in the grand scale of things it doesn't make much difference. Grain is fucking difficult. The seeds are very small and very hard and even once you deal with them, you still need to process them further to eat them. (Keep in mind that even gathering fuel for cooking fires was a lot of work and time.)

As Laudan points out, "Cities, states, and armies appeared only in regions of grain cuisines. When they did, grain cuisine splintered into subcuisines for powerful and poor, town and country, settled populations and nomads. A feast following a sacrifice to the gods was the emblematic meal everywhere, the meal that represented and united the society, as Thanksgiving now does in the United States. It is not clear whether these global parallels reflect widespread contact between societies, the logic of emerging social organization, or a combination of the two."

That last sentence sums up a lot of history and anthropology, incidentally. Don't trust anyone who insists that when you find X and sort-of-X in two places, it must be because contact between the two places transmitted X. That recent study claiming ancient origins for Little Red Riding Hood et al based on phylogenetic analyses? Don't take it at its word.

Anyway, the crazy difficulty of grain (even apart from how much more difficult it is earlier in history at the dawn of the Neolithic Revolution): "Steamed broomcorn millet and foxtail millet, tiny round grains from disparate boanical genera, were the basis of the first cuisine we encounter in the Yellow River Valley in ancient China. There peasants lived in small villages, their dwellings half buried in the ground and roofed with thick thatch to protect against the freezing winters, and the interiors crammed with grain and preserved vegetables. Small patches of millet dotted the valley's fertile yellow soil, which was brought by floods and winds from the steppe. To prepare the millet, peasants lifted heavy pestles high above mortars and let them fall repeatedly until the inedible outer hulls were cracked. Beginning around the first century BCE, they used foot-trodden pestles to pound grain in a mortar buried in the ground, a less demanding method. When all the hulls were cracked, they tossed the grains in a basket, winnowing away the lighter hulls. Then they steamed the grains until they were light and fluffy in three-legged pots set over small fires, a method that conserved scarce fuel. Before dipping their fingers into the communal bowl, they offered a little to the gods and the ancestors. They accompanied the millet with bites of pickled vegetables, cabbage of various kinds, mallow, water-shield (an aquatic plant), or bamboo shoots, seasoned and preserved with costly salt. Sometimes, when they had trapped small wild animals, they had a bit of boiled or steamed meat, seasoned with Chinese chives, Chinese dates, or sour apricots."

By this point, other grains had been introduced to the region from the Fertile Crescent -- wheat and barley, collectively referred to as mai -- but were unpalatably tough and chewy when prepared like millet, and were usually eaten only in lean times, like the months before the new harvest, when last year's millet stock began to run low.

A more elaborate sacrificial feast:

"Servants set out mats of aromatic reeds, small stools to support the diners' elbows, and dishes of bronze, wood, bamboo, and pottery. Meat on the bone and grain went on the left of each setting, sliced meat, drinks, and syrups on the right, and around them minced and roast meats, onions, and drinks were arranged in a symmetrical pattern. After making an offering to the ancestors, the king and the nobles knelt to eat, each man's seniority and valor in battle determining where he knelt and what pieces of meat he was entitled to. The warriors took morsels of the drier dishes with their fingers: meats marinated in vinegar, fried, and served over millet or rice; jerky spiced with brown pepper; and jerky seasoned with ginger, cinnamon, and salt. They scooped up keng, a stew soured with vinegar or sour apricots (Prunus mume, the "plums" of plum sauce). They nibbled on small cubes of raw beef, cured in chiu, and served with pickles, vinegar, or the juice of sour apricots; on meatballs of rice and pork, mutton, or beef; and on the much-sought-after roasted, fat-wrapped dog's liver."

But up above, we mentioned roots too, not just grains. The tropical monsoon region begins a few hundred miles south of the Yellow River Valley, and included both a root cuisine and a rice cuisine, about which much less is known than the Yellow River Valley cuisine. "To begin with the root cuisine, taro, yam, and the cooking banana (the starchy, high-yielding fruit of Musa spp., as well as its root) were boiled or steamed, and most likely pounded to pastes that could be scooped up with the fingers. People on the oceanic side of New Guinea loaded outriggers with the basics of this culinary package and sailed east into the Pacific. To sustain themselves at sea, they stowed lightweight, long-lasting dried or fermented fish, breadfruit, and bananas for food. They filled gourds and bamboo sections with water, and drank the water inside coconuts. They packed slips, cuttings, young plants, and taro and yams in moist moss, then wrapped them in a covering such as leaves or bark cloth, tucked them into palm-leaf casings, and hung them out of reach of salt spray. Breeding pairs of pigs, chickens, and dogs, which, if worst came to worst, could be eaten on the way, were carried on board. Between 1400 and 900 BCE, they settled many of the South Pacific Islands."

Another sacrificial feast, in Mesopotamia (followed by some general detail):

"A sacrificial feast included sauces, sweets, and appetizers, hallmarks of high cuisine. Fried grasshoppers or locusts made tasty appetizers. Pickles and condiments concocted from seeds, sesame oil, vegetables, fruits, garlic, turnip, onion, nuts, and olives titillated the palate. Sauces were prepared from an onion-and-garlic flavoring base combined with a rich fatty broth thickened with breadcrumbs, the ancestors of sauces still served in the Middle East and even of present-day English bread sauce. Pomegranates, grapes, dates, and confections of milk, cheese, honey, and pistachios provided a sweet touch.

"Professional cooks labored in kitchens that were as large as three thousand square feet, much of the space devoted to making grain-based dishes, bread, and beer. From the coarse groats and fine flour provided by the grinders -- perhaps prisoners and convicts -- cooks prepared porridge, flatbreads, and slightly leavened breads, the latter in three hundred named varieties. Dough was shaped into the form of hearts, hands, and women's breasts, seasoned with spices, and filled with fruit, with the texture often softened by oil, milk, ale, or sweeteners. A flour-oil pastry was enlivened with dates, nuts, or spices such as cumin or coriander. Stuffed pastries were pressed into an oiled pottery mold with a design on the bottom before baking. Flatbreads were baked on the inside walls of large ceramic pots. There is some evidence that bulgur, an easy to cook food, was made by drying parboiled wheat.

"To feed the cities, barley was shipped along rivers and canals. Onions of various kinds, garlic, herbs such as rue, and fruits such as apples, pears, figs, pomegranates, and grapes came from the gardens of the wealthy. The animals were driven to the city, where they were slaughtered, the lambs and the kids going to the temples and noble houses, the male sheep and goats to the officials, royalty, and nobles, the tough ox and ewe meat to the army, and the carcasses of donkeys to the dogs, perhaps royal hunting dogs. Saltwater fish, turtles, and shellfish came from the salt marshes and the Persian Gulf. Dried fish, probably a specialized and regulated industry, came from the Persian Gulf and from as far away as Mohenjo-Daro on the Indus and the Arabian Sea. Salt, excavated from the mountains or evaporated from briny springs and brackish river water, was shipped to distribution centers and packed onto asses, probably in standard-sized, solid-footed goblets.

"Barley was wealth. It paid for the meat and cheeses. It paid for the lapis lazuli and carnelian dishes for the sacrifice, the gold and silver for jewelry, the boatloads of copper that came down the Euphrates or from Dilmun on the Persian Gulf, the metals from Oman and the Sinai, the granite and marble from Turkey and Persia, and the lumber from Lebanon used to build the temples.

"Nomads around the fringes of the irrigated and cultivated areas included the Hebrews, whose daily fare large comprised barley pottages flavored with greens and herbs and flatbreads of barley and wheat, which they farmed in oases during the growing season or acquired by bartering their barren ewes and young rams. They made yogurt and fresh cheese from the milk of their flocks, which they ate accompanied by olive or sesame oil, honey, and grape must and date sweeteners (both of which were also called honey). To conserve their flocks, the source of their wealth, they enjoyed meat only on special occasions following the sacrifice of the 'fruit of the ground' (barley and wheat) and the 'firstlings of the flock' (lambs and kids) to Jehovah."

On various uses of grain:

"The ancient Romans built their empire on barley porridge. The Chinese enjoy rice porridge, the Indians rice and lentil porridge. Polenta (millet and later maize porridge) has sustained generations of Italian peasants. Similarly, grits and mushes were staples of the American colonies. Turkish families commemorate Noah's rescue from the flood with a porridge of mixed grains, fruit, and nuts. Left to sour or ferment slightly, boiled grain dishes became tangy, a flavor much appreciated in eastern Europe, for example.

"Bread -- baked flour and water paste -- was much more portable, but it needed more fuel. Early bread was nothing like our puffy square loaf. Because so much of the bran had to be sifted out to make white flour, white bread was reserved for the very rich until the nineteenth century. Most bread was dark and flat, made of one or more of the hard grains, such as barley, wheat, oats, and later rye, often with some mixture of beans and the starchier nuts, such as chestnuts or acorns.

"To run a city-state or provision an army, rulers had to make sure that grains were extracted from those who worked the land, then transported to cities and put in storage. Sometimes they demanded grain as tribute; sometimes they operated what were in effect agribusinesses farmed by slaves, serfs, or other barely free labor to produce grain; and later they exacted taxes to be paid in grain. Grains, more important, if less glamorous, than the precious metals, exotic wild animals, and beautiful slave girls that they also collected, were processed and redistributed to the ruler's household and bodyguard as pay in kind. Kings, emperors, landlords, and the great religious houses continued to collect grain long after money was invented."

More on the sheer labor of cooking:

"Before beginning to cook, women had to gather scraps of brush, seaweed, dung, furze -- anything that would burn. Steaming and boiling, which use the least fuel, were the commonest ways of cooking. A hot meal was often prepared only once a day, other meals being cold. Water for cooking, drinking, and washing, enough for one to five gallons a day per person (contemporary Americans use about seventy-two gallons a day), had to be carried from a river or well; three gallons weigh about twenty-four pounds. Salt was a luxury, reserved for making salty preserves that accompanied salt-free porridge or bread."

On blood, beliefs about which inform ancient meat cuisines, and sacrifice:

"Blood congealed into flesh, according to the Chinese, the Hebrews, and the Greeks. It was what food finally turned into in animals, said Aristotle. Consequently few societies were neutral about blood as food: some valued it highly, others prohibited it. In the first group were nomads who harvested blood from their animals, Christians who drained the blood of carcasses and used it to make sausages or thicken sauces, and the Chinese. Even today many Hong Kong Chinese mothers feed their children blood soup to sharpen their minds before examination. In the second group were Jews and Muslims, who slaughtered animals so as to drain all blood from the body.

"The sacrifice was followed by the sacrificial feast -- humans eating the gods' leftovers, which were charged with divine power. This might mean eating the flesh of sacrificed humans, a practice motivated not by hunger but by the logic of sharing the gods' leftovers. At least some northern Europeans ate the brains of the sacrificed in the third millennium BCE. The Cocoma people of Brazil, when admonished by the Jesuits for eating their dead and drinking an alcohol laced with ground bones, reportedly said that it 'was better to be inside a friend than to be swallowed by the cold earth.' The Aztecs ate slivers of flesh from those who had been sacrificed on the pyramids. More commonly, however, the feast featured roast meat from sacrificed animals."

Eating human flesh, whether or not in the context of sacrifice, is one of those topics that's subject to a lot of controversy and misinformation. Depictions of the Aztecs as bloodthirsty cannibals, for instance, were obviously pulpy nonsense cooked up much later, but a rejection of that depiction led to a widespread rejection of the notion of any Aztec cannibalism, which is also -- from what I understand, though Mesoamerican history is not at all my area -- false. Cannibalism in times of desperation is obviously widespread in the sense that you find it in any culture, in any time or part of the world, where there is such desperation and famine. Sacrificial or ritual cannibalism is sort of a different thing, though of course some historians and anthropologists theorize that cultures that resorted to desperation-induced cannibalism frequently enough simply developed rituals around it.

Which brings us to theories of other food rituals and food rules, the best known of which in the Western world are the Jewish dietary restrictions:

"Jewish culinary rules were laid out in Leviticus and other books of the Old Testament. Blood, animals with cloven hooves unless they chewed their cud, pork, animals with both fins and scales who lived in the water, and (echoing Persian practice) insects were all forbidden as foods. So was cooking meat in milk and dining with non-Jews. Temple priests followed rules of purification before sacrifice, slaughtered animals so that the lifeblood drained out, and refrained from offering impure fermented (corrupted) foods.

"In the mid-twentieth century, scholars offered opposing interpretations of Jewish food rules, particularly the ban on pork. Marvin Harris argued that they were health measures to prevent trichinosis." The Harris theory was still widely disseminated when I was in grad school, incidentally, and vaguely familiar to a lot of people outside the field, even though it isn't very good (or perhaps because it doesn't require much information). "Mary Douglas and Jean Soler contended that they were designed to create a distinct Jewish identity. The latter interpretation squares better with the simultaneous proliferation of culinary rules in the Persian Empire and the Indian states. With limited culinary resources, identity is most easily established by banning certain foodstuffs, cooking methods, and ways of dining. Pigs, being difficult to herd, were not popular with peoples of nomadic origin, so the force of the rule probably only became fully felt centuries later when Jews became a minority in pork-eating Roman or Christian lands."

Speaking of Rome:

"Every morning and evening, Roman infantrymen prepared meals like those they would have eaten at home on the farm. They boiled wheat to make wheat porridge or wheat pottage (wheat cooked with dried peas, beans, or lentils, a bit of oil, salt, and a little salt pork), which they dipped into with wooden spoons. Or they mixed whole-wheat flour, water, and salt and baked coarse whole-wheat bread, probably in the ashes of the campfire, to eat with a bit of cheese. In the morning, these foot soldiers ate standing up like animals outside their goatskin tents. In the evening, they ate seated on the ground in the tents like children or slaves. They drank water cut with wine or vinegar. Sacrifices on festival days, before they went into battle, and to celebrate victory, added a treat of boiled or roast beef. On the move or near the enemy, biscuit -- twice cooked bread that lasted a long time -- made an instant meal."

Making that porridge or potage required soldiers to grind grain, for which pack mules carried grindstones. "One of the soldiers assembled the grindstone, placing first a skin or cloth on the ground to catch the flour, then the squat lower grooved cylindrical stone, then the top stone, which rotated over the lower one. He squatted like a woman or slave over the grindstone. With one hand he rotated the upper stone using a peg near the circumference as a handle; with the other he poured handfuls of grain into a hole in the upper stone. The grain dribbled onto the lower stone and was sheared by the movement of the upper. The flour moved toward the circumference along grooves cut in the lower stone. He could grind enough meal for an eight-man squad in about an hour and a half with this rotary grinder, compared to at least four or five hours had he used a simple grindstone.

"Adopting the rotary grindstone involved a series of tradeoffs. It ground faster. The weight of the upper stone, not the weight of the grinder, did the shearing, making the work less exhausting. On the other hand, the rotary grindstone was heavier, more expensive, and more difficult to make than a simple grindstone. Nor could it produce the fine gradations of flour that the simple grindstone could deliver. ... If every squad of eight men required a mill and if at its height, the army comprised half a million men, then some sixty thousand grindstones were lugged over the Roman roads. A millennium and a half was to pass before any other European army was as well fed."

Roman feasts during the Empire:

"The dinner included appetizers, sauced dishes, and desserts, all spurned by republicans. For appetizers, diners might have lettuce (perhaps served with an oil and vinegar dressing), sliced leeks (boiled, sliced in rounds, and dressed with oil, garum [like fish sauce], and wine), tuna garnished with eggs on rue leaves, eggs baked in the embers, fresh cheese with herbs, and olives with honeyed wine.

"For the main course, slaves brought in dishes such as red mullet roasted and served with a pine nut sauce; mussels cooked with wine, garum, and herbs; sow's udder, boiled until soft and then grilled and served with sauce; chicken with a stuffing of ground pork, boiled wheat, herbs, and eggs; and crane with turnips in an herb-flavored vinegar sauce. Exotic fare, such as a pea dish, a chicken stew, and baked lamb with a sweet-and-sour sauce, attributed to Persia, added a cosmopolitan touch.

"Typically, sauces were made by pulverizing hard spices, usually pepper or cumin, but also anise, caraway, celery seed, cinnamon, coriander, cardamom, cassia, dill, mustard, poppy, and sesame, in a mortar. Nuts, such as almonds, filberts, and pine nuts, or fruits, such as dates, raisins, and plums, were added and the mass was worked to a paste. To this mixture, fresh herbs such as basil, bay, capers, garlic, fennel, ginger, juniper, lovage, mint, onion, parsley, rosemary, rue, saffron, savory, shallot, thyme, or turmeric were added, followed by garum and perhaps wine, must, honey, olive oil, or milk. The mixture was warmed to blend the tastes and sometimes thickened with wheat starch, eggs, rice, or crumbled pastry."

Outside of the Roman Empire, grain processing was as much as four times more labor-intensive, and in many parts of the world, unleavened bread (and steamed and boiled doughs in the form of pasta and dumplings, as in China) continued to be the norm. Eventually some cultures caught up to the Romans' efficiency, but beyond that, "There was to be little change in grain processing until the Industrial Revolution, and little change in the final cooking of grains until the twentieth century."

"To supplement the staple grain, oil seeds and olives were crushed in a variety of mills and mortars and pressed in a variety of presses. Sweeteners continued to be produced by many different methods -- sprouting grains (malt sugar in China), boiling down sap (palm sugar in India), boiling down fruit juices (grape and other fruit juices in the Middle East), and taking honeycombs from hives (honey in the Roman Empire). Alcoholic and lactic fermentations in the western half of Eurasia and mold ferments in the eastern half were used to make staple dishes (raised bread) and alcoholic drinks (wine, beer, and chiu) as well as to preserve foods (cheese and sausage in the Roman Empire, milk in the Middle East) and create condiments (fermented beans in China). Autolysis (self-digestion) produced garum in the Mediterranean and probably fish sauce in Southeast Asia."

An important point I brought up earlier about food processing is mentioned as we moved through the next millennium and a half:

"Cooking was a form of alchemy, the most sophisticated understanding of changes in matter then available. Cooking and alchemy used the same tools and equipment. Both sought to find the real nature or essence of a natural substance by applying the purifying power of fire. Just as a crude ore had to be refined in the fire to release the pure shining metal, so raw wheat or sugarcane had to be similarly refined to extract the pure white flour (originally "flower") or gleaming sugar. Culinary processes such as sugar refining and bread baking were thus potent metaphors for spiritual progress. Unlike our contemporary understanding of natural food as having received only minimal processing, this earlier understanding was that processing and cooking were essential to reveal what was natural."

This all reflects one of the major principles of ancient culinary philosophy: the theory of the culinary cosmos, which led to the practice of eating only cooked food (even most fruits were not often eaten uncooked in Europe, for instance), and, for those who could afford the options, to eat food that "balances the temperament," an idea that trickles down today in a lot of horseshit diets.

This balancing the temperament stuff was grounded in the idea of the "humors," or maybe it's better to think of them as both coming from the same worldview, and it informed the view of what we would now term "healthy eating." "In preparing food for their noble employers, cooks were as aware of the need to balance the humors as we are today of, say, the need to have all food groups represented. Root vegetables such as turnips were by nature earthy (dry and cold) and thus better left to peasants. Chard, onions, and fish were cold and wet, so that frying was appropriate. Mushrooms were so cold and wet that they were best avoided entirely. Melons and other fresh fruit were not much better, being very moist and thus thought likely to putrefy in the stomach. Grapes were best served dried as raisins, quinces were dried and cooked with extra sugar -- warm in humoral theory -- to make quince paste. Red wine tended to be cold and dry, so it was best served warm with added sugar and spices."

Another major principle was the hierarchical principle, which broadly called for eating according to your station in life -- a high cuisine for the court, a humble cuisine for the poor -- which roughly in this time period was extended in many parts of the world to include higher cuisine for holy men and intellectuals than for the unenlightened, rather than basing hierarchy only on overt political power.

The third ancient culinary principle was that of sacrifice, which had been largely been phased out in the Axial Age and replaced with new religious rules for eating: "these rules identified preferred ingredients and dishes, often ones believed to enhance contemplation, such as meat substitutes (fish, tofu, gluten), sweetened soft fruit and nut drinks, or stimulating drinks such as tea, coffee, and chocolate. They specified how to process an cook foods, including guidelines for slaughtering, and laid down rules about how cooks should purify themselves, whether fermented foods were acceptable, and which foods could and could not be combined. A third cluster of rules specified mealtimes, days of fasting and feasting, and who could dine with whom.

"The rules, stricter for religious elites than for ordinary believers, were formulated and reformulated for centuries because the founders of the religions, although they relied on culinary metaphors to explain beliefs and doctrines, rarely laid down clear and consistent regulations for cooking and eating. Christians, for example, were not required to fast until the fourth or fifth century. Then they were instructed to fast on about half the days of the year. Today, in the Roman Catholic Church, fasting has been reduced to a minimum."

"Even more important in the dissemination of the new cuisines were monasteries, shorthand for permanent religious houses. Like courts, they were places where all ranks of society met, from clerics to their servants and slaves. Like court kitchens, monastery kitchens were huge and complex, turning out different meals for different ranks: noble and aristocratic visitors; passing merchants, monks, and nuns; the poor and indigent; the sick; and students studying in the monastery school. ... Like courts, they invested in food-processing equipment like gristmills, oil presses, and sugar mills, processing and adding value to foodstuffs. These they sold or offered as gifts, thereby creating loyalty. Like courts, monasteries were part of networks that crossed state boundaries, in this case by the movement of religious orders and missionaries rather than marriage."

"As theocratic cuisines spread, so did their preferred raw materials: plants and sometimes animals. Particularly important were the transfers of southeastern and Chinese plants to Buddhist India, Indian plants to Buddhist China, Chinese plants to Korea and Japan, Indian plants to Islamic lands, and European plants to the Americas through the Columbian Exchange. Royal and monastic gardens and large estates transplanted, ennobled, and grew sugarcane, rice, grapevines, tea, coffee, and other crops essential to the new cuisines."

Here we come to one of my favorite topics in culinary history:

"Whereas culinary diffusion prior to world religions had primarily meant emulating or rejecting neighboring high cuisines, with world religions the relation between successive cuisines became more complex. 'Fusion,' the term so often used, does not do justice to the variety of interactions. One cuisine could be layered over another, as happened with the Spanish conquests in the Americas, the conquerors eating Catholic cuisine, the indigenous retaining their own cuisine. Specific dishes, techniques, plants, and animals might be adopted, as Europeans, for example, adopted distilling, confectionary, and citrus from Islam."

Oh, if I had a nickel for every dipshit going on about how some dish or approach isn't "authentic."

There is no authentic cuisine. All cuisines are in flux and ever have been. Lots of people carry around a sense of normalcy based on a sphere that extends for a couple hundred miles and a couple dozen years, and think that sense of normalcy reflects something real, something other than their memory of food they've experienced. That's not how it works. Italian food didn't suddenly become Italian food when tomatoes arrived on the boot, or when immigrants in the northeast US started making meatballs. And putting tomato sauce on that pasta for the first time, making those first giant meals of spaghetti and meatballs, didn't invalidate those meals either.

Nobody worried about this bullshit when they actually fucking cooked. It's the hobbyhorse of the dilettante.

Meanwhile! In the Mongol Empire!

"Twenty seven soups dominate the ninety-five food recipes [in Hu's Proper and Essential Things]. The centerpiece of Mongol cuisine, these soups could be quite liquid or thickened to become solid. The basic recipe went as follows:

"1: Chop meat on the bone (usually mutton, but also game such as curlew, swan, wolf, snow leopard) into pieces. Boil in a cauldron of water until tender. Strain the broth and cut up the meat.

"2: Boil the broth with a variety of thickeners, vegetables, and tsaoko cardamom.

"3: Add the meat.

"4: Season to taste with salt, coriander, and onions.

"For a traditional Mongol taste, the thickeners might be chickpeas, hulled barley, or barley meal. To give the soup a Persian touch, it was thickened with aromatic rice or chickpeas, seasoned with cinnamon, fenugreek seeds, saffron, turmeric, asafetida, attar of roses, or black pepper, and finished with a touch of wine vinegar. For a Chinese taste, it was thickened with wheat-flour dumplings and glutinous rice powder or rice-flour noodles, and flavorings of ginger, orange peel, soybean sauce, and bean paste. In this way, the soup of the khans could be adjusted to the preferences of the peoples they had conquered."

Authenticity my ass.

This is basically how pizza adapts to local culinary niches today, and of course what McDonald's does internationally.

Now coffee enters the scene, thank God:

"Coffee, like wine, was an aid to union with the divine. Long before the time of the Sufis, coffee beans, the fruit of a bush native to the highland forests of southwestern Ethiopia, had been chewed like a nut or mixed with animal fat to make a portable, satisfying, and stimulating food for warriors." If you haven't seen the way coffee grows, the bean is just the seed, and of course has a softer fruit surrounding it (which is also lightly caffeinated, and is sometimes used now in some coffee-growing regions to make a vaguely hibiscus-like drink). "Coffee plants were naturalized in Yemen perhaps as early as the sixth century BCE when the Abyssinians invaded Arabia. Later, a new way of preparing coffee by toasting the beans, grinding them, and brewing them with hot water was developed, perhaps in Iran. The Arabic word for coffee, qahwah, probably derives from a word meaning to have little appetite and hence to be able to do without. It had been first applied to wine and later to coffee (which suppressed the desire to sleep). Sufi pilgrims, traders, students, and travelers consumed coffee to keep awake during ceremonies and induce a sense of euphoria, spreading its use throughout the Islamic world between the thirteenth and fifteenth centuries."

Islam introduced coffee to the West, as with so many things, and that's not all! They also introduced stuff to have with coffee.

"Sugar cookery was introduced from Islam in the twelfth century by a physician known as Pseudo-Messue. The English words syrup, sherbet, and candy all have Arabic roots. Medicinal electuaries, pastes of spices and drugs, and comfits, sugar-coated spices, were the distant forerunners of candy. Sugared spices did not break the fast, Thomas Aquinas said, because 'although they are nutritious themselves, sugared spices are nonetheless not eaten with the end in mind of nourishment, but rather for ease in digestion.' It was an important decision, both because it gave medical respectability to sugar and because it foreshadowed later arguments about chocolate."

Arab fruit pastes became Portuguese quince marmalada, later inspiring citrus marmalades that are more familiar to Americans, and the sweet fried doughs used to celebrate the end of Ramadan inspired similar fried doughs in Catholic traditions, eaten before the Lenten fast: doughnuts, beignets, etc. (The Brits have their pancakes.)

Along with all this came distillation and better booze. Not too shabby.

"In the early fourteenth century, cookbook manuscripts began appearing across Europe.... Rarely were these cookbooks step-by-step manuals, being, rather, testimonials to a ruler's fine cuisine or aide-memoires to professional cooks. With the invention of printing, the number increased again."

Medieval history is not at all my area of expertise, but this broadly fits my understanding of the ... history of professionalization, sort of, the history of procedural rigor, if you will.

The dissemination of cookbooks further contributed to the Westernization of Islamic dishes in Europe, in much the same way that nineteenth and twentieth century cookbooks Americanized immigrant and foreign cuisines:

"Al-sikbaj (meat cooked in a mixture of sweetener and vinegar) was transformed into fried or poached fish (or chicken, rabbit, or pork) in an acid marinade of vinegar or orange (escabeche), perhaps the origin of aspic." Al-sikbaj was a characteristic dish of the Moors who conquered Spain, but has since died out in the Muslim world. "Ruperto de Nola's Libre del coch included thin noodles, bitter oranges, fried fish, escabeche, almond sauces, and almond confections. Martinez Motino's Arte de cocina contained several recipes for meatballs and capirotada, and one for couscous. It also had one for Moorish hen, roast chicken cut into pieces, simmered with bacon, onion, broth, wine, and spices -- which were not named, but probably included pepper, cinnamon, and cloves -- and then enlivened with a final dash of vinegar. The bacon and wine were typically Christian, but the sour-spicy sauce justifies the name."

So here's the other thing about sugar: it used to be in fucking everything. The line between "sweet" and "savory" isn't just a recent thing, it's the defining characteristic of the modern palate. Candies and confectionery used to include not just candied oranges and cherries, but carrots and turnips. Meat dishes in high cuisines were regularly served in sweet sauces -- no, not like at that Chinese place, no not like barbecue sauce, like really noticeably sweet, not tangy.

Then that changed.

If you have to pick a point where things start to change, it's 1651, when Pierre Francois La Varenne published La Cuisinier Francois, which was widely translated, and which inspired numerous imitators. The middle of the seventeenth century saw a significant shift in Western tastes characterized by two changes: "the disappearance of spices and sugar from savory dishes [notice how rarely we use 'baking' spices like cinnamon, clove, nutmeg, etc., in savory dishes, whereas they are still common in Middle Eastern, North African, and Central Asian cuisines] and the appearance of new fat-based sauces, many thickened with flour."

The traditional Catholic cuisine was displaced piecemeal across Europe. In England, "the aristocracy dined on the new French cuisine. The gentry, by contrast, rejected this in favor of a middling bread-and-beef cuisine optimistically described as the national cuisine." Across most of western Europe, sweet and sour were segregated to different dishes and usually different courses, while beef and bread became higher profile, as did dairy, and sauces using fat and flour. French cuisine both informed other European cuisines while at the same time absorbing and reinterpreting elements of them, a process that continued for the next couple centuries.

One of the major innovations of the time period was "middling cuisines," a prerequisite to modern cuisine: "Middling in the sense of bridging high and low cuisine, rich in fats, sugar, and exotic foodstuffs, featuring sauces and sweets, and eaten with specialized utensils in dedicated dining areas, middling cuisine became available to an increasing proportion of the population in the following centuries. Changes in political and nutritional theory underwrote this closing of the gap between high and humble cuisines. As more nations followed the Dutch and British in locating the source of rulers' legitimacy not in hereditary or divine rights but in some form of consent or expression of the will of the people, it became increasingly difficult to deny to all citizens the right to eat the same kind of food. In the West, the appearance of middling cuisines ran in close parallel with the extension of the vote. Reinforcing this, nutritional theory abandoned the idea that cuisine determined and reflected rank in society in favor of a single cuisine appropriate for every class of people.

"The growth of middling cuisines is what nutritionists call the 'nutrition transition,' the sequential global shift from diets composed largely of grains to diets high in sugar, oils, and meat ... the nutrition transition increases food security [but] brings in its wake many associated health problems, including increased incidence of strokes, heart attacks, obesity, and diabetes, and with them increased costs for society." (Of course, poverty and malnutrition have also decreased, so there's that.)

These middling cuisines began before the Industrial Revolution, but that was a huge driver in really bringing all these trends together and forming what we would recognize as modern cuisine. The advances of the Industrial Revolution brought about more efficient and cheaper forms of food preservation, refrigeration and rapid transportation of fresh food, extraordinary advances in agriculture (among them new fertilizers and pesticides), and so on, transforming the quality, quantity, and price of food more dramatically than any development had since the mastery of grain cookery thousands of years earlier. Those advances in transportation also made more feasible the waves of immigration that repopulated the United States after Native American tribes were decimated, and the arrival of many, many different immigrant groups, all with their own cuisines -- but not always with access to ingredients from home, and sometimes finding it easier to adapt what was available -- contributed to what is erroneously called the "melting pot," an American cuisine that was and I think remains in flux. Americans were also the first to begin using ice in their drinks -- and in a million other ways -- to such a great extent, and pioneered the commercial ice business.

The influence of French cuisine on modern cooking remained strong, and in the nineteenth and twentieth centuries, numerous dishes in non-French cuisines were created or altered with distinctive French touches -- adding butter instead of oil, reducing the spices and herbs in Greek dishes, adding dressing cold cooked vegetables or meats with mayonnaise or raw vegetables with vinaigrette. Bechamel -- originally Italian! but popularized by La Varenne -- showed up everywhere, with Russians using it as a piroshki filling with mushrooms or to thicken soup, Mexican chefs using it to dress fish, Indian chefs using it to dress eggplant and pumpkin. Beef Stroganov, unsurprisingly, is one of the most famous Russian dishes attempting to emulate French cooking, while bechamel was repopularized in northern Italy and found its way into lasagna.

A middling cuisine means, by extension, that pretty much everyone eats pretty much the same thing, at least in the broad strokes. Inevitably that means the specifics invite criticism. "Religious groups, conservatives, socialists, and feminists attacked modern middling cuisines. Some wanted the egalitarianism of modern culinary philosophy but rejected other aspects. For example, many reformers turned their backs on meat, white bread, and alcohol, developing alternative physiologies and nutritional theories to explain why vegetarianism [a term coined in the 1840s] or whole grains were superior. Others attacked domesticity, liberalism, and free trade, proposing alternative ways of organizing modern cooking, commerce, and farming. Yet others hoped it might be possible to return to a [purely] imagined egalitarian past, invoking agrarian and romantic traditions to criticize modern, industrialized cuisines."

One key to remember with the historical development of these things, and when encountering new such things in the wild, is, you know, the rejection tends to come first, with the rationale developed shortly thereafter. By the time you hear about it, that may not be clear, because once the rationale is developed, it's all "so I was doing research on Youtube and I discovered, holy fuck, bananas cause butt cancer," but really it's just that this one guy didn't fucking like bananas, or the idea of bananas, or he really liked the idea of conspicuously avoiding consumption of something, and later he came up with the butt cancer thing.

Okay! That brings us close enough to the present day to wrap up there. One more book down.