Friday, December 23, 2016

here are some of the reasons you or some grade-A asshole you know will give for not helping other people

That charity's CEO makes too much

Give me a fucking break.

First of all, CEO compensation is not nearly as important as a charity's efficiency, as measured by, for instance, Charity Navigator. CEOs of very large charities may make amounts that sound large, but relative to the amount of money the charity takes in and redistributes, a seemingly large number doesn't necessarily mean anything.

Second of all ... look, all you're doing here is making it clear how little attention you pay to how much OTHER CEOs make, the ones who don't work for charities. The charity CEOs sure as fuck aren't making an amount the private sector would find significant. They're making more than you are, and more than I am. A lot more. But if you're outraged by that, I think it can only be because of a lack of perspective, a lack of understanding the real scale of income in the world. There may be a handful of charity executives who are overcompensated -- I certainly can't claim to have seen everyone's tax forms -- but far more common are the executives making much less than their private sector counterparts.

According to CharityWatch, the highest paid charity CEO (who makes twice as much as the second-highest) makes about $3.5 million. Sure, it sounds like a lot. The average income of a CEO of an S&P 500 company is almost $14 million. The median income of charity CEOs is about 1.1% of that. PLEASE PAY ATTENTION TO THAT DECIMAL POINT, YOU SHITHEAD.

You pay charity CEOs more than other workers for the same reason you pay other CEOs more than other workers -- because our culture expects and encourages salaries to work that way, and because you are competing for their labor with other organizations that could offer them more. Any legitimate problem you can raise with that is best addressed to the private sector, where the inequality is far greater, and where much, much more of the money from your wallet ends up in the offshore accounts of CEOs.

Third, thinking of CEO compensation purely in terms of income misunderstands a great deal about the world of the American wealthy, and charity CEOs aren't receiving the tremendous benefits of their private sector counterparts. Even that 1.1% figure is overselling the wealth of the heads of charities.

Fourth, what is it you're actually outraged about, when you come down to it?

Is it really the amount of money? You're really not contributing much to it, let's face it. The only charity CEOs making serious money are heading charities large enough that your donations are a drop in the bucket -- you may as well complain about the contribution of your sales tax to the salaries of the state government officials you don't like, if you're going to fixate on the unfair distribution of your every dime.

I think this complaint more frequently speaks to a conviction people have that people who work for charities should do so primarily out of a motivation to be charitable, and that there is therefore something inappropriate about them being paid for it. We are so fucking miserly in our approach to charity, our approach to helping others, that even when we're in essence hiring a service to do good works on our behalf, we don't want that service to retain any of the money we're giving them to do so. We want it to pass untouched directly to the recipients. We want our charities to be volunteer-run and incur no expenses beyond a postage stamp, while somehow managing to distribute our charity in more useful and efficient forms than we can do ourselves.

That doesn't say great things about you or the grade-A asshole you know.

Goodwill doesn't work the way I thought it did

Well Jesus Christ, tough shit, buttercup.

Goodwill does not redistribute donated goods to the poor.

Goodwill provides charitable services in the form of employment, training, and related community programs to its employees, and collects donated goods in order to keep its overhead costs low so that it can afford to fund those services and pay those incomes.

That has always been Goodwill's model, and they've been around for over a century.

If you don't think that's a model you want to fund, don't do business with Goodwill. But the number of people who think Goodwill is deceiving them and collecting donated goods under false pretenses is ridiculous. It isn't their fault that you don't pay attention to one of the best-known charities in the country.

Charities don't even do charity, man, they just keep all the money

cf. the CEO argument above.

There are some bad charities. There are a few different kinds, I guess: actual scams, de facto scams that nevertheless legally operate as charities, charities that are run incompetently, charities that are run incredibly inefficiently, and charities that allow political or religious motivations to impact the way they operate, without sufficiently disclosing that to donors.

But first of all: this is not the majority of charities.

Second: You are not a fucking nineteenth century street urchin with nothing to rely on but the life skills you earned at the School of Hard Knocks. You have the internet. You have Charity Navigator. You have the ability to Google for NYT articles about a given charity. It is not fucking difficult to figure out if a given charity is a) real, b) the subject of a recent or ongoing legitimate scandal that should give you pause, c) good at doing whatever it is you want to accomplish with your charitable donation. This takes five minutes at the most.

Third: Apart from the scams, most of those problems are not about "keeping all the money," they're organizational problems or problems of the charity's goals not matching your own. People jump to this idea of charities being secret profit monsters really quickly, because -- well, because they're assholes without much real compassion, and a lot of these arguments, you'll notice, have a foul core at the center of the onion that is all about who does and doesn't deserve your compassion.

Drug-test welfare recipients

There are so many reasons this is a horseshit idea, and only a horseshit person would support it:

1: Assuming your goal is to save money by denying assistance to those who test positive, it doesn't work: these programs consistently cost more money than they save.

This fight has already been played out in decades of workplace drug-testing, which has declined since its 1990s peak not because drug use has declined but because employers discovered that spending money on drug testing employees or applicants was not actually resulting in gains of productivity. Outside of safety-relevant contexts, the purpose of drug testing is not really to improve performance or save money, it's to ostracize drug users.

2: The whole premise is wrongly predicated. Welfare recipients are significantly less likely to use drugs than the general population.

There is a common image of the average drug user as a strung-out addict living on the streets, an image promoted by both Nixon's War on Drugs and, especially, Reagan's campaign against crack, but one that Democrats have bought into just as much. It's not true, of course. It's insane that I have to point out that it isn't true. You are in all likelihood a current or past drug user yourself.

Drugs cost money. People with more money are more likely to be able to afford drugs. It's not fucking rocket science, and it's not a secret. Study after study -- and the War on Drugs has motivated many such studies -- of illegal drug use has confirmed this for decades. Lower-income people just don't have the money to spare to buy drugs as often or in as great numbers as the rest of the population does. What is true, however, is that when they buy drugs they are more likely to buy them in public or semi-public places rather than from a classmate or co-worker like the more monied people who don't think of themselves as drug users, and they are more likely to be targeted and intercepted.

Every level of the War on Drugs already disproportionately impacts lower-income drug users (and accused drug users and their families): drug investigations, searches, drug arrests, drug convictions, asset forfeiture and civil forfeiture, parole denials, readmissions to prison for parole violations. Felony conviction -- and some misdemeanor convictions in some jurisdictions -- results in significant loss of rights, which can include the right to vote, to serve on juries, and to receive welfare or other public assistance. It seriously impacts job prospects in most states, and can disqualify applicants from college scholarships. And obviously felony conviction for drug offenses, once again, disproportionately impacts people of lower income.

So you're already accomplishing your goal: denying benefits to people who are both needy and drug users.

3: But why do you have that goal?

Because you are a suckhole piece of shit, is why.

Because the core premise of laws like this is that aid goes to "the deserving," and we have demonized drug use, but we have demonized it in a very specific way. Drug use is prevalent throughout all ranks of society, after all. We have primarily demonized it among people who are vulnerable to being caught doing it.

We don't propose drug testing kids applying for student loans, the CEOs of companies receiving corporate welfare, etc etc. We single out the small percentage of the population receiving a specific form of aid that has been subject to decades of demonization, because ultimately this argument is just about looking for an excuse to deny welfare benefits to people, because you don't think anyone deserves them.

Why should we spend money on foreign aid when we have homeless veterans right here

Porque no los dos, you fuckknuckle?

This "why help this cause when this other cause exists" argument is obviously one of the lowest forms of conversation, but I know you know some grade-A asshole who brings it up.

Why should we have welfare, just work hard

However hard you work, there are hundreds of thousands of people who work harder and have less to show for it.

However hard you work, luck is always a factor. I'm not even just talking about privilege here. Privilege is important: regardless of effort and personal achievement, the system benefits white people, benefits native English speakers, benefits men, benefits people who could afford a college education regardless of whether they truly needed the content of that education for their job. But I'm talking about luck. If you don't understand that luck has played a role in your every success, you have badly misread the story of your life. If you don't understand that it could have easily gone another way if the person who interviewed you for a job just happened to be in a worse mood that day, or if your parents had moved to Town X instead of Town Y, then you understand almost nothing about your own life, and you certainly don't understand anything about the circumstances anyone else faces.

You can work hard and have nothing to show for it. You can work hard and prosper. The problem comes when those who work and prosper assume that their prosperity is evidence of their character, and that by extension everyone else's successes or lack thereof reflects their own worth.

Very few people these days will come out and say "the tangible rewards you have in life reflect who you are as a person and your value" (though of course some close, like the people who subscribe to abominable, morally toxic doctrines that correlate "positive or negative thinking" to the positive or negative events in your life). But you only have to talk to a few people in the course of any day to realize that it's what many of them believe.

And they are grade-A assholes.

People deserve welfare because they're people.

People deserve compassion because they're people.

Except for maybe your grade-A asshole friend over there.

But I don't really ...

No, I know you don't. Listen. This is the problem. This is what so much of this horseshit comes down to, and you or your grade-A asshole friend ought to shut the fuck up unless you want people to realize this about you: it comes down to a desire to withhold compassion.

A desire to use it as a reward. To give it -- whether "it" means caring about what happens to people, acting on that caring, or supporting policies and organizations that distribute material assistance and services to people -- only to "the deserving."

This desire to withhold compassion, to withhold charity, consistently overrides even pragmatic thinking: nevermind that it costs more to drug-test welfare recipients than you could ever save by doing so, as long as one drug user is denied some food stamps, it's all worth it, right? Nevermind that drug treatment programs are proven, again and again, to cost less and prevent more crime than prison sentences for drug users. Why should we help someone who doesn't deserve it, even if it benefits us to do so? Nevermind that it helps the economy to fund housing, education, healthcare, and job training for anyone who needs it -- what really matters here is, do we think they deserve the help?

If we try hard enough, I'm pretty sure we can always find a way to no. We are a pretty innovative fucking people, after all.




Saturday, November 12, 2016

cuisine and empire

I think my first blog blog, as opposed to LiveJournal or whatever other platforms I may have forgotten, was a cooking blog. I have cooked for a fair bit. About thirty years, I suppose, but only on the regular for about ... well, twenty-five years, then. From the latter part of high school on, I've cooked the overwhelming majority of the meals I've eaten, and sometime long after people said I was good at it, I started actually being good at it. The Food Network came and went (I realize it's still broadcasting, but come on). I had a few blogs. At one point or another I've made most things from scratch that don't require a still or a grain mill.

I think about cooking a lot, is the thing. I wouldn't do it if it weren't something that intrigued me. Or I suppose I would do, but I'd do it differently, you know -- I wouldn't cook the way that I do, which is a whole thing we don't need to get into here. I think about flavor, I think about technique, I think about context, and because I'm a historian, I think about history.

There aren't a whole lot of book-length histories of cuisine out there. It's a slightly more popular topic for microhistory -- you can find a number of different histories of coffeehouses, tea, pizza -- and there has been a small but promising uptick (maybe too small, maybe I shouldn't spook it) in books about immigrant cuisines in the United States in the last decade or so, which is very very cool. But there are only a handful of broad histories of cuisine overall, of which Reay Tannahill's is probably still the canonical.

Rachel Laudan's Cuisine and Empire is a valuable addition to the field. It differs from Tannahill quite deliberately in that Tannahill (writing in the 70s, perhaps relevantly) primarily organizes her work by country (or empire), while Laudan emphasizes the contacts between cultures.

This is a huge book, and by its nature not something can be summarized, so there will be a lot of detail that I skip over here because I just didn't think to dogear it.

It's been some decades since Tannahill's book, and in that time there has been considerable activity on the matter of cooking in prehistory. Most famously, Richard Wrangham has proposed that cooking actually predates us -- us meaning H. sapiens anyway, and that Homo erectus first cooked its food nearly two million years ago. Further, Wrangham argues that, as the subtitle of his book Catching Fire would have it, Cooking Made Us Human -- that it was our discovery of cooking that drove human evolution on a path divergent from the other primates, one that led not only to less time foraging but less time eating. Chimpanzees spend need to spend five hours a day chewing their food in order to get enough energy to get through that day. Cooking not only softens food (and in the case of many ingredients, increase bio-availability of many nutrients, and of course neutralizes many toxins), it's Wrangham's view that an early development of cooking contributed to numerous evolutionary advantages, including a more efficient digestive tract. This is not a universally held view, mainly because there is insufficient archaeological evidence to compel it, but it is more widely accepted that our various masteries of eating -- cooking, hunting, and much much later agriculture -- contributed to brain growth.

Our modern expectation to eat "fresh" and "natural" foods is possible only because we eat foods out of season -- radically out of season, in senses incomprehensible to the past: we not only rapidly transport food across the world from where it is grown or raised to where it is eaten, we not only refrigerate food to extend its freshness, we alter the natural life cycles of animals in order to have meat and dairy on demand, and we've spent thousands of years breeding both animals and plants for more desirable food traits. Plant-based foods take longer to spoil and are more resistant to pests; meat is more abundant; dairy is sweeter.

Humankind is possible only because of unfresh food, of course, preserved food, smoked, dried, salted, fermented. Grain that's been in the granary for months. Dried out meat you have to boil for a few hours before it's soft enough that you can eat it. Different peoples faced different challenges of climate, and had access to different food resources -- those differences, ultimately, account for the earliest cuisines, which is to say, sets of cooking methods, techniques, habits, and technologies characteristic of a given region or culture.

Laudan classifies cooking operations into four groups: "changing temperature (heating and cooling); encouraging biochemical activity (fermenting); changing chemical characteristics by treating with water, acids, and alkalis (leaching and marinating, for instance); and changing the size and shape of the raw materials using mechanical force (cutting, grinding, pounding, and grating, for example)." It's an important reminder in part because until we get to the fairly recent past, cooks had to do much more of this than they do now; most purchased ingredients are already heavily processed, though we don't think of them that way. Even at farmer's markets, for instance, many of the vegetables have been washed (even if not as efficiently as supermarket vegetables are), and possibly trimmed. But that's the most minor example compared to the preparation of grains -- which required hours of work for every day's worth of food -- or meat. "Take meat, for example. A carcass has to be skinned before meat can be cut from the bone and then into portions. These may then be eaten, or subjected to heat and then eaten, or frozen and dried or fermented so that they can be eaten at a later date." Food historians generally refer to those preliminary operations as "processing," although moderns tend to think of "processed food" as spray cheese and Tofurkey.

The stories of the earliest cities are the stories of cuisines based primarily on grains and roots -- and really, the Neolithic Revolution, the Agricultural Revolution, might better be called the Grain Revolution, because while it is sometimes simply described as "when people started planting crops," which led to permanent settlements instead of nomadic hunting and gathering, it was mastery of grain that made this possible, and it occurs relatively late in the history of cooking (especially if we accept Wrangham's view) because dealing with grain is so fucking difficult. There's some debate about whether we may have been grain-foragers before we were grain-planters -- I mean, presumably we had to have been, but the debate is about how long that went on -- but in the grand scale of things it doesn't make much difference. Grain is fucking difficult. The seeds are very small and very hard and even once you deal with them, you still need to process them further to eat them. (Keep in mind that even gathering fuel for cooking fires was a lot of work and time.)

As Laudan points out, "Cities, states, and armies appeared only in regions of grain cuisines. When they did, grain cuisine splintered into subcuisines for powerful and poor, town and country, settled populations and nomads. A feast following a sacrifice to the gods was the emblematic meal everywhere, the meal that represented and united the society, as Thanksgiving now does in the United States. It is not clear whether these global parallels reflect widespread contact between societies, the logic of emerging social organization, or a combination of the two."

That last sentence sums up a lot of history and anthropology, incidentally. Don't trust anyone who insists that when you find X and sort-of-X in two places, it must be because contact between the two places transmitted X. That recent study claiming ancient origins for Little Red Riding Hood et al based on phylogenetic analyses? Don't take it at its word.

Anyway, the crazy difficulty of grain (even apart from how much more difficult it is earlier in history at the dawn of the Neolithic Revolution): "Steamed broomcorn millet and foxtail millet, tiny round grains from disparate boanical genera, were the basis of the first cuisine we encounter in the Yellow River Valley in ancient China. There peasants lived in small villages, their dwellings half buried in the ground and roofed with thick thatch to protect against the freezing winters, and the interiors crammed with grain and preserved vegetables. Small patches of millet dotted the valley's fertile yellow soil, which was brought by floods and winds from the steppe. To prepare the millet, peasants lifted heavy pestles high above mortars and let them fall repeatedly until the inedible outer hulls were cracked. Beginning around the first century BCE, they used foot-trodden pestles to pound grain in a mortar buried in the ground, a less demanding method. When all the hulls were cracked, they tossed the grains in a basket, winnowing away the lighter hulls. Then they steamed the grains until they were light and fluffy in three-legged pots set over small fires, a method that conserved scarce fuel. Before dipping their fingers into the communal bowl, they offered a little to the gods and the ancestors. They accompanied the millet with bites of pickled vegetables, cabbage of various kinds, mallow, water-shield (an aquatic plant), or bamboo shoots, seasoned and preserved with costly salt. Sometimes, when they had trapped small wild animals, they had a bit of boiled or steamed meat, seasoned with Chinese chives, Chinese dates, or sour apricots."

By this point, other grains had been introduced to the region from the Fertile Crescent -- wheat and barley, collectively referred to as mai -- but were unpalatably tough and chewy when prepared like millet, and were usually eaten only in lean times, like the months before the new harvest, when last year's millet stock began to run low.

A more elaborate sacrificial feast:

"Servants set out mats of aromatic reeds, small stools to support the diners' elbows, and dishes of bronze, wood, bamboo, and pottery. Meat on the bone and grain went on the left of each setting, sliced meat, drinks, and syrups on the right, and around them minced and roast meats, onions, and drinks were arranged in a symmetrical pattern. After making an offering to the ancestors, the king and the nobles knelt to eat, each man's seniority and valor in battle determining where he knelt and what pieces of meat he was entitled to. The warriors took morsels of the drier dishes with their fingers: meats marinated in vinegar, fried, and served over millet or rice; jerky spiced with brown pepper; and jerky seasoned with ginger, cinnamon, and salt. They scooped up keng, a stew soured with vinegar or sour apricots (Prunus mume, the "plums" of plum sauce). They nibbled on small cubes of raw beef, cured in chiu, and served with pickles, vinegar, or the juice of sour apricots; on meatballs of rice and pork, mutton, or beef; and on the much-sought-after roasted, fat-wrapped dog's liver."

But up above, we mentioned roots too, not just grains. The tropical monsoon region begins a few hundred miles south of the Yellow River Valley, and included both a root cuisine and a rice cuisine, about which much less is known than the Yellow River Valley cuisine. "To begin with the root cuisine, taro, yam, and the cooking banana (the starchy, high-yielding fruit of Musa spp., as well as its root) were boiled or steamed, and most likely pounded to pastes that could be scooped up with the fingers. People on the oceanic side of New Guinea loaded outriggers with the basics of this culinary package and sailed east into the Pacific. To sustain themselves at sea, they stowed lightweight, long-lasting dried or fermented fish, breadfruit, and bananas for food. They filled gourds and bamboo sections with water, and drank the water inside coconuts. They packed slips, cuttings, young plants, and taro and yams in moist moss, then wrapped them in a covering such as leaves or bark cloth, tucked them into palm-leaf casings, and hung them out of reach of salt spray. Breeding pairs of pigs, chickens, and dogs, which, if worst came to worst, could be eaten on the way, were carried on board. Between 1400 and 900 BCE, they settled many of the South Pacific Islands."

Another sacrificial feast, in Mesopotamia (followed by some general detail):

"A sacrificial feast included sauces, sweets, and appetizers, hallmarks of high cuisine. Fried grasshoppers or locusts made tasty appetizers. Pickles and condiments concocted from seeds, sesame oil, vegetables, fruits, garlic, turnip, onion, nuts, and olives titillated the palate. Sauces were prepared from an onion-and-garlic flavoring base combined with a rich fatty broth thickened with breadcrumbs, the ancestors of sauces still served in the Middle East and even of present-day English bread sauce. Pomegranates, grapes, dates, and confections of milk, cheese, honey, and pistachios provided a sweet touch.

"Professional cooks labored in kitchens that were as large as three thousand square feet, much of the space devoted to making grain-based dishes, bread, and beer. From the coarse groats and fine flour provided by the grinders -- perhaps prisoners and convicts -- cooks prepared porridge, flatbreads, and slightly leavened breads, the latter in three hundred named varieties. Dough was shaped into the form of hearts, hands, and women's breasts, seasoned with spices, and filled with fruit, with the texture often softened by oil, milk, ale, or sweeteners. A flour-oil pastry was enlivened with dates, nuts, or spices such as cumin or coriander. Stuffed pastries were pressed into an oiled pottery mold with a design on the bottom before baking. Flatbreads were baked on the inside walls of large ceramic pots. There is some evidence that bulgur, an easy to cook food, was made by drying parboiled wheat.

"To feed the cities, barley was shipped along rivers and canals. Onions of various kinds, garlic, herbs such as rue, and fruits such as apples, pears, figs, pomegranates, and grapes came from the gardens of the wealthy. The animals were driven to the city, where they were slaughtered, the lambs and the kids going to the temples and noble houses, the male sheep and goats to the officials, royalty, and nobles, the tough ox and ewe meat to the army, and the carcasses of donkeys to the dogs, perhaps royal hunting dogs. Saltwater fish, turtles, and shellfish came from the salt marshes and the Persian Gulf. Dried fish, probably a specialized and regulated industry, came from the Persian Gulf and from as far away as Mohenjo-Daro on the Indus and the Arabian Sea. Salt, excavated from the mountains or evaporated from briny springs and brackish river water, was shipped to distribution centers and packed onto asses, probably in standard-sized, solid-footed goblets.

"Barley was wealth. It paid for the meat and cheeses. It paid for the lapis lazuli and carnelian dishes for the sacrifice, the gold and silver for jewelry, the boatloads of copper that came down the Euphrates or from Dilmun on the Persian Gulf, the metals from Oman and the Sinai, the granite and marble from Turkey and Persia, and the lumber from Lebanon used to build the temples.

"Nomads around the fringes of the irrigated and cultivated areas included the Hebrews, whose daily fare large comprised barley pottages flavored with greens and herbs and flatbreads of barley and wheat, which they farmed in oases during the growing season or acquired by bartering their barren ewes and young rams. They made yogurt and fresh cheese from the milk of their flocks, which they ate accompanied by olive or sesame oil, honey, and grape must and date sweeteners (both of which were also called honey). To conserve their flocks, the source of their wealth, they enjoyed meat only on special occasions following the sacrifice of the 'fruit of the ground' (barley and wheat) and the 'firstlings of the flock' (lambs and kids) to Jehovah."

On various uses of grain:

"The ancient Romans built their empire on barley porridge. The Chinese enjoy rice porridge, the Indians rice and lentil porridge. Polenta (millet and later maize porridge) has sustained generations of Italian peasants. Similarly, grits and mushes were staples of the American colonies. Turkish families commemorate Noah's rescue from the flood with a porridge of mixed grains, fruit, and nuts. Left to sour or ferment slightly, boiled grain dishes became tangy, a flavor much appreciated in eastern Europe, for example.

"Bread -- baked flour and water paste -- was much more portable, but it needed more fuel. Early bread was nothing like our puffy square loaf. Because so much of the bran had to be sifted out to make white flour, white bread was reserved for the very rich until the nineteenth century. Most bread was dark and flat, made of one or more of the hard grains, such as barley, wheat, oats, and later rye, often with some mixture of beans and the starchier nuts, such as chestnuts or acorns.

"To run a city-state or provision an army, rulers had to make sure that grains were extracted from those who worked the land, then transported to cities and put in storage. Sometimes they demanded grain as tribute; sometimes they operated what were in effect agribusinesses farmed by slaves, serfs, or other barely free labor to produce grain; and later they exacted taxes to be paid in grain. Grains, more important, if less glamorous, than the precious metals, exotic wild animals, and beautiful slave girls that they also collected, were processed and redistributed to the ruler's household and bodyguard as pay in kind. Kings, emperors, landlords, and the great religious houses continued to collect grain long after money was invented."

More on the sheer labor of cooking:

"Before beginning to cook, women had to gather scraps of brush, seaweed, dung, furze -- anything that would burn. Steaming and boiling, which use the least fuel, were the commonest ways of cooking. A hot meal was often prepared only once a day, other meals being cold. Water for cooking, drinking, and washing, enough for one to five gallons a day per person (contemporary Americans use about seventy-two gallons a day), had to be carried from a river or well; three gallons weigh about twenty-four pounds. Salt was a luxury, reserved for making salty preserves that accompanied salt-free porridge or bread."

On blood, beliefs about which inform ancient meat cuisines, and sacrifice:

"Blood congealed into flesh, according to the Chinese, the Hebrews, and the Greeks. It was what food finally turned into in animals, said Aristotle. Consequently few societies were neutral about blood as food: some valued it highly, others prohibited it. In the first group were nomads who harvested blood from their animals, Christians who drained the blood of carcasses and used it to make sausages or thicken sauces, and the Chinese. Even today many Hong Kong Chinese mothers feed their children blood soup to sharpen their minds before examination. In the second group were Jews and Muslims, who slaughtered animals so as to drain all blood from the body.

"The sacrifice was followed by the sacrificial feast -- humans eating the gods' leftovers, which were charged with divine power. This might mean eating the flesh of sacrificed humans, a practice motivated not by hunger but by the logic of sharing the gods' leftovers. At least some northern Europeans ate the brains of the sacrificed in the third millennium BCE. The Cocoma people of Brazil, when admonished by the Jesuits for eating their dead and drinking an alcohol laced with ground bones, reportedly said that it 'was better to be inside a friend than to be swallowed by the cold earth.' The Aztecs ate slivers of flesh from those who had been sacrificed on the pyramids. More commonly, however, the feast featured roast meat from sacrificed animals."

Eating human flesh, whether or not in the context of sacrifice, is one of those topics that's subject to a lot of controversy and misinformation. Depictions of the Aztecs as bloodthirsty cannibals, for instance, were obviously pulpy nonsense cooked up much later, but a rejection of that depiction led to a widespread rejection of the notion of any Aztec cannibalism, which is also -- from what I understand, though Mesoamerican history is not at all my area -- false. Cannibalism in times of desperation is obviously widespread in the sense that you find it in any culture, in any time or part of the world, where there is such desperation and famine. Sacrificial or ritual cannibalism is sort of a different thing, though of course some historians and anthropologists theorize that cultures that resorted to desperation-induced cannibalism frequently enough simply developed rituals around it.

Which brings us to theories of other food rituals and food rules, the best known of which in the Western world are the Jewish dietary restrictions:

"Jewish culinary rules were laid out in Leviticus and other books of the Old Testament. Blood, animals with cloven hooves unless they chewed their cud, pork, animals with both fins and scales who lived in the water, and (echoing Persian practice) insects were all forbidden as foods. So was cooking meat in milk and dining with non-Jews. Temple priests followed rules of purification before sacrifice, slaughtered animals so that the lifeblood drained out, and refrained from offering impure fermented (corrupted) foods.

"In the mid-twentieth century, scholars offered opposing interpretations of Jewish food rules, particularly the ban on pork. Marvin Harris argued that they were health measures to prevent trichinosis." The Harris theory was still widely disseminated when I was in grad school, incidentally, and vaguely familiar to a lot of people outside the field, even though it isn't very good (or perhaps because it doesn't require much information). "Mary Douglas and Jean Soler contended that they were designed to create a distinct Jewish identity. The latter interpretation squares better with the simultaneous proliferation of culinary rules in the Persian Empire and the Indian states. With limited culinary resources, identity is most easily established by banning certain foodstuffs, cooking methods, and ways of dining. Pigs, being difficult to herd, were not popular with peoples of nomadic origin, so the force of the rule probably only became fully felt centuries later when Jews became a minority in pork-eating Roman or Christian lands."

Speaking of Rome:

"Every morning and evening, Roman infantrymen prepared meals like those they would have eaten at home on the farm. They boiled wheat to make wheat porridge or wheat pottage (wheat cooked with dried peas, beans, or lentils, a bit of oil, salt, and a little salt pork), which they dipped into with wooden spoons. Or they mixed whole-wheat flour, water, and salt and baked coarse whole-wheat bread, probably in the ashes of the campfire, to eat with a bit of cheese. In the morning, these foot soldiers ate standing up like animals outside their goatskin tents. In the evening, they ate seated on the ground in the tents like children or slaves. They drank water cut with wine or vinegar. Sacrifices on festival days, before they went into battle, and to celebrate victory, added a treat of boiled or roast beef. On the move or near the enemy, biscuit -- twice cooked bread that lasted a long time -- made an instant meal."

Making that porridge or potage required soldiers to grind grain, for which pack mules carried grindstones. "One of the soldiers assembled the grindstone, placing first a skin or cloth on the ground to catch the flour, then the squat lower grooved cylindrical stone, then the top stone, which rotated over the lower one. He squatted like a woman or slave over the grindstone. With one hand he rotated the upper stone using a peg near the circumference as a handle; with the other he poured handfuls of grain into a hole in the upper stone. The grain dribbled onto the lower stone and was sheared by the movement of the upper. The flour moved toward the circumference along grooves cut in the lower stone. He could grind enough meal for an eight-man squad in about an hour and a half with this rotary grinder, compared to at least four or five hours had he used a simple grindstone.

"Adopting the rotary grindstone involved a series of tradeoffs. It ground faster. The weight of the upper stone, not the weight of the grinder, did the shearing, making the work less exhausting. On the other hand, the rotary grindstone was heavier, more expensive, and more difficult to make than a simple grindstone. Nor could it produce the fine gradations of flour that the simple grindstone could deliver. ... If every squad of eight men required a mill and if at its height, the army comprised half a million men, then some sixty thousand grindstones were lugged over the Roman roads. A millennium and a half was to pass before any other European army was as well fed."

Roman feasts during the Empire:

"The dinner included appetizers, sauced dishes, and desserts, all spurned by republicans. For appetizers, diners might have lettuce (perhaps served with an oil and vinegar dressing), sliced leeks (boiled, sliced in rounds, and dressed with oil, garum [like fish sauce], and wine), tuna garnished with eggs on rue leaves, eggs baked in the embers, fresh cheese with herbs, and olives with honeyed wine.

"For the main course, slaves brought in dishes such as red mullet roasted and served with a pine nut sauce; mussels cooked with wine, garum, and herbs; sow's udder, boiled until soft and then grilled and served with sauce; chicken with a stuffing of ground pork, boiled wheat, herbs, and eggs; and crane with turnips in an herb-flavored vinegar sauce. Exotic fare, such as a pea dish, a chicken stew, and baked lamb with a sweet-and-sour sauce, attributed to Persia, added a cosmopolitan touch.

"Typically, sauces were made by pulverizing hard spices, usually pepper or cumin, but also anise, caraway, celery seed, cinnamon, coriander, cardamom, cassia, dill, mustard, poppy, and sesame, in a mortar. Nuts, such as almonds, filberts, and pine nuts, or fruits, such as dates, raisins, and plums, were added and the mass was worked to a paste. To this mixture, fresh herbs such as basil, bay, capers, garlic, fennel, ginger, juniper, lovage, mint, onion, parsley, rosemary, rue, saffron, savory, shallot, thyme, or turmeric were added, followed by garum and perhaps wine, must, honey, olive oil, or milk. The mixture was warmed to blend the tastes and sometimes thickened with wheat starch, eggs, rice, or crumbled pastry."

Outside of the Roman Empire, grain processing was as much as four times more labor-intensive, and in many parts of the world, unleavened bread (and steamed and boiled doughs in the form of pasta and dumplings, as in China) continued to be the norm. Eventually some cultures caught up to the Romans' efficiency, but beyond that, "There was to be little change in grain processing until the Industrial Revolution, and little change in the final cooking of grains until the twentieth century."

"To supplement the staple grain, oil seeds and olives were crushed in a variety of mills and mortars and pressed in a variety of presses. Sweeteners continued to be produced by many different methods -- sprouting grains (malt sugar in China), boiling down sap (palm sugar in India), boiling down fruit juices (grape and other fruit juices in the Middle East), and taking honeycombs from hives (honey in the Roman Empire). Alcoholic and lactic fermentations in the western half of Eurasia and mold ferments in the eastern half were used to make staple dishes (raised bread) and alcoholic drinks (wine, beer, and chiu) as well as to preserve foods (cheese and sausage in the Roman Empire, milk in the Middle East) and create condiments (fermented beans in China). Autolysis (self-digestion) produced garum in the Mediterranean and probably fish sauce in Southeast Asia."

An important point I brought up earlier about food processing is mentioned as we moved through the next millennium and a half:

"Cooking was a form of alchemy, the most sophisticated understanding of changes in matter then available. Cooking and alchemy used the same tools and equipment. Both sought to find the real nature or essence of a natural substance by applying the purifying power of fire. Just as a crude ore had to be refined in the fire to release the pure shining metal, so raw wheat or sugarcane had to be similarly refined to extract the pure white flour (originally "flower") or gleaming sugar. Culinary processes such as sugar refining and bread baking were thus potent metaphors for spiritual progress. Unlike our contemporary understanding of natural food as having received only minimal processing, this earlier understanding was that processing and cooking were essential to reveal what was natural."

This all reflects one of the major principles of ancient culinary philosophy: the theory of the culinary cosmos, which led to the practice of eating only cooked food (even most fruits were not often eaten uncooked in Europe, for instance), and, for those who could afford the options, to eat food that "balances the temperament," an idea that trickles down today in a lot of horseshit diets.

This balancing the temperament stuff was grounded in the idea of the "humors," or maybe it's better to think of them as both coming from the same worldview, and it informed the view of what we would now term "healthy eating." "In preparing food for their noble employers, cooks were as aware of the need to balance the humors as we are today of, say, the need to have all food groups represented. Root vegetables such as turnips were by nature earthy (dry and cold) and thus better left to peasants. Chard, onions, and fish were cold and wet, so that frying was appropriate. Mushrooms were so cold and wet that they were best avoided entirely. Melons and other fresh fruit were not much better, being very moist and thus thought likely to putrefy in the stomach. Grapes were best served dried as raisins, quinces were dried and cooked with extra sugar -- warm in humoral theory -- to make quince paste. Red wine tended to be cold and dry, so it was best served warm with added sugar and spices."

Another major principle was the hierarchical principle, which broadly called for eating according to your station in life -- a high cuisine for the court, a humble cuisine for the poor -- which roughly in this time period was extended in many parts of the world to include higher cuisine for holy men and intellectuals than for the unenlightened, rather than basing hierarchy only on overt political power.

The third ancient culinary principle was that of sacrifice, which had been largely been phased out in the Axial Age and replaced with new religious rules for eating: "these rules identified preferred ingredients and dishes, often ones believed to enhance contemplation, such as meat substitutes (fish, tofu, gluten), sweetened soft fruit and nut drinks, or stimulating drinks such as tea, coffee, and chocolate. They specified how to process an cook foods, including guidelines for slaughtering, and laid down rules about how cooks should purify themselves, whether fermented foods were acceptable, and which foods could and could not be combined. A third cluster of rules specified mealtimes, days of fasting and feasting, and who could dine with whom.

"The rules, stricter for religious elites than for ordinary believers, were formulated and reformulated for centuries because the founders of the religions, although they relied on culinary metaphors to explain beliefs and doctrines, rarely laid down clear and consistent regulations for cooking and eating. Christians, for example, were not required to fast until the fourth or fifth century. Then they were instructed to fast on about half the days of the year. Today, in the Roman Catholic Church, fasting has been reduced to a minimum."

"Even more important in the dissemination of the new cuisines were monasteries, shorthand for permanent religious houses. Like courts, they were places where all ranks of society met, from clerics to their servants and slaves. Like court kitchens, monastery kitchens were huge and complex, turning out different meals for different ranks: noble and aristocratic visitors; passing merchants, monks, and nuns; the poor and indigent; the sick; and students studying in the monastery school. ... Like courts, they invested in food-processing equipment like gristmills, oil presses, and sugar mills, processing and adding value to foodstuffs. These they sold or offered as gifts, thereby creating loyalty. Like courts, monasteries were part of networks that crossed state boundaries, in this case by the movement of religious orders and missionaries rather than marriage."

"As theocratic cuisines spread, so did their preferred raw materials: plants and sometimes animals. Particularly important were the transfers of southeastern and Chinese plants to Buddhist India, Indian plants to Buddhist China, Chinese plants to Korea and Japan, Indian plants to Islamic lands, and European plants to the Americas through the Columbian Exchange. Royal and monastic gardens and large estates transplanted, ennobled, and grew sugarcane, rice, grapevines, tea, coffee, and other crops essential to the new cuisines."

Here we come to one of my favorite topics in culinary history:

"Whereas culinary diffusion prior to world religions had primarily meant emulating or rejecting neighboring high cuisines, with world religions the relation between successive cuisines became more complex. 'Fusion,' the term so often used, does not do justice to the variety of interactions. One cuisine could be layered over another, as happened with the Spanish conquests in the Americas, the conquerors eating Catholic cuisine, the indigenous retaining their own cuisine. Specific dishes, techniques, plants, and animals might be adopted, as Europeans, for example, adopted distilling, confectionary, and citrus from Islam."

Oh, if I had a nickel for every dipshit going on about how some dish or approach isn't "authentic."

There is no authentic cuisine. All cuisines are in flux and ever have been. Lots of people carry around a sense of normalcy based on a sphere that extends for a couple hundred miles and a couple dozen years, and think that sense of normalcy reflects something real, something other than their memory of food they've experienced. That's not how it works. Italian food didn't suddenly become Italian food when tomatoes arrived on the boot, or when immigrants in the northeast US started making meatballs. And putting tomato sauce on that pasta for the first time, making those first giant meals of spaghetti and meatballs, didn't invalidate those meals either.

Nobody worried about this bullshit when they actually fucking cooked. It's the hobbyhorse of the dilettante.

Meanwhile! In the Mongol Empire!

"Twenty seven soups dominate the ninety-five food recipes [in Hu's Proper and Essential Things]. The centerpiece of Mongol cuisine, these soups could be quite liquid or thickened to become solid. The basic recipe went as follows:

"1: Chop meat on the bone (usually mutton, but also game such as curlew, swan, wolf, snow leopard) into pieces. Boil in a cauldron of water until tender. Strain the broth and cut up the meat.

"2: Boil the broth with a variety of thickeners, vegetables, and tsaoko cardamom.

"3: Add the meat.

"4: Season to taste with salt, coriander, and onions.

"For a traditional Mongol taste, the thickeners might be chickpeas, hulled barley, or barley meal. To give the soup a Persian touch, it was thickened with aromatic rice or chickpeas, seasoned with cinnamon, fenugreek seeds, saffron, turmeric, asafetida, attar of roses, or black pepper, and finished with a touch of wine vinegar. For a Chinese taste, it was thickened with wheat-flour dumplings and glutinous rice powder or rice-flour noodles, and flavorings of ginger, orange peel, soybean sauce, and bean paste. In this way, the soup of the khans could be adjusted to the preferences of the peoples they had conquered."

Authenticity my ass.

This is basically how pizza adapts to local culinary niches today, and of course what McDonald's does internationally.

Now coffee enters the scene, thank God:

"Coffee, like wine, was an aid to union with the divine. Long before the time of the Sufis, coffee beans, the fruit of a bush native to the highland forests of southwestern Ethiopia, had been chewed like a nut or mixed with animal fat to make a portable, satisfying, and stimulating food for warriors." If you haven't seen the way coffee grows, the bean is just the seed, and of course has a softer fruit surrounding it (which is also lightly caffeinated, and is sometimes used now in some coffee-growing regions to make a vaguely hibiscus-like drink). "Coffee plants were naturalized in Yemen perhaps as early as the sixth century BCE when the Abyssinians invaded Arabia. Later, a new way of preparing coffee by toasting the beans, grinding them, and brewing them with hot water was developed, perhaps in Iran. The Arabic word for coffee, qahwah, probably derives from a word meaning to have little appetite and hence to be able to do without. It had been first applied to wine and later to coffee (which suppressed the desire to sleep). Sufi pilgrims, traders, students, and travelers consumed coffee to keep awake during ceremonies and induce a sense of euphoria, spreading its use throughout the Islamic world between the thirteenth and fifteenth centuries."

Islam introduced coffee to the West, as with so many things, and that's not all! They also introduced stuff to have with coffee.

"Sugar cookery was introduced from Islam in the twelfth century by a physician known as Pseudo-Messue. The English words syrup, sherbet, and candy all have Arabic roots. Medicinal electuaries, pastes of spices and drugs, and comfits, sugar-coated spices, were the distant forerunners of candy. Sugared spices did not break the fast, Thomas Aquinas said, because 'although they are nutritious themselves, sugared spices are nonetheless not eaten with the end in mind of nourishment, but rather for ease in digestion.' It was an important decision, both because it gave medical respectability to sugar and because it foreshadowed later arguments about chocolate."

Arab fruit pastes became Portuguese quince marmalada, later inspiring citrus marmalades that are more familiar to Americans, and the sweet fried doughs used to celebrate the end of Ramadan inspired similar fried doughs in Catholic traditions, eaten before the Lenten fast: doughnuts, beignets, etc. (The Brits have their pancakes.)

Along with all this came distillation and better booze. Not too shabby.

"In the early fourteenth century, cookbook manuscripts began appearing across Europe.... Rarely were these cookbooks step-by-step manuals, being, rather, testimonials to a ruler's fine cuisine or aide-memoires to professional cooks. With the invention of printing, the number increased again."

Medieval history is not at all my area of expertise, but this broadly fits my understanding of the ... history of professionalization, sort of, the history of procedural rigor, if you will.

The dissemination of cookbooks further contributed to the Westernization of Islamic dishes in Europe, in much the same way that nineteenth and twentieth century cookbooks Americanized immigrant and foreign cuisines:

"Al-sikbaj (meat cooked in a mixture of sweetener and vinegar) was transformed into fried or poached fish (or chicken, rabbit, or pork) in an acid marinade of vinegar or orange (escabeche), perhaps the origin of aspic." Al-sikbaj was a characteristic dish of the Moors who conquered Spain, but has since died out in the Muslim world. "Ruperto de Nola's Libre del coch included thin noodles, bitter oranges, fried fish, escabeche, almond sauces, and almond confections. Martinez Motino's Arte de cocina contained several recipes for meatballs and capirotada, and one for couscous. It also had one for Moorish hen, roast chicken cut into pieces, simmered with bacon, onion, broth, wine, and spices -- which were not named, but probably included pepper, cinnamon, and cloves -- and then enlivened with a final dash of vinegar. The bacon and wine were typically Christian, but the sour-spicy sauce justifies the name."

So here's the other thing about sugar: it used to be in fucking everything. The line between "sweet" and "savory" isn't just a recent thing, it's the defining characteristic of the modern palate. Candies and confectionery used to include not just candied oranges and cherries, but carrots and turnips. Meat dishes in high cuisines were regularly served in sweet sauces -- no, not like at that Chinese place, no not like barbecue sauce, like really noticeably sweet, not tangy.

Then that changed.

If you have to pick a point where things start to change, it's 1651, when Pierre Francois La Varenne published La Cuisinier Francois, which was widely translated, and which inspired numerous imitators. The middle of the seventeenth century saw a significant shift in Western tastes characterized by two changes: "the disappearance of spices and sugar from savory dishes [notice how rarely we use 'baking' spices like cinnamon, clove, nutmeg, etc., in savory dishes, whereas they are still common in Middle Eastern, North African, and Central Asian cuisines] and the appearance of new fat-based sauces, many thickened with flour."

The traditional Catholic cuisine was displaced piecemeal across Europe. In England, "the aristocracy dined on the new French cuisine. The gentry, by contrast, rejected this in favor of a middling bread-and-beef cuisine optimistically described as the national cuisine." Across most of western Europe, sweet and sour were segregated to different dishes and usually different courses, while beef and bread became higher profile, as did dairy, and sauces using fat and flour. French cuisine both informed other European cuisines while at the same time absorbing and reinterpreting elements of them, a process that continued for the next couple centuries.

One of the major innovations of the time period was "middling cuisines," a prerequisite to modern cuisine: "Middling in the sense of bridging high and low cuisine, rich in fats, sugar, and exotic foodstuffs, featuring sauces and sweets, and eaten with specialized utensils in dedicated dining areas, middling cuisine became available to an increasing proportion of the population in the following centuries. Changes in political and nutritional theory underwrote this closing of the gap between high and humble cuisines. As more nations followed the Dutch and British in locating the source of rulers' legitimacy not in hereditary or divine rights but in some form of consent or expression of the will of the people, it became increasingly difficult to deny to all citizens the right to eat the same kind of food. In the West, the appearance of middling cuisines ran in close parallel with the extension of the vote. Reinforcing this, nutritional theory abandoned the idea that cuisine determined and reflected rank in society in favor of a single cuisine appropriate for every class of people.

"The growth of middling cuisines is what nutritionists call the 'nutrition transition,' the sequential global shift from diets composed largely of grains to diets high in sugar, oils, and meat ... the nutrition transition increases food security [but] brings in its wake many associated health problems, including increased incidence of strokes, heart attacks, obesity, and diabetes, and with them increased costs for society." (Of course, poverty and malnutrition have also decreased, so there's that.)

These middling cuisines began before the Industrial Revolution, but that was a huge driver in really bringing all these trends together and forming what we would recognize as modern cuisine. The advances of the Industrial Revolution brought about more efficient and cheaper forms of food preservation, refrigeration and rapid transportation of fresh food, extraordinary advances in agriculture (among them new fertilizers and pesticides), and so on, transforming the quality, quantity, and price of food more dramatically than any development had since the mastery of grain cookery thousands of years earlier. Those advances in transportation also made more feasible the waves of immigration that repopulated the United States after Native American tribes were decimated, and the arrival of many, many different immigrant groups, all with their own cuisines -- but not always with access to ingredients from home, and sometimes finding it easier to adapt what was available -- contributed to what is erroneously called the "melting pot," an American cuisine that was and I think remains in flux. Americans were also the first to begin using ice in their drinks -- and in a million other ways -- to such a great extent, and pioneered the commercial ice business.

The influence of French cuisine on modern cooking remained strong, and in the nineteenth and twentieth centuries, numerous dishes in non-French cuisines were created or altered with distinctive French touches -- adding butter instead of oil, reducing the spices and herbs in Greek dishes, adding dressing cold cooked vegetables or meats with mayonnaise or raw vegetables with vinaigrette. Bechamel -- originally Italian! but popularized by La Varenne -- showed up everywhere, with Russians using it as a piroshki filling with mushrooms or to thicken soup, Mexican chefs using it to dress fish, Indian chefs using it to dress eggplant and pumpkin. Beef Stroganov, unsurprisingly, is one of the most famous Russian dishes attempting to emulate French cooking, while bechamel was repopularized in northern Italy and found its way into lasagna.

A middling cuisine means, by extension, that pretty much everyone eats pretty much the same thing, at least in the broad strokes. Inevitably that means the specifics invite criticism. "Religious groups, conservatives, socialists, and feminists attacked modern middling cuisines. Some wanted the egalitarianism of modern culinary philosophy but rejected other aspects. For example, many reformers turned their backs on meat, white bread, and alcohol, developing alternative physiologies and nutritional theories to explain why vegetarianism [a term coined in the 1840s] or whole grains were superior. Others attacked domesticity, liberalism, and free trade, proposing alternative ways of organizing modern cooking, commerce, and farming. Yet others hoped it might be possible to return to a [purely] imagined egalitarian past, invoking agrarian and romantic traditions to criticize modern, industrialized cuisines."

One key to remember with the historical development of these things, and when encountering new such things in the wild, is, you know, the rejection tends to come first, with the rationale developed shortly thereafter. By the time you hear about it, that may not be clear, because once the rationale is developed, it's all "so I was doing research on Youtube and I discovered, holy fuck, bananas cause butt cancer," but really it's just that this one guy didn't fucking like bananas, or the idea of bananas, or he really liked the idea of conspicuously avoiding consumption of something, and later he came up with the butt cancer thing.

Okay! That brings us close enough to the present day to wrap up there. One more book down.

Friday, November 11, 2016

the invention of science

I am behind, and in particular I have a backlog of Kindle books in a folder (okay, Kindle calls them "collections") that I keep specifically for Kindle books I've read but haven't yet transcribed notes from -- some for work, some for here, some for fiction projects.

On top of that, I had an ongoing series of thoughts about the election building up, mostly in the form of "hoo geez" and "you gotta be kiddin me" thought balloons popping up in response to things other people said as the election first loomed and then happened. One of the purposes of this weirdly multipurpose blog is to be a safety valve for social media, so that I vent here instead of there, so I was going to rail about people who ask their Trump supporting friends to please stop saying their nasty things in front of them (instead of, you know, actually confronting those friends on their racism, xenophobia, and other abominable beliefs - today's "stop telling racist jokes in front of me, teehee"), and about the "Bernie coulda won it, I tells ya" narrative, and the prematurity of all the other hot takes.

But I think I am, as you must be, too weary of reading postmortem analysis and reactions. Which is not to say I am not engaged - just the opposite, but the last thing I feel like doing right now is swatting down nonsense just to vent about it, because it doesn't feel like it would serve any therapeutic point this time.

I will say that my instinct as a historian says that the dumbest thing you can do is to marry yourself to some model of "what happened" that you read or devise in the first week or two after the election, because you're just going to weigh new data and new models against the one that you've "picked," even though the only reason you've picked it is, ultimately, because of the imagined need for an immediate explanation -- the irrational belief that an incomplete or misleading explanation today is better than an accurate explanation tomorrow. This is a small part of what makes teaching history to people so hard. Among other things, history sometimes includes things they have lived through, and that gives them a remarkable capacity to believe that living as a bystander to an event with very, very small access to data makes them an expert, resistant to the view of the event that has developed in retrospect. This is, of course, a bigger problem than just in the realm of getting people to understand history, but everybody walks down different hallways of the house, and this is one of mine.

Moving on away from the present, away from politics, let me start by finishing off this blog entry on a history of the scientific revolution.

When I started at Hampshire - before I started, actually, at the open house or the pitch at the interview or somewhere in that process - one of the ways they explained the whole approach to things there was by explaining the Div III, which is a mandatory thesis-like project. "See, because there are no majors at Hampshire, and no minors, you can take the things you're interested in and pursue them jointly instead of separately. You don't have to major in psychology and minor in art history. You can do your Div III on the history of the impact of psychological ideas on the visual arts."

I made up that example, but it's a representative one. It's a decent pitch, especially aimed at high school kids (and the parents thereof) who have probably been starved for any kind of serious intellectual or creative stimulus, at least within the bounds of their classrooms.

In the end, though, I had trouble living it out. I wanted to study psychology AND art history. Not just their intersection. I wanted to know about the parts of psychology that had absolutely no impact on art history, and the parts of art history that had nothing to do with psychology. The freedom to cross the streams was well and good, but I quickly took it for granted and wanted to be able to not have to cross the streams.

For instance, I wound up "majoring" - not that Hampshire has majors - in pop culture, but had strong interests in cognitive science and the then-novel study of online culture and communities. What I had zero interest in was combining any of those things for my Div III, which was -- at least in draft form, since I dropped out and transferred before completing it -- a long and rambling thesis on representations of superheroes in comic books and other media, which if it was grounded in anything was grounded more in literary theory and gender studies than cognitive science (though I led up to it with a fifty-page research paper on the role of nostalgia in the history and appeal of Batman, which is at least cogsci-adjacent), and had no overlap with the work I had done on online communities.

So like I've covered before, I have a variegated academic background, and as a grad student in an interdisciplinary program, I took a number of courses on the history and philosophy of science -- if there was such a thing as a "graduate minor," that would pretty much be mine, not because HPS and religious studies are two key branches in the history of ideas -- which would be a valid reason -- but simply because HPS also interested me, even though at that point in my academic career I was supposed to be narrowing my interests, not expanding them.

So it's something I try to keep up with, albeit not at as high a priority as religious studies.

David Wootton's The Invention of Science: A New History of the Scientific Revolution was published with a good deal of fanfare last fall. I'm generally reluctant to jump on the New Release section, for a number of reasons:

1: New releases on scholarly topics are, when they're good work, ultimately of the most value to working scholars in those areas, because those are the readers who are versed in the conversation. After all, if they have something significant to say, other scholars working in that area are going to need to respond to it, and won't have had time yet.

2: New releases on broad historical topics don't always have a particularly good reason to exist, apart from everyone in the field already having read the canonical works on the topic, and those works maybe being out of print. The existence of a new book on the topic is not evidence of the existence of new material. This is a constant source of frustration for me when I write about history, because publishers often stipulate in their style guide that at least half of your sources need to have been published within the last X years, where X is a fairly small number; it is very rarely the case with history that half of the good sources, or a tenth of the good sources, are that recent, especially when you're writing for a general audience and have no need to cite recent journal articles on minor points. (If you're wondering: yes, as a result I have to pad the bibliography with inferior recent work, or recent work that is good but not as directly related to what I'm writing about, in order to balance out the number of older but necessary sources I use.)

3: There is a ... boy, how do I not sound like an asshole here. There is a certain kind of reader I don't want to be. A certain kind of thinker I don't want to be, who's read Malcolm Gladwell and Jared Diamond, both of whom are moderately to completely awful, but nothing that's more than a couple decades old. That's the kind of reading diet that leads to a particularly shallow understanding of things -- but like Gladwell's work in general, to pick on him a little more, it's not a diet that's really designed for actual understanding so much as the satisfied feeling of the illusion of understanding, a junk food eureka.

I made an exception here because the reviews were enough to convince me that #2 and #3 weren't concerns, but that sort of puts a responsibility on me -- even if I'm the only one who perceives or cares about that responsibility -- to keep track of the Invention of Science conversation in the next few years, so I don't make the same mistake I discussed above in my discussion of politics.

Wootton's book traces what he calls the invention of modern science, "between 1572, when Tycho Brahe saw a nova, or new star, and 1704, when Newton published his Opticks... There were systems of knowledge we call 'sciences' before 1572, but the only one which functioned remotely like a modern science, in that it had sophisticated theories based on a substantial body of evidence and could make reliable predictions, was astronomy, and it was astronomy that was transformed in the years after 1572 into the first true science."

This idea, that science before this period was distinct from modern science, is key, and is part of a broader shift in thinking that affected not just the physical sciences but all scholarly pursuits. For that matter, it's not a coincidence that westerners don't really talk about "fiction" as such until the Scientific Revolution, when "nonfiction" becomes rigorously defined. I have banged my head against the wall repeatedly trying to get people to understand this. Obviously I'm not saying that no one wrote any fictitious stories before a certain point in time -- but the modern reader, who thinks of one set of shelves on the library as nonfiction, and another as fiction, is a fairly recent creature. When people describe the Bible or stories in the midrash, for instance, as "fiction," they're imposing a modern frame that didn't exist for the people who created and first received those texts. This isn't splitting hairs, because it's just as important to oppose the fundamentalists who insist that, if the Bible isn't fiction, it is therefore literally true. Again: it's not a coincidence that this claim became as popular when it did, and became politicized, at the point in time that it did. Both of these claims -- that the Bible is fiction, that the Bible is literally true -- are implicitly based on false premises about the nature of sacred texts in antiquity.

To use another example, history as we know it -- that is, the field of history as we know it -- is remarkably recent, in the sense that the idea that the goal of the historian should be to accurately record or recount the details of historical events, drawing on evidence wherever possible, dates to about the Enlightenment, at least in the West. This is kind of fucking crazy, and I feel like people who haven't taken historiography in grad school don't fully believe me when I talk about it, but it goes to Wootton's point. We take scientific thinking for granted now, to an amazing degree -- when you read fantasy novels and whatnot, a ridiculous level of scientific thinking is often ascribed to members of civilizations who would not necessarily be in a position to have developed it, for no apparent reason other than the fact that basic elements of this thinking have so permeated modern thought that it is taken for granted.

The layman often thinks of the history of science as a series of discoveries, rather than creations of different ways of thinking -- and doesn't usually have a good way to explain that most of the scientists famous now for major contributions also pursued wrong avenues (like Newton's extensive work in alchemy) or had no trouble accepting things that would be easily disproven (old beliefs in biology are full of this, and I don't just mean beliefs about race or gender that are motivated by politics and power structure -- there was a simple lack of rigor and attention to detail, by modern standards). If you think science is just a timeline of discoveries, you think the only difference between a clever person in 2016 and a clever person in 1316 is that the clever person in 1316 lives in a world where a bunch of shit hasn't been discovered yet, but that the two basically see their respective worlds the same way and deal with new information the same way, and this is enormously wrong. Even our ability to realize this is not an ability available to all the clever people in history.

One of the ideas that is ingrained in the modern mind that Wootton tackles straight off the bat is the idea of scientific progress. People may debate whether or not history itself "progresses" -- after all, everyone bitches and moans about trivial horseshit like "oh, the kids should still learn cursive" because they're attached to the first Polaroid of the world they watched come into focus -- but everyone today basically sees science as constantly moving forward.

This is a very new idea.

As Wootton points out, until the period he's talking about, not only did people not conceive of "the history of humanity ... as a history of progress," but the rate of technological advancement wasn't just slow, it sometimes went backwards. "The Romans were amazed by stories of what Archimedes had been able to do; and fifteenth-century Italian architects explored the ruined buildings of ancient Rome convinced that they were studying a far more advanced civilization of their own." Technologies were developed, lost, forgotten. This is inconceivable now -- so much so that the "lost technology" trope, when it pops up in popular culture, refers not to sophisticated architecture but to ancient electronics, pyramid magic, and other modern or futuristic technologies transposed to an ancient setting.

Wootton also points to the way Shakespeare and his contemporaries depict ancient Rome technologically identical to Renaissance Europe, with mechanical clocks and nautical compasses. In Borges's words, "all characters are treated as if they were Shakespeare's contemporaries. Shakespeare felt the variety of men, but not the variety of historical eras. History did not exist for him." As Wootton points out, this is a misleading charge -- Shakespeare was well versed in history as history was understood in his day. What he lacked was an understanding of historical change of the sort that we now treat as synonymous with "history."

"We might think that gunpowder, the printing press, and the discovery of America in 1492 should have obliged the Renaissance to acquire a sense of the past as lost and gone for ever, but the educated only slowly became aware of the irreversible consequences that flowed from these crucial innovations. It was only with hindsight that they came to symbolize a new era; and it was the Scientific Revolution itself which was chiefly responsible for the Enlightenment's conviction that progress had become unstoppable. By the middle of the eighteenth century, Shakespeare's sense of time had been replaced by our own."

The term "the Scientific Revolution" is itself one that can be interrogated, and Wootton explains that it's only in the 20th century -- the mid-20th century, at that -- that the term came to mean the creation of modern science as exemplified by Newton's physics. The inspiration for the term is not the American or French Revolution -- both of which were referred to as revolutions as they occurred -- but the Industrial Revolution. Like the Scientific Revolution, the Industrial Revolution was named after the fact -- and although it occurred later, it was named first. Wootton correctly points out that any term introduced by historians after the fact is a term that will be challenged by later historians after THAT fact, which is one of the things that will make your eyes glaze over and start unsubscribing from mailing lists. But anyway.

"In medieval universities, the core curriculum consisted of the seven liberal 'arts' and 'sciences': grammar, rhetoric, and logic; mathematics, geometry, music, and astronomy." There is some digression about what was meant by "art" and "science" at the time, the upshot of which is that all seven were considered both, whereas philosophy and theology were sciences but not arts. Anyway. "Moreover, these sciences were organized into a hierarchy: the theologians felt entitled to order the philosophers to demonstrate the rationality of belief in an immortal soul; the philosophers felt entitled to order the mathematicians to prove that all motion in the heavens is circular ... A basic description of the Scientific Revolution is to say that it represented a successful rebellion by the mathematicians against the authority of the philosophers, and of both against the authority of the theologians."

Da Vinci, for instance, said, "No human investigation can be termed true science if it is not capable of mathematical demonstration. If you say that the sciences which begin and end in the mind are true, that is not to be conceded, but is denied for many reasons, and chiefly the fact that the test of experience is absent from the exercises of the mind, and without it nothing can be certain." The reason we no longer class philosophy and theology as sciences is because we think of science as dealing with not just theory but experiment: testable, verifiable, observable, repeatable results. To return to the refrain: this is a relatively new idea.

One of the most interesting parts of Wootton's book is something I'm still pondering: "before Columbus discovered America in 1492, there was no clear-cut and well-established idea of discovery; the idea of discovery is, as will become apparent, a precondition for the invention of science."

Now, maybe I don't need to point this out, but the idea of "Columbus discovering America" is not important to Wootton's claim here: that is, it is not important that other non-Americans came to the continent before he did, nor that, since there were people fucking living here, none of these non-Americans actually discovered the damn thing. What's key is the European world absorbing the idea of Columbus discovering America and subsequently engaging with America: the phenomenon, so to speak, of "Columbus discovered America." There are other books on the physical and cultural effects of Columbus's voyages; this isn't that. This is about the scientific and intellectual reverberation of the concept of "discovery."

(Wootton also argues that, while everyone correctly dismisses the "Columbus proved the Earth was round" nonsense sometimes taught in elementary schools, the voyages to America did nevertheless change the European conception of the globe, by proving the existence of antipodean land masses, which were believed to be impossible. The Columbian contact did change the understanding of the Earth, then, just not in as simplistic a way as changing it from flat to round.)

"It is discovery itself which has transformed our world," Wootton points out, "in a way that simply locating a new land mass could never do. Before discovery history was assumed to repeat itself and tradition to provide a reliable guide to the future; and the greatest achievements of civilization were believed to lie not in the present or the future but in the past, in ancient Greece and classical Rome. It is easy to say that our world has been made by science or by technology, but scientific and technological progress depend on a pre-existing assumption, the assumption that there are discoveries to be made. ... It is this assumption which has transformed the world, for it has made modern science and technology possible."

Wootton backs up his proposition of the 1492 (essentially) invention of the idea of discovery with linguistic evidence, investigating the various Romance language terms for discovery and related concepts, and how they were used. This is the area where I expect to see other historians responding, refuting, or amplifying -- his evidence is compelling enough, but I'm in no position to tell whether he's cherry-picked it, how much he's interpreting it to favor his argument, etc. See what I mean about the problem with recent scholarly works (at least when they're outside your area of expertise)?

This is a key part of his "discovery" argument: "Although there were already ways of saying something had been found for the first time and had never been found before, it was very uncommon before 1492 for people to want to say anything of the sort, because the governing assumption was that there was 'nothing new under the sun.' The introduction of a new meaning for descrobir implied a radical shift in perspective and a transformation in how people understood their own actions. There were, one can properly say, no voyages of discovery before 1486, only voyages of exploration. Discovery was a new type of enterprise which came into existence along with the word."

Wootton makes a distinction between "discovery" and words like "boredom" and "embarrass," in that we accept that before there was a word for it, people felt bored; before "embarrass" acquired its modern meaning in the 19th century, people could feel embarrassed. But "discovery" is "'an actor's concept' ... you have to have the concept in order to perform the action ... So although there were discoveries and inventions before 1486, the invention and dissemination of a word for 'discovery' marks a decisive moment, because it makes discovery an actor's concept: you can set out to make discoveries, knowing that is what you are doing."

There is an interesting digression about the great sociologist of science Robert K Merton, to whom we owe the phrases "role model," "self-fulfilling prophecy," and "unintended consequence," and who wrote an entire book about the phrase "standing on the shoulders of giants." What Merton was unable to popularize was the idea of multiple discovery: the idea that "there are nearly always several people who can lay claim to a discovery, and that where there are not this is because one person has so successfully publicized his own claim that other claims are forestalled."

"We cannot give up the idea that discovery, like a race, is a game in which one person wins and everyone else loses. The sociologist's view is that every race ends with a winner, so that winning is utterly predictable. If the person in the lead trips and falls, the outcome is not that no one wins but that someone else wins. In each race there are multiple potential winners." Think of how many time travel stories revolve around, I don't know, going back in time to keep somebody from inventing the time machine. That's taking the Great Man view of history -- the view that prevailed in the 19th century, when history was portrayed as the result of specific heroic egos. Other than in fiction and, perhaps, biography, it is not a view that is looked on kindly anymore -- no one person, no one thousand people, can be said to be uniquely responsible for the major events of history, not because that person did not do the things they did, not because those things lacked significance, but because other people would do other significant things to move history in largely the same way. If you go back in time and kill Li'l Lincoln when he's a toddler, you don't wake up in a 2016 that still has slavery -- you simply find out that slavery ended under some other presidency instead. We pretty much accept that -- like I said, except in fiction -- but this centrality of the individual in discovery persists, even when as Wootton points out, numerous people independently discovered the sine law of refraction, the law of fall, Boyle's law, oxygen, the telescope.

I am skipping over lots and lots of detail about the effects of the telescope and the microscope, in part because it doesn't excerpt well.

Wootton also goes into a dispute with relativists, especially "the strong programme," which I think is too inside baseball to get into here, especially since as someone not employed by a university, I have no reason to be invested in the fight. (Okay, that's not entirely true. The Science Wars are important, and impact not only the work I do in educational publishing and reference books, but overall science literacy and public policy. But Wootton's comments on it are not the part of this book that will stay with me the longest, even as they frame the rest of it.) As a fan of Kuhn but not of many of the historians of science who have followed in Kuhn's footsteps -- the relativists, in other words -- but who has also been accused of relativism because I disagree with the way history is constructed by strict realists, I think I agree with Wootton when he says that "this book will look realist to relativists and relativist to realists."



Wednesday, June 15, 2016

podcasts

I wasn't sure if this was a better fit for my TV blog, but since so many of the podcasts mentioned here are about topics I discuss here, well, here we are.

I finally got an iPhone this year, after being a late adopter of smartphones in general and then having a Windows Phone for a bit. Although I'd listened to podcasts before, the iPhone streamlines every part of the process so much that it made me really delve into them, and they've become a significant enough part of my media diet that, like with TV, movies, books, etc., I've had to make priorities because there isn't time to just take everything in.

So after exploring, expanding, and winnowing, this is what I listen to. I don't listen to every episode of everything. I listen to some or a lot of the following.

Interviews

WTF with Marc Maron. The first podcast I listened to on a regular basis. Longform interviews of the kind that you don't see anywhere anymore - Charlie Rose doesn't even really do this anymore, Tavis Smiley is only half an hour. This is Dick Cavett stuff. Some of the guests you expect to be great certainly are -- Paul Thomas Anderson -- but you'd be surprised how great the Ed Begley Jr interview is, and Maron may arguably be at his best talking comedy with comics you might not have heard of.

On Being with Krista Tippett. Tippett's interview subjects cover a wide range of areas - anyone engaged in some way with the human condition. At times I think she could interrogate some of her softer subjects a little more instead of just letting them have the floor, but I suppose that's not the show.

Anna Faris is Unqualified. A newer podcast, Anna Faris and her guest answer relationship questions, often preceded by an informal interview with the guest.  

The Scholars' Circle Interviews. Hosted by Maria Armoudian.

Television

Kumail Nanjiani's The X-Files Files. My favorite TV podcast. Kumail -- you may (and should) know him from Silicon Valley -- goes deep on the X-Files, even going back and reading mid-90s alt.tv.x-files posts about the episodes being discussed. It's a labor of love about a show we all love, and you would think, is there anything new to say about the X-Files? There really is! This is one of the few podcasts I don't skip episodes of, but I've discovered it recently enough that I'm not at all caught up.

I love that podcast enough that I've tried to find similar TV podcasts, but ... well, it's a tough standard to live up to. I found a Friends podcast, for instance, hosted by twentysomethings who keep talking about how they were seven when they saw such and such an episode and how Friends was their first sitcom. My first sitcom was like, Taxi, or Welcome Back Kotter. More power to them, but I can't relate to the conversation. There's a Buffy podcast that tries too hard to be polished and funny, and somehow even when it succeeds it's offputting.

But there's also these two:

Better Call Saul Insider. Hosted by Better Call Saul editor Kelley Dixon, this is an indispensable accompaniment to one of the best shows on TV, a cut way above the after-shows that have proliferated TV and frankly better than most audio commentaries these days (which have become so rushed and perfunctory). 

The West Wing Weekly. This got a lot of press when it started, probably because it's co-hosted by West Wing co-star Joshua Malina (though he didn't join the show until a later season they haven't gotten to yet). So far so good.

Religion and Philosophy

History of Philosophy Without Any Gaps. The preeminent philosophy podcast, it is exactly what it sounds like, and it is mammoth.

The Religious Studies Project. Weekly discussions from scholars around the world.

Homebrewed Christianity. Probably the leading progressive Christianity podcast, started by Tripp Fuller and including interviews with big names like Crossan, Wright, Caputo, etc. Also hosts of the Theology Nerd Throwdown and other podcasts.

Seminary Dropout. An interview-focused podcast by young Texas pastor Shane Blackshear.

Nomad. "Two friends who like Jesus but dislike religion." Sound familiar?

Judaism Unbound. A relatively new podcast on Judaism hosted by, if I remember right, a Gen Xer and a Millennial. 

New Books in Religion. What it says on the tin.

Religious Studies News. From my peeps at AAR.

Miscellaneous

Radiolab. You know Radiolab. Radiolab is one of those shows everyone listens to.

Dan Carlin's Hardcore History. Very very long episodes about various topics in history. I usually have to make a block of time to listen to these, since the run to four hours.

The Brookings Cafeteria. As you know, Bonker, the Brookings Institution is a social sciences and public policy think tank on Think Tank Row. The Brookings Cafeteria is a Brookings-hosted podcast on a wide assortment of topics.

This Week in Law. Just what it sounds like! A weekly discussion of legal issues. I don't blog about legal issues here much because, hey, I'm not a lawyer. But it's a topic I keep an ear out for, in no small part because, as a religious studies scholar, I'm sensitive to the fact that the law, like religion and politics, is something that people talk and opine incessantly about without knowing shit about it. So I want to know more than shit about it.

The Psychology Podcast. Pretty general interest, I tend to skip around and look for the interesting ones.

Bon Appetit. Interviews and other brief discussions with the magazine's staff.

Jay and Miles X-Plain the X-Men. X-actly what it sounds like.

Decompressed. Phonogram creator Kieron Gillen's podcast about the craft of comics.


Podcasts I have on my iPhone that I have not listened to much yet but seriously I have been meaning to

NoSleep
Pseudopod
Escape Pod
PodCastle
Drunks and Dragons
Critical Hit
Philosophize This
How to Be a Person
Off Camera with Sam Jones
Black List Table Reads
Story Worthy
Sklarbro Country

Tuesday, June 14, 2016

let's talk about something more pleasant; or, looking back on six years of a five year reading plan

I'm about to turn 41, and I'm about to wrap up the sixth year of a five year reading plan (I had to tack on an extra year because, well, I wasn't fucking done). 

It started with two things as I headed into the middle of my mid-30s: reading John O'Hara, and the announcement of Penguin's publication of Malcolm and Ursula Lyons' fantastic translation of the Arabian Nights. 

John O'Hara is one of those authors who, while both bestselling and critically acclaimed in the past, has become mostly forgotten apart from a handful of works (Appointment in Samarra, Butterfield 8). He was, not that I knew this yet, absolutely the master of the novella. When I first read him, various aspects of his work reminded me of Fitzgerald -- which is misleading and has more to do with my frame of reference at the time -- but it made me reflect on the pattern that my reading tended to take, which was to read everything about or by XYZ, but little or nothing about or by the thing right next to it. I had read everything by Fitzgerald (except The Beautiful and Damned, for no real reason), and I mean everything - his novels, his short stories, the first draft of Gatsby called Trimalchio, his letters to Zelda and to Maxwell Perkins, all of Zelda's works, several biographies. 

But the REST of the Lost Generation?

Well, I'd read a little Hemingway, a little Sherwood Anderson, a little Faulkner. I'd obsessed over T.S. Eliot's poetry in high school before I even discovered Fitzgerald, though I had no historical context for it. But I hadn't read John Dos Passos, and of Steinbeck I'd only read Of Mice and Men and The Red Pony, both as a kid.

Thinking of Steinbeck reminded me of Salinger, the other large-looming JS, and the fact that I hadn't read Catcher in the Rye until I was in my twenties and felt too old for it to hit me the way it would have if I'd been younger -- I felt like I'd already read everything derivative of it, and that for that matter I'd read Franny and Zooey first, which might've been a mistake. But what I hadn't done was read the rest of Salinger.

So anyway, that line of thinking went on for a bit. And I thought, well Bill, there are a lot of things you know you've been MEANING to read -- a "decent translation of the Arabian Nights" being one of them, and here what do you know, Penguin goes and announces an enormous one, one Robert Irwin seems excited about -- and on top of the things you've been meaning to read, there are a lot of things you probably ought to mean to read. Like, conceptual gaps in your reading. 

So I came up with my first five-year curriculum. A conceptual reading list -- not a list of specific titles, or rather not JUST specific titles, though it did include some, but rather, just like a college curriculum is category-based but various specificities can satisfy those categories, so too with my curriculum. The main focus was on fiction -- the things I'd been meaning to read and the things I should have been meaning to read, which together constituted an unconscious kind of canon -- plus shoring up my reading on social and physical sciences and my field(s). Originally the nonfiction reading broke down more specifically than that -- but the books on that side of the curriculum are both more expensive and often longer, and over time that had an accretive effect on how things went. It's easy to get through five novels in a light week, for instance. The same is not true of mathematics books, even at my best or most flush.

Anyway, obviously given the amount of time I originally chose for the reading list here, the subtext was "this is the shit I want to have read before I'm forty," though like I said, I tacked that extra year on because I just wasn't done. So, with things winding down -- I am finishing up the last stack -- some notes on what I've read:

Some of those authors I'd read only a token amount of were absolutely worth revisiting -- Steinbeck's East of Eden is one of my favorite books, as it turns out, as are ... well, all of Salinger's books, but particularly Nine Stories. Similarly, I had only read Walker Percy's The Moviegoer before, but the rest of his books are just as fantastic. And although I had read some of Wallace Stegner's nonfiction, I'd never read Angle of Repose, which turns out to be one of the best novels I've ever read. Again: I'd always meant to get around to them. I finally made time to do so.

There was some genre stuff included here, because this wasn't a generic "things people should read" curriculum, this was "things I should read," and I'm a genre writer. I either hadn't read A Wrinkle in Time before or couldn't remember if I'd read it (I'd seen an adaptation of some kind - a play? a movie on PBS? I had a vague sense of the story, but only vague), so I read it, and it's fantastic. I read Ursula K Leguin's amazing Earthsea books, Thomas Tryon's The Other, Jim Thompson's The Killer Inside Me (and quickly discovered that as much as I like it, I shouldn't read too much other Thompson, because his misogyny and unpleasantness works best when the viewpoint character is MEANT to be a sociopath), T.H. White's Once and Future King, Octavia Butler's Kindred, David Gerrold's insane The Man Who Folded Himself.

I discovered some new favorite authors, chief among them Marilynne Robinson, whose Lila and Home are among my favorite books of all time; P.G. Wodehouse, who I am now obsessed with; and Graham Greene. I finally read Carson McCullers, and both The Heart is a Lonely Hunter and The Ballad of the Sad Cafe are just fucking brilliant books. Helen DeWitt's Last Samurai is one of those amazing books that I should recommend to more people. But hopefully everyone knows about it anyway.

Some books were just straight up fucking fun, like John Barth's Last Voyage of Somebody the Sailor.

Having been familiar with Cheever mainly as a short story writer, I read his novels, and loved Falconer while the Wapshot books were all right but somewhat less compelling.

And finally having gotten to Dos Passos, the USA trilogy is ingenious, and it would have blown me away as a twentysomething.

There were disappointments, of course, too. Agatha Christie just isn't for me. I've never been that big a mystery fan, really, and certain styles of mysteries are even more inclined to leave me cold -- I suppose hers is that style. Nathanael West, so highly recommended by so many people, was a chore. I could not stand The Brief Wondrous Life of Oscar Wao. The two novellas I read by Jim Harrison left me uncertain whether to read more -- one was brilliant, the other absolutely tiresome.

So the question is obviously what's next, and the answer to me is just as obvious -- though I'm not drawing specific attention to it here because it wasn't much of a highlight, this reading plan involved reading a lot of those Old White Guys, the Updikes and Mailers and Yateses and so on (this is why I had so little patience for the second Jim Harrison novella), and I need an antidote. I didn't explicitly set out to read a bunch of old white guys, or a bunch of guys or a bunch of whites, but it worked out that way because when you're a well-read thirtysomething white guy and the reading list you cobble together is "stuff I've been meaning to read forever," it stands to reason that the list will reflect an American literary culture that's been dominated by and that has primarily canonized white men and the concerns of white men. At the time I had no aims beyond "getting around to reading the things I'd been meaning to read," and wasn't thinking of what "the list" represented in any sense beyond that -- nor do I regret reading anything on it. 

But reading so much of that, even punctuated by the occasional Leguin or Robinson, is a bit much. I didn't even read nearly as much Updike or Mailer as I intended -- and I like Updike! -- because this turned out to be a poor context in which to appreciate either. (In the end I'm not sure I would be a Mailer fan anyway.) 

So the fiction component of my next five year curriculum is to read primarily works by authors who are either not white or not men, or works in translation. A curriculum to balance out the last six years. Who knows, this one may take me seven.

For the record, my favorite books coming out of these six years:

Marilynne Robinson, Home, Lila, and Gilead - It's hard to pick one from this "trilogy" (it's not important what order you read them in). Lila is probably my favorite and the most ambitious, though perhaps not in the way that people always mean when they use that word for books.

John Steinbeck, East of Eden - The fucking epic.

J.D. Salinger, Nine Stories - I'm kind of glad I didn't read this as a kid so that I'm not sick of it.

Ray Bradbury, Dandelion Wine - a reread, but rereading it made me realize I might love this book even more, reading it as an adult, than Something Wicked This Way Comes

P.G. Wodehouse, Leave it to Psmith - Wodehouse's best, though not necessarily his most representative. Wodehouse writes farce of a very specific tone; what makes this stand out is that it's also legit romantic comedy. However, one of the things that makes it better than his other books is the way that it plays with his usual structures, and that comes across better if you're familiar with those structures, especially with the other Blandings books.

Wallace Stegner, Angle of Repose - A novel about a middle-aged history professor reconstructing/narrating the life of his grandmother in frontier-era mining communities. AREN'T YOU EXCITED? But it is awesome.

Ursula K Leguin, the Earthsea books - I don't know how to pick one, and while they don't have to be read as a series in the same way that modern fantasy series like A Song of Ice and Fire do, they still work best informing one another.