Thursday, March 28, 2013

Diversity and Academic Open Mindedness

I had an interesting recent conversation with a fellow academic that I think worth a blog post. It started with my commenting that I thought support for "diversity" in the sense in which the term is usually used in the academic context—having students or faculty from particular groups, in particular blacks but also, in some contexts, gays, perhaps hispanics, perhaps women—in practice anticorrelated with support for the sort of diversity, diversity of ideas, that ought to matter to a university.

I offered my standard example. Imagine that a university department has an opening and is down to two or three well qualified candidates. They learn that one of them is an articulate supporter of South African Apartheid. Does the chance of hiring him go up or down? If the university is actually committed to intellectual diversity, the chance should go up—it is, after all, a position that neither faculty nor students are likely to have been exposed to. In fact, in any university I am familiar with, it would go sharply down.

The response was that that he considered himself very open minded, getting along with people across the political spectrum, but that that position was so obviously beyond the bounds of reasonable discourse that refusing to hire the candidate was the correct response. 

The question I should have asked and didn't was whether he had ever been exposed to an intelligent and articulate defense of apartheid. Having spent my life in the same general environment—American academia—as he spent his, I think the odds are pretty high that he had not been. If so, he was in the position of a judge who, having heard the case for the prosecution, convicted the defendant without bothering to hear the defense. Worse still, he was not only concluding that the position was wrong—we all have limited time and energy, and so must often reach such conclusions on an inadequate basis—he was concluding it with a level of certainty so high that he was willing to rule out the possibility that the argument on the other side might be worth listening to.

An alternative question I might have put to him was whether he could make the argument for apartheid about as well as a competent defender of that system could. That, I think, is a pretty good test of whether one has an adequate basis to reject a position—if you don't know the arguments for it, you probably don't know whether those arguments are wrong, although there might be exceptions. I doubt that he could have. At least, in the case of political controversies where I have been a supporter of the less popular side, my experience is that those on the other side considerably overestimate their knowledge of the arguments they reject.

Which reminds me of something that happened to me almost fifty years ago—in 1964, when Barry Goldwater was running for President. I got into a friendly conversation with a stranger, probably set off by my wearing a Goldwater pin and his curiosity as to how someone could possibly support that position. 

We ran through a series of issues. In each case, it was clear that he had never heard the arguments I was offering in defense of Goldwater's position and had no immediate rebuttal. At the end he asked me, in a don't-want-to-offend-you tone of voice, whether I was taking all of these positions as a joke. 

I interpreted it, and still do, as the intellectual equivalent of "what is a nice girl like you doing in a place like this?" How could I be intelligent enough to make what seemed like convincing arguments for positions he knew were wrong, and yet stupid enough to believe them?

Monday, March 25, 2013

Could You Make a Living from RPG's?

Obviously, people at Blizzard (World of Warcraft) and Wizards of the Coast (D&D) do. What I have been wondering is whether there are ways in which people outside of such firms could do it. 

The immediate inspiration for this post was an event on my campus where, after a day of academic talks by high-tech law types, the evening entertainment included D&D games run by hired DM's—for players most of whom, I would guess, had never played before. But the original source of the idea was my observation of quite a lot of people who devote levels of thought, effort, and talent to running dungeons or leading raids in WoW comparable to what other people spend making a living. 

Three possible models occur to me:

1. A professional DM. Lots of D&D games nowadays are played online, with software by which the DM can control what appears on his players' screens. Suppose a DM is good enough so that some players are willing to pay a modest fee, perhaps a few dollars an hour, to be in his game. That would not make him rich, but it might pay him ten or fifteen dollars an hour—enough to support a life style at least one step above couch surfing.

2. A professional raid leader. I know someone in WoW who is a skilled player, a good charismatic leader, and good at running raids—groups of ten or twenty-five players working together to accomplish difficult things. I can imagine people paying to be in the raids of someone like that—most obviously people who don't have a good enough social network, or enough skills in and out of game, to be invited into a competently run raid. A four hour raid with twenty-five players could plausibly earn the leader a hundred dollars or more.

3. A professional trainer. I am again thinking in the context of World of Warcraft. One of the two characters I currently play consistently underperforms, does less damage to enemy creatures than other players in the group he is with. I am pretty sure I am doing something wrong and I do not know what. If there was someone online with a reputation for doing a good job of coaching players in how best to play their characters, I might well pay for his services.

All three of these approaches face the same problem: A lot of people enjoy running D&D games, leading raids, or giving advice to friends or friends of friends on how to play their characters. It is hard to sell something when other people are giving it away for free. Possible solutions are either to provide something enough better so  that people are willing to pay for it, the approach by which private schools compete with public schools, or to sell the service to people who are unable to manage the non-monetary mechanisms by which others get the same service for free. The business model of the oldest profession.

Something along these lines, people making a living by doing for money what others do for free, already exists for some older games. The example I am most familiar with is bridge. Sufficiently good bridge players can, and some do, make a living as paid partners, providing players a little less good a mixture of entertainment, teaching, and status. 

I am not sure it would work for modern role playing games, whether D&D and its relatives or MMORPG's such as WoW. But it does seem as though there should be some way in which the level of skill and commitment shown by the best participants could provide them with at least a modest income.

Monday, March 18, 2013

Does Easy Sell?

A criticism some reviewers offer of my two novels is that the protagonists have it too easy, that there is never any serious doubt that they will ultimately win. It is, on the whole, a fair criticism. Part of the reason may be that I am an optimist, and so find it hard to believe in really bad outcomes even in my fictional worlds. I tried to fix the problem in the final section of Harald, my first novel, by reorganizing the sequence of events so that the bad guys appear to be winning until near the end, but it does not seem to have worked very well.

Considered as a literary judgement, the criticism is legitimate, but I am wondering whether it is also a legitimate criticism if the objective is not to write well but to sell. Part of the reason people read fiction is to imagine themselves as the protagonist. Imagining yourself as smarter, stronger, faster, in all important ways better than all the people around you may be a base pleasure, but for a lot of people it is a pleasure. Make a protagonist like that, and it is hard to make it plausible to the reader that he might lose.

For a particularly successful example, consider Superman. Strong, bullet proof, fast—he can even fly. Surely part of his success came from readers wanting to imagine themselves like that. One consequence was that, in order to provide him with threats adequate to support a halfway interesting plot, the writers had to introduce a variety of kludges to the story line, of which the most famous was Kryptonite.

I have never studied the pattern of best selling fiction; my impression is that much of it—The Lord of the Rings would be a striking exception—consists of books I probably would not much enjoy reading. But I wonder how much of it shares my fault and, unlike my novels, is successful as a result.

Wednesday, March 13, 2013

The Kind of Teaching I Like to Do

My university has an adult education program and I have just finished teaching a course for it, an abbreviated version of my seminar on legal systems very different from ours done as four lectures, each two and a half hours long. Interested readers can find recordings of part of the third lecture and all of the fourth on the class web page. It was great fun.

There were two important differences between that class and most classes I have taught. The first was that nobody was there who was not interested in what I was teaching, since the class did not meet any requirement for getting a degree. The second was that I did not have to grade the students. It thus eliminated the two least attractive features of conventional teaching. My ideal class might have some sort of exam at the end to provide me feedback on how good or bad a job I was doing, how much of what I had taught the students had learned—but I would not be the one grading it.

Each year I teach  about ten shorter classes of the same sort, under rather different circumstances. The setting is the Pennsic War, an annual two week long historical recreation event held in a private campground in Pennsylvania. Attendance at the event is upwards of ten thousand people. My classes, mostly an hour long, deal with medieval historical recreation—how to cook from a period recipe, make hardened leather armor, tell a period story in a way that creates the illusion of a medieval story teller entertaining a medieval audience. My classes are part of a Pennsic University that offers about a thousand classes each year, all taught by volunteer teachers to volunteer students. Nobody is there to get a degree and nobody has to give any grades. 

All of which suggests that perhaps there is something wrong with the more conventional model employed for most teaching from kindergarten through college.

Thursday, March 7, 2013

Nutrition, Obesity, Cost

In a recent online exchange, a poster commented on how extraordinary it was that poor people in our society are fatter than not-poor people. I am not sure the claim is literally true, but I believe it is true that obesity in the U.S. is at least as common among the poor as in the general population, and perhaps more common.

Someone responded that the reason was that more nutritious food was more expensive. She did not go into details, but my guess is that she was thinking of fast food—I have seen other people make the argument in that form. I responded by offering home made bread and lentils as examples of inexpensive but nutritious foods. Another poster responded to that with the claim that home made bread, while tasty and nutritious, was more expensive than "the cheap and nasty supermarket bread."

So I did some price comparisons, getting my price and nutrition information off the web.

Flour, the main ingredient in home made bread, costs about $.50/lb and has about 2000 calories/lb, so about $.25/1000 calories.

Wonder Bread, the classic example of supermarket bread, has 1100 calories/loaf and cost $1.99/loaf on sale at Walgreens, normally more. So about $1.80/1000 calories, or seven times as expensive.

Comparing lentils to fast food, 1 kilo of lentils has about 3500 calories and costs a little over $2. So about $.60/1000 calories

A McDonalds Quarter Pounder with cheese has 520 calories and costs $3.10.  So about $6.00/1000 calories. For that comparison, fast food costs about ten times as much per calorie.

Not only are the claims wrong, both are wrong by close to an order of magnitude.

I should add that I bake bread, as does my wife. My version is a sourdough bread, so does not require yeast. The only ingredients are flour, water, sourdough, salt, and raisins, so if I did a non-raisin version the only significant cost would be the flour. It does require an oven, but most Americans, including most poor Americans, have access to one.

My recipe does require some time, although not a lot--perhaps half an hour per pound of flour, counting only time spent actually doing things, not time waiting for bread to rise or bake. But there is a very low work recipe I sometimes use for a yeast bread, based on the recipes in Artisan Bread in Five Minutes a Day, that takes substantially less time than that.

Wednesday, March 6, 2013

The Scary Thing About Having Kids

For many couples considering parenthood, the most frightening risk is the possibility of a child with a serious birth defect. That is scary, but also unlikely, especially for couples willing to use amniocentesis to check for serious problems and if necessary abort. The risk that is both frightening and reasonably likely is the chance of having a child who does not like you.

I got along well with my parents, better than I did with most other people I knew, my wife got along well with her parents, and our children get along well with us. But my impression, largely from our children's accounts of people they know, is that a large fraction of teenagers and young adults, perhaps a majority, do not get along with their parents. That not only makes the process of bringing up children a lot harder and more unpleasant, it also eliminates one of the major long-term benefits—ending up with adults whom you love and trust and who love and trust you.

Which raises the question of whether we have just been lucky and if so in what way, or whether we, and our parents, did something right. Judith Harris' work, presented in her very interesting The Nurture Assumption, provides evidence that parental child rearing does not have a very large effect on the personality of the child when he becomes an adult, but it might have a large effect on the adult's relation with his parents. And even if child-rearing does not have a large effect, other environmental influences, especially, by her account, the peer group environment, do.

Harris mentions that in some families, although not many, the family is the peer group, creating an exception to her general rule. That, I think, describes my situation, my wife's situation, and the situation of our children. In each case, the child identified more strongly with the family culture than with the culture of his age peers. It might also describe a situation more common when population was much less dense, with the result that a smaller fraction of social interaction occurred outside the family. For an extreme version, consider the situation of the (fictional) Swiss Family Robinson. For a less extreme one, consider any family that is committed to a different view of the world than the surrounding culture—conservative Christians in a secular environment, atheists in a religious one, or immigrants from a very different society. It would be interesting to know whether such situations results in significantly less parent/child conflict.

Or maybe it is pure chance. There seems evidence that personality is to a significant degree genetic. There may be personality types that do not get along with each other, even types that do not get along with anyone. If that is the fundamental problem, we have been very lucky.

Tuesday, March 5, 2013

Official Scientific Truth

A pattern I have observed in a variety of public controversies is the attempt to establish some sort of official scientific truth, as proclaimed by a suitable authority—a committee of the National Academy of Science, the Center for Disease Control, or the equivalent. It is, in my view, a mistake,  one based on a fundamental misunderstanding of how science works. Truth is not established by an authoritative committee but by a decentralized process which (sometimes) results in everyone or almost everyone in the field agreeing. 

Part of the problem with that approach is that, the more often it is followed, the less well it will work. You start out with a body that exists to let experts interact with each other, and so really does represent more or less disinterested expert opinion. It is asked to offer an authoritative judgement on some controversy: whether capital punishment deters murder, the effect on crime rates of permitting concealed carry of handguns, the effect of second hand smoke on human health. 

The first time it might work, although even then there is the risk that the committee established to give judgement will end up dominated not by the most expert but by the most partisan. But the more times the process is repeated, the greater the incentive of people who want their views to get authoritative support to get themselves or their friends positions of influence within the organization, to keep those they disapprove of out of such positions, and so to divert it from its original purpose to becoming a rubber stamp for their views. The result is to subvert both the organization and the scientific enterprise, especially if support by official truth becomes an important determinant of research funding.

The case which struck me most recently had to do with second hand smoke. A document defending a proposal for a complete smoking ban on my campus was supported by a claim cited to the Center for Disease Control. Following the chain of citations, it turned out that the CDC was basing the claim on something published by the California EPA, which cited no source at all for it. As best I could determine, the claim originated with research that was probably fraudulent, using cherry-picked data to claim enormous and rapid effects from smoking bans. Pretty clearly, the person on my campus who was most responsible for the document had made no attempt to verify the claim himself, merely taken it on the authority of the CDC. For more details see my post on the case.

An interesting older case involved Cyril Burt, a very prominent British Psychologist responsible for early  studies of the heritability of I.Q., a highly controversial subject. After his death he was accused of academic fraud of various sorts. The official organization consulted was The British Psychological Association, which concluded that he was guilty, a conclusion that many people then took, and some still take, for gospel. Subsequently, two different authors published books arguing convincingly that some or all of the charges against him were bogus. Interested readers can find a detailed discussion of the case in Cyrus Burt: Fraud or Framed, which concludes that much, at least, of the case against Burt was in error. I am not certain, but I believe that the BPA later reversed its judgement, withdrawing the claim that his work had been fraudulent. Perhaps one of my readers can confirm that—I did not manage to with a brief Google search.

It is natural enough that observers of such controversies want an authoritative answer from a authoritative source—quoting the CDC is much less work than actually looking at the research a claim is based on. But treating such answers as really authoritative is a mistake, and a pattern of treating them that way a dangerous mistake.