Friday, August 31, 2007
A genetics/population bleg
In our small group session today, one member of the group, a very liberal, biracial (black/white) young woman became reservedly yet visibly exercised when we broached the subject of genetic risk for disease being associated with race. She was quick to tell us that when working at the NIH (an experience she is obviously quite proud of--I'm sure a liberal half-black woman has no trouble getting a job at the NIH) she learned that racial risk factors, while they do exist, are really attributable to social factors correlated with race such as low income, education, living in a high-crime area, etc., rather than to genetics. Naturally, I don't buy that. She also said that the only diseases that are correctly attributable to genetics and correlate with race are single-gene disorders such as sickle cell anemia, and that even in those cases, since it's only one gene, it's not really racial since any person of any race can have that one mutation. She said that multifactorial disorders aren't really attributable to race at all, that there's more genetic variation within races than between races, and of course, repeated the ubiquitous leftist talking point that there's really no such thing as race anyway and it's merely a social construct. She made it sound like at the NIH this stuff is considered firmly established scientific fact.
Now, it doesn't surprise me at all that the NIH has "discovered" the scientific "fact" that racial differences don't actually exist, though obviously, I'm highly skeptical. But I was unable to rebut any of these claims, because I'm really not well-read on this topic, and I figured that this being a subject that is obviously near and dear to her, she would probably have several studies or articles she could cite off the top of her head which she at least believes establish her view, which would make her look like she had science on her side while I was merely a quack engaging in speculation.
So I'd like to ask if anyone can recommend good books, articles, or other sources that may show that what this woman was saying is wrong. Specifically, I'm interested in knowing more about where this idea that there is more genetic variation within races than between races comes from, what its significance really is, and whether it really means what liberals seem to want it to mean. More generally, I'd also love to know if there's been anything solid refuting this incessant claim that there's no such thing as race and it's merely a social construct.
Tuesday, August 28, 2007
Ron Paul
In all seriousness, I think Ron Paul would make a fine President of the United States of America. He is one of the few politicians these days who actually understands and believes in our Constitution. Although my Crunchy Con sympathies disagree somewhat with some of his libertarian economic views, I know America can't be saved without being restored to true democracy and self-government, as opposed to an EU-style "democracy" of universal rights and equality imposed on the people against their will by an unaccountable bureaucracy, and Ron Paul makes this a priority. His reputation in the House as "Dr. No" because of his consistent opposition to spending increases is eminently commendable in an age when most politicians stay in office by voting for goodies for their constituents. (Though I suppose this really says something about the difference between the people of Paul's district in Texas and, say, Ted Kennedy's in Massachusetts.)
Tom Tancredo is still my first choice for the Republican nomination, because he, more so than Paul, correctly identifies immigration as the most important issue facing our nation right now. But if Tancredo doesn't make it, I will have no qualms about supporting Ron Paul for President.
Sunday, August 26, 2007
Woman knows best?
These assumptions are often present in the articles and blog posts at Boundless, the Focus on the Family-run site I mentioned in my post on how evangelicals love diversity. A few days ago, editor Ted Slater posted what amounted to a statement of contrition for the personal qualities he possessed while a bachelor and gratefulness toward marriage for absolving him of those sins and reforming him. Part of it read as follows:
"Habits" were a big part of who I was, pre-marriage. Habits like staying up late working on my audio or Web site projects, taking naps whenever I felt like it, eating whenever (and whatever) I wanted, spending money impulsively on new musical or computer equipment, enjoying flirt-tinged conversations with single women, hanging out late with my buddies after worship band practice, getting to work late and staying at the office late, and so on.
The way I prepared for married life was by telling myself, and my bride-to-be, that our wedding day marked the death of the single Ted. On Dec. 21, 2002, the single Ted would be no longer. He would be dead.
The truth is that it took years to shed some of my more self-centered habits, but I do think it was helpful to begin the process by having a specific time in mind where those habits were no longer what characterized me.
The single Ted is long dead. And the happily married Ted doesn't miss him.
Now, it may very well be that Slater was a rotten guy as bachelor and that marriage forced him to clean himself up. I have to question, though, why if this kind of thinking is not part of a trend that sees men as inherently bad and women as inherently good, we never see similar thoughts expressed about women needing to clean themselves up. Indeed, try to imagine a post from Ted's wife Ashleigh alongs these lines:
"Habits" were a big part of who I was, pre-marriage. Habits like staying up late working on my school projects, taking naps whenever I felt like it, eating whenever (and whatever) I wanted, spending money impulsively on new clothes or shoes, enjoying flirt-tinged conversations with single men, hanging out late with my friends after worship band practice, getting to work late and staying at the office late, and so on.
The way I prepared for married life was by telling myself, and my husband-to-be, that our wedding day marked the death of the single Ashleigh. On Dec. 21, 2002, the single Ashleigh would be no longer. She would be dead.
The truth is that it took years to shed some of my more self-centered habits, but I do think it was helpful to begin the process by having a specific time in mind where those habits were no longer what characterized me.
The single Ashleigh is long dead. And the happily married Ashleigh doesn't miss her.
One simply doesn't hear this kind of thing in the evangelical world. And the funny thing is that most evangelicals will still claim to believe in traditional gender roles, even though this totally contradicts the notion that men are inherently bad and essentially need to "submit" themselves to women in order to be reformed. The aforementioned mens' minstry met at 6:30 AM on Saturdays, and some of the men didn't have much time to stay and chat afterward because they said their wives wanted them home to relieve them of the kids. Excuse me, but I thought that when a woman stayed home, taking care of the kids was part of her job. Where in all this are the "submission" of wives to their husbands and the "rule" of husbands over their households of which the Bible speaks? Now we have "conservative" Christian men cowering in fear of their wives, or perhaps, not wishing to provoke sexual rejection.
There's a lot of fretting in the evangelical subculture today about how the divorce rate among professing born-again Christians isn't any lower than that of society at large. That is certainly worrisome and wrong, but it is never going to change as long as evangelicals hold this view of women being morally superior to men.
A profession that doesn't look like America
Unfortunately, historical data on the demographics of medical students is hard to find. It may be that no one was keeping track of the race and sex distribution of medical students prior to about 1980. But let's make do with what we can. Consider a hypothetical doctor on the verge of retirement. If he turned 65 years old today, he was born in 1942, and thus most likely started medical school around 1964. At that time, more than 93% of American medical school graduates were men. While racial data was hard to find, I figured from this table published in 1990 that in our hypothetical physician's age group (45-54 at the time), more than 75% of physicians were non-Hispanic white, so we can reasonably extrapolate that this same demographic datum applied to his fellow medical students when he was in school. Furthermore, we know that in the years prior to 1965, when the infamous immigration bill was passed that was responsible for the ongoing transformation of America into a multicultural society, our nation was 89% white. So our 65 year old doctor was a white medical student in a white world.
Some people might not know, however, how drastically the profession is changing. Fortunately, the American Association of Medical Colleges (AAMC) has collected all sorts of demographic data about medical students for the past 15-20 years, so it's much easier to look at what's happening now. First of all, there has been a massive influx of women into the field: 2003 made headlines as the first year in which more women applied to medical school than men (though still ever so slightly more men were accepted) and for several years now the ratio of male to female matriculants has appeared to asymptotically approach 50-50. While the implications of this are important, I'd like to confine a discussion of them to its own post and focus on the racial changes here.
In 2006, 61% of medical school matriculants were non-Hispanic white. 7% were black, 7% were Hispanic of any race, and 19% were Asian.* This is something that varies greatly by school. In many Southern and Midwestern state schools, the number of whites still dwarfs the number of Asians, but at East and West coast private schools, Asian students are admitted in numbers greatly out of proportion to their numbers in the general population, and at some California schools Asians outnumber whites. (At my own school, a fairly high-ranking private school, only 53% of this year's class is white while 30% are Asian. Interestingly, this means that the much-maligned and dreaded white males comprise less than 30% of our class.)
Now, everyone knows that for a long time, affirmative action was said to be necessary because "minorities" were at a historic disadvantage in America and needed a special boost in university admissions and job hiring to bring their average levels of education and representation in various professions up to their proportion of the general population. Originally, when we spoke of "minorities" in this sense we almost always meant blacks, though I suppose American Indians and non-white Hispanics may have been included under the banner as well. A funny thing happened on the way to equality, however. After the aforementioned 1965 immigration bill, we began admitting large numbers of Asian immigrants, who as a group have average IQs higher than not only blacks but whites as well. These high-IQ Asians naturally began rising to the top of our society, being admitted to prestigious universities and entering the "cognitive elite" professions in proportions vastly greater than their share of the general population. Suddenly, the word "minority" by itself was no longer useful to describe the groups supposedly needing affirmative action, since these Asians were and still are a minority. Hence, the name of the game these days in medical school admissions is "underrepresented minority," or URM.
The usual justification for affirmative action for URMs is that patients are better treated by physicians who are like them. White physicians, it is said, cannot understand blacks as well as blacks can, and black patients are less likely to feel comfortable with or confide in a white physician compared to a black one. The same is said to hold true for Hispanic patients. Therefore, just as Bill Clinton wanted to create a cabinet that "looks like America," we need to increase the number of URMs to serve the needs of society. But wait--whites are still 67% of the US population, but only 61% of last year's medical school matriculants. Who will meet the needs of that remaining 6% of the population? If black people need black physicians rather than white ones, don't white people need white physicians rather than Indian ones? By the liberals' logic, shouldn't we place some limits on the number of Asians and start practicing affirmative action for whites?
Maybe not. For one thing, large numbers of Asians increase "diversity," by which the left really means non-whiteness, and thus are just as useful as any other race in achieving the leftist goal of turning whites into a minority in our own country. But more significantly, America itself increasingly no longer looks like America. It's troubling enough that we've gone from 89% white in 1965 to only about 2/3 white in the early 2000's. Even worse, however, was the report earlier this year that only 55% of children under age 5 in America are white. So maybe we don't need more white doctors to serve the needs of the "white community." If whites become a minority in America, as we are on track to do unless we wake up, get off our duffs, and enact a moratorium on non-Western immigration soon, even 61% will be too high a proportion of white doctors. The only question that will remain is where we will get all of the Hispanic doctors needed to serve the needs of the swelling "Hispanic community", since most of the Mestizo peasants currently "immigrating" here, with their average IQ of 90, generally don't have a level of intelligence that is considered acceptable to make a good physician.
It doesn't end there, however. So far I have been speaking only of people admitted to and graduating from American medical schools. However, after medical school, in order to be licensed to practice medicine, one must complete a 1-year internship, and while this was not true 50 years ago, nowadays one must become board-certified by completing a full residency in order to realistically make a living as a doctor, since no hospital will bring a non-board-certified physician on staff nor will insurance companies reimburse for services provided by non-board-certified physicians. These residency programs, which exist at teaching hospitals across the country, are funded by Medicare and, unlike the number of slots available in medical schools, their numbers are not controlled by the AAMC. In 2007, 15206 US medical school seniors applied for residency positions, of which there were 21845 available. How did the remaining 6639 residency positions get themselves filled, you ask? Foreign medical graduates, who represent an ever-increasing share of the US physician workforce.
I'm sure I don't need to tell you that most of these foreign medical graduates don't come from Western Europe, Canada, and Australia. While some do, a great many are from South Asia and Eastern Europe. They tend to occupy the residency positions that American medical students find undesirable, like primary care fields and programs at community (as opposed to university) hospitals. My uncle, a private practice physician, told me recently of a patient satisfaction survey conducted by the community hospital where he is on staff. He said he suspected that a significant amount of the dissatisfaction patients expressed with the resident physicians could be attributed to their foreignness. "I mean, we've got guys wearing turbans, " he said. "Imagine you're a 75-year-old woman who's lived all her life in Jamison. The only place you've ever seen someone like that is on the evening news!" Yes, some of them really do wear turbans. Suffice it to say that they don't represent the "white community," or the black or Hispanic communities for that matter, very well. I don't think they have much "cultural competence," either. 30 years from now, when you need a primary care doctor, it may not only not be possible to find a white one--it may not even be possible to find an American-born one.
America is currently sick, with a disease called liberalism. Even though most Americans are still at a point where they are afraid to say it, America doesn't look like itself. Still, like many sick patients, it can get better. However, this cure is not going to come from doctors, who have been infected with liberalism themselves. What is needed is a behavioral approach rather than a biomedical one. The patient will need to find his own motivation, and to make some serious lifestyle changes, just as an overweight person who knows he is at risk for heart disease might go on a diet and start exercising. We as Americans can take this patient-centered approach and be cured if we really want to. Medicine and other institutions which represent only small segments of America can't cause top-down change; it has to come from the bottom up. Once we have set ourselves down the right path, medicine, as well as all of our other societal institutions, won't be able to help but follow.
*It's important to note that many demographic surveys don't make a distinction between Orientals and South Asians; thus, the term Asian encompasses Orientals, Indians, Pakistanis, etc. From purely anecdotal observation I'd say that the number of Indian students at my school is equal to if not greater than the number of Orientals.
Friday, August 17, 2007
More evidence that evangelicals are going liberal
Boundless has a blog called Boundless Line. Recently they had a post on Robert Putnam's recent study which has been making waves for its conclusions about the downsides of diversity. Author Candice Watters started off sounding conservative on the issue: she gave the post a title of "Forced Diversity Has Opposite Effect," and wrote that "now a new study suggests maybe the glorification of diversity wasn't such a good idea after all." By the end of the post, however, it's clear that she's only against "forced diversity," and agrees with Daniel Henninger that evangelical megachurches are a good "assimilation model."
Where it really gets interesting is the comments section. None of the commenters questioned the notion that diversity is good. Several questions Putnam's conclusion, and several raised the specter of racism. Now, it's possible that Boundless publishes some comments from nonbelievers, but I think we have to assume that evangelicals constitute a solid majority of its readership. Remember, these are evangelicals, those evil right-wing fascists whom the left thinks want to destroy basic civil liberties and purge the entire world of everyone who's not white, Christian, male, and wealthy. Look at some of the things they're saying:
- "I don't want this research to be used as an excuse to promote segregation."
- "I love my church. It's awesome to see all the people in it. You can find mohawks, perms, Jessica Simpson hair, and wash-and-wear styles all together in the same room! It's like 300 cultures becoming one!"
- "if admissions quotas and other efforts to diversify force us out of our comfort bubbles, than so be it. And if we're going to be political, I find it very plausible that admissions standards quotas are both necessary and just... I love diversity, especially ethnic and cultural diversity."
- "Unfortunately, arguments that call diversity a failed process only hinder racial tolerance and integration...To argue against diversity seems pointless...I guess that racism is not dead. Apparently, individuals still think that diversity is not necessary."
- "Putnam discovered that 'People in ethnically diverse settings don't want to have much of anything to do with each other,'
Am I the only person who thought 'racism' when I read this quote?"
Ethnic diversity is increasing in most advanced countries, driven mostly by sharp increases in immigration. In the long run immigration and diversity are likely to have important cultural, economic, fiscal, and developmental benefits. In the short run, however, immigration and ethnic diversity tend to reduce social solidarity and social capital. New evidence from the US suggests that in ethnically diverse neighbourhoods residents of all races tend to ‘hunker down’. Trust (even of one's own race) is lower, altruism and community cooperation rarer, friends fewer. In the long run, however, successful immigrant societies have overcome such fragmentation by creating new, cross-cutting forms of social solidarity and more encompassing identities. Illustrations of becoming comfortable with diversity are drawn from the US military, religious institutions, and earlier waves of American immigration.
I suppose Chris is referring to the statements that "In the long run immigration and diversity are likely to have important cultural, economic, fiscal, and developmental benefits" and "successful immigrant societies have overcome such fragmentation by creating new, cross-cutting forms of social solidarity and more encompassing identities." That may sound positive, but it is a statement that people are overcoming problems, not that problems don't exist in the first place. The real question is, where did the problems come from? Wouldn't it be better to prevent problems from existing in the first place, rather than finding ways to work around them? Aren't we interested in addressing, as they say, the "root causes?"
To make this point clear, let me rephrase the first sentence of the abstract in two different ways:
- Ethnic goodness is increasing in most advanced countries, driven mostly by sharp increases in immigration.
- Ethnic conflict is increasing in most advanced countries, driven mostly by sharp increases in immigration.
Each of those sentences is identical to the original sentence, except that the word "diversity" has been replaced with a word that reflects a value judgment about the nature of diversity. I submit that someone who thinks realistically about ethnic diversity reads the sentence the second way, and that the first way reflects an a priori assumption that diversity is good, a fundamentally liberal assumption--and it is the one the Boundless commenters are using. As Jared Taylor put it when addressing a Canadian audience:
Now, you probably think that every major Canadian institution from the federal government on down takes the view that racial diversity is a great strength for Canada. In fact, they all agree with me. They all assert most emphatically that racial diversity is not a source of strength but a source of conflict. The only difference is that instead of the word “conflict,” they use the word “racism.”
In other words, the liberal sees diversity as automatically good, and any conflict that results from it as racism, an evil reaction to it that must be rooted out. A true conservative sees true racism as bad, but also at the same time sees conflict as the inevitable result of incompatible peoples trying to live side-by-side with one another. Therefore the true conservative will advocate the reduction of diversity as a means of minimizing the problem of ethnic and racial conflict.
That is not what these evangelicals are doing. They are taking the liberal side in the debate. In signing onto the diversity movement, they have willingly subscribed to a view that originated with secular leftists who hated traditional white Western societies for their particularism, and hated Christianity for the same reason. For this reason, evangelicals should find the liberal view revolting, but they don't realize what they're doing. They have decided that they look bad when they take conservative positions, and that leftists are right when they say Jesus was a liberal, and so in order to win people over they must attempt to out-liberal liberals, an effort doomed to failure.
I suppose liberal evangelicals think they (or rather, God working through them) are going to save our society by saving the world. They need to realize that unless our society is saved first, which involves making it more cohesive and unified--in other words, reducing diversity--they are never going to get the chance to save the world.
Wednesday, August 15, 2007
Getting off the ground
Wednesday, August 8, 2007
When "single" no longer means single
I had an interesting experience a few days ago, which brought home how significant a shift in the usage of a seemingly innocuous word can be--and how far outside the mainstream my traditional views are. I was with a group of several other first-year students and one second-year student, all male, and the second-year requested the straight dope on the appeal of the females in our class. (For those who don't know, the medical school population in the USA is now nearly 50% female, a topic which I intend to address in later entries.)
The consensus view emerging from my classmates--I kept silent, as I usually do in conversations like this--was that there are a few attractive women in our class, but they are few and far between, and furthermore, that almost all of them have boyfriends. So far, nothing surprising in the slightest. Then one man quipped that you could identify three desirable traits in women: cute, single, and nice, and you could choose two of the three. (I don't remember whether the third trait was actually "nice," but if not, it was unimportant.) And as the conversation progressed along those lines, I realized something. The word "single," in this context, has always simply meant not married. Fellow conservative Christians, no matter how culturally non-traditionalistic they may be, know what I'm talking about. If you are not legally married, you are single. It doesn't matter if you've been dating your boyfriend for two years and are about to go ring shopping tomorrow; the word "single" refers to nothing more than the absence of a marriage license.
Yet that was not what it meant to these young products of the 21st century world. To them, it meant without a boyfriend or girlfriend. A single girl is one who is not even dating; a girl who has a boyfriend is... what? I don't know what they would call it. Taken? Attached? Whatever it is, it seems to have become a semi-official state, drifting inchoate somewhere between being completely romantically uninvolved and being married.
As I've aged, this common concept we use the words "boyfriend" or "girlfriend" to describe has slowly bothered me more and more. This is because there is no "official" basis for it. I'm not saying it should be done away with: I have participated in it myself, and obviously some kind of steady dating relationship is necessary before a couple progresses to engagement. What I am saying is that if you are married, you have participated in a rite which your surrounding society believes in: you have taken vows before witnesses, and declared yourselves to be attached to each other in an official legal relationship which has real, formal implications (inheritance; official sanctions on consorting with others, though these have fallen by the wayside in our debauched society; the assumption of parenting roles, etc.) In a traditional society, marriage is something the society as a whole believes in, and everyone knows what is meant by it. But in our society, where expressive individualism reigns, what is important is not what society thinks of your relationship, but what you think of it. And society's views must change to fit your needs and desires. If you say that you have a "boyfriend," that you are "taken" or "attached," who am I to say otherwise?
It occurred to me that the students using the word "single" this way didn't think there was anything remarkable about their usage. When they are asked to fill out a form which requests their marital status, are they surprised that it seems to consider "single" the opposite of "married?" I wonder, are we reaching the point where we need a compound term to describe someone's marital state; for example "unmarried and single" vs. "unmarried but taken?"
But the most troubling thing about this development in the usage of "single" is that not only does it replace the society-defined relationship with the couple-defined one, it elevates "dating" to the level of marriage. For all intents and purposes in our society, except among conservative Christians (and, I imagine, orthodox Jews), the belief that there is anything wrong or even remarkable about cohabitation before marriage is at present dead. The usual way of looking at this situation is to say that since people have abandoned traditional moral strictures, there is no reason not to live together before marriage: it's convenient, it's fun, and it gives people a chance to "test drive" marriage. But an oblique, and in my opinion more ominous, view is that, given the emphasis on individual expression and self-realization in the modern world, people have replaced the traditional, society-defined institution of marriage and its accompanying externally imposed strictures, with the self- (or couple-) defined institution of... well it doesn't have a name yet. But in a way, that's the point. Perhaps it if had a name, it would be too formal, too official, too externally imposed.
Now, when people recognize as legitimate these informal, self-defined relationships, as my fellow students and their peers certainly do, that takes away a certain amount of the impetus to get married. Part of the reason people used to get married was to have their relationships not merely legally recognized but socially taken seriously. Now that the opposite of "single" is not "married" but "dating," dating is taken as seriously as marriage. And when a person uses the word "single" in that sense, he reinforces that view--and when there is a trend of people doing this collectively, across the country, it is yet another of the many forces chipping away at the foundation of marriage in our society.
So there you have it. Words matter. They influence the way people think. What can one man do about it? Perhaps not much, but at least he can resist. If one of my classmates asks me if I am single, I will say yes, I am unmarried. And if, by the grace of God, I manage to convince a woman to start dating me, if one of them refers to me as no longer single, I will correct him.
Welcome!
I hope you enjoy the blog, and please don't be turned off by its simplistic appearance so far--I'm completely new to this, and expect to refine both the look and the content extensively as I gain experience.