Tuesday, August 30, 2011
A Picture is Worth a Thousand Blogs
I have been meaning to write a blog post as a generic response to every not very smart genetics blogger who thinks he or she is smart because he or she wrote that there is no single gene for something. The only thing holding me back has been that I do not want to seem too angry because I am a nice guy in real life. You see, according to them, a behavior like having a fit of rage is so complex that it must be molded by thousands of genes, each only supplying the slightest tincture of that immense complexity. My response had been to say, “What about Brunner syndrome?” Brunner syndrome causes fits of rage and is the product of just one gene. They instantly become indignant (fit of rage?) because Brunner syndrome is rare. That is true, but it proves that one rare gene can drive a “complex behavior,” so why can’t a few somewhat uncommon genes cause the somewhat uncommon behavior of violent delinquency? Or three genes and crime:
Labels:
biosocial criminology,
bloggers,
crime,
DAT1,
DRD2,
DRD4,
GWAS,
Hispanic,
Kevin Beaver,
MAOA
Saturday, August 27, 2011
Meet Towelie, the IQ Test of the Future
Popular culture warrants serious study. Take, for example, the 1991 movie Boyz N the Hood from the short-lived black-whining-drama genre. The film was almost universally praised despite its predictability and clichés. A major plot point hinged upon the SAT test, which used to stand for Scholastic Aptitude Test before it stood for Scholastic Assessment Test before it officially did not stand for anything. The omniscient, two-dimensional father figure in the movie gave us a pat summary of the SAT and IQ tests. “Most of those tests are culturally biased to begin with. The only part that is universal is the math.” However, data from the College Board disproves his assumption.
Below is the Black-White SAT score gap in standard deviations using College Board data that is available online from 1996 onwards and some additional data that Herrnstein and Murray obtained and published in The Bell Curve.
So, the “universal” math portion of the test produces a larger gap than the “culturally biased” verbal portion. The gap has clearly shrunk since the mid-70’s but is now flat and might even be slightly increasing. This could mean that racial egalitarianism has reached diminishing returns in Black college preparedness. When the pool of participants increases, it tends to create downward pressure on SAT scores because the proportion of people attending college is increasing and adding students who would not have been prepared to go to college in previous eras. To test whether the growth of Black SAT participation is responsible for the lack of decline in the Black-White SAT gap, I added a bar graph for the percentage increase of Black participants minus the percentage increase of White participants.
This shows that the percentage of Black people taking the SAT has been increasing faster than the White percentage, but the size of these increases does not seem to clearly coincide with changes in the gap. Compare this to the gaps between Asians and Whites.
Asians are also taking the SAT at a more accelerated participation rate, but their scores are also improving relative to Whites. They have reached parity with Whites on the verbal and written tests, while their math performance is increasingly surpassing that of White students. Consequently, the notion that cultural bias influences scores would be more logical in support of the conclusion that the strong performance of Asians is not yet at their potential, rather than the conclusion that Black students are suffering from such a bias.
Perhaps in response to such data and the attention brought to it by the bestseller status of The Bell Curve, egalitarian psychologists proposed a concept in 1995 called “stereotype threat,” the idea that internalized stereotypes in test takers undermine performance. My initial reaction to this notion was that the egalitarians were making a desperate reach for an untestable hypothesis for the sake of plausible deniability. However, it appears that this notion has led to an interesting array of research and a number of applications besides the Black-White score gap. They also traded relevance and subtlety to attain testability. After all, implying insults to black students before a test is not exactly standard proctoring procedure.
Now that the evidence for a high heritability for intelligence is stronger with corroborating lines of evidence, it is clear that the future of IQ testing will jettison the test experience altogether. In other words, the future of IQ testing will be examination of its physical underpinnings, including but not limited to genetic tests. A recent “debate” of two YouTube personalities featured an egalitarian who copied the questions from an inquirer speaking with controversial University of Western Ontario scientist JP Rushton. In the interview, the Black Al Jazeera journalist Rageh Omaar asked Rushton, “What then are the genes that determine intelligence? What are they? Can you name them?” The question was probably disingenuous, since it is easy enough to find out that genetic studies show no powerful IQ loci. However, the more up to date reply would be that naming most of them is possible if one has time because a recent study of 600,000 single nucleotide polymorphisms showed that they account for 51% of fluid intelligence. The study was not an exhaustive examination of human genome variation because there are many more differences yet to be studied in the many less common single nucleotide polymorphisms, as well as copy number variants and variable number of tandem repeats. A more detailed understanding could come from studies like the one being led by University of Oregon scientist Steve Hsu, who is developing a genome-wide association study of people with an intelligence level of about three standard deviations above the mean (an IQ of 145) or higher. Tellingly, his study is supported by a third-world country that has a serious need for reform of its own scientific establishment. I would suggest that fear of understanding the genetics of IQ plays a role in the potential for China to overtake America in this area.
Unlike twins studies, genome-wide association studies would advance race realism because probably most IQ variants affect IQ in all humans who have them, and the distribution of these variants in not likely to be geographically uniform.
Therefore, the future of the popular critique of IQ will be less Boyz N the Hood and more Gattaca. In Gattaca, a space flight company surreptitiously obtains genetic information from its employees, making the film a depiction of a horrific dystopia in which unhealthy people are prevented from being astronauts. IQ geneticists need an equally powerful pop culture icon to humanize their research.
After the popping of the dot-com bubble, much speculation and excitement turned to nanotechnology. Drugs would be programmed to target pathology with no fear of side effects. Plates would be programmed to tell us about food content. Computers would be woven into garments. Self-replicating nanobots would perfect manufacturing and become a new type of pollution. Enter Towelie. Towelie first appeared on the program South Park ten years ago. He is an RG-100 Smart Towel designed by Tynacorp with a computer chip and a programmed “TNA” to sense a person’s body moisture, beat the average person at chess, and become a weapon of mass destruction, should he fall into the wrong hands. None of this mattered to the boys of South Park because they were busy trying to find their stolen Okama GameSphere, which Stan’s mom bought for “only $399.99.” (All in all, that is fairly useful advice for the computer industry, considering that it is coming from a cartoon.) I would nominate Towelie to be the mascot for tomorrow’s IQ tests. Towelie clones would isolate DNA from hair or skin cells, decode the genome, run an IQ gene algorithm, and wirelessly transmit an IQ range estimate. All liberal handwringing about IQ validity would be moot. If only we could get Towelie to lay off the weed.
Below is the Black-White SAT score gap in standard deviations using College Board data that is available online from 1996 onwards and some additional data that Herrnstein and Murray obtained and published in The Bell Curve.
So, the “universal” math portion of the test produces a larger gap than the “culturally biased” verbal portion. The gap has clearly shrunk since the mid-70’s but is now flat and might even be slightly increasing. This could mean that racial egalitarianism has reached diminishing returns in Black college preparedness. When the pool of participants increases, it tends to create downward pressure on SAT scores because the proportion of people attending college is increasing and adding students who would not have been prepared to go to college in previous eras. To test whether the growth of Black SAT participation is responsible for the lack of decline in the Black-White SAT gap, I added a bar graph for the percentage increase of Black participants minus the percentage increase of White participants.
This shows that the percentage of Black people taking the SAT has been increasing faster than the White percentage, but the size of these increases does not seem to clearly coincide with changes in the gap. Compare this to the gaps between Asians and Whites.
Asians are also taking the SAT at a more accelerated participation rate, but their scores are also improving relative to Whites. They have reached parity with Whites on the verbal and written tests, while their math performance is increasingly surpassing that of White students. Consequently, the notion that cultural bias influences scores would be more logical in support of the conclusion that the strong performance of Asians is not yet at their potential, rather than the conclusion that Black students are suffering from such a bias.
Perhaps in response to such data and the attention brought to it by the bestseller status of The Bell Curve, egalitarian psychologists proposed a concept in 1995 called “stereotype threat,” the idea that internalized stereotypes in test takers undermine performance. My initial reaction to this notion was that the egalitarians were making a desperate reach for an untestable hypothesis for the sake of plausible deniability. However, it appears that this notion has led to an interesting array of research and a number of applications besides the Black-White score gap. They also traded relevance and subtlety to attain testability. After all, implying insults to black students before a test is not exactly standard proctoring procedure.
Now that the evidence for a high heritability for intelligence is stronger with corroborating lines of evidence, it is clear that the future of IQ testing will jettison the test experience altogether. In other words, the future of IQ testing will be examination of its physical underpinnings, including but not limited to genetic tests. A recent “debate” of two YouTube personalities featured an egalitarian who copied the questions from an inquirer speaking with controversial University of Western Ontario scientist JP Rushton. In the interview, the Black Al Jazeera journalist Rageh Omaar asked Rushton, “What then are the genes that determine intelligence? What are they? Can you name them?” The question was probably disingenuous, since it is easy enough to find out that genetic studies show no powerful IQ loci. However, the more up to date reply would be that naming most of them is possible if one has time because a recent study of 600,000 single nucleotide polymorphisms showed that they account for 51% of fluid intelligence. The study was not an exhaustive examination of human genome variation because there are many more differences yet to be studied in the many less common single nucleotide polymorphisms, as well as copy number variants and variable number of tandem repeats. A more detailed understanding could come from studies like the one being led by University of Oregon scientist Steve Hsu, who is developing a genome-wide association study of people with an intelligence level of about three standard deviations above the mean (an IQ of 145) or higher. Tellingly, his study is supported by a third-world country that has a serious need for reform of its own scientific establishment. I would suggest that fear of understanding the genetics of IQ plays a role in the potential for China to overtake America in this area.
Unlike twins studies, genome-wide association studies would advance race realism because probably most IQ variants affect IQ in all humans who have them, and the distribution of these variants in not likely to be geographically uniform.
Therefore, the future of the popular critique of IQ will be less Boyz N the Hood and more Gattaca. In Gattaca, a space flight company surreptitiously obtains genetic information from its employees, making the film a depiction of a horrific dystopia in which unhealthy people are prevented from being astronauts. IQ geneticists need an equally powerful pop culture icon to humanize their research.
After the popping of the dot-com bubble, much speculation and excitement turned to nanotechnology. Drugs would be programmed to target pathology with no fear of side effects. Plates would be programmed to tell us about food content. Computers would be woven into garments. Self-replicating nanobots would perfect manufacturing and become a new type of pollution. Enter Towelie. Towelie first appeared on the program South Park ten years ago. He is an RG-100 Smart Towel designed by Tynacorp with a computer chip and a programmed “TNA” to sense a person’s body moisture, beat the average person at chess, and become a weapon of mass destruction, should he fall into the wrong hands. None of this mattered to the boys of South Park because they were busy trying to find their stolen Okama GameSphere, which Stan’s mom bought for “only $399.99.” (All in all, that is fairly useful advice for the computer industry, considering that it is coming from a cartoon.) I would nominate Towelie to be the mascot for tomorrow’s IQ tests. Towelie clones would isolate DNA from hair or skin cells, decode the genome, run an IQ gene algorithm, and wirelessly transmit an IQ range estimate. All liberal handwringing about IQ validity would be moot. If only we could get Towelie to lay off the weed.
Labels:
BGI,
Charles Murray,
GWAS,
IQ,
SAT,
South Park,
Steve Hsu,
The Bell Curve,
Towelie
Wednesday, August 24, 2011
Community College Debate Resolves Issue of Race
(Part one of a continuing series.)
After millennia of controversies, passions, and subjugation, the issue of race was resolved over the weekend when a chain-smoking man with the equivalent of an associate degree won an over three-hour BlogTV debate against a former white nationalist who is part African-American. The winner of the debate, who goes by the pseudonym “Skeptical Heretic,” successfully used a debating style that was a cross between James Carville and Tony Soprano, despite a limited grasp of the primary subject, race and IQ. (This lack of proficiency was revealed in previous videos in which he confused anti-IQ polemicist Richard Nisbett with The Bell Curve co-author Charles Murray and the neo-conservative think tank, the American Enterprise Institute, with the eugenics science non-profit, the Pioneer Fund.) Nevertheless, his opponent’s exacerbated pauses with Al Gore-like sighs signaled to the approximately 300 viewers that the contest was decided, and doctors and scientists alike would no longer feel confident studying biological racial differences. Research oncologist Kathy Albain could not be reached for comment at the time of this writing, but presumably her groundbreaking investigations of the genetics of racial disparities in cancer mortality will be immediately shuttered.
Much of the victor’s case expectedly rested upon on the common linguistic argument of the continuum fallacy applied to race with the conclusion often summarized as “race is a social construct.” Anticipating this tact, I asked Heretic prior to the debate whether his disbelief in race extended to Neandertals. As I previously reported, Neandertals bore children with early non-African humans, and now people finally have access to a blood test that answers the eternal question, “Am I more Neanderthal than Ozzy Osbourne?” Thus, I was using the famous reductio ad absurdum argument to take his reductionist claims to their logical conclusion: Neandertals did not exist as a distinct group. Humanity is one! In fact, Heretic’s other major line of attack, which is commonly referred to as Lewontin’s fallacy, also has a rendition that applies to the distinction between humans and Neandertals, as University of Wisconsin-Madison professor John Hawks observed, “there are some human genes for which two human copies taken at random are more different from each other than one of them is from the Neandertal.”
To my inquiry, Heretic responded that Neandertals are not “established as an individual species or as a subspecies of modern mand [sic]. Right now, from what I can tell, the jury is still out, partially due to the mitochondrial observations.” For those unfamiliar, he is referring to the study by Green et al in 2008 that proved that humans and Neandertals do not share mitochondrial DNA. However, Neandertals cannot be a separate species by the classical definition of the term because Green et al in 2010 proved that Neandertals do share nuclear DNA with humans. I am not sure which deserves more blame for such a notable human as Heretic disowning his Neandertal relatives, public education or science reporting. Ironically, he cited a study to deconstruct race that also raised concerns about subspecies classifications for their “arbitrariness of criteria.”
During the debate, Heretic also disputed the validity and heritability of IQ. He never gave any indication that he had ever read a genome-wide association study on the topic (the first of which will celebrate its tenth anniversary next month) and admitted after the debate that he had not read Davies et al, which proved a week prior that fluid intelligence has a “lower bounds” narrow-sense heritability in a Scottish population of at least 51%. (For the record, I personally left him a link to this study several days before the debate.)
Perhaps racial political correctness deserves a more qualified public intellectual. However, I plan to make analysis of his case a continuing series for the simple reason that it was bad enough to be good.
After millennia of controversies, passions, and subjugation, the issue of race was resolved over the weekend when a chain-smoking man with the equivalent of an associate degree won an over three-hour BlogTV debate against a former white nationalist who is part African-American. The winner of the debate, who goes by the pseudonym “Skeptical Heretic,” successfully used a debating style that was a cross between James Carville and Tony Soprano, despite a limited grasp of the primary subject, race and IQ. (This lack of proficiency was revealed in previous videos in which he confused anti-IQ polemicist Richard Nisbett with The Bell Curve co-author Charles Murray and the neo-conservative think tank, the American Enterprise Institute, with the eugenics science non-profit, the Pioneer Fund.) Nevertheless, his opponent’s exacerbated pauses with Al Gore-like sighs signaled to the approximately 300 viewers that the contest was decided, and doctors and scientists alike would no longer feel confident studying biological racial differences. Research oncologist Kathy Albain could not be reached for comment at the time of this writing, but presumably her groundbreaking investigations of the genetics of racial disparities in cancer mortality will be immediately shuttered.
Much of the victor’s case expectedly rested upon on the common linguistic argument of the continuum fallacy applied to race with the conclusion often summarized as “race is a social construct.” Anticipating this tact, I asked Heretic prior to the debate whether his disbelief in race extended to Neandertals. As I previously reported, Neandertals bore children with early non-African humans, and now people finally have access to a blood test that answers the eternal question, “Am I more Neanderthal than Ozzy Osbourne?” Thus, I was using the famous reductio ad absurdum argument to take his reductionist claims to their logical conclusion: Neandertals did not exist as a distinct group. Humanity is one! In fact, Heretic’s other major line of attack, which is commonly referred to as Lewontin’s fallacy, also has a rendition that applies to the distinction between humans and Neandertals, as University of Wisconsin-Madison professor John Hawks observed, “there are some human genes for which two human copies taken at random are more different from each other than one of them is from the Neandertal.”
To my inquiry, Heretic responded that Neandertals are not “established as an individual species or as a subspecies of modern mand [sic]. Right now, from what I can tell, the jury is still out, partially due to the mitochondrial observations.” For those unfamiliar, he is referring to the study by Green et al in 2008 that proved that humans and Neandertals do not share mitochondrial DNA. However, Neandertals cannot be a separate species by the classical definition of the term because Green et al in 2010 proved that Neandertals do share nuclear DNA with humans. I am not sure which deserves more blame for such a notable human as Heretic disowning his Neandertal relatives, public education or science reporting. Ironically, he cited a study to deconstruct race that also raised concerns about subspecies classifications for their “arbitrariness of criteria.”
During the debate, Heretic also disputed the validity and heritability of IQ. He never gave any indication that he had ever read a genome-wide association study on the topic (the first of which will celebrate its tenth anniversary next month) and admitted after the debate that he had not read Davies et al, which proved a week prior that fluid intelligence has a “lower bounds” narrow-sense heritability in a Scottish population of at least 51%. (For the record, I personally left him a link to this study several days before the debate.)
Perhaps racial political correctness deserves a more qualified public intellectual. However, I plan to make analysis of his case a continuing series for the simple reason that it was bad enough to be good.
Tuesday, August 23, 2011
The Politically Incorrect Guide to Weight Loss
Currently I have something important in common with British welfare mother Susanne Eman. We are both watching our weight. She is doing so in order to more than double her mass such that she will weigh over 1,600 pounds and be the heaviest human in history. I, on the other hand, have the more modest goal of losing 40 pounds. I was pretty skinny most of my life, but after college my weight gradually rose until my BMI entered the low overweight range. I have been surprised at my success so far, so I thought I would share my tips and some provocative science.
1. Love yourself, but do not accept your weight.
I have heard physiological arguments for fat acceptance. I do not doubt that weight has a complex endocrine regulation, and as obesity spreads, genetics influences who becomes obese first. However, personal psychology is what allows this shocking spread of obesity in America, which the CDC has illustrated with this animated map.
People tell themselves many excuses to relax their weight control expectations. Among these excuses are shifting blame to bad genes, previous pregnancy, or advancing age. I could help them by citing some additional causes of obesity. Race, work hours, stress, sleep, smoking, and various drugs affect weight. Some think that Helicobacter pylori eradication increased ghrelin production, and, thereby, obesity. Two adenoviruses have been linked to obesity, but obesity primarily spreads like a virus because relaxation of weight control expectations is normative. Such was the conclusion of a 32-year Harvard study, which found that obesity increases the obesity risk of others in a social network to three degrees of separation. Geographical distance had no effect, so the influence was exerted by having knowledge of one’s associates becoming obese, not living in the same place.
Nothing better illustrates the truth of that study as the incredible success of Singapore’s “Trim and Fit” (TAF) program, which lasted from 1992 to 2007 and decreased childhood obesity from 14% to 9.5%. After its end, childhood obesity rose to 12%. Why would such a successful program come to an end? TAF received substantial criticism for using peer pressure to motivate children to lose weight and for stigmatizing children. A pair of studies (only one of which appears to have actually been published) claimed that TAF drove children to have eating disorders. Briefly, I should mention how remarkable the hysteria over anorexia nervosa is, given what we know about how rare and how heritable it is.
Another example of this dynamic is the success of fat acceptance in promoting obesity among African-American women. Whereas a third of non-Hispanic white women are obese, most non-Hispanic African-American women are obese, but they think they are thinner than they are and are more satisfied with their size. This has forced the usually politically correct public health researchers to advocate “multifaceted intervention” to dispel these “overly optimistic beliefs.”
Clearly the feminist fat-acceptance movement is destroying lives, so I would be remiss to not thank Michelle Obama for her campaign against obesity. I find it very strange that conservatives have joined radical feminists in criticizing her just because she happens to be married to a Kenyan Muslim.
2. Buy a digital scale.
I had a spring scale that grossly underestimated my weight. Family practice doctors typically use balance beam scales, but those are expensive and less convenient than a simple digital scale. Digital scales are so precise that I can immediately see the consequences of my actions. If I overindulge at a meal, it shows. Water loss from a heavy workout shows. This feedback is great for overcoming a sense of weight inertia.
3. Do not play sports.
It should not surprise that the country with the largest sports market has the highest obesity rate. Of course, some people have improved health from playing sports, but I would advise against it for several reasons. First of all, I have met so many people who suffer from obesity and its co-morbidities and are helpless to do meaningful exercise due to an old sports injury. Australian researchers determined that, over a 2-week period, 5% of participants will have a sports injury. Injury prevents physical activity in 20% of 18-to-59-year-olds and 40% of those over 60. Sports injuries, therefore, contribute to obesity, which aggravates injuries. Many turn to prescription painkillers to mask the pain, leading to a downward spiral of opioid tolerance and addiction.
For those who avoid injury, sports can lead to obesity in other ways. American football actually encourages weight gain to produce massive players. In fact, many sports emphasize muscularity or other athletic qualities as more important that cardiovascular endurance. Sports participation is associated with excess alcohol consumption, which can cause obesity. Also, the mentality of sports deprives exercise of spontaneity. As people age, the infrastructure of organized sports leaves their lives, and they let their bodies go. Finally, sports create winners and losers. Those who repeatedly lose will feel discouraged from continuing, and those who always win might not feel a need to push themselves further.
4. Run instead.
When someone takes my pulse, they initially panic before they remember to ask me if I am a runner because runners have slow heart rates. Since heart disease is America’s number-one mass murderer, I find it comforting to know that my heart is not working too hard.
I suppose I should support the need for exercise, itself, before I further try to sell running. The straw-man critique of exercise, laid out last year in The Guardian, is that exercise alone does not reduce weight because people tend to compensate or reward themselves with additional calories or time of inactivity. So, while some might react to such an article by thinking that exercise is worthless, the studies it cited actually called for coupling exercise with dietary changes and increasing exercise intensity. One of the studies actually emphasized health benefits to exercise other than weight loss, including improved blood pressure and reduced abdominal fat.
So, here is my case for running. Running might be difficult for some people in inhospitable climates, but for the rest of us, it is a simple, time-efficient means to a lifelong exercise regime. Running is also a fun, stress-relieving activity that allows one time to think. At the end of a run, I have a renewed sense of well-being.
My routine is a 6-mile distance that allows me to minimize contact with traffic while still reaching a park for water, restroom access, and equipment suitable for toning exercises like pull-ups. USA Track and Field allows you to create a route map and measure its distance or look at others’ maps, which can include markers for water fountains.
I realize that some people believe in swimming because it puts less strain on joints, but I have heard conflicting data on whether running really is bad for joints. Running is certainly good for bone strength, but, above all, I think running is more conducive to an intense, enjoyable workout. When I swim, I am tempted to just have fun rather than do laps, and I cannot lose myself as I can running because I would bump my noggin.
5. Music helps.
I do not believe in sweating to the oldies, folk, jazz, country, or Rebecca Black. If you decide to take up running, I challenge you to channel your inner Kurt Cobain. I would place the motivation and enjoyment that music provides as a more important component to running than shoe type, and some forms of music have great artistic merit but do not fit a running habit. I generally prefer two styles of running music: energetic release music and runner’s high music. For the former, I need the uninhibiting verve of punk-inspired angst anthems and mechanized industrial torture devices. I look for the atmospheric and sublime in the latter, like a good Radiohead song or an electronic instrumental tapestry. Of course, every athletic badass has a secret pop smash hit somewhere on his mp3 player.
I would recommend Bluetooth headphones for running, rather than earbuds. Some mp3 players do not support Bluetooth without an adapter, but Insignia used to make decent Bluetooth mp3 players, which can be bought used for very little.
6. Do not run marathons.
I know a number of very healthy people who strongly believe in running marathons. Some run so many marathons that preparing for them is their typical running regimen. I do not share this enthusiasm for marathons because I think these push some people to treat running like an occasion for an accomplishment, rather than a steady exercise habit. Some studies show that marathon running is perfectly safe, but one MRI study found evidence of cartilage damage.
7. Exercise controlled shopping.
I have a sweet tooth. I love rich foods. Therefore, I am keenly aware that will power starts at the supermarket. Many people who struggle with their weight unfortunately decorate their homes with bright advertisements for unhealthy snacks on the sides of bulk packages. Instead of conditioning oneself to sugar highs like Pavlov’s dog, one should shop to bolster will power by purchasing fruit and sugar-free gum.
8. Do not overthink dieting.
Debates over diet types among advocates with vested interests are startlingly ugly. Any diet that truly decreases calorie intake will facilitate weight loss, and I think most people have a decent conception of the difference between healthy and unhealthy foods. Whole grains are better than refined. Soda increases weight, and I cannot understand why anyone ever drinks it. Fish is a good source of healthy fats. My problem with dieting plans is that they make people obsess about eating and create an illusion that picking the right diet will make weight loss easy. Eat less. Focus on something other than food.
9. Skip meals.
I believe in skipping meals because it has been helping me lose weight, but I am aware that nutritional studies warn against skipping meals and suggest benefits from small, frequent meals. These studies might be flawed by poor reporting of eating habits. The important factor is overall calorie intake and nutritional value. If breakfast is one’s most healthy meal, then skipping it could cause weight elevation, especially if one compensates with a larger dinner. I think a lot depends on how a person can best control his or her own appetite. I have found that by not eating for extended periods, hunger pangs decrease in intensity. People who let their bodies go will often develop diabetes, which requires them to control their blood sugar, making meal-skipping infeasible.
An extension of the fat acceptance movement is the contrarian view that weight is not important for health. This view flies in the face of recent progress in calorie restriction research, such as a new study that showed that calorie restriction lowers core boy temperature, which is believed to promote longevity. However, most people would not want to join the Calorie Restriction Society. A new study seemed to support the contrarian view when it found few lifestyle differences between a control group and people who have lived past 95. Although overweight women were significantly more likely to have longevity, long-lived men and women tended to not be obese, and if more men over 95 were in the study, perhaps their increased likelihood of being normal weight would have reached significance. A second recent study corroborated that weight control is especially important for men. In fact, optimal male body mass index for reduced mortality was lower than the threshold of being overweight.
Finally, I think it is worth confronting the masculine ideal of the husky, thick-necked man. Some men might have a substantial neck circumference due to heavy muscularity, but that criticism is just as valid for body mass index. I think neck thickness is more associated with masculinity among the lower classes. Doctors are now considering neck circumference as an alternative to body mass index as an indicator of cardiovascular health.
Friday, August 19, 2011
Why YouTube Sucks
Before I started blogging here, I created videos on YouTube. It was kind of fun, and I was proud of the results. At the time, YouTube advertised videos by using a snapshot of the video exactly in the middle, so I created a video detailing the scientific evidence that lesbianism is likely to not be biological in the way that male homosexuality is biological, but I advertised it as relating to a famous bisexual celebrity, Tila Tequila. The result was that about 70,000 people saw my video, and while no one did much to debate me on the science, I did have some interesting discussions with lesbians, and I may have convinced a few to question their sexuality again. Actually, I prefer to remember it as the time I reduced the world’s lesbian population by about 70,000. Another accomplishment of mine was that I think I coined the phrase “education bubble” in a video I created in 2007 or 2008 about higher education. Some anonymous people started asking me for my sources just prior to series of mainstream journalist pieces speculating about the possibility of a higher education bubble. Such a bubble could have a profound impact on our economy and our society.
Most of my work on YouTube came to an end when I posted a video series on the genetics of black violence. Certainly it is a provocative title and subject, but the video contained nothing that I would consider legitimately controversial. It consisted almost entirely of direct quotations from peer-reviewed scientific literature on dopamine genes and monoamine oxidase A (MAOA). One person gave me feedback that it was too difficult to follow for this reason, but I wanted to help educate the public without using sources that anyone could impugn. I have taken pains in all of my work to not only cite science directly, but also to rely on scientists whose motivations and politics are not lambasted as are those of, for example, University of Western Ontario researcher J Philippe Rushton. For entertainment value and to highlight or counter the opposing viewpoint, I peppered the video with excerpts from liberal documentaries and television programs, including parts of the movie Bowling for Columbine by Michael Moore. Now, YouTube is wholly owned by Google, and all of the studies I used can be accessed through Google Scholar. However, YouTube saw fit to remove my video. More to the point, YouTube removed me. My “channel,” all of my videos, and every single comment that I posted on any video anywhere on YouTube immediately vanished forever. In response, I reposted each of the three parts of the genetics of black violence video on three separate channels using a mix of L’s and I’s for channel names to complicate anyone’s attempt to remove these again. Those videos remain on YouTube and have received over 27,000 views, which is not bad for a hidden, extensive scroll of esoteric scientific jargon read by a computerized voice.
The view that YouTube tried to snuff out has become a burgeoning subfield of criminology, called biosocial criminology, which recently received a panel discussion at the annual National Institute of Justice conference, to which the New York Times chose to draw attention. In less academically rigorous news, the MAOA gene became the star of a sensational National Geographic documentary and an episode of the Dr. Phil television program.
What is obvious and hardly needs mentioning is that, even without the extreme, draconian censorship, YouTube would still have a dimwitted milieu. It is dominated by those with enough free time to express themselves in video format and with the kind of career prospects that do not demand anonymity when approaching taboo subjects. Two such individuals plan to stage a live debate this weekend on race and IQ, and I can only expect wreckage. YouTube comments, in particular, tend to be dominated by idiotic insults, forcing the video owners to choose between joining in the censorship or allowing endless bickering that would drive away anyone with a three-digit IQ.
Most of my work on YouTube came to an end when I posted a video series on the genetics of black violence. Certainly it is a provocative title and subject, but the video contained nothing that I would consider legitimately controversial. It consisted almost entirely of direct quotations from peer-reviewed scientific literature on dopamine genes and monoamine oxidase A (MAOA). One person gave me feedback that it was too difficult to follow for this reason, but I wanted to help educate the public without using sources that anyone could impugn. I have taken pains in all of my work to not only cite science directly, but also to rely on scientists whose motivations and politics are not lambasted as are those of, for example, University of Western Ontario researcher J Philippe Rushton. For entertainment value and to highlight or counter the opposing viewpoint, I peppered the video with excerpts from liberal documentaries and television programs, including parts of the movie Bowling for Columbine by Michael Moore. Now, YouTube is wholly owned by Google, and all of the studies I used can be accessed through Google Scholar. However, YouTube saw fit to remove my video. More to the point, YouTube removed me. My “channel,” all of my videos, and every single comment that I posted on any video anywhere on YouTube immediately vanished forever. In response, I reposted each of the three parts of the genetics of black violence video on three separate channels using a mix of L’s and I’s for channel names to complicate anyone’s attempt to remove these again. Those videos remain on YouTube and have received over 27,000 views, which is not bad for a hidden, extensive scroll of esoteric scientific jargon read by a computerized voice.
The view that YouTube tried to snuff out has become a burgeoning subfield of criminology, called biosocial criminology, which recently received a panel discussion at the annual National Institute of Justice conference, to which the New York Times chose to draw attention. In less academically rigorous news, the MAOA gene became the star of a sensational National Geographic documentary and an episode of the Dr. Phil television program.
What is obvious and hardly needs mentioning is that, even without the extreme, draconian censorship, YouTube would still have a dimwitted milieu. It is dominated by those with enough free time to express themselves in video format and with the kind of career prospects that do not demand anonymity when approaching taboo subjects. Two such individuals plan to stage a live debate this weekend on race and IQ, and I can only expect wreckage. YouTube comments, in particular, tend to be dominated by idiotic insults, forcing the video owners to choose between joining in the censorship or allowing endless bickering that would drive away anyone with a three-digit IQ.
Thursday, August 18, 2011
A Pretty College Girl Promotes Race Realism
The intellectual bastion of YouTube has been rocked by a shocking discovery: a pretty, young woman who believes in race. Anti-racists fumbled nervously in damage-control mode, as they attempted to persuade the YouTube community that this woman was not, in fact, attractive, her videos notwithstanding. They further explained that if too many hot women became Sarah Palin-like activists for such a paradigm, the outcome would be forced sterilizations and genocidal atrocities.
The weakness of the ten’s arguments only further evinced the fact that she was indeed a hottie unaccustomed to the rigors of needing to persuade with words. She flicked her hair as she asked races to move into separate countries, which immediately resulted in a ten-point approval bump for segregation, according to Rasmussen.
After the initial shock, anti-racists put together a list of talking points about races not being real, except for the Neandertal race, or something. Anyway, about a third through her video she tilted her head and smiled in a very feminine manner. I shall be live-blogging her interview on CNN’s Piers Morgan Tonight.
Tuesday, August 16, 2011
The Genetics of Violence
This page will serve as a permanent index of all of my writings and videos on the genetics of violence and biosocial criminology.
The Post-Racial Era is Over – January 8, 2010
Deus ex Machina Genetics – January 8, 2010
No Amore for MAOA from Maori – January 8, 2010
Epistemology & Endocrinology – January 8, 2010
The Racial Controversy of a Violent Gene – March 4, 2011
Revealed: “nooffensebut” is Walter Ward, Floridian retiree. – March 6, 2011
A Picture is Worth a Thousand Blogs – August 30, 2011
Pulling the Empty Chair on Dr. Kevin Beaver – September 6, 2011
Attack of the Warrior Gene Babies – September 13, 2011
Kill Popular Science – October 13, 2011
Blacks with Bullets Embedded in Bone – December 2, 2011
Just Say No Limit: Trayvon, Dextromethorphan, Marijuana, and MAOA – July 5, 2012
Scientists Rediscover the Violence Gene, MAOA-2R – December 30, 2012
Monoamine Oxidase A Bibliography – January 13, 2013
The Stupid Stupidity Surrounding the Warrior Gene, MAOA, is Stupid – December 8, 2013
Dr. Kevin Beaver the Apostle – December 23, 2013
The Warrior Gene, Back from the Grave – July 8, 2014
Christopher Irwin Smith is an Idiot – July 10, 2014
The Alondra Oubré Academic Fraud Exposed – August 3, 2014
Correcting the Critics of Nicholas Wade & MAOA – August 24, 2014
The Genetics of Black Violence – May 26, 2009
Part 1
Part 2
Part 3
The Post-Racial Era is Over – January 8, 2010
Deus ex Machina Genetics – January 8, 2010
No Amore for MAOA from Maori – January 8, 2010
Epistemology & Endocrinology – January 8, 2010
The Racial Controversy of a Violent Gene – March 4, 2011
Revealed: “nooffensebut” is Walter Ward, Floridian retiree. – March 6, 2011
A Picture is Worth a Thousand Blogs – August 30, 2011
Pulling the Empty Chair on Dr. Kevin Beaver – September 6, 2011
Attack of the Warrior Gene Babies – September 13, 2011
Kill Popular Science – October 13, 2011
Blacks with Bullets Embedded in Bone – December 2, 2011
Just Say No Limit: Trayvon, Dextromethorphan, Marijuana, and MAOA – July 5, 2012
Scientists Rediscover the Violence Gene, MAOA-2R – December 30, 2012
Monoamine Oxidase A Bibliography – January 13, 2013
The Stupid Stupidity Surrounding the Warrior Gene, MAOA, is Stupid – December 8, 2013
Dr. Kevin Beaver the Apostle – December 23, 2013
The Warrior Gene, Back from the Grave – July 8, 2014
Christopher Irwin Smith is an Idiot – July 10, 2014
The Alondra Oubré Academic Fraud Exposed – August 3, 2014
Correcting the Critics of Nicholas Wade & MAOA – August 24, 2014
The Genetics of Black Violence – May 26, 2009
Part 1
Part 2
Part 3
Labels:
AR,
biosocial criminology,
cortisol,
DAT1,
DRD2,
DRD4,
estradiol,
genetics,
genetics of violence,
GR,
hormones,
HTTLPR,
IQ,
Kevin Beaver,
MAOA,
psychiatry,
race,
testosterone,
Warrior Gene
Subscribe to:
Posts (Atom)