Home » Nutrition
Category Archives: Nutrition
I enjoy reading what other bloggers write about testosterone and its supposed link to crime, aggression, and prostate cancer; I used to believe some of the things they did, since I didn’t have a good understanding of the hormone nor its production in the body. However, once you understand how its produced in the body, then what others say about it will seem like bullshit—because it is. I’ve recently read a few articles on testosterone from the HBD-blog-o-sphere and, of course, they have a lot of misconceptions in them—some even using studies I have used myself on this blog to prove my point that testosterone does not cause crime!! Now, I know that most people don’t read studies that are linked, so they would take what it says on face value because, why not, there’s a cite so what he’s saying must be true, right? Wrong. I will begin with reviewing an article by someone at The Alternative Hypothesis and then review one article from Robert Lindsay on testosterone.
The Alternative Hypothesis
Faulk has great stuff here, but the one who wrote this article, Testosterone, Race, and Crime, 1) doesn’t know what he’s talking about and 2) clearly didn’t read the papers he cited. Read this article, you’ll see him make bold claims using studies I have used for my own arguments that testosterone doesn’t cause crime! Let’s take a look.
One factor which explains part of why Blacks have higher than average crime rates is testosterone. Testosterone is known to cause aggression, and Blacks are known to at once have more of it and, for genetic reasons, to be more sensitive to its effects.
- No it doesn’t.
- “Testosterone is known to cause aggression“, but that’s the thing: it’s only known that it ’causes’ aggression, it really doesn’t.
- Evidence is mixed on blacks being “… for genetic reasons … more sensitive to its effects” (Update on Androgen Receptor gene—Race/History/Evolution Notes).
Testosterone activity has been linked many times to aggression and crime. Meta-analyses show that testosterone is correlated with aggression among humans and non human animals (Book, Starzyk, and Quinsey, 2001).
Why doesn’t he say what the correlation is? It’s .14 and this study, while Archer, Graham-Kevan and Davies, (2005) reanalyzed the studies used in the previous analysis and found the correlation to be .08. This is a dishonest statement.
Women who suffer from a disease known as congenital adrenal hyperplasia are exposed to abnormally high amounts of testosterone and are abnormally aggressive.
Abnormal levels of androgens in the womb for girls with CAH are associated with aggression, while boys with and without CAH are similar in aggression/activity level (Pasterski et al, 2008), yet black women, for instance, don’t have higher levels of testosterone than white women (Mazur, 2016). CAH is just girls showing masculinized behavior; testosterone doesn’t cause the aggression (See Archer, Graham-Kevan and Davies, 2005)
Actually, no. Supraphysiological levels of testosterone administered to men (200 and 600 mg weekly) did not increase aggression or anger (Batrinos, 2012).
Finally, people in prison have higher than average rates of testosterone (Dabbs et al., 2005).
Dabbs et al don’t untangle correlation from causation. Environmental factors can explain higher testosterone levels (Mazur, 2016) in inmates, and even then, some studies show socially dominant and aggressive men have the same levels of testosterone (Ehrenkraz, Bliss, and Sheard, 1974).
Thus, testosterone seems to cause both aggression and crime.
No, it doesn’t.
Furthermore, of the studies I could find on testosterone in Africans, they have lower levels than Western men (Campbell, O’Rourke, and Lipson, 2003; Lucas and Campbell, and Ellison, 2004; Campbell, Gray, and Ellison, 2006) so, along with the studies and articles cited on testosterone, aggression, and crime, that’s another huge blow to the testosterone/crime/aggression hypothesis.
Richard et al. (2014) meta-analyzed data from 14 separate studies and found that Blacks have higher levels of free floating testosterone in their blood than Whites do.
They showed that blacks had 2.5 to 4.9 percent higher testosterone than whites, which could not explain the higher prostate cancer incidence (which meta-analyses call in to question; Sridhar et al 2010; Zagars et al 1998). That moderate amount would not be enough to cause differences in aggression either.
Exacerbating this problem even further is the fact that Blacks are more likely than Whites to have low repeat versions of the androgen receptor gene. The androgen reception (AR) gene codes for a receptor by the same name which reacts to androgenic hormones such as testosterone. This receptor is a key part of the mechanism by which testosterone has its effects throughout the body and brain.
The rest of the article talks about CAG repeats and aggressive/criminal behavior, but it seems that whites have fewer CAG repeats than blacks.
This one is much more basic, and tiring to rebut but I’ll do it anyway. Lindsay has a whole slew of articles on testosterone on his blog that show he doesn’t understand the hormone, but I’ll just talk about this one for now: Black Males and Testosterone: Evolution and Perspectives.
It was also confirmed by a recent British study (prostate cancer rates are somewhat lower in Black British men because a higher proportion of them have one White parent)
Jones and Chinegwundoh (2014) write: “Caution should be taken prior to the interpretation of these results due to a paucity of research in this area, limited accurate ethnicity data, and lack of age-specific standardisation for comparison. Cultural attitudes towards prostate cancer and health care in general may have a significant impact on these figures, combined with other clinico-pathological associations.”
This finding suggests that the factor(s) responsible for the difference in rates occurs, or first occurs, early in life. Black males are exposed to higher testosterone levels from the very start.
In a study of women in early pregnancy, Ross found that testosterone levels were 50% higher in Black women than in White women (MacIntosh 1997).
I used to believe this, but it’s much more nuanced than that. Black women don’t have higher levels of testosterone than white women (Mazur, 2016; and even then Lindsay fails to point out that this was pregnant women).
According to Ross, his findings are “very consistent with the role of androgens in prostate carcinogenesis and in explaining the racial/ethnic variations in risk” (MacIntosh 1997).
Testosterone has been hypothesized to play a role in the etiology of prostate cancer, because testosterone and its metabolite, dihydrotestosterone, are the principal trophic hormones that regulate growth and function of epithelial prostate tissue.
Testosterone doesn’t cause prostate cancer (Stattin et al, 2003; Michaud, Billups, and Partin, 2015). Diet explains any risk that may be there (Hayes et al, 1999; Gupta et al, 2009; Kheirandish and Chinegwundoh, 2011; Williams et al, 2012; Gathirua-Mingwai and Zhang, 2014). However in a small population-based study on blacks and whites from South Carolina, Sanderson et al (2017) “did not find marked differences in lifestyle factors associated with prostate cancer by race.”
Regular exercise, however, can decrease PCa incidence in black men (Moore et al, 2010). A lot of differences can be—albeit, not too largely— ameliorated by environmental interventions such as dieting and exercising.
Many studies have shown that young Black men have higher testosterone than young White men (Ellis & Nyborg 1992; Ross et al. 1992; Tsai et al. 2006).
Ellis and Nyborg (1992) found 3 percent difference. Ross et al (1992) have the same problem as Ross et al (1986), which used University students (~50) for their sample. They’re not representative of the population. Ross et al (1992) also write:
Samples were also collected between 1000 h and 1500 h to avoid confounding
by any diurnal variation in testosterone concentrations.
Testosterone levels should be measured near to 8 am. This has the same time variation too, so I don’t take this study seriously due to that confound. Assays were collected “between” the hours of 10 am and 3 pm, which means it was whenever convenient for the student. No controls on activities, nor attempting to assay at 8 am. People of any racial group could have gone at whatever time in that 5 hour time period and skew the results. Assaying “between” those times completely defeats the purpose of the study.
This advantage [the so-called testosterone advantage] then shrinks and eventually disappears at some point during the 30s (Gapstur et al., 2002).
This makes it very difficult if not impossible to explain differing behavioral variables, including higher rates of crime and aggression, in Black males over the age of 33 on the basis of elevated testosterone levels.
See above where I talk about crime/testosterone/aggression.
Critics say that more recent studies done since the early 2000’s have shown no differences between Black and White testosterone levels. Perhaps they are referring to recent studies that show lower testosterone levels in adult Blacks than in adult Whites. This was the conclusion of one recent study (Alvergne et al. 2009) which found lower T levels in Senegalese men than in Western men. But these Senegalese men were 38.3 years old on average.
Alvergne, Fauri, and Raymond (2009) show that the differences are due to environmental factors:
This study investigated the relationship between mens’ salivary T and the trade-off between mating and parenting efforts in a polygynous population of agriculturists from rural Senegal. The men’s reproductive trade-offs were evaluated by recording (1) their pair-bonding/fatherhood status and (2) their behavioral profile in the allocation of parental care and their marital status (i.e. monogamously married; polygynously married).
They also controlled for age, so his statement “But these Senegalese men were 38.3 years old on average” is useless.
These critics may also be referring to various studies by Sabine Rohrmann which show no significance difference in T levels between Black and White Americans. Age is poorly controlled for in her studies.
That is one study out of many that I reference. Rohrmann et al (2007) controlled for age. I like how he literally only says “age is poorly controlled for in her studies“, because she did control for age.
That study found that more than 25% of the samples for adults between 30 and 39 years were positive for HSV-2. It is likely that those positive samples had been set aside, thus depleting the serum bank of male donors who were not only more polygamous but also more likely to have high T levels. This sample bias was probably worse for African American participants than for Euro-American participants.
Why would they use diseased samples? Do you even think?
Young Black males have higher levels of active testosterone than European and Asian males. Asian levels are about the same as Whites, but a study in Japan with young Japanese men suggested that the Japanese had lower activity of 5-alpha reductase than did U.S. Whites and Blacks (Ross et al 1992). This enzyme metabolizes testosterone into dihydrotestosterone, or DHT, which is at least eight to 10 times more potent than testosterone. So effectively, Asians have the lower testosterone levels than Blacks and Whites. In addition, androgen receptor sensitivity is highest in Black men, intermediate in Whites and lowest in Asians.
Ethnicmuse also showed that, contrary to popular belief, Asians have higher levels of testosterone than Africans who have higher levels of testosterone than Caucasians in his meta-analysis. (Here is his data.)
Let us look at one study (Ross et al 1986) to see what the findings of a typical study looking for testosterone differences between races shows us. This study gives the results of assays of circulating steroid hormone levels in white and black college students in Los Angeles, CA. Mean testosterone levels in Blacks were 19% higher than in Whites, and free testosterone levels were 21% higher. Both these differences were statistically significant.
Assay times between 10 am and 3 pm, unrepresentative sample of college men, didn’t have control for waist circumference. Horribly study.
A 15% difference in circulating testosterone levels could readily explain a twofold difference in prostate cancer risk.
No, it wouldn’t (if it were true).
Higher testosterone levels are linked to violent behavior.
Causation not untangled.
Studies suggest that high testosterone lowers IQ (Ostatnikova et al 2007). Other findings suggest that increased androgen receptor sensitivity and higher sperm counts (markers for increased testosterone) are negatively correlated with intelligence when measured by speed of neuronal transmission and hence general intelligence (g) in a trade-off fashion (Manning 2007).
Who cares about correlations? Causes matter more. High testosterone doesn’t lower IQ. Racial differences in testosterone are tiring to talk about now, but there are still a few more articles I need to rebut.
Racial differences in testosterone don’t exist/are extremely small in magnitude (as I’ve covered countless times). The one article from TAH literally misrepresents studies/leaves out important figures in the testosterone differences between the two races to push a certain agenda. Though if you read the studies you see something completely different. It’s the same with Lindsay. He misunderstood a few studies to push his agenda about testosterone and crime and prostate cancer. They’re both wrong, though.
Racial differences in testosterone are tiring to talk about now, but there are still a few more articles I need to rebut. People read and write about things they don’t understand, which is the cause of these misconceptions with the hormone, as well as, of course, misinterpreting studies. Learn about the hormone and you won’t fear it. It doesn’t cause crime, prostate cancer nor aggression; these people who write these articles have one idea in their head and they just go for it. They don’t understand the intricacies of the endocrine system and how sensitive it is to environmental influence. I will cover more articles that others have written on testosterone and aggression to point out what they got wrong.
In part II, we will look at the mental gymnastics of someone who is clueless to the data and uses whatever mental gymnastics possible in order to deny the data. Well, shit doesn’t work like that, JayMan. I will review yet more studies on sitting, walking and dieting on mortality as well as behavioral therapy (BT) in regards to obesity. JayMan has removed two of my comments so I assume the discussion is over. Good thing I have a blog so I can respond here; censorship is never cool. JayMan pushes very dangerous things and they need to be nipped in the bud before someone takes this ‘advice’ who could really benefit from lifestyle alterations. Stop giving nutrition advice without credentials! It’s that simple.
JayMan published a new article on ‘The Five Laws of Behavioral Genetics‘ with this little blip:
Indeed, we see this with health and lifestyle: people who exercise more have fewer/later health problems and live longer, so naturally conventional wisdom interprets this to mean that exercise leads to health and longer life, when in reality healthy people are driven to exercise and have better health due to their genes.
So, in JayMan’s world diet and exercise have no substantial impact on health, quality of life and longevity? Too bad the data says otherwise. Take this example:
Take two twins. Lock both of them in a metabolic chamber. Monitor them over their lives and they do not leave the chamber. They are fed different diets (one has a high-carb diet full of processed foods, the other a healthy diet for whatever activity he does); one exercises vigorously/strength trains (not on the same day though!) while the other does nothing and the twin who exercises and eats well doesn’t sit as often as the twin who eats a garbage diet and doesn’t exercise. What will happen?
Jayman then shows me Bouchard et al, (1990) in which a dozen pairs of twins were overfed for three months with each set of twins showing different gains in weight despite being fed the same amount of kcal. He also links to Bouchard et al, 1996 (can’t find the paper; the link on his site is dead) which shows that the twins returned to their pre-experiment weight almost effortlessly. This, of course, I do not deny.
This actually replicates a study done on prisoners in a Vermont prison (Salans, Horton, and Sims, 1971). “The astonishing overeating paradox” is something that’s well worth a look in to. Salans et al had prisoners overeat and also limited their physical activity. They started eating 4000 kcal per day and by the end of the study they were eating about 10000 kcal per day. But something weird happened: their metabolisms revved up by 50 percent in an attempt to get rid of the excess weight. After the study, the prisoners effortlessly returned to their pre-experiment weight—just like the twins in Bouchard et al’s studies.
The finding is nothing new but it’s nice to have replication (on top of the replication that it already had), but that’s not what I was talking about. Of course, being sedentary, eating like shit and not exercising will lead to deleterious health outcomes. The fact of the matter is, the twin in my thought experiment that did not exercise, sat around all day and ate whatever would die way sooner, have a lower quality of life, and more deleterious disease due to the shitty diet while his co-twin would have less since he ate right, exercised and spent less time sitting.
JayMan says, in regards to studies that show that obese people that even do light physical activity show lower all-cause mortality, that “That’s not what large RCTs show.” I know the study that he’s speaking of—the Look AHEAD study (Action for Health and Diabetes) (The Look AHEAD Research Group, 2009). The research group studied the effects of lifestyle interventions in type II diabetics. For one of the groups they gave intensive diet and exercise information, the other they gave only the standard advice. However, the study ended early at 9.3 years because there was no difference between both groups (Pi-Sunyer, 2015). JayMan uses this study as evidence that diet and exercise have no effect on the mortality of type II diabetics; however, in actuality, the results are much more nuanced.
Annuzzi et al (2014) write in their article The results of Look AHEAD do not row against the implementation of lifestyle changes in patients with type 2 diabetes:
The intervention aimed at weight loss by reducing fat calories, and using meal replacements and, eventually, orlistat, likely underemphasizing dietary composition. There is suggestive evidence, in fact, that qualitative changes in dietary composition aiming at higher consumption of foods rich in fiber and with a high vegetable/animal fat ratio favorably influence CV risk in T2D patients.
In conclusion, the Look AHEAD showed substantial health benefits of lifestyle modifications. Prevention of CV events may need higher attention to dietary composition, contributing to stricter control of CV risk factors. As a better health-related quality of life in people with diabetes is an important driver of our clinical decisions, efforts on early implementation of behavioral changes through a multifactorial approach are strongly justified.
They reduced far calories and used meal replacements. This is the trial JayMan is hedging his assertion on. Type II diabetics need a higher fat diet and don’t need the carbs as it will spike their insulin. Eating a higher fat diet will also lower the rate of CVD as well. This trial wasn’t too vigorous in terms of macronutrient composition. This is one of many reasons why type II diabetics discard dieting and exercise just yet.
Even modest weight loss of 5 to 10 percent is associated with significant improvements in cardiovascular disease (CVD) after one year, with larger weight loss showing better improvement (Wing et al, 2011). (Also read the article The Spinning of Look AHEAD.)
Telling diabetics not to eat right and exercise is, clearly, a recipe for disaster. This canard that dieting/exercise doesn’t work to decrease all-cause mortality—especially for diabetics and others who need the lifestyle interventions—is dangerous and a recipe for disaster.
Intentional weight loss needs to be separated from intentional weight loss as to better study the effects of both variables. Kritchevsky et al (2015) meta-analyzed 15 RCTs that “reported mortality data either as an endpoint or as an adverse event, including study designs where participants were randomized to weight loss or non-weight loss, or weight loss plus a co-intervention (e.g. weight loss plus exercise) or the weight stable co-intervention (i.e. exercise alone).” They conclude that the risk for all-cause mortality in obese people who intentionally lose weight is 15 percent lower than people not assigned to lose weight.
This study replicates a meta-analysis by Harrington, Gibson, and Cottrell (2009) on the benefits of weight loss and all-cause mortality. They noted that in unhealthy adults, weight loss accounted for a 13 percent decrease in all-cause mortality increase while in the obese this accounted for a 16 percent decrease. Of course, since the weights were self-reported and there are problems with self-reports of weight (Mann et al, 2007), then that is something that a skeptic can rightfully bring up. However, it would not be a problem since this would imply that they weighed the same/gained more weight yet had a decrease in all-cause mortality.
Even light physical activity is associated with a decrease in all-cause mortality. People who go from light activity, 2.5 hours a week of moderate physical intensity compared to no activity, show a 19 percent decrease in all-cause mortality while people who did 7 hours a week of moderate activity showed a 24 percent decrease in all-cause mortality (Woodcock et al, 2011). Even something as simple as walking is associated with lower incidence of all-cause mortality, with the largest effect being seen in individuals who went from no activity to light walking. Walking is inversely associated with disease incidence (Harner and Chida, 2008) but their analysis indicated publication bias so further study is needed. Nevertheless, the results line up with what is already known—that low-to-moderate exercise is associated with lower all-cause mortality (as seen in Woodcock et al, 2011).
What is needed to change habits/behavior is behavioral therapy (BT) (Jacob and Isaac, 2012; Buttren, Webb, and Waddren, 2012; Wilfley, Kolko, and Kaas, 2012; ). BT can also be used to increase adherence to exercise (Grave et al, 2011). BT has been shown to have great outcomes in the behaviors of obese people, and even if no weight loss/5-10 percent weight loss is seen (from Wing and Hill, 2001), better habits can be developed, and along with ‘training’ hunger hormones with lifestyle changes such as fasting, people can achieve better health and longevity—despite what naysayers may say. Though I am aware that outside of clinics/facilities, BT does not have a good track record (Foster, Makris, and Bailer, 2005). However, BT is the most studied and effective intervention in managing obesity at present (Levy et al, 2007). This is why people need to join gyms and exercise around people—they will get encouragement and can talk to others about their difficulties. Though, people like JayMan who have no personal experience doing this would not understand this.
In regards to dieting, the effect of macronutrient composition on blood markers is well known. Type II diabetics need to eat a certain diet to manage their insulin/blood sugar, and doing the opposite of those recommendations will lead to disaster.
Low-carb ketogenic diets are best for type II diabetics. There are benefits to having ketones circulating in the blood, which include (but are not limited to): weight loss, improved HbA1c levels, reduced rate of kidney disease/damage, cardiac benefits, reversing non-alcoholic fatty liver, elevated insulin, and abnormal levels of cholesterol in the blood (Westman et al, 2008; Azar, Beydoun, and Albadri, 2016; Noakes and Windt, 2016; Saslow et al, 2017). These benefits, of course, carry over to the general non-diabetic population as well.
Of course, JayMan has reservations about these studies wanting to see follow-ups—but the fact of the matter is this: dieting and eating right is associated with good blood markers, exactly what type II diabetics want. In regards to food cravings, read this relevant article by Dr. Jason Fung: Food Cravings. Contrary to JayMan’s beliefs, it’s 100 percent possible to manage food cravings and hunger. The hormone ghrelin mediates hunger. There are variations in ghrelin every day (Natalucci et al, 2005) and so if you’re feeling hungry if you wait a bit it will pass. This study lines up with most people’s personal experience in regards to hunger. One would have to have an understanding of how the brain regulates appetite to know this, though.
JayMan also cannot answer simple yes or no questions such as: Are you saying that people should not watch what they eat and should not make an effort to eat higher-quality foods? I don’t know why he is so anti-physical activity. As if it’s so bad to get up, stop sitting so much and do some exercise! People with more muscle mass and higher strength levels live longer (Ruiz et al, 2008). This anti-physical activity crusade makes absolutely no sense at all given the data. If I were to stop eating well and strength training, along with becoming a couch potato, would my chance of dying early from a slew of maladies decrease? Anyone who uses basic logic would be able to infer that the answer is yes.
I also need to address JayMan’s last comment to me which he censored:
No intervention shows that lifestyle changes extend life – or even improve health. Even if they did, their generalizability would depend on their actual prescription. In any case, the point is moot, since they don’t even show such improvements in the first place.
You’re only saying that because you’re literally hand waving away data. It’s clear that going from no exercise to some exercise will decrease all-cause mortality. I’m sorry that you have a problem reading and understanding things that you don’t agree with, but this is reality. You don’t get to construct your own reality using cherry-picked studies that don’t mean what you think they mean (like Look AHEAD; Dr. Sharma states that we may never know if weight reduction can save lives in type II diabetics, however the three studies on low-carb diets cited above lend credence to the idea that we can).
Please see my previously linked Obesity Facts page for more. Once you’ve read that, get back to me. Until then, I’m putting the brakes on this discussion.
Of course, you’re putting the brakes on this discussion, you have substantial replies other than your one-liners. You need to censor people when you have no substantial response, that’s not intellectually honest.
All in all, JayMan is giving very dangerous ‘advice’, when the literature says otherwise in regards to lifestyle interventions and all-cause mortality. You can talk about genes for this or that all you want; you’re just appealing to genes. Light physical exercise shows that mortality risk can be decreased; that’s not too hard for most people.
I know JayMan talks about genes for this and that, yet he does not understand that obesogenic environments drive this epidemic (Lake and Townshend, 2006; Powell, Spears, and Rebori, 2011; Fisberg et al, 2016). He doesn’t seem to know about the food reward hypothesis of obesity either. Think about obesogenic environments and food reward and how our brains change when we eat sugar and then things will begin to become clearer.
JayMan is giving out deadly ‘advice’, again, without the correct credentials. Clearly, as seen in both of my responses to him, taking that ‘advice’ will lead to lower quality of life and lower life expectancy. But I’m sure my readers are smart enough to not listen to such ‘advice’.
(Note: Diet and exercise under Doctor’s supervision only)
by Scott Jameson
An earlier post established that Omega 3 fatty acids are an important nutrient of which hardly anybody is getting enough, and that this deficiency is making us Westerners a bit dumber and a bit crazier on the whole. Description sometimes obligates prescription, so this post is where I spitball about possible solutions, and welcome you to join me in the comments.
I’m reminded of the gubbermint mandating the lacing of our salt with sorely needed iodine, or the enrichment of white flour with nutrients lost in the removal of the bran and germ as well as the bleaching of the endosperm.
For yourself and your family, fish oil pills are fine- kelp oil if you’re one of those people. But we need solutions that work for nearly everybody, and the brilliance of the examples I listed above is that everybody eats that stuff (bread and salt) and now it’s laced with nutrients they’ve been needing. So we could produce N3s at low cost and legally mandate that certain foods contain them. Chia is a promising source: it’s cheap, it’s loaded with N3s, and it doesn’t taste fishy. Flax also works, but it’s loaded with phytoestrogens. Tons of seeds have those, I think Chia as well, but I’ve been told (incorrectly?) that flax is a particularly bad offender. Anyway, how do we load Chia or a similar seed into people’s diets?
Omega 3 eggs are one way to go. Chickens metabolize plant ALA (such as from chia or flax) into DHA and store both in their eggs. Just as we made use of the auroch and tarpan’s efforts to have a brain, we can hijack the chicken’s futile attempts to provide brain-material for her nonexistent offspring, using her eggs as a vehicle to get N3s into ourselves. It’s as simple as a mandate that a certain percentage of all chicken feed must be N3 rich seeds and/or insects.
Another obvious place to look is the plant oils that go into our food. Check out the table on the Wikipedia page for ALA: soybean and rapeseed oils have a pathetic showing for ALA content, and they’re put in absolutely everything. The State mandates that all gas will be a bit ethanol: why not all soybean and canola be 10, 20% Chia or some comparably high ALA crop?
It’s worth pointing out that you can genetically modify ALA rich vegetable oil to be on a quality closer to par with fish oil, having some of the ALA converted into the more useful EPA. Forget any concerns about GMOs you may have because the oils I’m talking about lacing with GMOs are already themselves GMOs.
We also must mandate that all infant formula be laced with N3s: EPA and DHA in particular, and tested for stuff like mercury if it comes from fish. You probably know at least one person who is autistic because they were bottlefed.
Comment your potential solutions below. I want to hear them. Double points for anyone playing the game on hard mode: free market solutions (libertarians) or animal-free solutions (vegans). If you try both, you’re a masochist and you need help.
Homo Neanderthalis vs. Homo Sapiens Sapiens: Who is Stronger? Implications for Racial Strength Differences
Unfortunately, soft tissue does not fossilize (which is a problem for facial reconstructions of hominins; Stephan and Henneberg, 2001; I will cover the recent ‘reconstructions’ of Neanderthals and Nariokotome boy soon). So saying that Neanderthals had X percent of Y fiber type is only conjecture. However, to make inferences on who was stronger, I do not need such data. I only need to look at the morphology of the Neanderthals and Homo sapiens, and from there, inferences can be made as to who was stronger. I will argue that Neanderthals were stronger which is, of course, backed by solid data.
Neanderthals had wider pelves than Homo sapiens. Wider pelves in colder climes are due to adaptations. Although Neanderthals had wider pelves than ours, they had infants around the same size as Homo sapiens, which implies that Neanderthals had the same obstetric difficulties that we do. Neanderthals also had a pelvis that was similar to Heidelbergensis, however, most of the pelvic differences Neanderthals had that were thought to be derived traits are, in fact, ancestral traits—except for the cross-sectional shape of the pubic ramus (Gruss and Schmidt, 2015). Since Neanderthals had wider pelves and most of their pelvis were ancestral traits, then wide pelves may have been a trait of ancestral Homo (Trinkaus, Holliday, and Aurbach, 2014).
Hominins do need wider pelves in colder climates, as it is good for heat retention, however (see East Asians and Northern Europeans). Also, keep in mind that Neanderthals were shorter than us—with the men averaging around 5 feet five inches, and the women averaging about 5 feet, about 5.1 inches shorter than post-WW II Europeans (Helmuth, 1998).
So what does a wider pelvis mean? Since the Neanderthals were shorter than us and also had a wider pelvis, they had a lower center of gravity in comparison to us. Homo sapiens who came Out of Africa, had a narrower pelvis since narrow pelves are better to dissipate heat (Gruss and Schmidt, 2015). Homo sapiens would have been better adapted to endurance running and athleticism, in comparison to the wide-pelved Neanderthals.
People from tropical climates have longer limbs, and are tall and narrow (which is also good for endurance running/sprinting) while people from colder climates are shorter and more ‘compact’ (Lieberman, 2015: 113-114) with a wide pelvis for heat retention (Gruss and Schmidt, 2015). So, clearly, due to the differences in pelvic anatomy between Homo sapiens and Neanderthals,
Furthermore, due to the length of Neanderthal clavicles, it was thought that they had long clavicles which would have impeded strength. However, when the clavicles were reanalyzed it was discovered that when the clavicles were adjusted with the body size of Neanderthals—and not compared with the humeral lengths—Neanderthals had a similar clavicular length, which implies a similar shoulder breadth as well, to Homo sapiens (Trinkaus, Holliday, and Aurbach, 2014). This is another clue that Neanderthals were stronger.
Yet more evidence comes from comparing the bone density of Neanderthal bones to that of Homo sapiens. Denser bones would imply that the body would be able to handle a heavier load, and thusly generate more power. In adolescent humans, muscle power predicts bone strength (Janz et al, 2016). So if the same holds true for Neanderthals—and I don’t see why not—then Neanderthals would have higher muscle power since it predicts bone strength.
Given the “heavy musculature” of Neanderthals, along with high bone robusticity, then they must have had denser bones than Homo sapiens (Friedlander and Jordan, 1994). So since Neanderthals had denser bones, then they had higher muscle power; they had a lower center of gravity due to having a wider pelvis and being shorter than Homo sapiens whose body was heat-adapted. Putting this all together, the picture is now becoming clearer that Neanderthals were, in fact, way stronger than Homo sapiens.
Another cause for these anatomical differences between Neanderthals and Homo sapiens is completely independent of cold weather. Neanderthals had an enlarged thorax (rib cage), which evolved to hold an enlarged liver, which is responsible for metabolizing large amounts of protein. Since protein has the highest thermic effect of food (TEF), then they would have had a higher metabolism due to a higher protein diet which would also have resulted in an enlarged bladder and kidneys which are necessary to remove urea, which possibly would have also contributed to a wider pelvis for Neanderthals (Ben-Dor, Gopher, and Barkai, 2016).
During glacial winters, Neanderthals would have consumed 74-85 percent of their calories from fat, with the rest coming from protein (Ben-Dor, Gopher, and Barkai, 2016). Neanderthals also consumed around 3,360-4,480 kcal per day (Steegman, Cerny, and Holliday, 2002). Let’s assume that Neanderthals averaged 3800 kcal per day. Since the upper limit of protein intake is 3.9 g/bw/day (erectus) and 4.0 g/bw/day for Homo sapiens (Ben-Dor et al, 2011), then Neanderthals would have had a theoretical higher upper limit due to having larger organs, which are useful in processing large amounts of protein. The protein intake for a Neanderthal male was between estimated to be between 985 kcal (low end) to 1170 kcal (high end). It was estimated that Neanderthal males had a protein intake of about 292 grams per day, or 1,170 kcal (Ben-Dor, Gopher, and Holliday, 2016: 370).
Assuming that Neanderthals did not eat carbohydrates during glacial winters (and even if a small amount were eaten, the model would not be affected) and an upper limit of protein intake of 300 grams per day for Neanderthal males, this implies that 74-85 percent of their diet came from animal fat—the rest being protein. Protein is the most thermogenic macro (Corvelli et al, 1997; Eisenstein et al, 2002; Buchholz and Schoeller, 2004; Halton and Hu, 2004; Gillingham et al, 2007; Binns, Grey, and Di Brezzo, 2014). So since Neanderthals ate a large amount of protein, along with their daily activities, they had to have had a high metabolic rate.
To put into perspective how much protein Neanderthals ate, the average American man eats about 100 grams of protein per day. In an analysis of the protein intake of Americans from 2003-2004, it was found that young children ate about 56 grams of protein per day, adults aged 19-30 ate about 91 grams of protein per day, and the elderly ate about 56 grams of protein per day (Fulgoni, 2008). Neanderthals ate about 3 times the amount of protein than we do, which would lead to organ enlargement since larger organs are needed to metabolize said protein as well. Another factor in the increase of metabolism for Neanderthals was the fact that it was, largely, extremely cold. Shivering increases metabolism (Tikuisis, Bell, and Jacobs, 1985; van Ooijen et al, 2005). So the Neanderthal metabolism would have been revved up close to a theoretical maximum capacity.
The high protein intake of Neanderthals is important because high amounts of protein are needed to build muscle. Neanderthals consumed a sufficient amount of kcal, along with 300 grams of protein per day on average for a Neanderthal male, which would have given Neanderthals yet another strength advantage.
I am also assuming that Neanderthals had slow twitch muscle fibers since they have wider pelves, along with evolving in higher latitudes (see Kenyans, East Asians, European muscle fiber distribution), they would have an abundance of type slow twitch muscle fibers, in comparison to fast twitch muscle fibers, however, they also have more slow twitch fibers which Europeans have, while African-Americans (West-African descendants) have a higher amount of fast twitch fibers. (Caesar and Henry, 2015). So now, thinking of everything I explained above and replacing Neanderthals with Europeans and Homo sapiens with Africans, who do you think would be stronger? Clearly, Europeans, which is what I have argued for extensively. African morphology (tall, lanky, high limb ratio) is not conducive to strength; whereas European morphology (wide pelvis, low limb ratio, an abundance of slow twitch fibers) is.
The implications for these anatomic differences between Neanderthals and Homo sapiens and how it translates into racial differences will be explored more in the future. This was just to lay the anatomic and morphologic groundwork in regards to strength and cold weather adaptations. Nevertheless, the evidence that Neanderthals were stronger/more powerful than Europeans stands on solid ground, and the same does hold for the differences in strength between Africans and Europeans. The evolution of racial pelvic variation is extremely important to understand if you want to understand racial differences in sports.
by Scott Jameson
I’ve been thinking about Omega 3 fatty acids (N3s) recently. We’re clearly adapted to getting more of them than we’re actually getting.
Children whose mothers took fish oil (chock full of N3s EPA and DHA) during pregnancy have higher coordination, and they are smarter, although that difference may not persist later in development. RaceRealist has written about N3s and PISA math scores before.
Here’s a paper summarizing many of the known benefits of fish oil supplementation. Goes over some of the aforementioned results regarding kids, and also shows random improvements such as helping people with Alzheimer’s maintain weight.
A lack of N3s is associated with depression, coronary artery disease, maybe even autism. N3s are known to lessen autism symptoms. This is probably an immune thing. Sulforaphane, an anti-inflammatory agent, is known to reduce autism symptoms, and other research has elucidated the relationship between N3s and the immune system.
This is delicious fodder for a post of its own. There’s a mass of papers about the relationship between autism and the immune system; N3s being neuroprotective and correlating negatively with likelihood of autism/severity of autism symptoms vindicates the idea that autism happens when your immune system cooks your brain (neuroinflammation). You would expect males to be whacked harder by this because they’re not good at producing the important N3 docosahexaenoic acid (DHA). Thus, males ought to have higher autism rates- and they do.
Anyway, here’s what I’m getting at. Either N3s are a counter-intuitive miracle drug, or they’re just an important nutrient of which many of us do not get enough. The latter, I should think! The lemonade that can and will be made from these lemons is that widespread N3 deficiency gives us an opportunity to understand one of the ways that a human brain can get messed up. But we still want to fix the problem- more on that in my next post.
Our ancestors probably got more N3s than we do. They’ve been estimated to, anyway. (Second study mentioning our ancestors’ higher N3 intake.) If they didn’t, they might’ve been selected for better processing of alpha linoleic acid into eicosapentaenoic acid and subsequently docosahexaenoic acid, which real-life humans aren’t great at. As it stands, humans need either a whole bunch of excess ALA to convert to EPA and DHA, or we could have EPA and DHA straight. The excess ALA idea probably isn’t something you can rely on for your N3 needs, but certainly ALAs are better than nothing.
Seeds would’ve been a part of the diet back in the Paleolithic. Flax, an example of a seed rich in ALA, was known to humans quite a darned while ago, so seeds might have been a source of these critical nutrients. It’s worth noting that the first paleolithic diet estimation study linked in the previous paragraph pegs hunter gatherers as getting way more ALA than modern folks do.
Of course, many populations would’ve lived near oceans or at least rivers and lakes, where they could’ve gotten N3s from fish and seafood.
Some of it must have come from insects, themselves actually having a decent amount of N3s.
I’ve got an even weirder guess, though: bioaccumulation in land animal tissue. This possibility must have been important for peoples like the Botai culture with extremely narrow (Tarpan-based) diets, which likely didn’t include enough seeds, insects, and/or seafood. Animals can’t produce N3s of their own, but they eat ALA, convert it into EPA and DHA, and send a lot of it up to the brain, where it’s needed, with the result being that the brain has a lot more and higher quality N3s than any of the plants from which the N3s were derived.
Grog the caveman could’ve gotten a bunch of N3s from his favorite treat: scrambled auroch brains. Many of his progeny carry the practice into the present day, to the disgust of other living humans. Being that humans and our relatives are disgusting in general, Neanderthals occasionally even ate each other’s brains, and according to CNN, they’re still at it!
Nonhuman brains had to be the more popular option, then as now; mentioning the whole cannibalism thing is a bit of a non sequitur I shoved in for sheer entertainment value. But I suspect that there are several reasons we Westerners don’t get enough N3s: we don’t eat brains anymore, we don’t eat bugs anymore, and we don’t get as many ALA-rich plant oils as we used to. Perhaps we’re not getting enough fish, or perhaps the fish we’re getting are less likely to be fatty (e.g. salmonids, tuna, schooling fish) and more likely to be lean (cod, pollock).
Somehow or other we ought to get N3s back in our diet. Could save some kids from autism or, failing that, improve their prognosis. Again, it could increase our math scores, too, particularly in women. Closing part of the gap with other nations and between our sexes- two birds with one stone for our educators- and a boon for our engineering departments as well. Which is a boon for everyone.
The stakes are high. Who can save us from our collective state of starved brains? I’ll post my ideas soon.
Nutritional myths run amok everywhere. One of the most persistent is that ‘a calorie is a calorie’, that is, every macronutrient will be processed the same by the body. Another assumption is that the body doesn’t ‘care’ about where the calories come from—they can come from fat, protein, or carbs and the response will be the same: bodyweight will be reduced until one reaches their goal. However, it’s not as simple as that. He also has the assumption that “diets work”, when the best meta-analysis I know of on the matter shows the opposite (Mann et al, 2007, see especially table 1). They control for studies where weight was self-reported. They conclude that dieting does not work. This is what, as Heartiste says, “iScience!” says on the matter, so he should believe everything I state in this article, which is backed by “iScience!”.
Chateau Heartiste published an article back in 2010 titled The Twinkies Diet Proves Fatty Fats Are Fat Because They Eat Too Much. He is referring to professor of human nutrition Mark Haub and his success on ‘the twinkies diet’, where 2/3rds of his caloric intake came from junk food such as Twinkies. He lost 27 pounds in a two month period while his LDL cholesterol decreased by 20 percent and his HDL cholesterol increased by 20 percent. His level of triglycerides also decreased by 37 percent, with his body fat decreasing from 33.4 to 24.9 percent. So he ate 1800 kcal per day—2/3rds of it being junk food—for two months and lost 27 pounds. Case closed, right? Eat junk food at a deficit and lose weight? A calorie is a calorie? There are a few problems with this contention which will be addressed below.
Big bottom line: Being fat itself is bad for your health. “Fat and fit” is a myth. The change that counts the most is losing the weight, which can only be done by PUSHING AWAY FROM THE TABLE.
Except fit and overweight and obese individuals have similar mortality rates than their normal weight counterparts (Barry et al, 2014). However, more recently a study was published purporting that overweight and obese individuals being healthy despite excess weight is a myth. The researchers state that in a sample of millions of Britons that overweight and obese individuals had a higher risk of heart disease than their normal-weight counterparts. Unfortunately, I cannot locate the study since it wasn’t published in a journal (and thusly not peer reviewed). I wonder if variables such as diet, smoking and other lifestyle factors were taken into account. Nevertheless, the debate on fitness and fatness continues.
Another large meta-analysis shows that grade 1 obesity (BMI 30->35) had the same mortality risk as normal-weight individuals with grade 2 obese (BMI +35) having a significantly higher risk of death (Flegal, Kit, and Orpana, 2013).
Heartiste claims that ‘a calorie is a calorie’. This is a common fallacy. This suggests that the body will process all foods the same way—that is, processing them the same metabolically. This, however, is not the case. Haub himself is a sample size of 1. If Heartiste can use a sample size of 1 to make a claim, then I can too.
Sam Feltham ate +5,000 kcal per day for 21 days and only gained 1.3 kg when he should have gained 7.3 kg based on the amount of kcal he ate. A calorie is a calorie, right? This is a fallacious statement. The statement “a calorie is a calorie” violates the second law of thermodynamics (Feinman and Fine, 2004). Heartiste writes:
That first law of thermodynamics looms large over everything.
The first law of thermodynamics is irrelevant to human physiology. It only states that an organism gets bigger if it consumes more energy; it doesn’t state why this occurs, which is due to the hormone insulin which causes weight gain.
He does rightly state that an omega 3/6 imbalance is part of the reason but then handwaves it away. Western-like high-fat diets (i.e., diets with an imbalance of linoleic acids (LA; and n-6 fatty acid) with n-3) are sufficient enough to induce gradual enhancement in fat mass across the generations (Massiera et al, 2010). This obviously includes the average 55 percent carbohydrate diet that the AHA recommends (Eckel et al, 2014). The Standard American Diet (aptly named the “SAD diet”) has the n-3/n-6 imbalance along with being high in carbohydrates which spike insulin which impedes fat being unlocked from the adipocyte.
Heartiste doesn’t understand that if you reduce the ‘in’, the ‘out’ also decreases. This was noted in the famous starvation experiment headed by Ancel Keys. They took 36 healthy men who ate normally for three months while being their behavior and personality was monitored. In the next six months, they were reduced to eating half of their initial intake (they started at 2000 kcal and dropped to 1000 kcal; some individuals going lower than that) and their metabolic rate decreased by 40 percent (Keys et al, 1945). This is proof for the contention that the body decreases its metabolic rate due to what is ingested. A similar study was done on Vermont prisoners, except they were told to gorge on food. Since they were in a controlled setting, the prisoners could be monitored to ensure they ate all of the food.
At the end of the study, their metabolic rates had increased by 50 percent. This is evidence that the body was trying to get back to its original weight. In six months, the prisoners went back to their normal weight as they ate normally (Salas, Horton, and Sims, 1971) One man only gained ten pounds eating all of those calories. Clearly, the body was resisting weight gain and when they were allowed to eat normally, they effortlessly regained their normal weights.
Finally, on the topic of Haub, Big Food shill, I will address a few things about him and his ‘research’ that recently came to light.
Intermittent fasting and obesity expert Dr. Jason Fung showed that in 2016 after Coca-Cola released their funding reports after criticisms of transparency, Mark Haub was found to be one of the many researchers that were backed by Coca-Cola. This is an attempt to show that ‘a calorie is a calorie’ and that ‘all calories are created equal’. This has been rebutted above.
In 2016—six years after his ‘experiment—it was revealed that he was funded by Coca-Cola. No doubt in order to ‘prove’ that ‘a calorie is a calorie’ and have people continue to gorge on high carbohydrate/insulinogenic foods. However, the human body is a lot more complex than to just reduce it to simply calories in and calories out—which I have written about in depth.
People like Heartiste need to get an actual understanding of the literature and what Coca-Cola has been trying to do for years, which is to make eating junk food ‘OK’ because ‘it doesn’t cause obesity’. Children consume 45 percent more food when exposed to advertisements (Harris, Bargh, and Brownell, 2009). So to begin to curb obesity rates we don’t need to ‘eat junk food’, we need to not eat junk food and eat a diet more ancestral to us—that is, one lower in processed carbs and higher in animal fat and protein. Big Food shills like Haub need to be exposed for what they are—people who do ‘research’ for a quick buck, i.e., not furthering our understanding of a complex issue as he would like you to believe. Exercise also doesn’t induce weight loss. So the claims of ‘eat less and move more’ (eat less according to the 55 percent carbohydrate recommendations) is bound to fail.
If Heartiste can make a claim using one man as an example then so can I. Read the above article by Sam Feltham in which he writes about hs experience eating 5,000 kcal per day for 21 days while only gaining 1.3 kg. I can use this example to say that eating low carb and high fat at 5,000 kcal per day will lead to negligible weight gain, however, I don’t use n=1 sample sizes to make claims and no one else should either.
by Scott Jameson
RaceRealist and I have been ruminating on a lot of stuff lately. Here’s a fun one: what economic system works best relative to what we know about human health? In my mind there are two approaches: the libertarian approach, and quasi-fascism.
In the libertarian approach, there’s no regulation of sugar placed in our food. That’s already the case. But here’s an improvement: you don’t have to pay for anyone’s gastric bypass after they overeat that sugar.
In the fascist approach, there is regulation of sugar, because a fascist state does not allow people to poison each other for profit. You still have to pay for others’ medical expenses, but those expenses will be lower.
Here’s an advantage to the libertarian approach. In that society, the people who stuff their faces and refuse to get off the couch- who are dumber and lazier on average, probably- will have a higher mortality rate on average. Eugenics need not cost a dime.
But you run into a snag, sand in the gears of your hands-off system, when Big Food kicks out a whole bunch of crappy dietary advice, at which point a minority of reasonably intelligent people will be led astray, perhaps to the grave. How could a libertarian society stop that from taking place? Would it even bother? Could the system broadly work in spite of this snag?
A libertarian society doesn’t pay for idiots to have children. That’s good, but half of your population (women) are unlikely to ever support it. Women don’t do libertarianism; observe Rand Paul’s demographic Achilles Heel on page 25. When women asked men what to do about so-and-so’s eighth unpaid for child, we’d have to look them in the eyes and give a deadpan “let’s hope private charity can handle it.” There was a time, before FDR, when women would’ve accepted that answer. They were still in the kitchen back then, and I don’t know how to put them back there.
A fascist society has more hands-on eugenics, possibly genome editing or embryo selection. Also good. Expensive, but obviously worth it.
We welcome your input on these issues.
As an aside, White men are well-known as the most conservative, small government, nationalist group out there in our current political atmosphere. I always hear people spewing the schmaltziest nonsense about the values of the Founding Fathers. They were, relative to our political compass, nationalist libertarians. Accordingly, modern nationalists and libertarians do best with the exact same demographics that used to vote on candidates back then: property-owning White men. The sole reason that Ron and Rand Paul couldn’t get elected is that they are too similar to the Founding Fathers. Any other candidate who blathers on about the Founding values is simply a liar, and their obvious lies show a disrespect of your intelligence.
If you’re a libertarian, but not an ethno-nationalistic and patriarchal thinker, then you simply haven’t gotten the memo: women and minorities do not want to create the same world that you do, nor will they ever. Evolution gave us women who want social safety nets and other races which are better off if they parasitize off of your tax dollars. All of the most libertarian societies that ever existed (early US, ancient Athens, Roman Republic) were entirely run by White men, and adding women to the electorate gave us the welfare state. Aristophanes was right.
We’re also ruminating on the difference between IQ and expertise. I know of no mentally complicated task of which one can be a master without being intelligent. Take the IQs of chess grandmasters and you will find no morons.
Contrast that with purely physical activities. I bet you there are some really stupid people out there who are great at dancing for example. A prodigiously capable cerebellum may not predict an equally capable frontal lobe.
Discounting tasks which exclusively require things like simple physical coordination, muscle memory, etc, I ought to think that IQ is the biggest component of expertise.
By Scott Jameson
I’ve been active in the blogosphere for around 24 hours now and I’ve already gotten a negative response from someone who happens to be wrong. That’s a win in my book.
The argument we’re having is, as best I can tell, why some populations out there just don’t have obesity as an observed phenotype amongst their members. TL;DR: Pumpkin Person and Robert Lindsay believe that genetics explain why there are no obese New Guineans. But it ain’t so.
The original context is an old Pumpkin Person post. Much of what he’s saying here doesn’t seem too off-base; for example he says that behavioral genetics may explain much of the differences in BMI between individuals within the same population. True. It is possible that some people are genetically inclined to eat more or unhealthier foods, rather than simply being genetically inclined to putting on weight regardless of what they do.
As an aside, genotypes that affect how you digest things also probably explain part of the BMI gap between skinny folks and fat folks within the first world. The APOA2 gene for example has a recessive allele that is associated with higher BMI in people who eat more saturated fats. The interactions between genes and environment which determine BMI are complicated and not yet fully understood, but I’m willing to bet that being genetically worse at processing certain nutrients is a part of the problem, and that being genetically inclined to stuff your face is a part of the problem as well. PP is probably right about that issue.
Where he and Lindsay get it wrong is using examples of people from Podunk, New Guinea as evidence for obesity “being genetic” (relative term). Obesity is a gene-environment interaction such that, without certain environmental inputs, you simply won’t get the phenotype. History tells us that that input is processed carbohydrates.
There was a time when people could have used Australian Aboriginals or Inuit or Pima Indians as examples of groups of people who just don’t have obese folks amongst their numbers, just as Lindsay did with a few populations. Homo sans lardicus. Then the White Devils showed up with their refined Einkorn wheat products and their firewater and so on. Now those populations have fat people in them.
There’s an ongoing debate as to whether some populations are more resistant to the fattening effects of processed carbs or not. My guess is, the answer’s yes (and you’d look at Europeans and East Asians to see the more carb-resistant people, in theory) but that topic would merit its own post. That being said, every population in the world will almost assuredly have obese people in it after you introduce processed carbs. All of the populations that were introduced to this diet, now have fat people in them.
Heritability of BMI is high within the first world because the relevant environmental input is pretty uniform: everybody has access to potatoes, everybody has access to broccoli. As PP points out, which you’re likely to eat and how much you’re likely to eat likely depends on your genetics. As I point out, how your body processes the nutrients also has a likely genetic component. But the environmental contribution to our within-population differences in BMI is low (~20%) because we all have access to roughly the same stuff.
Rural New Guineans, lacking a bunch of processed carbs, could hardly get fat if they tried their best to. That’s a big between-population, nonheritable cause for a phenotypic difference; this means that environment probably explains most of the BMI gap between them and us. If I wanted evidence to refute Lindsay’s assertion that New Guineans are skinnier thanks to genetics, I’d find a population of urbanized New Guineans somewhere with higher average BMI. Such a group would have New Guinean genetics but a “developed” environment vaguely similar to ours; if they were fatter than their rural ken, then Lindsay’s hypothesis that New Guineans are just genetically obesity-free would be falsified.
There are much more interesting theories of the evolution of hominin intelligence other than the tiring (yawn) cold winter theory. Last month I wrote on why men are attracted to a low waist-to-hip ratio in women. However, the relationship between gluteofemoral fat (fat in the thighs and buttocks) is only part of the story on how DHA and fatty acids (FAs) drove our brain growth and our evolution as a whole. Tonight I will talk about how fatty acids predict ‘cognitive performance’ (it’s PISA, ugh) in a sample of 28 countries, particularly the positive relationship between n-3 (Omega-3s) and intelligence and the negative relationship between n-6 and intelligence. I will then talk about the traditional Standard American Diet (the SAD diet [apt name]) and how it affects American intelligence on a nation-wide level. Finally, I will talk about the best diet to maximize cognition in growing babes and women.
Lassek and Gaulin (2013) used the 2009 PISA data to infer cognitive abilities for 28 countries (ugh, I’d like to see a study like this done with actual IQ tests). They also searched for studies that showed data providing “maternal milk DHA DHA values as percentages of total fatty acids in 50 countries”. Further, to control for SES influences on cognitive performance, they controlled for GDP/PC (gross domestic product per country) and “educational expenditures per pupil.” They further controlled for the possible effect of macronutrients on maternal milk DHA levels, they included estimates for each country of the average amount of kcal consumed from protein, fat, and carbohydrates. To explore the relationship between DHA and cognitive ability, they included foodstuffs high in n-3—fish, eggs, poultry, red meat, and milk which also contain DPA depending on the type of feed the animal is given. There is also a ‘metabolic competition’ between n-3 and n-6 fatty acids, so they also included total animal and vegetable fat as well as vegetable oils.
Lassek and Gaulin (2013) found that GDP/PC, expenditures per student and DHA were significant predictors of (PISA) math scores, whereas macronutrient content showed no correlation.
The predictive value of milk DHA on cognitive ability is only weak when either two of the SES variables are added in the multiple regression. When milk arachidonic (a type of Omega-6 fatty acid) is added to the regression, it is negatively correlated with math scores but not significantly (so it wasn’t added to the table below).
So countries with lower maternal milk levels of DHA score lower on the maths section of the PISA exam (not an IQ test, but it’s ‘good enough’). Knowing what is known about the effects of DHA on cognitive abilities, countries who have higher maternal milk levels of DPA do score higher on the maths section of the PISA exam.
Table 2 shows the correlations between grams per capita per day of food consumption in the data set they used and maternal milk DHA. As you can see, total fish and seafood consumption are substantially correlated with total milk DHA, while foods that are high in n-6 show medium negative correlations with maternal milk DHA. The combination of foods that explain the most of the variance in maternal milk DHA is total fat consumed and total fish consumed. This explained 61 percent of the variance in maternal milk DHA across countries.
Not surprisingly, foodstuffs high in n-6 showed significant negative correlations on maternal milk DHA. “Any regression including total fish or seafood, and vegetable oils, animal fat or milk consistently explains at least half of the variance in milk DHA, with fish or seafood having positive beta coefficients and the remainder having negative beta coefficients.”
The study showed that a country’s balance of n-3 and n-6 was strongly related to the students’ math performance on the PISA. This relationship between milk DHA and cognitive performance remains sufficient even after controlling for national wealth, macro intake and investment in education. The availability of DHA in populations is a better predictor of test scores than are SES factors (which I’ve covered here on Italian IQ), though SES explains a considerable portion of the variance, it’s not as much as the overall DHA levels by country. Furthermore, maternal DHA levels are strongly correlated to per capita fish and seafood consumption while a negative correlation was noticed with the intake of more vegetable oils, fat, and beef, which suggests ‘metabolic competition’ between the n-3 and n-6 fatty acids.
There are, of course, many possible errors with the study such as maternal milk DHA values not reflecting the total DHA in that population as a whole; measures of extracting milk fatty acids differed between studies; test results being due to sampling error; and finally the per capita consumption of foods is based on food disappearance, not amount of food consumed. However, even with the faults of the study, it’s still very interesting and I hope they do further work with actual measures of cognitive ability. Despite the pitfalls of the study (the main one being the use of PISA to test ‘cognitive abilities’), this is a very interesting study. I eventually hope that a study similar to this one is undertaken with actual measures of cognitive ability and not PISA scores.
We now know that n-6 is negatively linked with brain performance, and that n-3 is positively linked. What does this say about America?
As I’m sure all of you are aware of, America is one of the fattest nations in the world. Not surprisingly, Americans consume extremely low levels of seafood (very high in DPA) and more foods high in n-6 (Papanikolaou et al, 2014). High levels of n-3 (which we do not get enough of in America) and n-6 are correlated with obesity (Simopoulos, 2016). So not only do we have a current dysgenic effect in America due to decreased fertility of the more intelligent (which is also part of the reason why we have the effect of dysgenic fertility in America), obesity is also driven by high levels of n-6 in the Western diet, which then causes obesity down the generations (Massiera et al, 2010).
I also previously wrote on agriculture and diseases of civilization. Our hunter-gatherer ancestors were all around healthier than we were. This, clearly, is due to the fact that they ate a more natural diet and not one full of processed, insulin-spiking carbohydrates, among other things. Our hunter-gatherer ancestors consumed n-3 and n-6 at equal amounts (1:1) (Kris-Etherson, et al 2000). As I documented in my article on agriculture and disease, HGs had low to nonexistent rates of the diseases that plague us in our modern societies today. However, around 140 years ago, we entered the Industrial Revolution. The paradigm shift that this caused was huge. We began consuming less n-3 (fish and other assorted seafood and nuts among other foods) while n-6 intake increased (beef, grains, carbohydrates) (Kris-Etherson, et al 2000). Moreover, the ratio of n-6 to n-3 from the years 1935 to 1939 were 8.4 to 1, whereas from the years 1935 to 1985, the ratio increased to about 10 percent (Raper et al, 2013). We Americans also consume 20 percent of our daily kcal from one ‘food’ source—soybean oil—with almost 9 percent of the total kcal coming from n-6 linoleic acids (United States Department of Agriculture, 2007). The typical American diet contains about 26 percent more n-6 than n-3, and people wonder why we are slowly getting dumber (which is, obviously, a side effect of civilization). So our n-6 consumption is about 26 percent higher than it was when we were still hunter-gatherers. Does anyone still wonder why diseases of civilization exist and why hunter-gatherers have low to nonexistent rates of the diseases that plague us?
The bioavailability of n-6 is dependent on the amount of n-3 in fatty tissue (Hibbeln et al, 2006). This goes back to the ‘metabolic competition’ mentioned earlier. N-3 also makes up 10 percent of the overall brain weight since the first neurons evolved in an environment high in n-3. N-3 fatty acids were positively related to test scores in both men and women, while n-6 showed the reverse relationship (with a stronger effect in females). Furthermore, in female children, the effect of n-3 intake were twice as strong in comparison to male children, which also exceeded the negative effects of lead exposure, suggesting that higher consumption of foods rich in n-3 while consuming fewer foods rich in n-6 will improve cognitive abilities (Lassek and Gaulin, 2011).
The preponderance of evidence suggests that if parents want to have the healthiest and smartest babes that a pregnant woman should consume a lot of seafood while avoiding vegetable oils, total fat and milk (fat, milk and beef moreso from animals that are grain-fed) Grassfed beef has higher levels of n-3, which will balance out the levels of n-6 in the beef. So if you want your family to have the highest cognition possible, eat more fish and less grain-fed beef and animal products.
In sum, if you want the healthiest, most intelligent family you can possibly have, the most important factor is…diet. Diets high in n-3 and low in n-6 are extremely solid predictors of cognitive performance. Due to the ‘meatbolic competition’ between the two fatty acids. This is because n-6 accumulates in the blood and tissue lipids exacerbating the competiiton between linolic acid (the most common form of n-6) and n-3 for metabolism and acylation into tissue lipds (Innis, 2014). Our HG ancestors had lower rates of n-6 in their diets than we do today, along with low to nonexistent disease rates. This is due to the availability of n-6 in the modern diet, which was unknown to our ancestors. Yes, seafood intake had the biggest effect on the PISA math scores, which, in my opinion (I need to look at the data), is due in part to poverty. I’m very critical of PISA, especially as a measure of cognitive abilities, but this study is solid, even though it has pitfalls. I hope a study using an actual IQ test is done (and not Richard Lynn IQ tests that use children, a robust adult sample is the only thing that will satisfy me) to see if the results will be replicated.
I also think it’d be extremely interesting to get a representative sample from each country studied and somehow make it so that all maternal DHA levels are the same and then administer the tests. This way, we can see how all groups perform with the same amounts of DHA (and to see how much of an effect that DHA really does have). Furthermore, nutritonally impoverished countries will not have access to the high-quality foods with more DHA and healthy fatty acids that lead to higher cognitive function.
It’s clear: if you want the healthiest family you could possibly have, consume more seafood.
It is assumed that since the advent of agriculture that we’ve been better nourished than our hunter-gatherer ancestors. This assumption stems from the past 130 years since the advent of the Industrial Revolution and the increase in the quality of life of those who had the benefit of the Revolution. However, over a longer period of time, the advent of agriculture is linked to poorer health, vectors of disease and lower quality of life (in terms of intractable disease). Despite what I have claimed in the past about hunter-gatherer societies, they do have lower or nonexistent rates of the diseases that currently plague our first-world societies. Why do we have such extremely high rates of disease that they don’t?
Contrary to popular belief, agriculture has caused decreases in many facets of our lives. These diseases, more aptly termed ‘diseases of civilization‘ are directly caused by agricultural and societal ways of living. This increases disease rates as it’s easier for diseases to spread faster through bigger populations. Moreover, we haven’t had time to evolve to the current diet we now eat in first-world countries which has lead to what is termed an ‘evolutionary mismatch‘ between genes and environment. We evolved to eat a certain diet and the introduction of easily digestible carbohydrates which spike insulin the highest. Since insulin causes weight gain, and carbohydrate intake has dramatically increased since the 70s, obesity has increased as a result as countries begin to industrialize and more processed foods are available to the populace.
However, since the Industrial Revolution, height has increased along with IQ. Researchers argue that in first-world countries, high rates of obesity are not preventable due to the excess amounts of highly refined and processed foods. There is data for this theory. In first-world countries, the heritability of BMI is between .76 and .85. Since first-world countries are industrialized, we would expect them to hit their ‘genetic height and weight’ along with having the ability to reach their IQ potential. However, with the excess amount of highly processed and refined foods, this would also, in theory, have the population hit their ‘genetic weights’. This is what we see in first-world countries.
To see how first-world, industrialized societies cause these gene-environment mismatches, we can compare the disease acquisition rate—or lack thereof—to that of Europeans eating an industrialized, first-world diet (high in carbohydrates).
In his 2013 book The Story of the Human Body: Evolution, Health, and Disease, Paleoanthropologist Daniel Lieberman talks at length about evolutionary mismatches. The easiest way to think about this is to think about how one evolved to their environment and think how the processes that alter the environment. A perfect example is African farmers. They may dig a trench to divert water to better irrigate their crops, but this then would cause a higher rate of mosquitoes due to the increase in still water and then selection for genes that protect against malaria would be selected for. This is one example of an evolutionary mismatch turning into an advantage for a population. Most mismatch diseases are caused by changes in the environment which change how the body functions. In other words, the current first-world diet is correlated very highly with diseases of civilization and drive most of the mismatch diseases. Most likely, you will die from one of these mismatch diseases.
If you’re born in a hotter environment, you will have more sweat glands than if you were born in a cooler environment. If you grow up eating soft, processed food, your face will be smaller than if you ate harder foods. These are two ways in which ‘cultural evolution’ (cultural change) have an effect on how the human body grows and adapts to certain stimuli based on the environment around it.
The largest cause of the higher disease rate between industrialized peoples and those in hunter-gatherer societies is shifts in life history. As our life spans increased through modernization, so to did our chance of acquiring more diseases. Of course living longer affects how many children you have but it also raises your chances of acquiring an evolutionary mismatch and your chances of dying from one.
Daniel Lieberman writes on page 190 of his book The Story of the Human Body:
A typical hunter-gatherer adult female will manage to collect 2,000 calories a day and a male can hunt between 3,000 and 6,000 calories a day. (24) A hunter-gatherer groups combined efforts yield just enough food to feed small families. In contrast, a household of early Neolithic farmers from Europe using solely manual labor before the invention of the plow could produce an average if 12,800 calories per day over the course of a year, enough to feed families of six. (25) In other words, the first farmers could double their family size.
Thusly, you can see how evolutionary mismatches would occur with the advent of an agricultural diet that we didn’t evolve to be accustomed to. This is one of the biggest examples of the negative effects of agriculture, our inability to adapt quickly to our new diets which then accelerated after the Industrial Revolution. Further, hunter-gatherers will eat anything edible while agricultural societies will largely eat only what they grow. This would have huge implications for farmers if a few pests ruined their crops since they relied on a few crops to survive.
The thing about farming is that as the Agricultural Revolution began, this increased the population size as well as making that population pretty much stable in terms of migrating. This, then, led to higher rates of disease as larger populations foster new kinds of infectious diseases. Large populations didn’t happen until the advent of farming, and with it came the first plagues. The first farming villages were small, but “as the Reverend Malthus pointed out in 1798, even modest increases in a population’s birthrate will cause rapid increases in overall population size in just a few generations.” (Lieberman, 2013: 197) So as even small increases in population size would cause a boom in future generations, which along with it would drive disease acquisition and plagues in that new and stationary society.
Lieberman further writes on pages 199-200:
Not surprisingly, farming ushered in an era of epidemics, including tuberculosis, leprosy, syphilis, plague, smallpox and influenza. (44) This is not to say that hunter-gatherers did not get sick, but before farming, human societies primarily suffered from parasites such as lice, pinworms they acquired from contaminated food, and viruses or bacteria, such as herpes simplex, which they got from contact with mammals. (45) Diseases such as malaria and yaws (the nonvenereal precursor of syphilis) were probably also present among hunter-gatherers, but at much lower rates than in farmers. In fact, epidemics could not exist prior to the Neolithic because hunter-gatherer populations are below one person per square kilometer, which is below the threshold necessary for virulent diseases to spread. Smallpox, for example, is an ancient viral disease that humans apparently acquired from monkeys or rodents (the disease’s origins are unresolved) that was able to spread appreciably until the growth of large, dense settlements. (46)
Moreover, another evolutionary mismatch is the lack of sanitation that comes with stationary societies. Hunter-gatherers could just go and defecate in a bush, whereas with the advent of civilization, waste and refuse began to pile up in the area. As noted above, when farmers clear space for irrigation to plant crops, this introduces mosquitoes into the area which then causes more disease. Furthermore, we have also acquired about 50 diseases from living near animals (Lieberman, 2013: 201). There are more than 100 evolutionary mismatch diseases that agriculture has brought to humanity.
We can compare disease rates of people in industrialized societies and people in modern-day hunter-gatherer societies. In his 2008 book Good Calories, Bad Calories, Gary Taubes documents numerous instances of hunter-gatherer societies that have no to low rates of the same modern diseases that we have:
In 1914, Hoffman himself had surveyed physicians working for the Bureau of Indian Affairs. “Among some 63,000 Indians of all tribes,” he reported, “there occurred only 2 deaths from cancer as medically observed from the year 1914.” (Taubes, 2008: 92)
“There are no known reasons why cancer should not occasionally occur among any race of people, even though it be below the lowest degree of savagery and barbarism,” Hoffman wrote. (Taubes, 2008: 92)
“Granting the practical difficulties of determining with accuracy the causes of death among the non-civilized races, it is nevertheless a safe assumption that the large number of medical missionaries and other trained medical observers, living for years among native races throughout the world, would long ago have provided a substantial basis of fact regarding the frequency of malignant disease among the so-called “uncivilized” races, if cancer were met with among them to anything like the degree common to practically all civilized countries. Quite the contrary, the negative evidence is convincing that in the opinion of qualified medical observers cancer is exceptionally rare among the primitive peoples.” (Taubes, 2008: 92)
These reports, often published in the British Medical Journal, The Lancet or local journals like the East African Medical Journal, would typically include the length of service the author had undergone among the natives, the size of the local native population served by the hospital in question, the size of the local European population, and the number of cancers involved in both. F.P. Fouch, for instance, district surgeon of the Orange Free State in South Africa, reported to the BMJ in 1923 that he had spent six years at a hospital that served fourteen thousand natives. “I never saw a single case of gastric or duodenal ulcer, colitis, appendicitis, or cancer in any form in a native, although these diseases were frequently seen among the white or European population.” (Taubes, 2008: 92)
As a result of these modern processed foods, noted Hoffman, “far-reaching changes in bodily functioning and metabolism are introduced which, extending over many years, are the causes or conditions predisposing to the development of malignant new growths, and in part at least explain the observed increase in cancer death rate of practically all civilized and highly urbanized countries.” (Taubes, 2008: 96)
The preponderance of evidence shows that these people have low rates of disease that are endemic to our societies due to the advent of agriculture. There is one large difference between hunter-gatherer societies and industrialized ones: the type and amount of food we eat.
Along with the boom of agriculture, we see a slight decrease in height the longer people live in these types of societies. As the Neolithic began 11,500 years ago, height increased about 1.5 inches for males and slightly less for females. But around 7,500 years ago, stature began to decrease and we began noticing evidence of nutritional stress and skeletal markers of disease. There is evidence that as maize was introduced into eastern Tennessee about 1,000 years ago, a decrease of .87 inches in men and 2.4 inches in women were seen. Further, the height of early farmers in China and Japan decreased by 3.1 inches as rice farming progressed, with similar height decreases being seen in Mesoamerica in men (2.2 inches) and women (3.1 inches).
Anti-hereditarian Jared Diamond asks the question “Was farming worth it?” in which he writes:
With agriculture came the gross social and sexual inequality, the disease and despotism, that curse our existence.
The first two things he brings up are pretty Marxist in nature, though they are true. He implies that agriculture causes so-called ‘sexual inequalities’ in which women are made ‘beasts of burden’, made to do the work while men walk by ’empty handed’. This seems to be one negative to a society that is, supposedly, smarter than Europeans.
Regular readers may remember me criticizing Andrew Anglin and his stance on the paleo diet—with how it’s ‘how European man evolved to eat’. However, I am a data-driven person and I try to not let any bias get involved in my thought processes. I know do believe that we should eat a diet that closely mimics our hunter-gatherer ancestors, though we shouldn’t go overboard like certain people in the paleo community, we should be mindful of the quality of food we do it as we will greatly increase our life expectancy along with our quality of life. Indeed, researchers have proposed that we should adopt diets that are close in composition to what our hunter-gatherer ancestors ate in order to battle diseases of civilization. Based on what I’ve read over the past few months, I am inclined to agree. Indeed, evidence for this is seen in a sample of ten Australian Aborigines who were introduced back to their traditional lifestyle (O’Dea, 1984). In a 7 week period, they showed improvement in carbohydrate and lipid metabolism, effectively becoming diabetes-free in almost 2 months.
In sum, there were obviously both positive and negative effects on human life due to the advent of agriculture (leaning more towards negative). These range from diseases to increased population size, to ‘social inequalities’ to higher rates of obesity (this evolutionary mismatch will be extensively covered in the future) to a whole myriad of other diseases. These then lower the quality of life of the individual inflicted. However, the rates of these diseases are low to non-existent in hunter-gatherer societies due to them being nomadic and eating more plentiful foods. Agricultural societies become dependent on a few staple crops so when an endemic occurs, there is mass death since they do not know how to subsist on anything but what they have become accustomed to. The advent of agriculture leads to a decrease in stature as well as brain size. Further, agriculture and the processed foods that came with it caused us to become more susceptible to obesity, which was further exacerbated by the industrial revolutions and the ‘nutritional guidelines’ of the 60s and 70s that led to higher rates of coronary heart disease. It is the lifestyle change from agriculture that we have not adapted to yet that causes disease these diseases of civilization that shorten our life expectancies. I do now believe that all people should eat a diet as close to hunter-gatherer diet as possible, as that’s what the preponderance of evidence shows.
By the way, to my knowledge, contrary to what The Alternative Hypothesis says, there are no differences in carbohydrate metabolism between races (save for a few populations such as the Pima).