Genetic determinism is the belief that behavior/mental abilities are ‘controlled by’ genes. Gerick et al (2017) note that “Genetic determinism can be described as the attribution of the formation of traits to genes, where genes are ascribed more causal power than what scientific consensus suggests“, which is similar to Oyama (1985) who writes “Just as traditional though placed biological forms in the mind of God, so modern thought finds ways of endowing genes with ultimate formative power.” Moore (2014: 15) notes that genetic determinism is “the idea that genes can determine the nature of our characteristics” or “the old idea that biology controls the development of characteristics like intelligence, height, and personality” (pg 39). (See my article DNA is not a Blueprint for more information.)
On the other hand, epigenetic determinism is “the belief that epigenetic mechanisms determine the expression of human traits and behaviors” (Wagoner and Uller, 2016). Both views are, of course, espoused in the scientific literature as well as usual social discourse. Both views, as well, are false. Moore (2014: 245) notes that epigenetic determinism is “the idea that an organism’s epigenetic state invariably leads to a particular phenotype.”
The concept of genetic determinism was first proposed by Weismann in 1893 with a theory of germplasm. This, in contemporary times, is contrasted with “blank slatism” (Pinker, 2002), or the Standard Social Science Model (SSSM; Tooby and Cosmides, 1992; see Richardson, 2008 for a response). Genes, genetic determinists hold, determine the ontogeny of traits, being a sort of “director.” But this betrays modern thinking on genes, what they are, and what they “do.” Genes do nothing on their own without input from the physiological system—that is, from the environment (Noble, 2011). Thus, gene-environment interaction is the rule.
This lead to either-or thinking in regard to the origin of traits and their development—what we now call “the nature-nurture debate.” Nature (genes/biology) or nurture (experience, how one is raised), gene determinists hold, are the cause of certain traits, like, for example, IQ.
Plomin (2018) asserts that nature has won the battle over nurture—while also stating that they interact. So, which one is it? It’s obvious that they interact—if there were no genes there would still be an environment but if there were no environment there would be no genes. (See here and here for critiques of his book Blueprint.)
This belief that genes determine traits goes back to Galton—one of the first hereditarians. Indeed, Galton was the one to coin the phrase “nature vs nurture”, while being a proponent of ‘nature over nurture.’ Do genes or environment influence/cause human behavior? The obvious answer to the question is both do—and they are intertwined: they interact.
Griffiths (2002) notes that:
Genetic determinism is the idea that many significant human characteristics are rendered inevitable by the presence of certain genes; that it is futile to attempt to modify criminal behavior or obesity or alcoholism by any means other than genetic manipulation.
Griffiths then argues that genes are very unlikely to be deterministic causes of behavior. Genes are thought to have a kind of “information” in them which then determines how the organism will develop. This is what the “blueprint metaphor” for genes attempts to show. Genes contain this information for trait development. The implicit assumption here is that genes are context-independent—that the (environmental) context the organism is in does not matter. But genes are context-dependent—“the very concept of a gene requires the environment” (Schneider, 2007). This speaks to the context-dependency of genes. There is no “information”—genes are not like blueprints or recipes. So genetic determinism is false.
The claim that genes determine phenotypes can be shown in the following figure from Kampourakis (2017: 187):
Figure 9.6 (a) The common representation of gene function: a single gene determines a single phenotype. It should be clear by what has been present in the book so far that is not accurate. (b) A more accurate representation of gene function that takes development and environment into account. In this case, a phenotype is produced in a particular environment by developmental processes in which genes are implicated. In a different environment the same genes might contribute to the development of a different phenotype. Note the “black box” of development.
Richardson (2017: 133) notes that “There is no direct command line between environments and genes or between genes and phenotypes.” The fact of the matter is, genes do not determine an organism’s characters, they are merely implicated in the development of the character—being passive, not active templates (Noble, 2011).
Moore (2014: 199) tells us how genetic determinism fails since genes do not work in a vaccuum:
There is just one problem with the neo-Darwinian assumption that “hard” inheritance is the only good explanation for the transgenerational transmission of phenotypes: It is hopelessly simplistic. Genetic determinism is a faulty idea, because genes do not operate in a vacuum; phenotypes develop when genes interact with nongenetic factors in their local environments, factors that are affected by the broader environment.
On the other hand, epigenetic determinism, the belief that epigenetic mechanisms determine the behavior of the organism, is false but in the other direction. Epigenetic determinists decry genetic determinism, but I don’t think they realize that they are just as deterministic as they are.
Dupras et al (2018) note how “overly deterministic readings of epigenetic marks could promote discriminatory attitudes, discourses and practices based on the predictive nature of epigenetic information.” While epigenetics—specifically behavioral epigenetics—refutes notions of genetic determinism, we can then fall into a similar trap, but determinism all the same. This means, though, that since genes don’t determine, epigenetics does not either, so we cannot epigenetically manipulate pre- or perinatally since what we would attempt to manipulate—‘intelligence’, contentment, happiness—all develop over the lifespan. Moore (2014: 248) continues:
Even in situations where we know that certain perinatal experiences can have very long-term effects, determinism is still an inappropriate framework for thinking about human development. For example, no one doubts that drinking alcohol during pregnancy is bad for the fetus, but in the hundreds of years before scientists established this relationship, innumerable fetuses exposed to some alcohol nonetheless grew up to be healthy, normal adults. This does not mean that pregnant women should drink alcohol freely, of course, but it does mean that developmental outcomes are not as easy to predict as we sometimes think. Therefore, it is probably always a bad idea to apply a deterministic worldview to a human being. Like DNA segments, epigenetic marks should not be considered destiny. How a given child will develop after trauma, for example, depends on a lot more than simply the experience of the trauma itself.
In an interview with The Psych Report Moore tells us that people not know enough about epigenetics for there to be epigenetic determinists (though many journalists and some scientists talk like they are :
I don’t think people know enough about epigenetics yet to be epigenetic determinists, but I foresee that as a problem. As soon as people start hearing about these kinds of data that suggest that your early experiences can have long-term effects, there’s a natural assumption we all make that those experiences are determinative. That is, we tend to assume that if you have this experience in poverty, you are going to be permanently scarred by it.
The data seem to suggest that it may work that way, but it also seems to be the case that the experiences we have later in life also have epigenetic effects. And there’s every reason to think that those later experiences can ameliorate some of the effects that happened early on. So, I don’t think we need to be overly concerned that the things that happen to us early in life necessarily fate us to certain kinds of outcomes.
While epigenetics refutes genetic determinism, we can run into the problem of epigenetic determinism, which Moore predicts. But journalists note how genes can be turned on or off by the environment, thereby dictating disease states, for example. Though, biological determinism—of any kind, epigenetic or genetic—is nonsensical as “the development of phenotypes depends on the contexts in which epigenetic marks (and other developmentally relevant factors, of course, are embedded” (Moore, 2014: 246).
What really happens?
What really happens regarding development if genetic and epigenetic determinism are false? It’s simple: causal parity (Oyama, 1985; Noble, 2012): the thesis that genes/DNA play an important role in development, but so do other variables, so there is no reason to privilege genes/DNA above other developmental variables. Genes are not special developmental resources and so, nor are they more important than other developmental resources. So the thesis is that genes and other developmental resources are developmentally ‘on par’. ALL traits develop through an interaction between genes and environment—nature and nurture. Contra ignorant pontifications (e.g., Plomin), neither has “won out”—they need each other to produce phenotypes.
So, genetic and epigenetic determinism are incoherent concepts: nature and nurture interact to produce the phenotypes we see around us today. Developmental systems theory, which integrates all factors of development, including epigenetics, is the superior framework to work with, but we should not, of course, be deterministic about organismal development.
A not uncommon reaction to DST is, ‘‘That’s completely crazy, and besides, I already knew it.” — Oyama, 2000, 195, Evolution’s Eye