Christopher T Smith.com
  • Home
  • About Me
  • Leadership
  • Reflections
  • Career Development Research
  • Neuroscience Research
  • Published Research
  • Press
  • Presentations
  • Job Search Resources
  • Funding Resources
  • Subscribe to Newsletter
  • Contact

Reflections Blog

Perspective

1/26/2023

0 Comments

 
Neuroscience, Life Advice, Personal Perspective
Picture
It's all a matter of perspective.

Human beings are remarkably adaptable creatures. In fact, our ability to adapt to different climates and environmental circumstances has allowed homo sapiens to colonize virtually all of Planet Earth. Essentially, adaptability is our evolutionary advantage. 
Habituation and Unconscious Behaviors
Adaptability is a double-edged sword, however. We often become so accustomed to a particular state that we forget what a different state can feel like. Biologists might resonate with explaining this in terms of homeostasis, where the body seeks to maintain a steady state of internal conditions (think temperature, pH, etc...). Our brains are no different. A neuroscientist might explain the "homeostasis" of our minds as habituation. In its most classic form, habituation involves our minds becoming accustomed to a constant stimulus to the point that it is not perceived after a period of time. A good example is the texture or feeling of our clothes on our skin. There is certainly a stimulus being applied but it becomes essentially imperceptible as we habituate to its constant presence. In essence, our conscious minds filter out this stimulus as it is not new, novel, or salient enough to devote attention to.  

Both our perception of external stimuli and our behavior can become habitual. Our ability to interpret and react to the world consistently produces a habit. Stimulus produces response almost reflexively when a habit is formed and conscious thought of why a particular action was taken is often absent. Habits are often useful as they free up cognitive resources and allow "routine" actions to proceed automatically. No need to think about how to walk once you have developed the action and, at a higher cognitive level, bicycling or driving to work everyday ultimately proceeds on autopilot after you have been using the same route for a month. Because of this amazing capability of our minds, we can think about other issues and goals during our commute as the "automatic" processes of our brains take over to get us from home to work. 

​The unconscious nature of habits means that we are often unaware of why we make choices or take actions that have become habitual. We may not even be aware or able to resist engaging in actions that are objectively "bad" or harmful. A classic example is drug addiction. One hallmark aspect of being addicted to a drug of abuse is that use of the drug becomes habitual (automatic) and that addicted individuals continue their drug use despite negative consequences. This occurs because drug use has become habitual in a biological sense, often triggered by stimuli in the environment that prompt craving and use in a powerfully unconscious way. There is strong evidence that habit and "wanting" drives drug use more than "liking" in once a drug has become addictive. 
Drug addiction may be one of the most stark demonstrations of how corrosive and destructive habits and the unconscious processes between stimulus and response can be on us and our lives. It is far from the only problematic behavior fueled by the environment acting on core neurobiological processes. Our modern world has resulted in the development of a variety of problematic habits, many of which are driven by the ability to obtain entertainment and content in an instant. Our attention is also sapped by a plethora of digital signals coming from our screens and attempts to appeal to our basal instincts of pleasure seeking and pain avoidance. The effects of technological proliferation on our brains and behavior is being studied and a particular focus on how it is shaping the minds of adolescents' during their development is critical. 

Personally, I feel patience and taking the long-view is in short supply these days. The current climate leads many to think feedback or "results" should be instantaneous in all aspects of their lives. We expect response to rapidly follow action in the 21st Century but all aspects of life are not as quick to give us the feedback we want as clicking "buy  now" on your smartphone. Overcoming these modern temptations is a challenge because of how easy it is for them to tap into habitual behaviors and our core needs of resource acquisition, human acknowledgement, belonging, and more. Fortunately, however, we have the ability to consciously frame our experience of the world in positive, constructive ways and take steps to behave accordingly.  
Picture
Individual Differences in How We Interact with and See the World
Humans are exceptionally good at allowing their perspective to construct their version of the world.
In our modern information age, one can often be captured by negative headlines. And while certainly negative information is more attention grabbing (ie, salient), it does not mean there are no positive narratives to speak of. 

In addition, many events or outcomes we experience are not objectively ALL negative or positive. Rather, there is a perspective that can often be taken that sees the positive in mostly negative events or the negative in mostly positive ones. 

I believe some human beings are wired to be more drawn to the positive or negative aspects of an experience...seeing the flaws in nearly all things or viewing the world through rose-colored glasses. Indeed data show individual differences in the experience of stimuli as positive or negative which may have a biological basis (see also). Through conscious decisions and processes, however, we can regulate our innate biological tendencies to focus on the negative or positive. 
Our perspective and view of the world ultimately shapes how we interact with it. If you feel the world is a hostile place and that everyone around you is motivated by their own self-interest, you will begin to take the same perspective. On the other hand, if you believe most human beings are altruistic and get fulfillment from helping others, you will perceive your interactions differently.
This can perhaps best be illustrated by thinking about the many instances we encounter in day where we are trying to discern a person's intent or motivation. This can be especially difficult if it comes in a form of communication where tone and other cues are absent - email.  

When you receive an email with a comment or request you project onto it your own belief about what the person intended to communicate. It is critical, then, to try to "read" the message from multiple perspectives and not assume that it was written with either ill intent or effusive praise. 
When we are faced with fear and uncertainty, I think it is even more important to keep our perspective and not spiral into a negative state. Indeed anxiety and stress heighten our negativity bias. A tendency to engage in cognitive reappraisal, or changing the way one thinks about potentially emotion-eliciting events, can mitigate these effects, however. 
Another concept that comes to mind when thinking about perspective is the impact a growth versus fixed mindset can have on our willingness to learn and develop. Stanford psychologist Carol Dweck coined these terms and her and her colleagues have researched how growth and fixed mindsets impact us. Those with a growth mindset believe that, with effort, perseverance and drive, they can develop their natural qualities and "improve". In contrast, those with a fixed mindset believe talent and abilities are fixed/innate and, thus, less likely to expend effort to try to enhance their skillsets. 
A similar concept is that of locus of control. Locus of control describes the degree to which individuals perceive that outcomes result from their own behaviors (internal locus of control), or from forces that are external to themselves (external locus of control).  

​We could all do better by developing a growth mindset and internal locus of control as we navigate a complex world. 
Picture
Shifting Perspectives
In an increasingly polarized and atomized United States and world, considering other's perspectives becomes a critical skill in short supply. It takes more cognitive resources and effort to consider other perspectives and ideas. This contemplation requires us to slow down and not rush to judgement. The process also requires decoupling our perception of a person's intentions from that individual's actual intent. As we've discussed, it is easy to fall into negative assumptions or construct narratives of ill-intent or maliciousness. While those assumptions could be true, starting from a negative space is rarely productive or effective. 

I choose to carefully reframe my perceptions of interactions before responding. To take a measured approach and understand the other party's position and viewpoint. While this takes time and effort, changing our default perceptions and habits can lead us to a more productive relationship with others and the world. 
Related Items from the Blog:
  • Wanting, Liking, and Dopamine's Role in Addiction
  • To Be Rather Than to Seem
​
Further Reading:
  • Brain health consequences of digital technology use
  • ​The impact of the digital revolution on human brain and behavior: Where do we stand?
  • Where do desires come from? Positivity offset and negativity bias predict implicit attitude toward temptations
  • Negativity bias, negativity dominance, and contagion (PDF)
  • The psychological and neurobiological bases of dispositional negativity (PDF)
  • Propensity to reappraise promotes resilience to stress-induced negativity bias (PDF)
 
  • The Impulse Society: America in the Age of Instant Gratification (book)
  • Positive Emotions and Psychophysiology Lab at UNC Chapel Hill (led by Barbara Fredrickson, who developed the Broaden-and-Build Theory of Positive Emotions)
0 Comments

Dopamine, Drug Addiction, & Personalized Medicine

12/2/2021

0 Comments

 
​Neuroscience, Personalized Medicine
Picture
What is dopamine?
Dopamine is a neurotransmitter, a chemical that shapes how the brain processes information. It does this by binding to different categories of dopamine receptors which then leads to changes in the intracellular processes of neurons, the cells responsible for transmitting information in and outside the brain. The D1 family of dopamine receptors (D1 & D5) increase intracellular levels of a chemical second messenger, cyclic AMP, which can then affect how a neuron processes other signals it receives. The D2 family of dopamine receptors (D2, D3, & D4) decrease cyclic AMP, which can also shape neural responses. How the dopamine signals interact with other signals in the brain can be quite complex and is beyond the scope of this piece. For more see this review article.

Dopamine signaling plays a role in a variety of critical cognitive processes including motor control, learning, and decision making. It has also been implicated in the addictive nature of drugs of abuse, which I studied in some detail during my Ph.D. and postdoctoral research. 
Positron Emission Tomography and measuring dopamine signaling in the human brain
Positron Emission Tomography (PET) allows scientists to measure dopamine signaling in the living brain. PET has been around since the 1960s and involves imaging the location and amount of a radiotracer (radioactively-tagged compound) in the body. Most PET radiotracers contain C-11, F-18, or O-15 radioactive isotopes. These isotopes release positrons (which are the antiparticle of the electron) which, when they interact with nearby electrons in the body produce an annihilation event leading to 2 gamma ray photons being emitted at 180 degrees. The PET scanner "counts" these gamma ray events and ultimately reconstructs the image that produced the events by projecting the gamma ray counts back into the body part being imaged. These PET images give quantifiable data regarding the amount of tracer that accumulates in a particular area over time.
Picture
Schematic of how a PET scanner measures gamma rays to quantify the level of a radiotracer in particular anatomical areas of the brain. Image by Jens Maus (http://jens-maus.de/); Public Domain, https://commons.wikimedia.org/w/index.php?curid=401252
​Brain PET is a particularly powerful technique in that we can use radiotracers that allow us to investigate brain metabolism, neurotransmitter receptors (dopamine or opioid, among others), neurotransmitter synthesis, and the presence of beta-amyloid plaques (often present in Alzheimer's disease). With these compounds we gain a better understanding of individual differences that may be useful as markers of disease state or risk for developing a particular disease. Common radiotracers for imaging the dopamine system include FDOPA, C-11-Raclopride, F-18-Fallypride, FMT, and others. Several groups have used some of these compounds to better understand the dopamine system's role in drug abuse. 
Do dopamine signaling differences reflect risk for drug addiction?
All drugs of abuse release dopamine in the brain. Dopamine, among other things, links pleasure/wanting with the stimuli its release is paired with. Thus, differences in dopamine signaling in response to drugs of abuse may relate to a greater propensity to re-use drugs found to be rewarding and potentially lead to increased risk for drug addiction.

PET imaging has shown that lower dopamine D2/3 receptors are present in a variety of drug-addicted individuals (alcohol, cocaine, methamphetamine, heroin) when compared to healthy controls. Whether low D2 receptors are a cause or consequence of problematic drug use has been difficult to determine in human studies, however.
Picture
Animal work has suggested that behavioral impulsivity is associated with lower D2 receptor levels in rodents. These researchers also found that high impulsive rats would later go on to self-administer more cocaine than low impulsive rats (Dalley et al., 2007). Thus, D2 receptors may confer a greater propensity to engage in behaviors that are associated with drug addiction risk in humans (impulsivity, novelty seeking). Furthermore, work in non-human primates has shown that low D2 receptor levels predict escalation in cocaine self-administration, which leads to lower D2 receptor levels (Nader et al., 2006). This work suggests that low D2 receptor levels may predispose individuals to escalate drug use and that chronic drug use further changes these receptor levels.
Human PET studies have focused on individuals with a family history of addiction to try to corroborate the animal work linking dopamine D2 receptors with addiction risk. Volkow et al. 2006 have shown that individuals with a family history (FH) of alcoholism show heightened striatal (a region deep in the brain responsible for reward processing, learning, and action initiation) D2 receptor levels compared to subjects without a family history. They argue these high D2 levels may serve as a protective factor that prevented these individuals from becoming alcohol abusers themselves. This finding highlights the complexity of working with human subjects as the animal literature might have suggested the opposite finding (lower D2 in FH individuals). Human motives to use drugs are many and often the environment greatly shapes behavior. It could be argued that FH positive individuals with lower D2 (not observed in Volkow et al) had behavioral profiles (see Dalley et al., 2007; above) that resulted in them already transitioning to alcohol/drug abuse and thus being excluded from the Volkow study. Undoubtedly, there are more variables associated with risk for drug use than low D2 levels and future work may be able to identify what other factors (genetic, environmental, social) interact with D2 levels to predict drug abuse risk.
Genetic factors affecting dopamine signaling
There has also been interest in understanding whether genetic differences may lead to different levels of D2 receptor availability, potentially placing some individuals at greater risk for addictive disorders. I investigated the effect of some common D2 receptor single nucleotide polymorphisms (SNPs) on D2 receptor availability using F-18-Fallypride as part of my postdoctoral research. Many of these SNPs had been previously associated with dopamine receptor differences in relatively small PET studies or been associated with potential increased risk for drug addiction. 
  • Taq1A - A1 allele associated with lower striatal D2 receptor availability (replicated in separate study but not in a third)
  • C957T - C allele associated with lower striatal D2 receptor availability in study of 45 individuals 
  • -141C Ins/Del - inconsistent findings on whether it affects D2 receptor availability
For more see: Genetic variation and dopamine D2 receptor availability: a systematic review and meta-analysis of human in vivo molecular imaging studies
Since the Taq1A SNP was discovered to associate with differences in dopamine signaling first, researchers have used it as a proxy for D2 receptor status (or more loosely as an index of general dopamine functioning). However, given that the Taq1A polymorphism does not occur within the DRD2 gene itself, researchers have speculated that polymorphisms in Taq1A may associate with other SNPs in the DRD2 gene that are the real drivers of expression of the receptor in vivo.

The C957T and -141C Ins/Del polymorphisms are in strong linkage disequilibrium with Taq1A and have themselves been associated with striatal D2/3 receptor availability. Despite the data suggesting that these SNPs are strongly linked, few studies have systematically investigated the effect of C957T, -141C Ins/Del, and Taq1A in isolation and combination on D2/3 receptor availability. Beyond the potential link to drug addiction risk, characterizing the functional effect of these SNPs on D2/3 receptor availability has implications for better understanding the mechanisms through which they exert their demonstrated influence on motivated behaviors including learning and decision making, impulsivity, and reward responsivity. 

In our work, we used F-18-Fallypride, which is a D2/3 receptor tracer with favorable affinity to measure both striatal and extrastriatal dopamine receptors, and assessed the impact of C957T, Taq1A and -141C Ins/Del SNPs on D2/3 receptor availability in a sample of 84 healthy subjects.
Picture
The C allele of the C957T SNP was associated with lower D2/3 receptor availability in the ventral striatum and putamen. No other SNP investigated demonstrated an effect on D2/3 receptor availability. BPnd=binding potential, a measure of D2/3 receptor availability; VS=ventral striatum
We found that the C957T SNP was associated with variation in dopamine D2/3 receptor availability in areas of the striatum often implicated in reward processing. The fact that the C allele was associated with lower dopamine receptor availability suggests it could be a useful genetic measure for at least one biological factor (lower D2 receptor availability) linked with drug addiction. While more work needs to be done to confirm these results, certainly further study of the C957T SNP in the DRD2 gene is warranted. 
Individual differences in dopamine release
Another area of focus regarding dopamine’s role in addiction is understanding differences in dopamine release to potential drugs of abuse. This measure is more closely associated with the biological processes associated with actual drug use, but is collected in a more controlled, laboratory setting. PET psychostimulant challenge studies allow researchers to examine dopamine release in the brains of human subjects. Methylphenidate and d-amphetamine (dAMPH) are often used in these PET studies as both release dopamine in the brain by blocking and/or reversing the dopamine transporter. If PET radiotracers that are displaceable by endogenous dopamine are used, researchers can perform a PET scan after placebo or psychostimulant administration and measure the change in radiotracer signal. The PET signal will go down after a psychostimulant for a tracer that is displaceable as the increased endogenous dopamine released by the drug lowers the binding sites for the tracer in the brain. This change in binding potential of the radiotracer can be used as a measure of dopamine release and has become a useful tool in research into addiction related processes.
Picture
Areas of significant change in D2/3 receptor availability as measured by F-18-Fallypride PET after dAMPH administration when compared to PET data collected on placebo. This change in receptor availability on dAMPH is interpreted as a measure of the level of dopamine release to the dAMPH. Data from 34 healthy young adults. dAMPH=d-amphetamine
Using this PET technique, Casey et al 2014 found that young adults with a multigenerational FH of substance use disorders showed reduced dAMPH-induced dopamine than either healthy controls or subjects that personally used drugs at similar levels to the FH group but without a FH of substance use disorders. This study was particularly informative as the effects of current drug use were also investigated and measured separately from family history. Furthermore, our group and others have demonstrated that dAMPH-induced dopamine release correlates with subjective ratings of the drug, particularly wanting more, in drug naïve individuals. These data confirm animal work linking changes in dopamine signaling after drug use to wanting processes (which has been labeled incentive salience).

Read more about wanting, liking, and drug abuse in a previous blog post.
​
The concept of blunted dopamine signaling (lower D2 receptor levels and less dopamine release) as biomarkers of addiction has also been recently reviewed (Trifilieff et al 2017; Leyton, 2017). While more work needs to be done, understanding factors that influence these PET-based biomarkers of dopamine signaling in human subjects has the potential to identify at risk individuals. This risk identification may allow intervention to be attempted earlier in the addiction process or perhaps prevent addiction before it even occurs.
Individual differences in dopamine signaling and the future of personalized medicine
The term “personalized medicine” has gained popularity in recent years. While it may seem like a buzzy term, its potential for improving treatment of a variety of medical conditions is vast. Personalized medicine involves tailoring treatments to individuals based on some aspect of their biology that might affect how they respond to a treatment. For example, you might give one patient with a particular genetic variant a different pharmacological treatment than another if that variant affects how they process (metabolize) or respond to that particular drug. This particular approach of using genetic information to understand response to pharmaceuticals is termed pharmacogenomics (see also).
Picture
The rapid reduction in the cost to sequence the human genome (complete set of an individual’s DNA) as well as proliferation of genotyping services such as 23andMe (which genotype common genetic polymorphisms, or areas in human DNA most likely to vary across individuals) means that genetic data can be readily obtained by anyone who wants it. This technological advance will allow physicians greater information of a patient’s underlying biology and eventually will be merged with growing insights into the effects of genetic variation on drug metabolism, brain signaling, and behavior to make personalized medicine commonplace. In fact, pharmacogenomic data has been added to several drugs by the FDA.

My own work, referenced above, suggests that genetic variation in a gene encoding the dopamine D2 receptor (DRD2) can affect the relative availability of this receptor in the brain as measured with PET (Smith et al., 2017 Translational Psychiatry). Individuals with a particular genetic variant in DRD2 that is associated with less availability of the receptor (C957T CC individuals) may need either a higher dose of a D2 drug or a higher affinity D2 drug to receive a therapeutic benefit.

The implications for this finding go beyond potential treatments or interventions for drug addiction. D2 agonists are commonly used in Parkinson’s Disease patients to preserve motor function and D2 antagonist-like drugs are used in the treatment of Schizophrenia. Understanding the genotype of individuals affected with these conditions, then, could enhance the effectiveness of their D2 drug treatments (by suggesting a physician might want to start with a higher or lower dose of the drug). While studies such as ours linking genetic variation with differences in biology are encouraging, DNA can also be modified by the environment. Researchers have begun studying these epigenetic effects on behavior, with most work occurring in rodents. As we integrate this knowledge, we will begin to better understand the impact gene by environment interactions have on biology and behavior.
Non-genetic factors also influence dopamine signaling
Genetics are not the only variables that could be worth attending to in future treatments. Additionally, dopamine signaling is known to decline with age (see also a previous blog post on this topic). So, doses of dopaminergic drugs that work well on young adults might need to be titrated in older adults. Furthermore, we and others have shown that estradiol levels in naturally cycling women can affect dopaminergic brain functions (assessed by fMRI imaging and a genetic variant (COMT) know to affect dopamine levels in the higher-order, prefrontal areas of the brain). Thus, a dopaminergic medication might be more effective at treating a female patient’s symptoms at certain points of her menstrual cycle but not others. We are only beginning to understand the role of female sex hormones in a variety of biological systems as basic research historically has focused on male model organisms.​
Dopamine signaling complexity and developing future treatments
The role of dopamine in drug addiction is quite complex. In addition, implementing personalized medicine when treating psychiatric or behavioral disorders is challenging as most of these disorders do not have a single, identifiable biological cause. The brain is complex enough and the fact that genetics, sex hormones, age, and environment can all affect one neurotransmitter (dopamine) among the many others involved in brain function speaks to the vast challenge that lies ahead for researchers.

​Our quest to better understand individual differences, however, has the potential to lead to more targeted treatments and therapies for a variety of dopamine-associated disorders including ADHD, Schizophrenia, Parkinson’s Disease, and drug addiction. The development of these personalized treatments will undoubtedly improve healthcare in the 21st Century and beyond but will require further research focused on measuring and categorizing individual differences. 
​

Explore more neuroscience-related posts on the blog:
  • ​Declining Dopamine: How aging affects a key modulator of reward processing and decision making
  • Stress & the Brain: How genetics affects whether you are more likely to wilt under pressure
  • Wanting, Liking, & Dopamine's Role in Addiction
  • Now vs Later - How immediate reward selection bias may be a risk factor for addiction 

More scholarly articles on dopamine and its effects:
  • What does dopamine mean?
  • Fifty years of dopamine research
  • Dopamine, behavior, and addiction
  • Dopamine and effort-based decision making
0 Comments

Now vs Later - How Immediate Reward Selection Bias May be a Risk Factor for Addiction

10/28/2021

0 Comments

 
Neuroscience
Picture
It has been over 7 years since I defended by Ph.D. dissertation in March 2014 at the University of North Carolina at Chapel Hill. Here, I wanted to share some of the rationale and implications of my graduate research on immediate reward selection bias in humans. While this research encompassed 5+ years of my life and resulted in a 112-page dissertation, I will focus on the key points and findings and why they are important. I have moved on from doing this research in my current role but it will forever be a part of my identity. In addition, I hope my scientific contributions have added a bit more to our understanding of substance abuse risk factors and how we might work to either intervene to support those at risk for addiction proactively or treat some of the behavioral patterns in addicted individuals that can continue the cycle of problematic drug use despite its negative consequences. 
What is an intermediate phenotype? 
Many psychiatric disorders including schizophrenia and depression are complex and heterogenous (i.e., they have diverse and varied symptoms and potential causes). The highly heritable nature of these disorders, estimated from twin studies to be anywhere from 40 to 80% (Sullivan et al., 2000; Sullivan et al., 2003), suggests that some biological processes mediated by genetics must confer risk for developing the disorders. It has been proposed that the inability to isolate strong biological bases for how genetic variation leads to complex, highly heritable diseases lies in the fact that various intermediate behaviors or traits are more closely tied to the genetics associated with the disease (Rasetti and Weinberger, 2011).

Given that substance use disorders (SUDs) are also complex disorders (people consume and continue to use drugs of abuse due to a variety of factors) with heritability estimates ranging from 40 to 60% (Heath et al., 2001; Verweij et al., 2010; Bierut, 2011; Agrawal et al., 2012), the identification of intermediate phenotypes associated with risk for these disorders is a growing focus of research (Karoly et al., 2013). Behavioral candidates for SUD intermediate phenotypes include reduced response inhibition (Acheson et al., 2011; Norman et al., 2011), increased risk taking behavior (Cservenka and Nagel, 2012; Schneider et al., 2012), aberrant reward responsivity (Wrase et al., 2007; Andrews et al., 2011), and increased discounting of delayed monetary rewards (Mitchell et al., 2005; Boettiger et al., 2007; Claus et al., 2011; MacKillop et al., 2011; MacKillop, 2013).
Criteria for categorizing a behavior as an intermediate phenotype
For an intermediate phenotype to be useful it must be a quantitative, continuously variable feature or behavior that can be consistently measured. Furthermore, as these intermediate phenotypes are thought to convey genetic risk for a disorder, they should be elevated in those affected with the disorder as well as in those individuals’ close relatives who share genetic similarity with them. Importantly, the level of these phenotypes in affected individuals and their close relatives should be shifted away from a distribution of those otherwise unaffected with no familial risk (Gottesman and Gould, 2003). For example, Egan et al. (2001) found unaffected siblings of those with schizophrenia to display executive function deficits that fell between unaffected nonrelatives and individuals with schizophrenia.

A variety of criteria have come to define an intermediate phenotype in psychiatry (
Almasy and Blangero, 2001; Gottesman and Gould, 2003; Waldman, 2005; Meyer-Lindenberg and Weinberger, 2006):
  1. The phenotype should be sufficiently heritable with genetics explaining variance in the behavior.
  2. The phenotype should have good psychometric properties as it must be reliably measurable to be a useful diagnostic.
  3. The phenotype needs to be related to the disorder and its symptoms in the general population.
  4. The phenotype should be stable over time in that it can be measured consistently with repeated testing, potentially to assess treatment effects.
  5. The behavior should show increased expression in unaffected relatives of those with the disorder as highlighted by Egan et al. (2001), above.
  6. The phenotype should co-segregate with the disorder in families in that a family member with the disorder should show the behavior or trait to a greater degree than an unaffected sibling and that this unaffected sibling should display the trait to a greater degree than a distant unaffected relative.
  7. The phenotype should have common genetic influences with the disorder.

To illustrate the intermediate phenotype concept and associated criteria, we can look at research in schizophrenia. Schizophrenia is associated with poor performance (and hyperactivity in an area of the brain known as the dorsolateral prefrontal cortex, dlPFC) on executive function tasks. As mentioned above, Egan et al. (2001) found unaffected siblings of those with schizophrenia to display executive function deficits that fell between unaffected nonrelatives and individuals with schizophrenia. Furthermore, genes affecting dlPFC activity and executive functions such as the catechol-O-methyltransferase (COMT) gene explain variation in schizophrenia risk (see Egan et al., 2001). Thus, by investigating a specific behavior (executive function) and its neural correlate (dlPFC activity) in schizophrenic patients and those at increased genetic risk for the disorder, a genetic factor (COMT) was isolated. Schizophrenia is caused by more than one genetic variation but this example illustrates the value of identifying a link between a behavior associated with schizophrenia (an intermediate phenotype) and a potential biological and genetic basis for said behavior.  ​

What is immediate reward selection (Now) bias?
Delay discounting (DD) behavior reflects the tendency for animals (including humans) to discount the value of delayed (in time) rewards in comparison to those available immediately. DD has also been referred to as immediate reward selection (“Now”) bias as the value of rewards available immediately supersedes waiting for a larger, delayed reward in the future (Rachlin and Green, 1972). Other terms for this behavior include temporal discounting or hyperbolic discounting as plots of the value of time-delayed ​rewards relative to present value often take a hyperbolic shape with the present value of a reward delivered at longer delays decreasing at a steep, non-linear rate. In other words, $100 in 3 months may only be worth $25 in present value while $100 in 1 month is worth $50 in present value (the discount rate gets steeper moving from a 1-month to a 3-month delay). 
Picture
Example of temporal discounting behavior. The present value of a reward decreases with the time one must wait to receive it. Individuals differ in the degree to which they discount rewards over time. The individual whose choice behavior is plotted with open circles and fit with the dashed line is a steeper discounter of time than that individual plotted with the filled circles and solid line of best fit.
Now bias as an intermediate phenotype for alcohol use disorders
As delay discounting (DD) behavior has been shown to be highly heritable (Anokhin et al., 2011; Anokhin et al., 2015; Mitchell, 2011), suggesting a strong genetic component, and is elevated in a variety of addictive behaviors (MacKillop et al., 2011), we focused our current work of exploring intermediate phenotypes for addiction on this behavior. Prior work has suggested DD displays many of the necessary criteria of an intermediate phenotype for a variety of neurobehavioral disorders including substance use disorders (SUDs) (Becker and Murphy, 1988; Reynolds, 2006; Perry and Carroll, 2008; Rogers et al., 2010), attention deficit hyperactivity disorder (Barkley et al., 2001; Sonuga-Barke et al., 2008; Paloyelis et al., 2010), and pathological gambling (Alessi and Petry, 2003; Leeman and Potenza, 2012). As these behaviors often co-occur, they may share similar biological and genetic components (Wilens, 2007; Leeman and Potenza, 2012).
An overview of various intermediate phenotype criteria for SUDs met by DD (Now bias) has been recently outlined (MacKillop, 2013). Particularly relevant to the current work, individuals with alcohol use disorders (AUDs) consistently display greater Now bias behavior versus those without AUDs (Petry, 2001; Bjork et al., 2004; Mitchell et al., 2005; Boettiger et al., 2007; Mitchell et al., 2007; MacKillop et al., 2011). Thus, Now bias is elevated in those individuals with an AUD (intermediate phenotype criterion 3).

​Conceptually, Now bias can be thought to have some relation to AUDs, as every relapse or excess drink represents a decision favoring immediate over delayed benefits. Furthermore, Now bias behavior has been shown to be heritable and associates with substance use, suggesting common genetic influences with SUDs (
Anokhin et al., 2011). Importantly, Now bias as assessed through delay discounting (DD) tasks, has good psychometric properties (responses are highly reliable (Matusiewicz et al., 2013; Weafer et al., 2013)), suggesting it is a trait that is robust to consistent measurement (intermediate phenotype criterion 2). This is further supported by the fact that DD behavior is stable over time (Kirby, 2009). Thus, prior work has demonstrated Now bias satisfies many of the criteria for an intermediate phenotype for AUDs. However, not all criteria have yet to be examined.  
Picture
Under-investigated criteria for Now bias as an intermediate phenotype for AUDs
As Now bias is elevated in those with AUDs, we might expect to see this behavior heightened in those on a trajectory toward an AUD as well. Such demonstrations between elevated Now bias and AUD risk would add greatly to the utility of Now bias as an intermediate phenotype. As problematic alcohol use during emerging adulthood (late teens to early twenties) may predict development of an AUD later in life (O'Neill et al., 2001; Merline et al., 2008; Dick et al., 2011), though many individuals mature out of problematic use (Bartholow et al., 2003; Costanzo et al., 2007; Lee et al., 2013), one might expect Now bias is enriched in problematic drinking emerging adults. Only one relatively small behavior study has looked at such a relationship with Now bias observed to be heightened among heavy versus lighter social drinking college students (Vuchinich and Simpson, 1998). This finding requires replication in a larger, more diverse sample.

In addition to being elevated in problematic drinking emerging adults, to satisfy another intermediate phenotype criterion for AUDs, Now bias behavior should also be elevated in unaffected first-degree relatives (parents, siblings) of those suffering from AUDs (intermediate phenotype criterion 5). Elevated Now bias in first-degree relatives of those with AUDs has yet to be adequately demonstrated, however.

​Most of the intermediate phenotype literature considers the expression of the behavior or trait in first-degree relatives as critical in demonstrating that behavior as an intermediate phenotype. In the field of AUDs, however, positive family history of an AUD is often defined as having at least one parent with an AUD (Acheson et al., 2011) or father with an AUD (Crean et al., 2002; Petry et al., 2002), or some combination of parental history or sufficient density of AUD history in second-degree relatives (Herting et al., 2010). In these previous studies, the effects of family history on Now bias was either only observed in females (Petry et al., 2002), was not found at all (Crean et al., 2002; Herting et al., 2010), or was not present when controlling for group differences in IQ and antisocial behavior (Acheson et al., 2011).

Measuring Now bias behavior in individuals with any first-degree relatives with AUDs expands the classic family history positive AUD definition to include siblings, who display greater genetic concordance with a particular individual than their parents. To our knowledge, though, this definition of first-degree family member positive or negative for AUDs has not been applied to the study of Now bias. Thus, while Now bias possesses many properties that suggest it could be a good intermediate phenotype for AUDs, further investigation of this possibility is warranted, particularly work focusing on examining whether Now bias is elevated in unaffected individuals with first degree relatives with AUDs.

Given our review of the literature and past work in this area (
Mitchell et al., 2005), we focused on better demonstrating the utility of Now bias as an intermediate phenotype for AUDs in a large group of individuals who ranged in age, level of alcohol use, and family history of AUDs. This work was published in Frontiers in Human Neuroscience in 2015, but I share the key takeaways from the study, below. ​
New evidence supporting Now bias as an intermediate phenotype for AUDs
​As mentioned earlier, adults with addictive disorders, including alcohol use disorders (AUDs), tend to choose smaller, sooner over larger, delayed rewards in the context of delay-discounting (DD) tasks more frequently than do adults with no addiction history (Petry, 2001; Mitchell et al., 2005; MacKillop et al., 2011). This immediate reward selection (or “Now”) bias persists even after years of abstinence and does not correlate with abstinence duration (Mitchell et al., 2005), suggesting irreversible consequences of chronic alcohol abuse and/or a pre-existing risk trait, or intermediate phenotype (Meyer-Lindenberg and Weinberger, 2006; MacKillop, 2013). If Now bias was a pre-existing risk trait for AUDs, we would predict heightened Now bias among young people who engage in at-risk drinking but who do not meet clinical criteria for alcohol dependence, relative to age-matched light or moderate drinkers. In addition, we would also predict heightened Now bias among light or moderate drinkers with problem-drinking first degree relatives if this behavior was an intermediate phenotype for AUDs. 
As the Alcohol Use Disorders Identification Test (AUDIT) is an effective means of measuring problem drinking behavior (Fiellin et al., 2000; Barbor and Higgins-Biddle, 2001; Kokotailo et al., 2004), we recruited high and low AUDIT individuals across a group of 18-40 year old social drinkers not reporting any AUD. We hypothesized that Now bias would be elevated in high but not low AUDIT emerging adults (defined as ages 18-21 or 18-24). Furthermore, we sought to test whether Now bias was elevated in those otherwise unaffected individuals (light/moderate social drinkers; low AUDIT) with a first degree relative with an AUD. We used the intermediate phenotype criteria of first-degree biological relative status (father, mother, or sibling with AUD), excluding those with mothers with an AUD to rule out potential fetal alcohol effects. We hypothesized that Now bias would be elevated in low AUDIT individuals with a first-degree relative with an AUD but not in those with no first-degree AUD relative. ​
Considering the effect of age on Now bias 
As this study was underway, we began to wonder how age might impact Now bias independent of problematic alcohol use. Our lab had previously found marked Now bias among emerging adults (18-25 yrs), regardless of drinking behavior. This suggests elevated DD generally among individuals transitioning from adolescence to adulthood. The observation that adult controls (average age of 26-28) with no AUD diagnosis display reduced Now bias compared to abstinent alcoholic adults (Mitchell et al., 2005; Boettiger et al., 2007) suggests that this bias should decline between emerging adulthood and adulthood, at least among moderate, non-problem drinkers. While emerging adults are widely regarded as impulsive (Chambers and Potenza, 2003; de Wit, 2009), and DD normally decreases from childhood to the early 30’s (Green, 1994; Scheres et al., 2006; Olson et al., 2007; Eppinger et al., 2012), little is known about specific changes in DD from late adolescence to adulthood. Some data show trait impulsivity declining linearly with age from early adolescence to age 30 (Steinberg et al., 2008). Thus, given positive correlations between DD and trait impulsivity (Mitchell et al., 2005; de Wit et al., 2007), we hypothesized DD should decline with age from adolescence into the 30s, but, to our knowledge, no prior studies have explicitly investigated age effects on DD in detail from ages 18 to 40. Moreover, we do not know whether heavy alcohol use moderates any such age-related changes in DD. Thus, a secondary aim of our work was to investigate age-related differences in Now bias in our population as a whole and separately in those reporting heavy, problematic versus light/moderate drinking. 
Confirming and extending on prior work, we found that emerging adults (defined as either aged 18-21 or aged 18-24) regardless of their drinking status (light/moderate vs heavy drinkers) showed equally high Now bias behavior, which did not support our first hypothesis that this behavior would be elevated in heavy drinkers. Follow-up analyses concluded that Now bias generally declined with age in our light/moderate drinker population (r=-0.28, p=0.022) but not in the heavy drinkers (r=-0.03, p=0.39). We measured Now bias in our study as an impulsive choice ratio (ICR), which can range from 0 (no Now bias) to 1 (complete Now bias). The age-related decline in Now bias began to asymptote around age 25. Thus, we organized our data into emerging adult (aged 18-24) and adult (aged 26-40) groups for further analyses. 
Picture
Our measure of Now bias, ICR, was found to decline with age in light/moderate drinkers but not in heavy drinkers.
Now bias as intermediate phenotype for AUDs in adults
We did confirm our second hypothesis that Now bias (measured via ICR) was elevated in light/moderate drinking adults (aged 26-40) with first-degree relatives with an alcohol use disorder (AUD). Plotting our adult population by first-degree relative status and comparing it to heavy drinking adults and abstinent alcoholic adults studied previously (Mitchell et al., 2005), we found strong evidence supporting Now bias as an intermediate phenotype for AUDs.
  1. Now bias (ICR) is high in individuals who drink heavily and problematically but without an AUD (orange bar in graph, below)
  2. ​Heavy drinker ICR is nearly equivalent ​to that seen in abstinent alcoholics (red bar in graph, below) despite these individuals not meeting the criteria of having an AUD
  3. Now bias is elevated in light/moderate drinking adults with first-degree relatives with an AUD (FH+) relative to those without a first-degree relative with an AUD (FH-; blue bars in graph, below)
Picture
Among adults, Now bias as measured by ICR is elevated in individuals at risk for AUDs. The dark blue line and red bar represent prior data measuring ICR in abstinent alcoholics and healthy controls. New data from heavy or light/moderate drinking adults with (FH+) and without (FH-) a family history of AUDs are plotted in the orange and blue bars, respectively.
Implications of our findings - Could reducing Now bias lower one's risk of an AUD?
Our work has added additional support to Now bias being an intermediate phenotype for alcohol use disorders (AUDs). The fact that Now bias was elevated in heavy drinking adults without an AUD was suggestive that this behavior may proceed and AUD diagnosis. More work is needed to follow up on this finding, however. Specifically, longitudinal studies need to be conducted to measure Now bias in individuals in their early teens (prior to exposure to drinking) and continue to measure this behavior over the lifespan, especially as these individuals enter their late teens and early twenties when problematic drinking behavior often emerges. Only through careful study of the trajectory of Now bias during adult development in both non-problematic and problematic drinkers can we begin to truly determine the utility of this measure as an intermediate phenotype for alcohol use disorders or substance use disorders in general.

Ongoing work taking place as part of the Adolescent Brian Cognitive Development (ABCD) Study seeks to understand adolescent brain and cognitive development generally and the various behavioral (including Now bias) and neural risk factors that can emerge in adolescence that lead to mental or psychiatric disorders in adulthood.

Learn more about this ambitious study here and view current publications emerging from the dataset here. 
Since our study on Now bias as a potential intermediate phenotype for AUDs was published in Frontiers in Human Neuroscience, other work has shown:
  • Large individual differences in intertemporal choice behavior (Review; Keidel et al., 2021)
  • Genomic basis of delayed reward discounting (Gray et al., 2019)
  • Steep Discounting of Future Rewards as an Impulsivity Phenotype: A Concise Review (Levitt et al., 2020) 
  • ​Individuals with two parents with addiction have significantly higher rates of discounting compared to those with no or only one parent with addiction (Athamneh et al., 2017)
  • The density of familial alcoholism interacted with binge-drinking status to predict impulsive choice (Jones et al., 2017)
  • A review of age & impulsive behavior in drug addiction (Argyriou et al., 2017)
With increased confidence in Now bias as an intermediate phenotype for alcohol use disorders, our next step is better understanding the neural and biological bases of this behavior. This information may then offer a means to potentially reduce Now bias in individuals at risk for alcohol use disorders. Making at-risk individuals more future-focused could assist them in considering the long-term consequences of problematic alcohol use and reduce the temptation to drink heavily in the moment. Targeting the dopaminergic system is one potential approach to modulating Now bias as some of my and others' work has shown. Delving into that topic will have to wait for a future post, though. Stay tuned. 

Explore more of my work on Now bias:
  • Age modulates the effect of COMT genotype on delay discounting behavior
  • Ovarian cycle effects on immediate reward selection bias in humans: a role for estradiol 
  • Modulation of impulsivity and reward sensitivity in intertemporal choice by striatal and midbrain dopamine synthesis in healthy adults
  • Neural Systems Underlying Individual Differences in Intertemporal Decision-making  

And additional neuroscience topics on the blog:
  • Declining Dopamine: How aging affects a key modulator of reward processing and decision making
  • Stress and the Brain: How genetics affects whether you are more likely to wilt under pressure
  • Wanting, Liking, & Dopamine's Role in Addiction 
0 Comments

Wanting, Liking, & Dopamine’s Role in Addiction

11/17/2020

0 Comments

 
Neuroscience

This piece originally appeared on my Life Apps Brain & Behavior Blog on October 5, 2020.
​It has been expanded on here. 
Picture
Drug addiction is a huge problem across the world, leading to large societal costs in terms of lost productivity and healthcare expenses. In the United States (US), specifically, the National Institute on Drug Abuse (NIDA) has compiled a variety of statistics illustrating the scope of the addiction problem. Economically, the estimated annual cost of all drugs of abuse (including alcohol and tobacco) is $740 billion. Drug abuse is also a prominent problem with over 11% of those over age 12 in the US reporting using illicit drugs in the past month. Alcohol and tobacco abuse is also prevalent with nearly 6% of US adults estimated to have an alcohol use disorder (where alcohol negatively interferes with their life) and 14% of US adults report currently smoking cigarettes. 
Drugs of abuse are powerfully addictive because they “hijack” biological processes put in place to ensure we continue to pursue behaviors that promote our survival. The primary biological process all drugs of abuse have in common is that their initial use results in an increase in the signaling chemical dopamine in the brain (albeit via different mechanisms based on the drug used). 
Picture
Given these data, it is not surprising that in popular culture dopamine is thought of as the “reward” signal in the brain. But what exactly does that mean?
Does dopamine=reward? 
What is a reward signal in the context of biology and the brain anyway?

Reward is a complex construct (for more see this excellent overview) but one reasonable definition is that reward refers to the fact that certain environmental stimuli have the ability to elicit approach responses, especially under biologically-based “need” conditions. Put another way: Stimuli that are desirable as a result of a biological need are “rewarding”.
​Food is rewarding when we are hungry, water when we are thirsty. 
​

Our brains are primed to learn about reward, specifically to learn what stimuli and actions in our environment will lead to obtaining outcomes that are necessary for survival - a process known as reinforcement learning. We learn certain behaviors are rewarding as they lead to us obtaining things we need to continue living.
Reinforcement learning in its most basic form involves associating a stimulus with a response that then leads to a reward. This type of learning is done by virtually all animals. For example, a lab rat can learn that when a light comes on and it presses a particular lever in a specific environment it receives a food pellet. 

It learns light + press = reward via the following constituent parts: 

Light = stimulus
Press specific lever after light = response
Food pellet = reward
And once this learned stimulus-response association is made, the stimulus itself can be perceived as “rewarding” in a process known as incentive salience (more on this later). 
Picture
Reinforcement Learning, Reward, & Addiction
Stimulus-response learning drives most of the behaviors we think of as “drug addiction” in people. When one is addicted to a drug of abuse, stimuli associated with the use of the drug (think, one’s neighborhood bar or a friend you routinely smoke with when together) can themselves drive drug use behavior. This is true even when the actual use of the drug is no longer “pleasurable” for the addicted individual. In fact, most drug addicted individuals do not find the use of their addicted drugs “pleasurable” any more. 

This is because addiction is known to progress from a binge/intoxication stage of use to a withdrawal/negative affect stage and finally to a preoccupation/anticipation stage which can then reactivate drug use. Thus, drugs are initially used because they are pleasurable but over time this shifts and individuals use drugs of abuse to relieve negative withdrawal effects and not for pleasure. And, as mentioned above, drug use can be triggered by stimuli that were associated with drug use that promote preoccupation with using the drug even in individuals trying to stop or limit their use. ​
Why does this happen? How can a drug that starts out as pleasurable lead to negative feelings of withdrawal when not used? Well, the brain is very adaptive and quickly modifies the biological environment such that there is less disturbance in dopamine (and other chemical) signaling after drug use. So, while drugs of abuse initially result in a large release of dopamine, this effect moderates with continued use. This change in responsivity to drugs of abuse is tolerance and explains why those addicted to drugs of abuse need to take larger and larger quantities to achieve the same effect. In fact, the continued use of addictive drugs results in notable changes in the brain dopamine system (see figure at right) which promotes a strong biological dependence on them. 
Picture
Continued drug use physically changes the brain's dopamine system, which can affect drug abusers' mood & behavior.
While these findings help us understand how the dopamine system acts in response to addictive drugs, we have yet to examine how the initial dopamine release to drugs of abuse maps onto “reward” and may promote continued early use that ultimately leads to addiction. ​
Does dopamine signal “reward”?
Much research has shown that dopamine does not signal “reward” (or to be more technical, pleasure) per se but rather is used in learning the various predictors of reinforcement in the environment - reinforcement learning. 

This concept of dopamine signaling reinforcement learning was most famously demonstrated by the work of Wolfram Schultz, a professor at the University of Cambridge in the UK, who recorded the firing of dopamine-producing neurons (cells) in the brain of primates receiving reinforcing juice rewards. 
​

Initially, these neurons fire to unexpected reward (juice) delivery. If a cue (tone or light) perfectly predicts the juice delivery (say 5 seconds before juice delivery), over repeated trials, Schultz found that the dopamine neurons fired in the presence of the cue (or, in psychological speech, conditioned stimulus) and not the reward. And when a reward is not followed by a stimulus previously paired with it, there is an observable dopamine “dip” locked to the time when the reward was expected to occur. See figure, below, from Schultz et al., 1997, illustrating reward prediction signaling in dopamine neurons.
Picture
Reward prediction error responses at the time of reward (right) and reward-predicting visual stimuli (left in bottom two graphs). The dopamine neuron is activated by the unpredicted reward eliciting a positive reward prediction error (blue, + error, top), shows no response to the fully predicted reward eliciting no prediction error (0 error, middle), and is depressed by the omission of predicted reward eliciting a negative prediction error (- error, bottom). From Figure 2 in Schultz, 2016 and reproduced from Schultz et al., 1997. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4826767/
This work can be distilled to the following conclusion: 
Dopamine signals predictors of rewards and not rewards themselves. 
Follow-up studies have also shown the amazing ability for dopamine-producing neurons to encode reward prediction in a scaled manner (stronger dopamine response to higher probability predictors of reward) and has resulted in perhaps the most well-accepted computational model for a biological process: the temporal difference model for reinforcement learning. ​
Wanting vs Liking and Dopamine’s Role in Perpetuating Addiction
This concept of dopamine as a reward predictor has been extended to a hypothesis around the role of incentive salience in dopamine release and how this process can lead to craving or wanting behaviors in those addicted to drugs of abuse. 

An amazing amount of work on this topic from Kent Berridge at the University of Michigan has demonstrated dopamine release is not associated with liking in the sense of a hedonic, pleasurable response but rather dopamine motivates behavior and affects how hard animals are willing to work for rewards (“wanting”). 
Picture
Picture
The processes of reinforcement learning described above naturally occur but the extra boost of dopamine release associated with taking an addictive drug further strengthens stimulus-response associations. Cues or stimuli that predict drug use can then themselves become “rewarding” and trigger wanting/craving responses in the brain as it anticipates drug use. And via other dopamine-related processes, drug use behaviors can become habitual, being guided by stimuli and the environment more than one’s active choice to use drugs. ​
While many may conflate liking with wanting and the role of “reward” in all of this, the implications around the role dopamine plays in these processes are critical, especially if one is working to develop treatments to combat drug addiction. Compulsive drug use despite negative consequences is what results in addictive drugs negatively interfering in someone’s life NOT the pleasure drug use provides. So, a better understanding of what processes mediate wanting and craving for drugs of abuse is essential as we seek to combat drug addiction. 
Picture
Dopamine release in three regions of the brain - the vmPFC, VS, and insula - was found to correlate with wanting more d-amphetamine. From Smith et al., 2016 https://pubmed.ncbi.nlm.nih.gov/27174408/
My own work has sought to understand how the release of dopamine after oral d-amphetamine administration in healthy human subjects affects the brain. We found that dopamine release correlates with participants “wanting more” (NOT “liking”) d-amphetamine in three core brain regions often associated with reward and drug-related effects: ventral striatum (VS), ventromedial prefrontal cortex (vmPFC), and insula (see image at left).
​The VS is a 
core brain hub of reward valuation along with the vmPFC. Others have also found a relationship between VS dopamine release and “wanting”. The insula is a region of the brain often associated with drug craving/wanting and, in fact, damage to this part of the brain results in a loss of craving for cigarettes in smokers. Future efforts to modulate these craving-related systems and their associated dopamine signals through interventions such as transcranial magnetic stimulation may ultimately help drug-addicted individuals effectively stop their problematic drug use. 
We are just beginning to understand the neurobiological bases of drug addictive processes but continued research into them promises the development of better treatments in the future. ​​
Concluding Thoughts
Hopefully this piece has illustrated the complex role dopamine plays in signaling reward. Research that has emerged over the last few decades using sophisticated techniques to measure brain signaling in animals and humans has implicated dopamine in reinforcement learning processes and, by extensive the incentive salience of cues associated with rewards. The role of dopamine in signaling what stimuli predict reward is hijacked and pushed into overdrive by drugs of abuse that themselves release dopamine. Thus, after repeated pairings of stimuli and drug rewards, the brain adapts to respond powerfully to drug-related stimuli and cues, prompting craving in addicted individuals. 

Not everyone is as susceptible to these dopamine-mediated learning processes, though. How individual differences in biology ultimately map onto risk for drug addiction is a matter of intense interest in the field of neuroscience but is beyond the scope of this current post. For the time being, I encourage you to explore the references below for more on the complex and nuanced role dopamine plays in reward and learning processes.   ​
References:
  • A Neural Substrate of Prediction and Reward
  • Neurobiology of addiction: A neurocircuitry analysis
  • Liking, Wanting and the Incentive-Sensitization Theory of Addiction
  • Pleasure Systems in the Brain
  • Learning, Reward, and Decision Making
  • Neural mechanisms underlying the vulnerability to develop compulsive drug-seeking habits and addiction
  • Dopaminergic Mechanisms in Actions and Habits
  • Imaging genetics and the neurobiological basis of individual differences in vulnerability to addiction
0 Comments

Stress & the Brain: How genetics affects whether you are more likely to wilt under pressure

6/18/2020

1 Comment

 
​Neuroscience
​
Adapted from a post appearing as part of my new
Brain & Behavior blog on LifeApps.
Don’t stress. Stop stressing out. Relieve your stress.
Is stress always a bad thing?
Is there any truth to the fact that some people thrive under stress? Actually, yes.
Stress & the Brain
First, when we are under stress, a variety of biological processes are taking place. Here, I am going to focus on a particular role stress can play in the brain. Under psychological stress the amygdala activates stress pathways in the brain and the hypothalamus and brainstem release high amounts of norepinephrine and dopamine. While these chemicals have a range of effects on the brain, I want to focus this discussion on how they impact the prefrontal cortex (PFC). 

The PFC is the most frontal part of the brain, sitting above our eyes. This part of the brain has seen rapid evolutionary development and is much larger in primates, including humans, than other mammals. The PFC plays a major role in planning, higher-order thinking, problem solving, and simulating/anticipating the results of actions we may or may not undertake. 
Picture
The prefrontal cortex (PFC) is the most recently evolved portion of the brain, sitting above the eyes.
The PFC is also important in cognitive control, which can be thought of as “staying on task” despite the presence of potential distractors. When you need to concentrate really hard on a complex math problem on a test, your PFC is highly engaged. A key function this brain region must undertake, then, is maintaining relevant, important information over a long enough time period for you to act on it...think holding on to the intermediate product of a mathematical formula while the next piece of information is being added (12 times 2 is 24, plus 6 is 30, and divide by three, equals 10). 

The neurotransmitters dopamine and norepinephrine play key roles in maintaining this critical information (a concept known as working memory) in the PFC. 
Picture
Molecular structure of the neurotransmitter dopamine.
Picture
Molecular structure of the neurotransmitter norepinephrine.
Dopamine, Norepinephrine, & the Prefrontal Cortex
To summarize an incredible amount of work by the likes of Amy Arnsten at Yale University and others (see also), dopamine is thought to be important in repressing the “noise” (distractions) in PFC brain circuits while norepinephrine is believed to enhance the “signal”. 

The effects of these two neurotransmitters on the PFC and ultimately behavior are nonlinear. In fact, there is much evidence that the effects of these chemicals on the brain follow an inverted-U relationship where too little or too much dopamine or norepinephrine in the PFC leads to sub/supra-optimal functioning of this area of the brain. Thus, an intermediate level of these chemicals is best for optimal PFC function. 
Stress increases the levels of dopamine and norepinephrine in the brain.
The inverted-U model for dopamine and norepinephrine effects on the PFC is important given the fact that stress increases the level of both of these chemicals in this area of the brain.
​
So, the optimal balance of dopamine and norepinephrine may be thrown off under stress. 


But, this isn’t the complete story.
Biological Differences in PFC Dopamine Signalling 
What we also know from a great deal of research is that there are biological differences in neurotransmitter signaling in the brain. More is known about how dopamine signaling varies across individuals based on their genetic variation (and see; and also), which I will focus on for the remainder of this piece.
Picture
A single nucleotide polymorphism (SNP) in the COMT gene results in a change in the amino acid structure (valine to methionine, Val to Met) of the COMT enzyme. This change results in a COMT enzyme with lower activity (act) that then breaks down dopamine (DA) and other catecholamines less efficiently. The result is higher levels of DA in the PFC.
Genetic Variation in COMT Affects PFC Dopamine Levels 
I will focus specifically on an interesting genetic variant in the COMT gene. What is COMT? It is an enzyme that helps to break down dopamine, as well as other chemicals with similar molecular structures, and regulate its level in the body and brain. A single nucleotide polymorphism (SNP) in the COMT gene (G->A DNA substitution resulting in a Val->Met amino acid variation in the protein; and denoted as the Val158Met COMT variant) results in a differential stability of the COMT enzyme, allowing it to more or less efficiently breakdown dopamine. And while COMT breaks down a variety of catecholamines, including norepinephrine, epinephrine, and dopamine, it plays a majority role in regulating dopamine levels in the PFC. Those with the COMT Val polymorphism have higher enzymatic activity and therefore lower levels of tonic PFC dopamine. In contrast, the COMT Met polymorphism results in lower COMT activity and therefore higher levels of tonic PFC dopamine. 
Picture
Schematic illustrating how variation in COMT ultimately affects dopamine (DA) signaling in the PFC.
Interestingly, this particular COMT SNP has recently evolved in humans and both polymorphisms (Val & Met) are quite common in the human population. This suggests both versions of the COMT SNP had a useful purpose, evolutionarily. One popular hypothesis for the common occurrence of these COMT variants is the Worrier vs Warrior explanation. 

Several studies (see comprehensive review here) have demonstrated that individuals with the Met version of the COMT enzyme perform well in cognitively demanding tasks but have an enhanced vulnerability to stress (worrier); they wilt under pressure. 
In contrast, individuals with the Val version of COMT have better stress resiliency (warrior). 
Picture
Picture
Stress elevates the level of catecholamines like dopamine (DA) in the PFC, resulting in a shift in where individuals with the ValVal or MetMet COMT polymorphism find themselves on the inverted-U function between PFC DA and task performance, where intermediate DA associates with optimal performance. While an individual with the MetMet polymorphism may (under certain conditions) be in the optimal range of PFC DA and task performance under normal conditions, stress pushes their DA levels into supra-optimal levels, degrading task performance. In contrast, stress-induced increases in DA may help shift those with the ValVal polymorphism closer to optimal DA levels and thus improve task performance. 
This is thought to be due to the fact Val individuals have lower tonic dopamine and that the dopamine boost that occurs under stress moves these individuals toward a more optimal level of dopamine for performing cognitively demanding tasks. To further support these points, a large study in Taiwan has shown ValVal (remember, humans have two copies of each gene, one from Mom & one from Dad) individuals perform better on a stressful, standardized test administered to 10th graders across the country each year. 

And it’s not just an effect on dopamine the Val158Met COMT polymorphism provides. Research has shown that ValVal individuals show lower physiological stress reactivity than MetMet individuals. 

Taken together, these data suggest that individuals with two copies of the Met allele will generally perform poorer under stressful conditions than those with two copies of the Val allele while those with a copy of each allele will fall somewhere in between.
Is your genotype destiny? 
Should those individuals with the Met polymorphism in their COMT gene resign themselves to doing poorly on big standardized tests; wilting under pressure? 

​The short answer? No. The long answer? We are more than our genetics.

First, our genetics interact with other aspects of our biology to ultimately produce behavior. My own research has shown that COMT genotype interacts with age to affect Now vs Later decision making. We interpreted this in context of the inverted-U relationship between dopamine and PFC function as dopamine levels are known to decline with age. So, a particular genetic setup that leads to supra-optimal dopamine levels when one is young may result in more optimal levels as one ages and dopamine “falls” down the curve toward more optimal levels. 
Picture
Dopamine levels naturally decline with age. Thus, where one's COMT genetics positions them in terms of optimal PFC function will shift over the lifespan with ValVal individuals in the above example falling out of the optimal dopamine range while those with the MetMet polymorphism may fall down into a more optimal intermediate level of PFC dopamine.
Note, that what level of PFC dopamine is "optimal" for various cognitive tasks will differ based on a variety of environmental factors. Thus, the COMT ValVal polymorphism may be more optimal in some situations and MetMet more optimal in others. 

Biology is just one part of the equation. 
​The Importance of Mindset
Mindset, or how an individual reacts to the biological changes that accompany stress, is also critical.    

It has been shown that taking a stress-is-enhancing mindset leads to better affective and cognitive outcomes than a stress-is-debilitating mindset.  
COMT genetic variation has been shown to mediate the effect of a stress-is-enhancing mindset manipulation on affect and cognition such that those with two copies of the Met allele were more responsive to the manipulation than those with two copies of the Val allele. MetMet individuals can more easily develop a stress-is-enhancing mindset.

And while your COMT genetics may affect how well mindset manipulations work, anyone can take steps to re-frame their stressful experiences in such a way as to see stress as more of a benefit than a detriment. For example, treat your stress as something you learn from rather than dwelling on the negative aspects.
Final Thoughts
In closing, genetic variation in dopamine signaling plays a role in how we perform under cognitively demanding tasks. Evolutionarily speaking, it made sense for some people to perform well under pressure (Val warriors) while others performed better under baseline, unstressed conditions (Met worriers). We should embrace the genetic diversity inherent in this and other behaviors but also realize biology is only one determinant of behavior. Our mindset and how we frame the effect of stress on us is also critical and, in fact, has biological effects on our stress response. 

The data presented here reflect a theme common in the brain and human behavior: behavior is modulated by both our biology and environment. Behavior is complicated and so to understand it, we need to look beyond merely our genes, proteins, and cells. Especially when it comes to human behavior, our environment and experiences affect our biology and behavior.

​None of these relationships are simple, which is what makes studying them so interesting.  
Further Reading:
Catechol-O-Methyltransferase moderates effect of stress mindset on affect and cognition

Changing Stress Mindset Through Stressjam: A Virtual Reality Game Using Biofeedback

Quantitative role of COMT in dopamine clearance in the prefrontal cortex of freely moving mice

The influence of Val158Met COMT on physiological stress responsivity

COMT genetic variation affects fear processing: psychophysiological evidence

The efficacy of stress reappraisal interventions on stress responsivity: A meta-analysis and systematic review of existing evidence

Rethinking stress: the role of mindsets in determining the stress response
​

The catechol-O-methyltransferase Val(158)Met polymorphism modulates fronto-cortical dopamine turnover in early Parkinson's disease: a PET study
​

Site-Specific Role of Catechol-O-Methyltransferase in Dopamine Overflow within Prefrontal Cortex and Dorsal Striatum
​

Older age may offset genetic influence on affect: The COMT polymorphism and affective well-being across the life span
1 Comment
<<Previous

    Author

    A neuroscientist by training, I now work to improve the career readiness of graduate students and postdoctoral scholars.

      Subscribe to Reflections Newsletter

    Subscribe to Newsletter

    Archives

    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    June 2022
    May 2022
    April 2022
    March 2022
    January 2022
    December 2021
    October 2021
    September 2021
    August 2021
    July 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020
    August 2020
    July 2020
    June 2020
    May 2020
    March 2020
    February 2020
    January 2020
    December 2019
    November 2019
    September 2019
    August 2019
    July 2019
    May 2019
    April 2019

    Categories

    All
    Academic Job Search
    Artificial Intelligence
    Career Development
    Career Exploration
    Data Science
    Future Of Work
    Innovation
    International Concerns
    Job Search
    Life Advice
    Neuroscience
    NIH BEST Blog Rewind
    Opinion
    Personalized Medicine
    PhD Career Pathways
    Professional Development
    Scientific Workforce
    Tools & Resources
    Welcome

    RSS Feed

Science

Career Development Research
​
Neuroscience Research


Publications

Writing

​Reflections Blog

Other Posts

Press, Resources, & Contact

Press                                                       Contact

Job Search Resources         Funding Resources

Subscribe to Reflections Newsletter 
© COPYRIGHT 2023. ALL RIGHTS RESERVED.