Christopher T Smith.com
  • Home
  • About Me
  • Leadership
  • Reflections
  • Career Development Research
  • Neuroscience Research
  • Published Research
  • Press
  • Presentations
  • Job Search Resources
  • Funding Resources
  • Subscribe to Newsletter
  • Contact

Reflections Blog

Past, Present, Future: Reflections On Time

12/21/2023

0 Comments

 
Life Advice, Neuroscience, Personal Perspective
Picture
As we approach the end of the calendar year and are in the midst of the holiday season, time is often top of mind. Both reminiscing on the year that was or holidays past are common in December. Memories often made with those we love can bring comfort if we are apart or if those individuals are no longer with us. We can mentally time travel back to the past and in the process drag the positive feelings from that time into the present, if only for a bit. 
In addition, as the calendar transitions to January many people begin to think about the new year and our goals for the future. We project ourselves into the future and ask: what can we do now and in the immediate months ahead to get us where we aspire to be years from now? Getting more "in shape" today will make us healthier long-term, saving a bit more for retirement now will pay off when we are older, learning a new skill over the next few months can make us more marketable for a future career. These are all worthy goals that involve simulating how our actions in the present will potentially impact our future. ​
In both instances mentioned above we are constructing other moments in time in our heads and using them to affect our current state or guide near-term actions. Thinking about both the past and future also reminds us that the passage of time is constant and persistent and that the present is quite fleeting. The clock ticks on around us pushing the present into "the past" and pulling "the future" into the present. And while we are certainly aware that time is a real, objective thing that can be measured in second, hours, days, and years it is also, in a way, a subjective experience that can be hard to fully describe.   
Picture
Measuring Time as a Mortal
Time is certainly an intriguing concept, especially to human beings. We have all experienced time seeming to slow down or speed up in our daily lives. In some ways it can seem arbitrary (who decided a minute is 60 seconds?) but in other ways it is very tangible and real, relentlessly moving forward (regardless of how you measure it) and in the process reminding us that it is personally limited. ​Time is our most precious resource as it is finite. You cannot create more of it. Objectively, time is constant and immutable in the sense that one cannot escape it. 
"To be human is to be aware of the passage of time; no concept lies closer to the core of our consciousness."
                    - Dan Falk from In Search of Time: The History, Physics, and Philosophy of Time

Our fascination with and fixation on time is most certainly related to our mortality in that we all will die in time. All living things have an expiration date and one could reasonably argue that the act of living is the delaying of death through information and action. 
Time as our most precious commodity can perhaps best be illustrated by the fact that some of the world's most wealthy and powerful people have been working hard to extend the time they have on Earth with investments in Altos Labs, Unity Biotechnology, Juvenescence, and Calico Labs and organizations like the Methuselah Foundation. It has become popular these days for many of us to try to hack our longevity (for more see the work of Peter Attia including his podcast and new book Outlive: The Science & Art of Longevity for more on this). And while certainly having information and resources could increase one's longevity and potentially extend one's time on Earth, time as a daily commodity is the same for everyone. ​
We all have 24 hours in our days which we fill with commitments (work, raising a family), biological functions (eating, sleeping), and exercising our own preferences and desires. And while certainly those with means and privilege may be able to spend less hours on "basic needs", they can't extract more than 24 hours out of a day. Those 24 hours are limited and precious for everyone. So, what will you do with the 24 hours you have each day? Should you seek to optimize them to the hilt (to try to squeeze all utility out of them) or is their value in "wasting time" or enjoying your time? And how should you balance the allocation of these activities?  ​
Picture
Structuring Time
The quote "How we spend our days is how we spend our lives" is attributed to the author Annie Dillard in her book ​The Writing Life. She goes on to speak to the value of daily schedules defending us from "chaos and whim" and that "a schedule is a mock-up of reason and order—willed, faked, and so brought into being; it is a peace and a haven set into the wreck of time; it is a lifeboat on which you find yourself, decades later, still living." 
This quote speaks to both the unpredictability of the future and inability for us to completely "master time" (chaos and whim; the wreck of time) while also conveying the value of having structure in our lives to ensure we make time for what is important to us. This could include key work priorities or making time for companionship and fun, all of which are important to human flourishing. But in our limited time, how do we prioritize amongst the many things we could be doing while also ensuring we make progress on what we have to do to survive? There are no easy answers. Rather, we must try to the best of our human ability to plan for the future while not letting the present pass us by.  ​
The Burden of the Present: A never-ending to-do list 
Given that most of us perceive we lack sufficient time in our daily lives, many search (often unsuccessfully) for various approaches to boost their productivity and "maximize their time." The rise of smart phones and communication technology like Slack and Zoom make it possible for us to work from virtually anywhere at anytime. It is exceedingly difficult to disconnect in such an environment. I think many of us believed, especially with the rise of "remote work" in response to the global COVID-19 pandemic in Spring 2020 that we would be liberated from the bounds of an office, strict working schedules, and the dreadful daily commute. And in a way we were but when the lines between work and home blur it makes separating one's "work time" from "personal time" a challenge. Even the most exciting sounding possibility for remote work technology - work from anywhere, including while on vacation - seems to miss the point that one takes a vacation to, at least conceptually, escape work. 
There are plenty of resources and writings that try to address time management in the digital age or successfully setting and maintaining work/life boundaries. Various tools and apps promise to help. And while certainly there may be some value in these approaches, I think we have all found a "solution" to our time management problems elusive. So did author Oliver Burkeman who, in his book Four Thousand Weeks: Time Management for Mortals, comes to a shocking but perhaps accurate conclusion that we have more responsibilities and work to be done than can ever be accomplished in our lifetime (the average length of which is 4,000 weeks). He says we should accept this reality and in the process try to liberate ourselves from the notion that there is some hack or approach that will help us accomplish our many tasks. When I read this book, its central thesis shocked me at first. I expected to get some practical tips or insights on time management and was left with the hard truth of the inability to ever "accomplish my work". This is something I am still coming to terms with even though at my core I realize it is, in fact, true. 
In the book, Burkeman makes the point that we often want to rationalize that the hard work we do today will pay off in the future. And there is some truth to this. But, he also mentions we wrongly believe that through our present efforts and sacrifices - the checking off of our "to-do list" - we will ultimately reach some idealized future where we will magically "have more time" and then be able to focus on what really matters.  
Picture
​However, there are always new things we will (or feel we will) need to do...the to-do list is essentially never able to be fully checked off. The challenge we all encounter is the idealized future we are "working toward" is often a future that we never pull into the present...it never reaches us. 
So, why be so worried about all these responsibilities and to-dos if they prevent us from living in the now? Why not just live fully in the present - YOLO and all of that? Well, living too much in the present, the NOW, also has its issues. 
The Neuroscience of Time: NOW vs LATER Rewards, Depression, & Anxiety
A quick detour into how our brains perceive time. First, there is very interesting work from Thomas Suddendorf and Michael Corballis suggesting that "mental time travel" (ie, the ability to transport our consciousness to the past - episodic memory - or future - prospection) may make humans unique or at least is a process more developed in humans. These researchers also speculate that human language may have evolved to facilitate group mental time travel through stories from the past and/or coordinating/planning future, collective goals. 
So, our brains are built for time traveling. They also can get fixated at different points in time to our peril - ruminating too much on the past and/or becoming too anxious about the "unpredictable" future. We also struggle calculating the time value of money and are rightly biased to be more concerned about having a resource like money to spend today versus investing it for a future purpose. ​
Living wholly in the present has its appeal and deep biological origins in our animalistic need for survival. Humans and other animals indeed will "discount the future," favoring rewards available NOW (or soon) over those delivered in the future, even if the future rewards are large. Academics call this tendency to favor NOW over LATER delay discounting behavior. 
Picture
Picture
The present value of a reward is discounted in proportion to the time you must wait to claim it. The figure on the left shows how impulsive choice is a preference to choose the NOW option over the LATER option at higher rates than average or, to put it another way, to discount the future more steeply. In the figure on the right we see two "discounting curves" where the white circles reflect an individual with a steeper discounting rate (ie, they value the NOW over the LATER more or discount the future more steeply) than the individual plotted with black circles. 
In short, we are biologically biased to favor the NOW over LATER, though some of us more than others. My Ph.D. dissertation work at the University of North Carolina was focused on investigating these individual differences in delay discounting behavior in humans. As mentioned, we know from years or research on a variety of species that rewards have less value if they are delayed and the degree to which individuals discount this delay varies. In humans, these variations in delay discounting are based partially on biology and our environment but can also be susceptible to temporary states such as acute stress. And understanding this phenomenon is important as delay discounting has also been shown to be associated with drug addiction and may indeed be a risk factor in developing addictive behaviors. 
Picture
My dissertation work found that individuals at risk for alcohol use disorders (heavy drinkers, those with a family history of alcohol use disorders) show a higher preference for NOW choices than those at low risk (no family history, light drinking behavior).
While certainly we can see how thinking only about the NOW and our impulsive urges may be problematic, so too is an over-emphasis of our thoughts on the past or future. It has been demonstrated that depression is more associated with past events while anxiety more often associates with future events. This indicates that an unusual fixation on both the past and future can also be problematic to our mental health and well-being. 
Thus, balancing our focus on past, present, and future is probably best to live an effective life. One could see how learning from the past can enhance future survival and similarly how planning for the future (and the uncertainty of future events) is also important. And living in the present is also essential as it is the only actual living we are doing currently.  
The cliché phrase (attributed to Eleanor Roosevelt, among others):
Yesterday is history. Tomorrow is a mystery. Today is a gift, that is why they call it the Present.   
​May by a bit trite but that doesn't mean there is not wisdom in it. 
It speaks to the idea that the present is within our control as it is happening now. 
Focusing too much on the past or future prevents us from living in the moment and taking actual action in the here and now. 
Living in the Present
In some ways we think living in the present moment (right now) should be easy. However, to be really present in a moment requires you to resist distraction. Sitting with your thoughts or being engaged in the presence of others (ie, having meaningful conversations and connections) can be surprisingly uncomfortable if you haven't done it frequently. 
In an age of unlimited work and entertainment distractions, one of the most difficult things for us is to be truly present in a moment. Part of the challenge is that we know at a deep level that our time is finite and limited and so we often attempt to master it, to maximize it. Sitting in the moment and enjoying it can seem somehow wasteful and superfluous - what do we get from being "in the moment"? Well, being present in the moment is effectively at the center of the practice of mindfulness, which has shown a range of positive effects on human health and well-being. Despite this knowledge that being in the moment is valuable, we almost can't help our thoughts from drifting to our "to-do list" or the need to use our time "better". 
At the center of our focus on optimizing time is the realization that we are mortal and finite...that we will die. Time will run out for all of us. The irony of this fact, though, is that when we accept this, really accept this, we can stop trying to convince ourselves that by maximizing our schedules and time we will somehow create more of it. Rather, realizing our mortality and relishing the present can help us focus on what matter most to us and stop waiting for "one day" to come. 
An excellent example of the "waiting for one day" phenomenon is the fixation many Americans have on earning and saving more money today, often at great personal sacrifice, for that glorious retirement in the future. 
Picture
Preparing for the Future
In the personal finance realm, a focus on saving for the future is seen as an unequivocally good thing. Indeed saving for the future is critical to retiring in a financially stable position (especially if your government does not provide a strong social safety net to retirees). However, it is very easy for many individuals, especially those obsessed with the Financial Independence, Retire Early (FIRE) movement to overly focus on future retirement instead of present satisfaction. Many individuals have gone to great lengths to cut so much out of their current lives to expand their savings rate that they live lives of immense deprivation (NOTE: there is a difference between frugality and deprivation). 
But one should reflect during their working life and ask: What are they saving this money for? What is the purpose of having retirement savings? Upon deeper self-reflection you could come to the conclusion that there is such a thing as delaying one's consumption too much into the future. 
Also, as none of us really know how long we will live, we could be saving for a future that may not come. The influential book Die With Zero may have an extreme title but its thesis seems correct to me: one should think critically about when it makes sense to spend money NOW on experiences that should not be delayed for a future retirement. 
In the book, the author speaks to the need for us to consider when we should have certain experiences and questions whether waiting "until retirement" is the right answer. Backpacking through Europe is probably best done in your twenties, for instance. Even something like hiking a beautiful trail may be more feasible when you are forty or fifty not seventy. And the thought experiment goes even further when it brings up the point that you can only do certain activities with others during select windows of time. You can't constantly put off playing baseball with your 10-year old because they won't be 10 for more than 365 days. You can't keep delaying that big family trip with your parents until "your work slows down" because their time on Earth is probably more limited than yours (and the work will probably never "slow down"). When you bring other people and their mortality into the equation or, less grimly, the fact that they are also aging and moving into different seasons of life themselves, you reach the conclusion that you can't afford to wait to do certain things with them. This doesn't have to be a morbid or depressing experience (ie, focusing on a loved one's aging and mortality) but hopefully it is a motivating force to take action NOW to have memorable experiences with the people important to you. 
Memory Dividends
These memorable experiences produce what the author of Die With Zero calls "memory dividends". Essentially, these experiences with others live on in our memories for years and the recalling of them can bring joy and fulfillment well into the future. So, by investing the time in key experiences and moments NOW we reap the benefits in the future via our memories and recollection of "that time when" story. This is not unlike investing for retirement and reaping dividends on our capital...we put money in now for a payoff later. But in the case of memory dividends we often sacrifice money now for experiences we will treasure forever. Money can't buy happiness but it can provide us with the means to have special experiences with others we will treasure. 
And while a lot of this memory dividend talk brings to mind spending lavishly on a multi-thousand dollar African Safari, Mediterranean Cruise, or week-long trip to Disney World, one doesn't have to spend a lot of money to make memories with family and friends. In a way, being present with those we care about can be enough. Making the time for them and prioritizing that over other commitments can be everything. Ironically, we often remember some of the smallest, seemingly mundane experiences with those we love as strongly as the big, audacious trips and outings. And these memories often become more powerful with the passing of time and, ultimately, sadly, when those we made them with are no longer with us. Then, the memory is what is left in the end. A reminder of that individual and their impact on you and you on them. 
Picture
Nostalgia is defined as a sentimental longing or wistful affection for the past. For example, we often look back on our times as children fondly when we perceive life was simpler and less stressful. And data show older adults are more likely to be nostalgic as they reminisce on their youth or days gone by and in general these nostalgic experiences lead to higher subjective well-being in the elderly. So, in a way, nostalgia and positively remembering the past allows us to cope with approaching the inevitability of death as we age. It seems a bit morbid at first but our brains and consciousness are probably doing us a big favor by increasing our subjective well-being as we age. It often helps us feel we had a good life. 
Immortality
At the beginning of this piece I mentioned the fixation many of the world's billionaires have on living longer (or perhaps forever). And while advances in technology may one day allow those with means to upload their consciousness to the cloud or a robot body (saying nothing about the ethics of whether they should ​do it), what if I told you there is something you can do today to ensure your immortality? 
We have already covered this secret. Your existence, your impact, your story can live on in others. When we interact with those around us we are laying the groundwork for a sort of oral history of ourselves. How we treat them will often be recounted. The person we were in their eyes can persist for years to come. So, what kind of person do you want to be remembered as?
Picture
This brings to mind the concept of "resume virtues" and "eulogy virtues" from David Brooks. Brooks argues that what we all should be focused on are the actions that could end up in a funeral eulogy (he was a kind, caring husband and father who gave much of his time to his favorite charity after retirement) and not points on a resume (he rose to the rank of senior vice president). In the moment, however, we often are more fixated on the resume items. It is up to us to reflect often on our actions and ask ourselves if the things we are obsessed by in the now will be things we and others care about in the future. ​
We build our eulogy virtues through caring for others. ​
And, lest we forget, if you are a creative, writer, or scientist, we also build our legacy through our work. So, doing good work is important in the sense that it can produce a piece of knowledge or creative work that others can consume, be inspired by, and build on and through this process has the potential to live on after us. And sometimes what we put out there in the world can have an effect or impact we can't even imagine (or experience) during our life. For example, in the sciences there are many who question the value of "basic research" but this type of work can inform big breakthroughs, often years later (see for example, Mendelian genetics, the decades-long development of technology for mRNA vaccines, and an overview of other innovations from fundamental research published by the National Academies of Science, Engineering, and Medicine). I say all this to emphasize that anything you put out in the world, from a product of your creative or cognitive labor to the way you treat others, can have long-lasting and unexpected effects. The key is to do these things without always questioning (or obsessing over) their "value" in the now. I know that is easier said than done but history shows underappreciated work can have a big impact later. 
Picture
And while "history" may not chronicle the impact good deeds and kindness have on others at an individual level we all know at our core that how others treat us affects us. Think back to an elementary school teacher in your life...they almost certainly positively impacted you in ways they may never know. And they do the work because they care and aren't necessarily focused on "measuring" or "tracking" the product of their labor. In the immediate term there are crude and ineffective ways to "measure" ​a teacher's impact (ie, test scores) but in the long term their impact is immeasurable (igniting curiosity in young people, encouraging a student to pursue a passion, or modeling how to be a caring citizen). Measuring one's impact on others is hard unless they tell you explicitly about it but that doesn't mean an impact wasn't made. 
In the end, all of you reading this (at least with the current state of technology in late 2023) will eventually move on to whatever comes after life. As such, your life is precious and limited. So, how will you spend it? 
In my mind, our job as living beings is to try our best to bring joy, happiness, and meaning to ourselves and others while also adding value to the world. Sometimes the value you add is obvious, sometimes less so. In a way, it is not for you to judge or measure your value (though I know we all do it...hey, we are human beings after all). But if you deeply examine how you can take actions or deploy your skills to serve others today and in the future, whether they be family, friends, or your fellow man, I promise you that you will have made good use of your limited time on Earth. ​

More from the blog:
  • The power of human connection
  • Why you should get involved in things outside the lab/work
  • Cultivate serendipity by giving back and getting involved 
  • Find your passion? Finding meaning and purpose in your work and life
​
For Further Reading:
  • The Science of Mental Time Travel and Why Our Ability to Imagine the Future Is Essential to Our Humanity
  • In Search of Time: The History, Physics, and Philosophy of Time (Book)
  • Your Brain is a Time Machine (Book)
  • Die With Zero (Book)

Watch:
  • How to stop fighting against time - Tedx Talk by Oliver Burkeman 
  • Four Thousand Weeks: Time Management for Mortals video summary
  • Die With Zero book video summary 
  • A longer watch: Die With Zero - Net Fulfillment Over Net Worth, an interview with the author
0 Comments

Mind Over Matter

2/23/2023

0 Comments

 
Neuroscience, Career Development, Life Advice, Personal Perspective
Picture
In last month's blog post I discussed how our perspective matters in how we interact with and see the world. As I was exploring research to cite in that piece I came across some very interesting work related to how how a person's mindset can affect them, physically. 
Much of this work comes from Alia Crum, an Associate Professor of Psychology at Stanford University. The Stanford Mind & Body Lab she directs studies how subjective mindsets (e.g., thoughts, beliefs, and expectations) can alter objective reality through behavioral, psychological, and physiological mechanisms. Her first publication, Mind-set Matters: Exercise and the Placebo Effect, found that informing female hotel room attendants that their work cleaning rooms was good exercise that satisfied the Center for Disease Control and Prevention's recommendations for an active lifestyle perceived themselves getting significantly more exercise 4 weeks later than a control group despite no overt change in their actual physical activity. Informing the attendants that their work was good exercise also affected their physiology measured at the 4-week time point. In fact, the subjects in the informed group lost an average of 2 pounds, lowered their systolic blood pressure by 10 points, and were significantly healthier as measured by body-fat percentage and body mass index.

​This study is a remarkable demonstration of how perception and belief an affect not only how one perceives their actions but also how this impacts their bodies and health. Crum has gone on to examine other interesting effects of mindset and beliefs on human physiology including how a milkshake perceived as more caloric and decadent increased participants' feeling of satiety ("fullness") and reduced ghrelin levels (a physiological signal for satiety) more than a milkshake labeled as healthy/diet despite the fact the milkshakes were identical in their make-up. The simple belief that one shake was more decadent and rich (despite it not actually being so) led to a physiological signal of more "satisfaction". Beliefs are powerful things.
Watch Dr. Crum's excellent Ted Talk discussing her research & the impact our mindsets make. 
And while certainly these findings are interesting and potentially impactful in how we think about food and exercise, Crum and others have also demonstrated the power of mindset on our mental state and ability to function productively in the world.

​For example, stress can both enhance and hinder human performance and work by Crum and colleagues show that one's stress mindset can impact both physiology and behavior. Based on responses to a scale developed by these researchers (Stress Mindset Measure), individuals fall into either a “stress-is-enhancing” or “stress-is-debilitating” mindset by default. Importantly, though, information presented to individuals that emphasize the enhancing nature of stress show improvements in self-reported health and work performance. Additionally, the authors found that individuals with a stress-is-enhancing mindset have a stronger desire to receive feedback on their performance and show more adaptive cortisol (stress hormone) profiles under acute stress.
Crum's stress work indicates the importance of mindset on how we respond to challenges in the world. One of her most recent publications, though, takes her lab work out into the real world. Specifically, they investigated differences in how individuals viewed the COVID-19 pandemic at its outset in Spring 2020 and the impact these varied viewpoints had on a variety of measures collected from them 6 weeks and 6 months later. Over 20,000 American adults participated in this study at intake (which took place on the very day the World Health Organization officially declared COVID-19 a global pandemic: March 11, 2020) with analyses investigating subgroups that completed the follow-up assessments at 6 weeks (May 2020; n=9,643) and 6 months (October 2020; n=7,287) post initial assessment. A total of 5,365 COVID-negative participants completed all three surveys and were included in the subsequent longitudinal analyses by the team.
Study participants' mindsets (using a modified version of the Illness Mindset Inventory, for more see) about the pandemic were categorized along three dimensions:
  • Catastrophe Mindset: The COVID-19 pandemic is a global catastrophe that is wreaking havoc on our society.
  • Manageable Mindset: The COVID-19 pandemic can be managed so that people in our society can live life as normal. 
  • Opportunity Mindset: The COVID-19 pandemic can be an opportunity for our society to make positive changes.  
​Importantly, mindsets differed between individuals and within individuals over time (some individual's mindsets shifted across the timepoints assessed).
Here is an excerpt from the discussion section of their paper explaining their findings:
"Those who endorsed the catastrophe mindset more than others took the situation more seriously; they stayed home, washed their hands, and (when it was recommended) started wearing a mask. Interestingly, this appeared to be at the expense of other aspects of their wellbeing.

This contrasts with the effects of the manageable mindset. Despite maintaining high levels of wellbeing during the pandemic, people who adopted the manageable mindset to a greater extent than others were much less likely to prioritize these CDC recommendations. As such, endorsement of this mindset may reflect an attempt to deny the reality of the global pandemic and a refusal to engage with it in a socially responsible way. Over time, as people adjusted to the changes necessitated by the pandemic, it may have become more adaptive.
​
The opportunity mindset seemed to provide the best of both worldviews; those who adopted this mindset to a greater degree compared to others staved off major declines in wellbeing without subverting the behaviors necessary to engage with the pandemic in a socially responsible way."
Picture
Opportunity, Optimism, and Your Job Search
Indeed, framing stressful and challenging situations as an opportunity is crucial to aid us in persisting in activities despite the perceived and real barriers we face. And viewing the stress associated with life as enhancing can help us channel our stress to productive efforts.

​For many seeking to enter the world of work, the modern job search is one of those stressful experiences than can benefit from a mindset shift. 
Your mindset affects your career. 
​Data show that students with a lifelong learning mindset (ie, a growth mindset) receive higher supervisor ratings of their performance in a co-operative education program and report higher levels of job satisfaction, work engagement, and job-related self-efficacy in their careers after graduation. In addition, they receive more promotions in their careers.

​A study of Duke University MBA students mirrored these findings: those with an optimistic attitude about life (assessed at the beginning of their graduate program) received more internship offers, had better employment prospects at graduation, and were more likely to be promoted 2 years after graduation.  
Your mindset, uncertainty, and the future. 
​
We must acknowledge that while optimism and a growth mindset can help you navigate the world and your career more effectively, we are living through a time of rapid technological progress and change. The rise of artificial intelligence, machine learning, large language models, and more have added increased levels of anxiety amongst knowledge workers (a topic I will discuss in March's blog post). We must remember though, that by its very nature, the future is uncertain and unpredictable. Dealing with this uncertainty and change by abandoning your agency is not a winning strategy, however. 
Regardless of what is happening in the ever-changing external world, we must believe that we have, at minimum, control over our mindset and, as a result, believe that things can get better for us despite the stress and uncertainty we face. Cultivate optimism and a growth mindset. 
​
​Indeed, optimistic individuals tend to have better health prospects and live longer and cultivating a growth mindset is associated with increased subjective well-being & health and relationship & job satisfaction. ​
Picture
Optimism & Your Career
I spend much of my working days thinking about how to help individuals with Ph.D.s navigate their careers. It is both a reflection of human nature and a sign of the times that some of the most educated individuals in society are stressed, anxious, and pessimistic about their job prospects.

​Some of this is surely rooted in how academia has constructed graduate and postdoctoral training (ie, an apprenticeship model) as well as actual barriers to work that exist for international students and scholars needing work visas to be employed in the United States, for example. 

A great deal of job search anxiety comes from the fact that humans are often wired to focus on what they don't have versus the attributes they do possess (see last month's post and a discussion of the negativity bias). We all have valuable skills and perspectives to share but we have to believe this is the case before we can convince others of these facts.

In addition, we need to work to channel our stress and unease about a job search into productive efforts (ie, view stress as enhancing vs debilitating). Instead of allowing our feelings of inadequacy to push us toward a state of inaction or resignation remember that growth and development is part of life. Just because you aren't good at something yet doesn't mean you can't develop that skill or competency.

​Take a growth mindset to developing your growth mindset. Construct a plan to enable you to assess your skills, determine where you need to develop, and chart your future, ideally before you enter a job search.  

To return to the fundamentals of your mindset, a critical first step to making progress in your career, job search, and life is believing you have something to offer and contribute. Focusing on your strengths and unique experiences can help and as we have seen in some of the data shared in this piece, simply reframing your beliefs (in this case about your job search) in an affirming light - I have something to offer and contribute - can make all the difference in your experience and even, perhaps, your outcomes. 
More from the Blog
  • Career Exploration 101
  • Post Ph.D. Career Plans: Consider the Possibilities 
  • Perspective (Blog post from last month)
  • Conveying Your Value Prior to and During a Job Search
  • Compounded Returns: Growing Your Network and Personal Brand

Additional Reading
  • The Importance of Being an Optimist: Evidence from Labor Markets
  • Dispositional Optimism​
  • A Matter of Mindset: ​The Benefit of a Growth Mindset After a Career Shock
0 Comments

Perspective

1/26/2023

0 Comments

 
Neuroscience, Life Advice, Personal Perspective
Picture
It's all a matter of perspective.

Human beings are remarkably adaptable creatures. In fact, our ability to adapt to different climates and environmental circumstances has allowed homo sapiens to colonize virtually all of Planet Earth. Essentially, adaptability is our evolutionary advantage. 
Habituation and Unconscious Behaviors
Adaptability is a double-edged sword, however. We often become so accustomed to a particular state that we forget what a different state can feel like. Biologists might resonate with explaining this in terms of homeostasis, where the body seeks to maintain a steady state of internal conditions (think temperature, pH, etc...). Our brains are no different. A neuroscientist might explain the "homeostasis" of our minds as habituation. In its most classic form, habituation involves our minds becoming accustomed to a constant stimulus to the point that it is not perceived after a period of time. A good example is the texture or feeling of our clothes on our skin. There is certainly a stimulus being applied but it becomes essentially imperceptible as we habituate to its constant presence. In essence, our conscious minds filter out this stimulus as it is not new, novel, or salient enough to devote attention to.  

Both our perception of external stimuli and our behavior can become habitual. Our ability to interpret and react to the world consistently produces a habit. Stimulus produces response almost reflexively when a habit is formed and conscious thought of why a particular action was taken is often absent. Habits are often useful as they free up cognitive resources and allow "routine" actions to proceed automatically. No need to think about how to walk once you have developed the action and, at a higher cognitive level, bicycling or driving to work everyday ultimately proceeds on autopilot after you have been using the same route for a month. Because of this amazing capability of our minds, we can think about other issues and goals during our commute as the "automatic" processes of our brains take over to get us from home to work. 

​The unconscious nature of habits means that we are often unaware of why we make choices or take actions that have become habitual. We may not even be aware or able to resist engaging in actions that are objectively "bad" or harmful. A classic example is drug addiction. One hallmark aspect of being addicted to a drug of abuse is that use of the drug becomes habitual (automatic) and that addicted individuals continue their drug use despite negative consequences. This occurs because drug use has become habitual in a biological sense, often triggered by stimuli in the environment that prompt craving and use in a powerfully unconscious way. There is strong evidence that habit and "wanting" drives drug use more than "liking" in once a drug has become addictive. 
Drug addiction may be one of the most stark demonstrations of how corrosive and destructive habits and the unconscious processes between stimulus and response can be on us and our lives. It is far from the only problematic behavior fueled by the environment acting on core neurobiological processes. Our modern world has resulted in the development of a variety of problematic habits, many of which are driven by the ability to obtain entertainment and content in an instant. Our attention is also sapped by a plethora of digital signals coming from our screens and attempts to appeal to our basal instincts of pleasure seeking and pain avoidance. The effects of technological proliferation on our brains and behavior is being studied and a particular focus on how it is shaping the minds of adolescents' during their development is critical. 

Personally, I feel patience and taking the long-view is in short supply these days. The current climate leads many to think feedback or "results" should be instantaneous in all aspects of their lives. We expect response to rapidly follow action in the 21st Century but all aspects of life are not as quick to give us the feedback we want as clicking "buy  now" on your smartphone. Overcoming these modern temptations is a challenge because of how easy it is for them to tap into habitual behaviors and our core needs of resource acquisition, human acknowledgement, belonging, and more. Fortunately, however, we have the ability to consciously frame our experience of the world in positive, constructive ways and take steps to behave accordingly.  
Picture
Individual Differences in How We Interact with and See the World
Humans are exceptionally good at allowing their perspective to construct their version of the world.
In our modern information age, one can often be captured by negative headlines. And while certainly negative information is more attention grabbing (ie, salient), it does not mean there are no positive narratives to speak of. 

In addition, many events or outcomes we experience are not objectively ALL negative or positive. Rather, there is a perspective that can often be taken that sees the positive in mostly negative events or the negative in mostly positive ones. 

I believe some human beings are wired to be more drawn to the positive or negative aspects of an experience...seeing the flaws in nearly all things or viewing the world through rose-colored glasses. Indeed data show individual differences in the experience of stimuli as positive or negative which may have a biological basis (see also). Through conscious decisions and processes, however, we can regulate our innate biological tendencies to focus on the negative or positive. 
Our perspective and view of the world ultimately shapes how we interact with it. If you feel the world is a hostile place and that everyone around you is motivated by their own self-interest, you will begin to take the same perspective. On the other hand, if you believe most human beings are altruistic and get fulfillment from helping others, you will perceive your interactions differently.
This can perhaps best be illustrated by thinking about the many instances we encounter in day where we are trying to discern a person's intent or motivation. This can be especially difficult if it comes in a form of communication where tone and other cues are absent - email.  

When you receive an email with a comment or request you project onto it your own belief about what the person intended to communicate. It is critical, then, to try to "read" the message from multiple perspectives and not assume that it was written with either ill intent or effusive praise. 
When we are faced with fear and uncertainty, I think it is even more important to keep our perspective and not spiral into a negative state. Indeed anxiety and stress heighten our negativity bias. A tendency to engage in cognitive reappraisal, or changing the way one thinks about potentially emotion-eliciting events, can mitigate these effects, however. 
Another concept that comes to mind when thinking about perspective is the impact a growth versus fixed mindset can have on our willingness to learn and develop. Stanford psychologist Carol Dweck coined these terms and her and her colleagues have researched how growth and fixed mindsets impact us. Those with a growth mindset believe that, with effort, perseverance and drive, they can develop their natural qualities and "improve". In contrast, those with a fixed mindset believe talent and abilities are fixed/innate and, thus, less likely to expend effort to try to enhance their skillsets. 
A similar concept is that of locus of control. Locus of control describes the degree to which individuals perceive that outcomes result from their own behaviors (internal locus of control), or from forces that are external to themselves (external locus of control).  

​We could all do better by developing a growth mindset and internal locus of control as we navigate a complex world. 
Picture
Shifting Perspectives
In an increasingly polarized and atomized United States and world, considering other's perspectives becomes a critical skill in short supply. It takes more cognitive resources and effort to consider other perspectives and ideas. This contemplation requires us to slow down and not rush to judgement. The process also requires decoupling our perception of a person's intentions from that individual's actual intent. As we've discussed, it is easy to fall into negative assumptions or construct narratives of ill-intent or maliciousness. While those assumptions could be true, starting from a negative space is rarely productive or effective. 

I choose to carefully reframe my perceptions of interactions before responding. To take a measured approach and understand the other party's position and viewpoint. While this takes time and effort, changing our default perceptions and habits can lead us to a more productive relationship with others and the world. 
Related Blog Posts:
  • Wanting, Liking, and Dopamine's Role in Addiction
  • Stress and the Brain: How Genetics Affects Whether You are More Likely to Wilt Under Pressure​
  • To Be Rather Than to Seem
​
Further Reading:
  • Brain health consequences of digital technology use
  • ​The impact of the digital revolution on human brain and behavior: Where do we stand?
  • Where do desires come from? Positivity offset and negativity bias predict implicit attitude toward temptations
  • Your powerful, changeable mindset
  • Negativity bias, negativity dominance, and contagion (PDF)
  • The psychological and neurobiological bases of dispositional negativity (PDF)
  • Propensity to reappraise promotes resilience to stress-induced negativity bias (PDF)
 
  • Mindset: The New Psychology of Success (book)
  • The Impulse Society: America in the Age of Instant Gratification (book)
  • Positive Emotions and Psychophysiology Lab at UNC Chapel Hill (led by Barbara Fredrickson, who developed the Broaden-and-Build Theory of Positive Emotions)
  • Stanford Mind & Body Lab, which has led interesting studies on how mindset affects one's biology including: 
    • Making sense of a pandemic: Mindsets influence emotions, behaviors, health, and wellbeing during the COVID-19 pandemic
    • Mind over milkshakes: Mindsets, not just nutrients, determine ghrelin response
    • Mind-Set Matters: Exercise and the Placebo Effect
0 Comments

Dopamine, Drug Addiction, & Personalized Medicine

12/2/2021

0 Comments

 
​Neuroscience, Personalized Medicine
Picture
What is dopamine?
Dopamine is a neurotransmitter, a chemical that shapes how the brain processes information. It does this by binding to different categories of dopamine receptors which then leads to changes in the intracellular processes of neurons, the cells responsible for transmitting information in and outside the brain. The D1 family of dopamine receptors (D1 & D5) increase intracellular levels of a chemical second messenger, cyclic AMP, which can then affect how a neuron processes other signals it receives. The D2 family of dopamine receptors (D2, D3, & D4) decrease cyclic AMP, which can also shape neural responses. How the dopamine signals interact with other signals in the brain can be quite complex and is beyond the scope of this piece. For more see this review article.

Dopamine signaling plays a role in a variety of critical cognitive processes including motor control, learning, and decision making. It has also been implicated in the addictive nature of drugs of abuse, which I studied in some detail during my Ph.D. and postdoctoral research. 
Positron Emission Tomography and measuring dopamine signaling in the human brain
Positron Emission Tomography (PET) allows scientists to measure dopamine signaling in the living brain. PET has been around since the 1960s and involves imaging the location and amount of a radiotracer (radioactively-tagged compound) in the body. Most PET radiotracers contain C-11, F-18, or O-15 radioactive isotopes. These isotopes release positrons (which are the antiparticle of the electron) which, when they interact with nearby electrons in the body produce an annihilation event leading to 2 gamma ray photons being emitted at 180 degrees. The PET scanner "counts" these gamma ray events and ultimately reconstructs the image that produced the events by projecting the gamma ray counts back into the body part being imaged. These PET images give quantifiable data regarding the amount of tracer that accumulates in a particular area over time.
Picture
Schematic of how a PET scanner measures gamma rays to quantify the level of a radiotracer in particular anatomical areas of the brain. Image by Jens Maus (http://jens-maus.de/); Public Domain, https://commons.wikimedia.org/w/index.php?curid=401252
​Brain PET is a particularly powerful technique in that we can use radiotracers that allow us to investigate brain metabolism, neurotransmitter receptors (dopamine or opioid, among others), neurotransmitter synthesis, and the presence of beta-amyloid plaques (often present in Alzheimer's disease). With these compounds we gain a better understanding of individual differences that may be useful as markers of disease state or risk for developing a particular disease. Common radiotracers for imaging the dopamine system include FDOPA, C-11-Raclopride, F-18-Fallypride, FMT, and others. Several groups have used some of these compounds to better understand the dopamine system's role in drug abuse. 
Do dopamine signaling differences reflect risk for drug addiction?
All drugs of abuse release dopamine in the brain. Dopamine, among other things, links pleasure/wanting with the stimuli its release is paired with. Thus, differences in dopamine signaling in response to drugs of abuse may relate to a greater propensity to re-use drugs found to be rewarding and potentially lead to increased risk for drug addiction.

PET imaging has shown that lower dopamine D2/3 receptors are present in a variety of drug-addicted individuals (alcohol, cocaine, methamphetamine, heroin) when compared to healthy controls. Whether low D2 receptors are a cause or consequence of problematic drug use has been difficult to determine in human studies, however.
Picture
Animal work has suggested that behavioral impulsivity is associated with lower D2 receptor levels in rodents. These researchers also found that high impulsive rats would later go on to self-administer more cocaine than low impulsive rats (Dalley et al., 2007). Thus, D2 receptors may confer a greater propensity to engage in behaviors that are associated with drug addiction risk in humans (impulsivity, novelty seeking). Furthermore, work in non-human primates has shown that low D2 receptor levels predict escalation in cocaine self-administration, which leads to lower D2 receptor levels (Nader et al., 2006). This work suggests that low D2 receptor levels may predispose individuals to escalate drug use and that chronic drug use further changes these receptor levels.
Human PET studies have focused on individuals with a family history of addiction to try to corroborate the animal work linking dopamine D2 receptors with addiction risk. Volkow et al. 2006 have shown that individuals with a family history (FH) of alcoholism show heightened striatal (a region deep in the brain responsible for reward processing, learning, and action initiation) D2 receptor levels compared to subjects without a family history. They argue these high D2 levels may serve as a protective factor that prevented these individuals from becoming alcohol abusers themselves. This finding highlights the complexity of working with human subjects as the animal literature might have suggested the opposite finding (lower D2 in FH individuals). Human motives to use drugs are many and often the environment greatly shapes behavior. It could be argued that FH positive individuals with lower D2 (not observed in Volkow et al) had behavioral profiles (see Dalley et al., 2007; above) that resulted in them already transitioning to alcohol/drug abuse and thus being excluded from the Volkow study. Undoubtedly, there are more variables associated with risk for drug use than low D2 levels and future work may be able to identify what other factors (genetic, environmental, social) interact with D2 levels to predict drug abuse risk.
Genetic factors affecting dopamine signaling
There has also been interest in understanding whether genetic differences may lead to different levels of D2 receptor availability, potentially placing some individuals at greater risk for addictive disorders. I investigated the effect of some common D2 receptor single nucleotide polymorphisms (SNPs) on D2 receptor availability using F-18-Fallypride as part of my postdoctoral research. Many of these SNPs had been previously associated with dopamine receptor differences in relatively small PET studies or been associated with potential increased risk for drug addiction. 
  • Taq1A - A1 allele associated with lower striatal D2 receptor availability (replicated in separate study but not in a third)
  • C957T - C allele associated with lower striatal D2 receptor availability in study of 45 individuals 
  • -141C Ins/Del - inconsistent findings on whether it affects D2 receptor availability
For more see: Genetic variation and dopamine D2 receptor availability: a systematic review and meta-analysis of human in vivo molecular imaging studies
Since the Taq1A SNP was discovered to associate with differences in dopamine signaling first, researchers have used it as a proxy for D2 receptor status (or more loosely as an index of general dopamine functioning). However, given that the Taq1A polymorphism does not occur within the DRD2 gene itself, researchers have speculated that polymorphisms in Taq1A may associate with other SNPs in the DRD2 gene that are the real drivers of expression of the receptor in vivo.

The C957T and -141C Ins/Del polymorphisms are in strong linkage disequilibrium with Taq1A and have themselves been associated with striatal D2/3 receptor availability. Despite the data suggesting that these SNPs are strongly linked, few studies have systematically investigated the effect of C957T, -141C Ins/Del, and Taq1A in isolation and combination on D2/3 receptor availability. Beyond the potential link to drug addiction risk, characterizing the functional effect of these SNPs on D2/3 receptor availability has implications for better understanding the mechanisms through which they exert their demonstrated influence on motivated behaviors including learning and decision making, impulsivity, and reward responsivity. 

In our work, we used F-18-Fallypride, which is a D2/3 receptor tracer with favorable affinity to measure both striatal and extrastriatal dopamine receptors, and assessed the impact of C957T, Taq1A and -141C Ins/Del SNPs on D2/3 receptor availability in a sample of 84 healthy subjects.
Picture
The C allele of the C957T SNP was associated with lower D2/3 receptor availability in the ventral striatum and putamen. No other SNP investigated demonstrated an effect on D2/3 receptor availability. BPnd=binding potential, a measure of D2/3 receptor availability; VS=ventral striatum
We found that the C957T SNP was associated with variation in dopamine D2/3 receptor availability in areas of the striatum often implicated in reward processing. The fact that the C allele was associated with lower dopamine receptor availability suggests it could be a useful genetic measure for at least one biological factor (lower D2 receptor availability) linked with drug addiction. While more work needs to be done to confirm these results, certainly further study of the C957T SNP in the DRD2 gene is warranted. 
Individual differences in dopamine release
Another area of focus regarding dopamine’s role in addiction is understanding differences in dopamine release to potential drugs of abuse. This measure is more closely associated with the biological processes associated with actual drug use, but is collected in a more controlled, laboratory setting. PET psychostimulant challenge studies allow researchers to examine dopamine release in the brains of human subjects. Methylphenidate and d-amphetamine (dAMPH) are often used in these PET studies as both release dopamine in the brain by blocking and/or reversing the dopamine transporter. If PET radiotracers that are displaceable by endogenous dopamine are used, researchers can perform a PET scan after placebo or psychostimulant administration and measure the change in radiotracer signal. The PET signal will go down after a psychostimulant for a tracer that is displaceable as the increased endogenous dopamine released by the drug lowers the binding sites for the tracer in the brain. This change in binding potential of the radiotracer can be used as a measure of dopamine release and has become a useful tool in research into addiction related processes.
Picture
Areas of significant change in D2/3 receptor availability as measured by F-18-Fallypride PET after dAMPH administration when compared to PET data collected on placebo. This change in receptor availability on dAMPH is interpreted as a measure of the level of dopamine release to the dAMPH. Data from 34 healthy young adults. dAMPH=d-amphetamine
Using this PET technique, Casey et al 2014 found that young adults with a multigenerational FH of substance use disorders showed reduced dAMPH-induced dopamine than either healthy controls or subjects that personally used drugs at similar levels to the FH group but without a FH of substance use disorders. This study was particularly informative as the effects of current drug use were also investigated and measured separately from family history. Furthermore, our group and others have demonstrated that dAMPH-induced dopamine release correlates with subjective ratings of the drug, particularly wanting more, in drug naïve individuals. These data confirm animal work linking changes in dopamine signaling after drug use to wanting processes (which has been labeled incentive salience).

Read more about wanting, liking, and drug abuse in a previous blog post.
​
The concept of blunted dopamine signaling (lower D2 receptor levels and less dopamine release) as biomarkers of addiction has also been recently reviewed (Trifilieff et al 2017; Leyton, 2017). While more work needs to be done, understanding factors that influence these PET-based biomarkers of dopamine signaling in human subjects has the potential to identify at risk individuals. This risk identification may allow intervention to be attempted earlier in the addiction process or perhaps prevent addiction before it even occurs.
Individual differences in dopamine signaling and the future of personalized medicine
The term “personalized medicine” has gained popularity in recent years. While it may seem like a buzzy term, its potential for improving treatment of a variety of medical conditions is vast. Personalized medicine involves tailoring treatments to individuals based on some aspect of their biology that might affect how they respond to a treatment. For example, you might give one patient with a particular genetic variant a different pharmacological treatment than another if that variant affects how they process (metabolize) or respond to that particular drug. This particular approach of using genetic information to understand response to pharmaceuticals is termed pharmacogenomics (see also).
Picture
The rapid reduction in the cost to sequence the human genome (complete set of an individual’s DNA) as well as proliferation of genotyping services such as 23andMe (which genotype common genetic polymorphisms, or areas in human DNA most likely to vary across individuals) means that genetic data can be readily obtained by anyone who wants it. This technological advance will allow physicians greater information of a patient’s underlying biology and eventually will be merged with growing insights into the effects of genetic variation on drug metabolism, brain signaling, and behavior to make personalized medicine commonplace. In fact, pharmacogenomic data has been added to several drugs by the FDA.

My own work, referenced above, suggests that genetic variation in a gene encoding the dopamine D2 receptor (DRD2) can affect the relative availability of this receptor in the brain as measured with PET (Smith et al., 2017 Translational Psychiatry). Individuals with a particular genetic variant in DRD2 that is associated with less availability of the receptor (C957T CC individuals) may need either a higher dose of a D2 drug or a higher affinity D2 drug to receive a therapeutic benefit.

The implications for this finding go beyond potential treatments or interventions for drug addiction. D2 agonists are commonly used in Parkinson’s Disease patients to preserve motor function and D2 antagonist-like drugs are used in the treatment of Schizophrenia. Understanding the genotype of individuals affected with these conditions, then, could enhance the effectiveness of their D2 drug treatments (by suggesting a physician might want to start with a higher or lower dose of the drug). While studies such as ours linking genetic variation with differences in biology are encouraging, DNA can also be modified by the environment. Researchers have begun studying these epigenetic effects on behavior, with most work occurring in rodents. As we integrate this knowledge, we will begin to better understand the impact gene by environment interactions have on biology and behavior.
Non-genetic factors also influence dopamine signaling
Genetics are not the only variables that could be worth attending to in future treatments. Additionally, dopamine signaling is known to decline with age (see also a previous blog post on this topic). So, doses of dopaminergic drugs that work well on young adults might need to be titrated in older adults. Furthermore, we and others have shown that estradiol levels in naturally cycling women can affect dopaminergic brain functions (assessed by fMRI imaging and a genetic variant (COMT) know to affect dopamine levels in the higher-order, prefrontal areas of the brain). Thus, a dopaminergic medication might be more effective at treating a female patient’s symptoms at certain points of her menstrual cycle but not others. We are only beginning to understand the role of female sex hormones in a variety of biological systems as basic research historically has focused on male model organisms.​
Dopamine signaling complexity and developing future treatments
The role of dopamine in drug addiction is quite complex. In addition, implementing personalized medicine when treating psychiatric or behavioral disorders is challenging as most of these disorders do not have a single, identifiable biological cause. The brain is complex enough and the fact that genetics, sex hormones, age, and environment can all affect one neurotransmitter (dopamine) among the many others involved in brain function speaks to the vast challenge that lies ahead for researchers.

​Our quest to better understand individual differences, however, has the potential to lead to more targeted treatments and therapies for a variety of dopamine-associated disorders including ADHD, Schizophrenia, Parkinson’s Disease, and drug addiction. The development of these personalized treatments will undoubtedly improve healthcare in the 21st Century and beyond but will require further research focused on measuring and categorizing individual differences. 
​

Explore more neuroscience-related posts on the blog:
  • ​Declining Dopamine: How aging affects a key modulator of reward processing and decision making
  • Stress & the Brain: How genetics affects whether you are more likely to wilt under pressure
  • Wanting, Liking, & Dopamine's Role in Addiction
  • Now vs Later - How immediate reward selection bias may be a risk factor for addiction 

More scholarly articles on dopamine and its effects:
  • What does dopamine mean?
  • Fifty years of dopamine research
  • Dopamine, behavior, and addiction
  • Dopamine and effort-based decision making
0 Comments

Now vs Later - How Immediate Reward Selection Bias May be a Risk Factor for Addiction

10/28/2021

0 Comments

 
Neuroscience
Picture
It has been over 7 years since I defended by Ph.D. dissertation in March 2014 at the University of North Carolina at Chapel Hill. Here, I wanted to share some of the rationale and implications of my graduate research on immediate reward selection bias in humans. While this research encompassed 5+ years of my life and resulted in a 112-page dissertation, I will focus on the key points and findings and why they are important. I have moved on from doing this research in my current role but it will forever be a part of my identity. In addition, I hope my scientific contributions have added a bit more to our understanding of substance abuse risk factors and how we might work to either intervene to support those at risk for addiction proactively or treat some of the behavioral patterns in addicted individuals that can continue the cycle of problematic drug use despite its negative consequences. 
What is an intermediate phenotype? 
Many psychiatric disorders including schizophrenia and depression are complex and heterogenous (i.e., they have diverse and varied symptoms and potential causes). The highly heritable nature of these disorders, estimated from twin studies to be anywhere from 40 to 80% (Sullivan et al., 2000; Sullivan et al., 2003), suggests that some biological processes mediated by genetics must confer risk for developing the disorders. It has been proposed that the inability to isolate strong biological bases for how genetic variation leads to complex, highly heritable diseases lies in the fact that various intermediate behaviors or traits are more closely tied to the genetics associated with the disease (Rasetti and Weinberger, 2011).

Given that substance use disorders (SUDs) are also complex disorders (people consume and continue to use drugs of abuse due to a variety of factors) with heritability estimates ranging from 40 to 60% (Heath et al., 2001; Verweij et al., 2010; Bierut, 2011; Agrawal et al., 2012), the identification of intermediate phenotypes associated with risk for these disorders is a growing focus of research (Karoly et al., 2013). Behavioral candidates for SUD intermediate phenotypes include reduced response inhibition (Acheson et al., 2011; Norman et al., 2011), increased risk taking behavior (Cservenka and Nagel, 2012; Schneider et al., 2012), aberrant reward responsivity (Wrase et al., 2007; Andrews et al., 2011), and increased discounting of delayed monetary rewards (Mitchell et al., 2005; Boettiger et al., 2007; Claus et al., 2011; MacKillop et al., 2011; MacKillop, 2013).
Criteria for categorizing a behavior as an intermediate phenotype
For an intermediate phenotype to be useful it must be a quantitative, continuously variable feature or behavior that can be consistently measured. Furthermore, as these intermediate phenotypes are thought to convey genetic risk for a disorder, they should be elevated in those affected with the disorder as well as in those individuals’ close relatives who share genetic similarity with them. Importantly, the level of these phenotypes in affected individuals and their close relatives should be shifted away from a distribution of those otherwise unaffected with no familial risk (Gottesman and Gould, 2003). For example, Egan et al. (2001) found unaffected siblings of those with schizophrenia to display executive function deficits that fell between unaffected nonrelatives and individuals with schizophrenia.

A variety of criteria have come to define an intermediate phenotype in psychiatry (
Almasy and Blangero, 2001; Gottesman and Gould, 2003; Waldman, 2005; Meyer-Lindenberg and Weinberger, 2006):
  1. The phenotype should be sufficiently heritable with genetics explaining variance in the behavior.
  2. The phenotype should have good psychometric properties as it must be reliably measurable to be a useful diagnostic.
  3. The phenotype needs to be related to the disorder and its symptoms in the general population.
  4. The phenotype should be stable over time in that it can be measured consistently with repeated testing, potentially to assess treatment effects.
  5. The behavior should show increased expression in unaffected relatives of those with the disorder as highlighted by Egan et al. (2001), above.
  6. The phenotype should co-segregate with the disorder in families in that a family member with the disorder should show the behavior or trait to a greater degree than an unaffected sibling and that this unaffected sibling should display the trait to a greater degree than a distant unaffected relative.
  7. The phenotype should have common genetic influences with the disorder.

To illustrate the intermediate phenotype concept and associated criteria, we can look at research in schizophrenia. Schizophrenia is associated with poor performance (and hyperactivity in an area of the brain known as the dorsolateral prefrontal cortex, dlPFC) on executive function tasks. As mentioned above, Egan et al. (2001) found unaffected siblings of those with schizophrenia to display executive function deficits that fell between unaffected nonrelatives and individuals with schizophrenia. Furthermore, genes affecting dlPFC activity and executive functions such as the catechol-O-methyltransferase (COMT) gene explain variation in schizophrenia risk (see Egan et al., 2001). Thus, by investigating a specific behavior (executive function) and its neural correlate (dlPFC activity) in schizophrenic patients and those at increased genetic risk for the disorder, a genetic factor (COMT) was isolated. Schizophrenia is caused by more than one genetic variation but this example illustrates the value of identifying a link between a behavior associated with schizophrenia (an intermediate phenotype) and a potential biological and genetic basis for said behavior.  ​

What is immediate reward selection (Now) bias?
Delay discounting (DD) behavior reflects the tendency for animals (including humans) to discount the value of delayed (in time) rewards in comparison to those available immediately. DD has also been referred to as immediate reward selection (“Now”) bias as the value of rewards available immediately supersedes waiting for a larger, delayed reward in the future (Rachlin and Green, 1972). Other terms for this behavior include temporal discounting or hyperbolic discounting as plots of the value of time-delayed ​rewards relative to present value often take a hyperbolic shape with the present value of a reward delivered at longer delays decreasing at a steep, non-linear rate. In other words, $100 in 3 months may only be worth $25 in present value while $100 in 1 month is worth $50 in present value (the discount rate gets steeper moving from a 1-month to a 3-month delay). 
Picture
Example of temporal discounting behavior. The present value of a reward decreases with the time one must wait to receive it. Individuals differ in the degree to which they discount rewards over time. The individual whose choice behavior is plotted with open circles and fit with the dashed line is a steeper discounter of time than that individual plotted with the filled circles and solid line of best fit.
Now bias as an intermediate phenotype for alcohol use disorders
As delay discounting (DD) behavior has been shown to be highly heritable (Anokhin et al., 2011; Anokhin et al., 2015; Mitchell, 2011), suggesting a strong genetic component, and is elevated in a variety of addictive behaviors (MacKillop et al., 2011), we focused our current work of exploring intermediate phenotypes for addiction on this behavior. Prior work has suggested DD displays many of the necessary criteria of an intermediate phenotype for a variety of neurobehavioral disorders including substance use disorders (SUDs) (Becker and Murphy, 1988; Reynolds, 2006; Perry and Carroll, 2008; Rogers et al., 2010), attention deficit hyperactivity disorder (Barkley et al., 2001; Sonuga-Barke et al., 2008; Paloyelis et al., 2010), and pathological gambling (Alessi and Petry, 2003; Leeman and Potenza, 2012). As these behaviors often co-occur, they may share similar biological and genetic components (Wilens, 2007; Leeman and Potenza, 2012).
An overview of various intermediate phenotype criteria for SUDs met by DD (Now bias) has been recently outlined (MacKillop, 2013). Particularly relevant to the current work, individuals with alcohol use disorders (AUDs) consistently display greater Now bias behavior versus those without AUDs (Petry, 2001; Bjork et al., 2004; Mitchell et al., 2005; Boettiger et al., 2007; Mitchell et al., 2007; MacKillop et al., 2011). Thus, Now bias is elevated in those individuals with an AUD (intermediate phenotype criterion 3).

​Conceptually, Now bias can be thought to have some relation to AUDs, as every relapse or excess drink represents a decision favoring immediate over delayed benefits. Furthermore, Now bias behavior has been shown to be heritable and associates with substance use, suggesting common genetic influences with SUDs (
Anokhin et al., 2011). Importantly, Now bias as assessed through delay discounting (DD) tasks, has good psychometric properties (responses are highly reliable (Matusiewicz et al., 2013; Weafer et al., 2013)), suggesting it is a trait that is robust to consistent measurement (intermediate phenotype criterion 2). This is further supported by the fact that DD behavior is stable over time (Kirby, 2009). Thus, prior work has demonstrated Now bias satisfies many of the criteria for an intermediate phenotype for AUDs. However, not all criteria have yet to be examined.  
Picture
Under-investigated criteria for Now bias as an intermediate phenotype for AUDs
As Now bias is elevated in those with AUDs, we might expect to see this behavior heightened in those on a trajectory toward an AUD as well. Such demonstrations between elevated Now bias and AUD risk would add greatly to the utility of Now bias as an intermediate phenotype. As problematic alcohol use during emerging adulthood (late teens to early twenties) may predict development of an AUD later in life (O'Neill et al., 2001; Merline et al., 2008; Dick et al., 2011), though many individuals mature out of problematic use (Bartholow et al., 2003; Costanzo et al., 2007; Lee et al., 2013), one might expect Now bias is enriched in problematic drinking emerging adults. Only one relatively small behavior study has looked at such a relationship with Now bias observed to be heightened among heavy versus lighter social drinking college students (Vuchinich and Simpson, 1998). This finding requires replication in a larger, more diverse sample.

In addition to being elevated in problematic drinking emerging adults, to satisfy another intermediate phenotype criterion for AUDs, Now bias behavior should also be elevated in unaffected first-degree relatives (parents, siblings) of those suffering from AUDs (intermediate phenotype criterion 5). Elevated Now bias in first-degree relatives of those with AUDs has yet to be adequately demonstrated, however.

​Most of the intermediate phenotype literature considers the expression of the behavior or trait in first-degree relatives as critical in demonstrating that behavior as an intermediate phenotype. In the field of AUDs, however, positive family history of an AUD is often defined as having at least one parent with an AUD (Acheson et al., 2011) or father with an AUD (Crean et al., 2002; Petry et al., 2002), or some combination of parental history or sufficient density of AUD history in second-degree relatives (Herting et al., 2010). In these previous studies, the effects of family history on Now bias was either only observed in females (Petry et al., 2002), was not found at all (Crean et al., 2002; Herting et al., 2010), or was not present when controlling for group differences in IQ and antisocial behavior (Acheson et al., 2011).

Measuring Now bias behavior in individuals with any first-degree relatives with AUDs expands the classic family history positive AUD definition to include siblings, who display greater genetic concordance with a particular individual than their parents. To our knowledge, though, this definition of first-degree family member positive or negative for AUDs has not been applied to the study of Now bias. Thus, while Now bias possesses many properties that suggest it could be a good intermediate phenotype for AUDs, further investigation of this possibility is warranted, particularly work focusing on examining whether Now bias is elevated in unaffected individuals with first degree relatives with AUDs.

Given our review of the literature and past work in this area (
Mitchell et al., 2005), we focused on better demonstrating the utility of Now bias as an intermediate phenotype for AUDs in a large group of individuals who ranged in age, level of alcohol use, and family history of AUDs. This work was published in Frontiers in Human Neuroscience in 2015, but I share the key takeaways from the study, below. ​
New evidence supporting Now bias as an intermediate phenotype for AUDs
​As mentioned earlier, adults with addictive disorders, including alcohol use disorders (AUDs), tend to choose smaller, sooner over larger, delayed rewards in the context of delay-discounting (DD) tasks more frequently than do adults with no addiction history (Petry, 2001; Mitchell et al., 2005; MacKillop et al., 2011). This immediate reward selection (or “Now”) bias persists even after years of abstinence and does not correlate with abstinence duration (Mitchell et al., 2005), suggesting irreversible consequences of chronic alcohol abuse and/or a pre-existing risk trait, or intermediate phenotype (Meyer-Lindenberg and Weinberger, 2006; MacKillop, 2013). If Now bias was a pre-existing risk trait for AUDs, we would predict heightened Now bias among young people who engage in at-risk drinking but who do not meet clinical criteria for alcohol dependence, relative to age-matched light or moderate drinkers. In addition, we would also predict heightened Now bias among light or moderate drinkers with problem-drinking first degree relatives if this behavior was an intermediate phenotype for AUDs. 
As the Alcohol Use Disorders Identification Test (AUDIT) is an effective means of measuring problem drinking behavior (Fiellin et al., 2000; Barbor and Higgins-Biddle, 2001; Kokotailo et al., 2004), we recruited high and low AUDIT individuals across a group of 18-40 year old social drinkers not reporting any AUD. We hypothesized that Now bias would be elevated in high but not low AUDIT emerging adults (defined as ages 18-21 or 18-24). Furthermore, we sought to test whether Now bias was elevated in those otherwise unaffected individuals (light/moderate social drinkers; low AUDIT) with a first degree relative with an AUD. We used the intermediate phenotype criteria of first-degree biological relative status (father, mother, or sibling with AUD), excluding those with mothers with an AUD to rule out potential fetal alcohol effects. We hypothesized that Now bias would be elevated in low AUDIT individuals with a first-degree relative with an AUD but not in those with no first-degree AUD relative. ​
Considering the effect of age on Now bias 
As this study was underway, we began to wonder how age might impact Now bias independent of problematic alcohol use. Our lab had previously found marked Now bias among emerging adults (18-25 yrs), regardless of drinking behavior. This suggests elevated DD generally among individuals transitioning from adolescence to adulthood. The observation that adult controls (average age of 26-28) with no AUD diagnosis display reduced Now bias compared to abstinent alcoholic adults (Mitchell et al., 2005; Boettiger et al., 2007) suggests that this bias should decline between emerging adulthood and adulthood, at least among moderate, non-problem drinkers. While emerging adults are widely regarded as impulsive (Chambers and Potenza, 2003; de Wit, 2009), and DD normally decreases from childhood to the early 30’s (Green, 1994; Scheres et al., 2006; Olson et al., 2007; Eppinger et al., 2012), little is known about specific changes in DD from late adolescence to adulthood. Some data show trait impulsivity declining linearly with age from early adolescence to age 30 (Steinberg et al., 2008). Thus, given positive correlations between DD and trait impulsivity (Mitchell et al., 2005; de Wit et al., 2007), we hypothesized DD should decline with age from adolescence into the 30s, but, to our knowledge, no prior studies have explicitly investigated age effects on DD in detail from ages 18 to 40. Moreover, we do not know whether heavy alcohol use moderates any such age-related changes in DD. Thus, a secondary aim of our work was to investigate age-related differences in Now bias in our population as a whole and separately in those reporting heavy, problematic versus light/moderate drinking. 
Confirming and extending on prior work, we found that emerging adults (defined as either aged 18-21 or aged 18-24) regardless of their drinking status (light/moderate vs heavy drinkers) showed equally high Now bias behavior, which did not support our first hypothesis that this behavior would be elevated in heavy drinkers. Follow-up analyses concluded that Now bias generally declined with age in our light/moderate drinker population (r=-0.28, p=0.022) but not in the heavy drinkers (r=-0.03, p=0.39). We measured Now bias in our study as an impulsive choice ratio (ICR), which can range from 0 (no Now bias) to 1 (complete Now bias). The age-related decline in Now bias began to asymptote around age 25. Thus, we organized our data into emerging adult (aged 18-24) and adult (aged 26-40) groups for further analyses. 
Picture
Our measure of Now bias, ICR, was found to decline with age in light/moderate drinkers but not in heavy drinkers.
Now bias as intermediate phenotype for AUDs in adults
We did confirm our second hypothesis that Now bias (measured via ICR) was elevated in light/moderate drinking adults (aged 26-40) with first-degree relatives with an alcohol use disorder (AUD). Plotting our adult population by first-degree relative status and comparing it to heavy drinking adults and abstinent alcoholic adults studied previously (Mitchell et al., 2005), we found strong evidence supporting Now bias as an intermediate phenotype for AUDs.
  1. Now bias (ICR) is high in individuals who drink heavily and problematically but without an AUD (orange bar in graph, below)
  2. ​Heavy drinker ICR is nearly equivalent ​to that seen in abstinent alcoholics (red bar in graph, below) despite these individuals not meeting the criteria of having an AUD
  3. Now bias is elevated in light/moderate drinking adults with first-degree relatives with an AUD (FH+) relative to those without a first-degree relative with an AUD (FH-; blue bars in graph, below)
Picture
Among adults, Now bias as measured by ICR is elevated in individuals at risk for AUDs. The dark blue line and red bar represent prior data measuring ICR in abstinent alcoholics and healthy controls. New data from heavy or light/moderate drinking adults with (FH+) and without (FH-) a family history of AUDs are plotted in the orange and blue bars, respectively.
Implications of our findings - Could reducing Now bias lower one's risk of an AUD?
Our work has added additional support to Now bias being an intermediate phenotype for alcohol use disorders (AUDs). The fact that Now bias was elevated in heavy drinking adults without an AUD was suggestive that this behavior may proceed and AUD diagnosis. More work is needed to follow up on this finding, however. Specifically, longitudinal studies need to be conducted to measure Now bias in individuals in their early teens (prior to exposure to drinking) and continue to measure this behavior over the lifespan, especially as these individuals enter their late teens and early twenties when problematic drinking behavior often emerges. Only through careful study of the trajectory of Now bias during adult development in both non-problematic and problematic drinkers can we begin to truly determine the utility of this measure as an intermediate phenotype for alcohol use disorders or substance use disorders in general.

Ongoing work taking place as part of the Adolescent Brian Cognitive Development (ABCD) Study seeks to understand adolescent brain and cognitive development generally and the various behavioral (including Now bias) and neural risk factors that can emerge in adolescence that lead to mental or psychiatric disorders in adulthood.

Learn more about this ambitious study here and view current publications emerging from the dataset here. 
Since our study on Now bias as a potential intermediate phenotype for AUDs was published in Frontiers in Human Neuroscience, other work has shown:
  • Large individual differences in intertemporal choice behavior (Review; Keidel et al., 2021)
  • Genomic basis of delayed reward discounting (Gray et al., 2019)
  • Steep Discounting of Future Rewards as an Impulsivity Phenotype: A Concise Review (Levitt et al., 2020) 
  • ​Individuals with two parents with addiction have significantly higher rates of discounting compared to those with no or only one parent with addiction (Athamneh et al., 2017)
  • The density of familial alcoholism interacted with binge-drinking status to predict impulsive choice (Jones et al., 2017)
  • A review of age & impulsive behavior in drug addiction (Argyriou et al., 2017)
With increased confidence in Now bias as an intermediate phenotype for alcohol use disorders, our next step is better understanding the neural and biological bases of this behavior. This information may then offer a means to potentially reduce Now bias in individuals at risk for alcohol use disorders. Making at-risk individuals more future-focused could assist them in considering the long-term consequences of problematic alcohol use and reduce the temptation to drink heavily in the moment. Targeting the dopaminergic system is one potential approach to modulating Now bias as some of my and others' work has shown. Delving into that topic will have to wait for a future post, though. Stay tuned. 

Explore more of my work on Now bias:
  • Age modulates the effect of COMT genotype on delay discounting behavior
  • Ovarian cycle effects on immediate reward selection bias in humans: a role for estradiol 
  • Modulation of impulsivity and reward sensitivity in intertemporal choice by striatal and midbrain dopamine synthesis in healthy adults
  • Neural Systems Underlying Individual Differences in Intertemporal Decision-making  

And additional neuroscience topics on the blog:
  • Declining Dopamine: How aging affects a key modulator of reward processing and decision making
  • Stress and the Brain: How genetics affects whether you are more likely to wilt under pressure
  • Wanting, Liking, & Dopamine's Role in Addiction 
0 Comments
<<Previous

    Author

    A neuroscientist by training, I now work to improve the career readiness of graduate students and postdoctoral scholars.

      Subscribe to Reflections Newsletter

    Subscribe to Newsletter

    Archives

    October 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    May 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    June 2022
    May 2022
    April 2022
    March 2022
    January 2022
    December 2021
    October 2021
    September 2021
    August 2021
    July 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020
    August 2020
    July 2020
    June 2020
    May 2020
    March 2020
    February 2020
    January 2020
    December 2019
    November 2019
    September 2019
    August 2019
    July 2019
    May 2019
    April 2019

    Categories

    All
    Academic Job Search
    Artificial Intelligence
    Career Development
    Career Exploration
    Creativity
    Data Science
    Future Of Work
    Innovation
    International Concerns
    Job Search
    Life Advice
    Neuroscience
    NIH BEST Blog Rewind
    Opinion
    Personalized Medicine
    PhD Career Pathways
    Professional Development
    Scientific Workforce
    Sports
    Tools & Resources
    Welcome

    RSS Feed

Science

Career Development Research
​
Neuroscience Research


Publications

Writing

​Reflections Blog

Other Posts

Press, Resources, & Contact

Press                                                       Contact

Job Search Resources         Funding Resources

Subscribe to Reflections Newsletter 
© COPYRIGHT 2025.
​ALL RIGHTS RESERVED.