Monthly Archives: April 2024

Gray and white matter morphology have been investigated in detail in the PNC

A recent follow-up of this cohort reported that PS at age 11 were associated not only with a diagnosis of schizophrenia at age 38 7.24) but also with diagnoses of Post Traumatic Stress Disorder , substance dependence , depression , and anxiety . Higher rates of PS at age 11 further predicted suicide/ suicide attempts at age 38, even when controlling for other psychiatric disorders at age 11 [the 15-year follow-up study of the Dunedin cohort at age 26 reported very similar findings ]. It is important to note, however, that PS were assessed with the Diagnostic Interview Schedule for Children, an instrument that includes only 5 questions on positive PS. The Avon Longitudinal Study of Parents and Children , with over 13,000 study participants, includes a total of 68 assessment points between birth and age 18. Niarchou et al. reported that, similar to results from the Dunedin cohort, PS at age 12 were predictive of a psychotic disorder at age 18 12.7. Interestingly, nonspecific symptoms such as depersonalization and sub-psychotic unusual experiences were predictive of a psychotic disorder and depression at age 18 . Even though, ALSPaC also assessed only positive PS, the semi-structured PLIKSi instrument covers the three major domains of positive PS, i.e., hallucinations, delusions, and bizarre thinking, and therefore reflects a broader spectrum of PS . Overall, in the general population it appears that PS during childhood and adolescence increase the risk of later development of a broad range of psychiatric illnesses . PS in help-seeking individuals fulfilling CHR criteria may be more specific in terms of predicting psychosis onset even though rates of co-occurring nonpsychotic disorders are also higher in these cohorts relative to the general population . Although their etiology is not well understood,grow tray PS throughout life are often preceded and accompanied by emotional and behavioral problems, which in turn are often associated with life adversities.

Findings from the ALSPaC sample further confirmed previously described risk factors. In particular, early neurodevelopmental problems such as autism spectrum symptoms, asphyxia during birth, lower IQ, and delayed early motor development were specifically associated with PS in adolescence . Bolhuis et al. highlighted emotional and behavioral problems at age 3 and 6 as the earliest significant predictors of PS at age 10. These encompassed depressive symptoms, aggressive behavior, anxiety, sleep difficulties, attention problems, and somatic complaints. Interestingly, emotional and behavioral problems also partially explained the association between previously described risk factors such as autistic traits and childhood adversities and PS, rendering it likely that emotional problems are a core risk factor or precursor for later PS. Further, the authors hypothesize that PS can manifest differently across the lifespan, ranging from emotional problems in early childhood to sub-clinical PS in late childhood and adolescence, and severe mental illness in adulthood. However, difficulties in validly assessing PS in younger children could lead to a distortion of the true association between childhood emotional problems and PS . A twin study further supports an association between childhood emotional and behavioral problems and adolescent PS by showing a modest genetic overlap across these phenotypes . Further, lack of certain personal resources such as low optimism, low self-esteem, and high avoidance, in addition to emotional problems, have been reported as significant predictors of PS during adolescence . Early life stress and childhood adversities are associated with emotional and behavioral problems not only in childhood and adolescence but across the lifespan . In the largest population-based study to date, the World Health Organization Mental Health Survey , McGrath and colleagues confirmed that childhood adversities are associated with an at least two-fold increased risk for developing PS, in a dose-response relationship . Childhood adversities characterizing ‘maladaptive family functioning’ posed a somewhat stronger association with later onset of PS than ‘other childhood adversities’ .

Interestingly, when adjusting for other mental illnesses with onset prior to PS, the association between childhood adversities and PS onset during adolescence became non-significant. This finding suggests that childhood adversities are not only a risk factor for adolescent-onset PS, but also other psychopathological symptoms with onset prior to adolescence, which in turn may lead to PS. Finally, an often-discussed risk factor for the consecutive development of PS is cannabis use; longitudinal results from the Netherlands Mental Health and Incidence Study reported that baseline cannabis use predicted PS at follow-up . Recent publications conclude that the evidence for this association is sufficient for policy makers to take this risk into consideration when further discussing legalizing cannabis . Recently, genetic studies have made great progress in elucidating the genetic architecture of severe mental illnesses. In the majority of cases, risk for severe mental illnesses appears to be attributable to the cumulative impact of multiple genes, where each gene individually explains only a small amount of variance, but the sum of risk alleles across all identified variants accounts for up to 18% of variance in schizophrenia diagnosis . As such, investigation of poly genic risk scores , based on effect sizes of common variants associated with schizophrenia and other disorders has become increasingly common in population-based studies. Studies applying PRS to developmental cohorts have recently emerged. For example, in the ALSPaC cohort schizophrenia PRS was significantly associated with negative symptoms and anxiety during adolescence, but not with positive symptoms, again suggesting that the genetic basis of PS may present differently across development . In line with behavioral studies, Riglin et al. highlighted associations between schizophrenia PRS and diverse problems of childhood development at ages 7 to 9, such as lower IQ and poor social and language skills . A recent study combined three major population-based cohorts [ALSPaC, TEDS , and CATSS ], identifying significant associations between schizophrenia PRS and different symptom domains: hallucinations and paranoia , anhedonia, cognitive disorganization, and parent-rated negative symptoms . Interestingly, bipolar disorder PRS was also significantly associated with hallucinations and paranoia, even when including individuals who scored zero on this scale.

PRS for major depression was further associated with anhedonia and parent-reported negative symptoms. In a follow-up study taking a multivariate factor analytic approach, Jones et al. found schizophrenia PRS was significantly associated with multiple psychopathology factors . However, these specific effects vanished when including a general psychopathology factor, suggesting that psychopathology during adolescence may be explained with one broad factor. PS during adolescence are rather non-specific and pose risk for a variety of severe mental illnesses. Loohuis and colleagues therefore utilized a novel multi-trait approach including PRS of a broad range of psychiatric disorders, including neurodevelopmental disorders as well as brain and cognitive traits, to assess the association between these genetic risk factors and PS in youth. Interestingly, the ADHD PRS was the only significant predictor of PS in youth of European-American ancestry in the PNC , even after removing individuals endorsing any ADHD symptoms to avoid confounds related to phenotypic overlap . This finding was replicated in a sample of help-seeking CHR individuals. Further, the association between PS and ADHD PRS was age-dependent, such that the association was strongest in younger children . It is noteworthy that for individuals < 12 years only collateral information on psychopathology was available, which could affect the results. In addition to polygenic risk ,hydroponic trays recent exome sequencing studies have also found that rare and ultra-rare variants contribute to the genetic risk of schizophrenia . Overall, findings from these studies highlight the complex association between genetic risk and PS during adolescence. While such symptoms may be non-specific, and presage later severe mental illnesses, polygenic risk may be indexing global psychopathology as well as risk for specific diagnostic entities. Importantly, because PRS are currently derived from almost entirely European cohorts, their application to non-European ethnic groups is problematic ; collection of ethnically diverse samples is a research imperative. Further, while PRS are far from clinical utility in the general population, as ever-increasing GWAS size improves the strength of these associations, these risk scores may approach clinical utility in enriched populations in the near future. Examples of publicly available population-based datasets in youth that include multimodal imaging and neurocognitive assessments are the PNC and the Adolescent Brain Cognitive Development study. These samples offer unprecedented opportunities for the neuroscience community to study complex brain-behavior interactions during development. In particular, longitudinal data will allow for unique investigations of developmental trajectories. Given the young age of ABCD participants at study baseline it has the potential to capture earliest signs of emotional and behavioral problems associated with subsequent severe mental illnesses. Table 2 summarizes large scale epidemiological cohorts with multi-modal imaging. The PNC has led to a wealth of new findings regarding structural and functional brain alterations in youth experiencing PS; 1,445 youth aged 8 to 21 years were recruited from the greater Philadelphia area and underwent genotyping, multi-modal imaging, and neuropsychological testing. This sample was not ascertained for specific neuropsychiatric problems and includes multi-ethnic youth from various socio-economic backgrounds.

Exclusion criteria were limited, and included significant medical problems, intellectual disability, neurological and/or endocrine conditions, and general MRI contraindications .Importantly, all studies on PS in the PNC applied the same diagnostic criteria, offering comparability across studies . Furthermore, neuroimaging data were acquired with a single MRI scanner, reducing artifacts and heterogeneity due to scanner and study site variability. Reductions in local gray matter volume in youth experiencing PS relative to typically developing youth were observed in bilateral medial temporal lobes, and were also associated with PS severity . Further, a significant age by group interaction suggested that these local reductions in gray matter volume only became apparent in mid-adolescence in youth experiencing PS. This pattern of volume reductions in medial temporal regions mirrors a wealth of such findings not only in individuals with chronic schizophrenia, but also in individuals with first episode psychosis as well as in individuals at clinical high-risk for developing psychosis . Given that the medial temporal lobe in this study included both the amygdala as well as parahippocampal cortex, this finding was followed up with a more detailed parcellation of the temporal lobe: whereas decreased volume of the left amygdala was associated with positive PS, decreased volume of the left entorhinal cortex was correlated with impaired cognition as well as more severe negative and disorganized symptoms , suggesting that variation in these brain structures may contribute to distinct symptom domains. Jalbrzikowski et al. subsequently investigated whole-brain morphology differences in cortical thickness, surface area, and sub-cortical volume in PS youth in this cohort, relative to both youth with bipolar mood symptoms and typically developing youth . This study found thalamic volume reductions that were specific to PS. Again, these findings parallel those observed in individuals with overt psychosis and those at CHR , highlighting the role of the thalamus in neural system disruptions in psychosis. In terms of white matter micro-structure, youth with PS also exhibited reduced fractional anisotropy in the retrolenticular internal capsule and the superior longitudinal fasciculus , possibly reflecting altered axonal diameter and/or myelination . Development of the SLF was associated with cognitive maturation in typically developing youth, an effect that was absent in youth experiencing PS. Overall, alterations of brain morphology observed in these non-clinically ascertained cohorts of youth experiencing subthreshold PS can be interpreted as further evidence for a psychosis continuum, given qualitatively similar alterations observed in individuals with overt illness and those at CHR for psychosis.In terms of functional MRI, task-based brain function and resting state functional connectivity have both been investigated in population-based studies of PS. In the PNC, two MRI paradigms have been acquired: an n-back task probing different working memory loads and an emotion identification task. Working memory is viewed as a function of higher cognitive/ executive functioning consistently shown to be impaired in schizophrenia . Similarly, a wealth of evidence exists for impaired emotional processing in schizophrenia . Wolf et al. found reduced activation in the executive control network in response to increasing working memory demands, concomitant with worse performance, in PS youth relative to typically developing peers . Amygdala activation in response to threatening facial expressions was increased in PS youth compared to unaffected youth and was also positively correlated with positive symptom severity .

Future studies may benefit from interviewing an independent reporter of prenatal maternal alcohol use

Animal studies suggest that prenatal alcohol exposure affects DNA methylation through antagonistic effects on methyl donors, such as folate, and via long-lasting changes in gene expression . Preliminary evidence from studies of children with fetal alcohol spectrum disorder show genome-wide differences in DNA methylation . Further research is required to examine epigenetic markers and their role in adverse outcomes among exposed youths; DNA methylation or other epigenetic markers could potentially provide objective indicators of prenatal alcohol exposure. Limitations of our study include potential maternal under reporting of alcohol use during pregnancy, imprecise retrospective data on the timing, amount, and frequency of alcohol exposure, and absence of data on trimester-specific alcohol exposure. The effects of under reporting by mothers who indicated alcohol use during pregnancy may have inflated the observed associations, while under reporting by mothers who indicated no alcohol use when they did in fact consume alcohol would have attenuated the associations toward the null. Furthermore, data were not available on mothers who regularly consumed less than a full unit of alcohol. Therefore, youths exposed to this pattern of drinking would have been included in the unexposed group, potentially diluting outcome effects. Despite the large sample size, there were relatively few cases of youths exposed to stable light drinking throughout pregnancy, and too few cases of stable heavier drinking or increased consumption throughout pregnancy, to examine the impact on offspring. There is a larger body of existing evidence based on the consequences of heavier alcohol exposure . The small sample size of youths exposed to light, stable drinking throughout pregnancy resulted in wider variance in outcome measures and may underestimate the true impact. Other notable explanatory variables of early life that may influence the observed associations between prenatal alcohol exposure and neurobehavioral outcomes include childhood adversity and quality of parental care. These variables may contribute to mediating effects of neurodevelopment and possible epigenetic modifications .

The baseline ABCD Study protocol did not capture these variables,vertical growing system although future waves will. Longitudinal analyses of this cohort should consider these variables as possible confounding factors. In addition, we did not examine the effect of preconception paternal alcohol exposure on preadolescent brain structure, and this should be explored in future studies. In conclusion, relatively light levels of prenatal alcohol exposure were associated with small yet significantly greater psychological and behavioral problems, including internalizing and externalizing psychopathology, attention deficits, and impulsiveness. These outcomes were linked to differences in cerebral and regional brain volume and regional surface area among exposed youths ages 9 to 10 years. Examination of dose-dependent relationships and light alcohol exposure patterns during pregnancy shows that children with even the lowest levels of exposure demonstrate poorer psychological and behavioral outcomes as they enter adolescence. Associations preceded offspring alcohol use and were robust to the inclusion of potential confounding factors and during stringent demographic matching procedures, increasing the plausibility of the findings. Women should continue to be advised to abstain from alcohol consumption from conception throughout pregnancy.The opioid crisis in the United States continues to worsen, with the number of opioid related deaths continuing to rise. Increases in deaths have come in multiple waves. The first was overdoses related primarily to prescription opioid pills; the second was driven by heroin-related overdoses; and the third has been driven by overdoses due to use of illicitly manufactured fentanyl and its analogs . Deaths related to synthetic opioids other than methadone, primarily fentanyl and its analogs, have recently increased from 9,580 in 2015 to 36,359 in 2019 , and provisional counts from 2020 suggest that the number has continued to increase . Given that the opioid crisis has continued to shift, it is important for research to continue to examine trends related to fentanyl use and overdose in order to most effectively inform prevention and harm reduction efforts. Centers for Disease Control and Prevention National Vital Statistics Systems mortality data have been the official source of information on opioid-related deaths in the US; however, results are typically lagged by about nine months . These data also tend to lack extensive information on characteristics or circumstances of overdoses, and information regarding nonfatal overdoses is not collected. In light of these limitations, alternate sources of national data could help further inform researchers and the public regarding characteristics of the ever-shifting opioid crisis which is currently driven by fentanyl use.

Even though reports of fatal fentanyl exposures are exponentially higher via NVSS mortality data as these reports are believed to count all or almost all related deaths in the US , we believe Poison Control data can help complement this information. Specifically, Poison Control data are uploaded in almost real time and therefore, depending on data availability, they can be used as an informative source for surveillance . National Poison Control also collects more extensive data on circumstances of exposures , and the majority of National Poison Control data are cases involving nonfatal overdoses – events that are currently lacking data at the national level. Therefore, National Poison Control data can elucidate risk factors for severity of fentanyl exposure outcomes, and determine how severity of outcomes and other circumstances of use shift over time. In this analysis, we examine trends in characteristics of fentanyl exposures in the US using National Poison Control data and we also delineate correlates of cases experiencing major adverse effects or death. We intend for these analyses not only to complement national mortality data but also to inform prevention and harm reduction efforts as the opioid crisis continues to shift.This study is based on a collaboration through the National Institute on Drug “Abuse” National Drug Early Warning System with the Researched “Abuse” Diversion and Addiction-Related Surveillance System. Poison Control data were obtained via the RADARS System Poison Center Program. Participating Poison Control Centers provided all cases involving pre-identified Micromedex codes to RADARS System staff who then reviewed select cases for accuracy. Poison Control provides treatment advice to the public and to healthcare staff treating people with suspected poisonings involving drugs, chemicals, and plants. Information about the patient and poisoning circumstances are recorded by individual PCCs as per standards set by the American Association of Poison Control Centers and stored in an electronic database overseen by the National Poison Data System. RADARS System obtained data on poisonings reported to involve fentanyl between January 2015 and December 2021.

Data were available from PCCs in all US states other than Utah prior to 2017 and North Carolina . With respect to patient characteristics, age and sex of the patient were obtained by PCC staff from the caller to the poison center, which may be the patient, health care provider, or other contact. With regard to characteristics of reported exposures, PCCs obtained information on the reason or intention for exposure, whether other drugs were co-used, the route of administration, the management site, and severity of the outcome from the caller. Reasons for use included substance “abuse,” substance “misuse,” suspected suicide attempt, therapeutic error, and various other categories of intentional and unintentional exposure. Unintentional exposure does not involve intentional use of another drug and typically refers to exposure among children. “Abuse” was defined by PCC as exposure resulting from intentional improper or incorrect use of a drug in which the patient was attempting to acquire a high,how to dry cannabis a euphoric effect, or some other psychotropic effect . “Misuse” was defined by PCC as intentional improper or incorrect, or otherwise non-medical use but for reasons other than acquiring a psychotropic effect. Information on reason was collected by specialists in poison information from PCC contacts and reviewed by RADARS System staff. SPIs are instructed to determine whether the results were due to a purposeful action or not . Based on coding guidelines provided by the AAPCC , they select the most appropriate reason for use within these categories. SPIs are instructed to record the rationale for this selection in case notes which are reviewed by RADARS System staff. In instances in which the patient is not conscious, this may impact the ability to obtain this information or create a bias due to reliance on other persons reporting. Routes of administration included ingestion, dermal administration, injection, inhalation, and other method. It cannot be determined, however, whether inhalation referred to insufflation or smoking. Patients were able to report multiple routes. Co-drug use was also queried, and we focused on reported co-use of alcohol, cannabis, cocaine, methamphetamine, gabapentin, benzodiazepines, and prescription opioids . Drug use was based on self-report and toxicology test results were considered when available. All information recorded by PCC staff was reviewed for accuracy by trained RADARS System staff. Management site was the site in which the call about the exposure to the PCC was made, and this was coded as taking place at a hospital center, on site , or patient referral. Finally, medical outcome was defined by PCC staff as none, mild, moderate, major, or death . Mild effects were defined as minimally bothersome effects, moderate effects were defined as more pronounced or prolonged effects, and major effects were defined as life-threatening or permanently disabling effects.

Deaths indicate that the patient was believed to have died in relation to use of the drug. Specifically, exposure related death was either directly determined by PCC staff involved with case management, or from death reports obtained from a medical examiner or other source . In the latter case, an AAPCC faculty review team then judged whether deaths were in fact likely responsible or at least contributory regarding the reported exposure . First, we examined trends in characteristics of fentanyl exposures. We described the prevalence of characteristics of fentanyl exposures within each separate year and then calculated absolute and relative changes in prevalence between 2015 and 2021. We also estimated whether there were changes in the proportion of each category of each covariate by time by examining whether there were linear, quadratic, or cubic trends using logistic regression. Next, we examined correlates of exposures resulting in a major effect or death. Covariates were fit into a multi-variable generalized linear model using Poisson and log link to estimate adjusted prevalence ratios for each covariate. We imputed missing data for independent variables in the multi-variable model. Multiple imputation was implemented via chained equations to handle missingness; predictors included all variables in the case complete model. We imputed 10 datasets for the multi-variable model and combined results using Rubin’s Rules . We also conducted sensitivity tests in which we repeated all analyses excluding cases reported from Connecticut. This was done because beginning in 2018, Connecticut became the only state to mandate emergency medical service providers suspecting fentanyl overdoses to report such cases to local poison control . All statistics were conducted using Stata SE 17 . This secondary analysis was exempt from review by New York University Langone Medical Center’s institutional review board. In this analysis of fatal and nonfatal fentanyl-related exposures reported to PCCs in the US, we detected significant shifts in characteristics of cases between 2015 and 2021. We also determined correlates of exposures resulting in major adverse effects or death. There were many shifts in proportion of various age groups exposed across time. Exposures among individuals in all age groups between age 13 and 39 increased in proportion over time, especially adolescents . We also determined that children, adolescents, and adults ages 50–69 were at higher risk for experiencing a major effect or death, suggesting these age groups are at particularly high risk for experiencing morbidity after exposure. The finding about increased exposures among children and adolescents is relatively unique. Although previous studies of mortality related to synthetic opioids suggest that there were increases among all age groups from 2011 through 2016, with a 93.9% increase among those age 15–24, increases were larger among those aged 35–44 and 25–34 . As such, our findings regarding children and adolescents being at increased risk for more severe outcomes may require more focus. Also, with respect to patient characteristics, the proportion of exposures increased among males. These results corroborate mortality data which suggests that since 2013, deaths related to use of synthetic opioids have increased at a faster rate then for females . In fact, in 2018, the rate of males who died from synthetic opioids was 14.2 per 100,000 compared to 5.5 per 100,000 females .

Treatments delivered via digital platforms can be widely accessible at a population level

The application of digital technologies to better assess, understand and treat substance use disorders is a particularly promising and vibrant area of scientific research . Among the many applications of digital technologies, digital tools may be useful in the screening and assessment of SUD. Indeed, research evaluating the use of electronic screeners has demonstrated that individuals more accurately report risk behavior, including substance use and sexual risk behavior, when responding to questions posed by an electronic screener instead of by another individual . Embedding standardized, validated clinical assessments of SUD into electronic health records may also facilitate the assessment and treatment of SUDs as part of the routine clinical workflow in a wide variety of clinical settings . Digital health interventions are interactive, self-directed software tools that can overcome some of the striking disparities in treatment access and treatment quality evident in healthcare settings across the globe . For example, digital therapeutics can teach people effective, scientifically validated skills to recognize and change unhealthy thoughts and behavior and provide tools to help people apply these skills to their everyday lives. Digital therapeutics can be available 24/7 and thus allow for “on-demand” access to therapeutic support, thereby creating unprecedented models of intervention delivery and reducing barriers to accessing care. Telehealth, the use of telecommunication technologies to deliver long-distance clinical care, may also allow SUD expert clinicians to deliver care in communities where SUD treatment needs are high but SUD workforce capacity is limited . Telehealth can be used in concert with digital therapeutics to provide real-time distance communication with SUD clinicians via video technology, complemented by digital therapeutic software that does not rely on synchronous communication with another individual but rather can be available at all times. Digital therapeutics and telehealth models of care may be transformative in the treatment of SUDs in many ways .

As most persons with SUDs spend the majority of their time outside of a treatment facility,cannabis dryer digital technologies can extend the reach and impact of treatment by offering anytime/anywhere SUD care. Digital tools can function like a therapist “in your pocket” and can be accessible at times when individuals struggling with SUDs may be in greatest need of therapeutic support. Additionally, a large part of care offered in SUD treatment settings does not reflect the state of the science of SUD care . Digital therapeutics can ensure the delivery of SUD care with fidelity to the most evidence-based practices. Further, the behavioral health clinician workforce cannot meet the large population-level needs for SUDs or offer anytime/anywhere care . Digital therapeutics provide science based, scalable solutions to meet SUD needs at a population-level. This may be particularly relevant in tackling the current U.S. opioid crisis, in which the number of Americans with an opioid use disorder has surged, especially in rural communities, while the trained SUD workforce has not grown at a comparable rate , 2019. Digital technologies also afford new opportunities to examine clinical trajectories and identify novel digital biomarkers within-individuals through intensive collection of individual-level data using mobile devices, wearable sensors, and mapping digital footprints. Indeed, digital tools may capture information about individual’s physiology “in vivo” as they live their daily lives . Specifically, mobile technologies enable ecological momentary assessment a method that prompts individuals to respond to queries on mobile devices, and which enables near real-time monitoring, of individuals’ behavior while they engage in daily activities. Because EMA allows for intensive longitudinal assessment in naturalistic contexts, these data offer promise to enhance our understanding of mechanisms of health behavior, including drug-taking behavior . Digital technologies also enable passive sensing and inference from smartphones or sensing devices worn on the body, which is transforming how we understand human behavior . Mobile sensing allows for the continuous measurement of physiological and behavioral data in the real world.

This sensor data can be streamed to a smartphone and processed immediately to infer information about a person’s health behavior, physiology, and context. These data from sensors can be combined with data from self-report EMA assessments to enhance an understanding of the individual’s behavior in context . This information can then be used to trigger the delivery of interventions in real time . Further, the use of social media sites has exploded in recent years. Social media enables multi-directional communication anywhere and anytime. Social media may be leveraged to recruit individuals into research, often allowing for rapid, cost-effective recruitment of national and hard-to-reach populations . Social media data have also been used to predict many phenomena, ranging from purchasing patterns to disease epidemics , and a growing body of literature shows how social media data may enable a rich understanding of the topology and functioning of social networks and their relationships to health/risk behavior . For example, social media has been shown to contain signals of depression among individuals, such as decreased social activity, increased negative affect, highly clustered egocentric networks, and heightened concerns about relations and medications . Also, data derived from social media has been shown to predict a range of sensitive personal attributes including sexual orientation, political views, personality traits, and use of addictive substances . Digitally-derived data offer great potential to refine and advance our understanding of health behavior, including SUDs. These granular-level data captured in daily life allow for the development of dynamic models of SUDs to understand behavior in real-time and in response to changing environmental, social, physiological, and intrapersonal factors . And, they can help us understand when individuals may be most receptive to interventions , with a goal of providing the right type/amount of therapeutic support at the right time by adapting to an individual’s changing internal and contextual state . Collectively, these digital technologies enable an entirely new offering of tools for collecting rich data about individuals’ behavior, health, and environment, provide personalized interventions and resources based on individuals’ needs and preferences, and enable dynamic computational models to predict and characterize individuals’ changing needs and health trajectories over time.

The National Drug Abuse Treatment Clinical Trials Network , launched in 1999 by the U.S. National Institute on Drug Abuse , has supported a growing line of research that leverages digital technologies to glean new insights into SUDs and provide science based therapeutic tools to a diverse array of persons with SUDs. The CTN is a unique research infrastructure for conducting practical, rigorous, and highly impactful trials focused on improving the treatment of SUDs and promoting widespread implementation and sustainability of effective and accessible SUD care in community systems across the nation. Among its many contributions, the CTN has supported a broad array of innovative and impactful research projects that have leveraged digital health. This manuscript provides an overview of the digital health portfolio of the CTN and outlines a vision for the many future opportunities to leverage the unique national CTN research network to scale-up the science on digital health to examine optimal strategies to increase the reach of, and reduce barriers in access to science-based SUD service delivery models both within and outside of healthcare. Note that we recognize that additional, rigorous research supported by NIDA focused on the application of digital health to SUDs has been conducted outside of the CTN ; however, consistent with the focus of this journal Special Issue, this article focuses on the breadth of work within the CTN. Collectively, this research within the CTN is focused on addressing important scientific questions such as how to use digital health to routinize and standardize validated SUD screening in health care settings; how to use digital health to detect substance use in real time and in “the wild”; how to implement and scale up effective SUD treatment; and how to leverage social media for recruitment and intervention. The majority of digital health studies in the CTN have focused on the use of electronic health records for SUD screening and/or assessment. One of the earliest projects in this area was the development and validation of a brief screening and assessment instrument, the Tobacco, Alcohol, Prescription Medication, and Other Substance Use Tool,vertical farming systems for use in primary care patients . The TAPS tool is comprised of a 4-item screening survey, followed by a more detailed, substance-specific assessment of risk for any substances for which an individual has a positive initial screen . An early multi-site CTN trial with 2000 adult patients in 5 adult primary care clinics compared an interviewer-administered version of the TAPS tool to a version of the tool that was self administered on a tablet computer . Results demonstrated that the interviewer- and self-administered versions of the TAPS tool had comparable diagnostic characteristics, but the self-administered version yielded higher rates of reporting of past year alcohol, illicit drug and prescription medication misuse . The most notable discrepancy was for reports of prescription medication misuse, such that disclosure rates were 50% higher on the self-administered version. In addition, the tool showed promising sensitivity and specificity for detecting several types of substance use disorders, including tobacco and alcohol. It also identified adult primary care patients with high risk scores on the World Health Organization’s Alcohol, Smoking and Substance Involvement Screening Test as well as moderate risk scores for tobacco, alcohol and marijuana . Overall, the TAPs tool showed a more modest ability to identify some illicit and prescription medication SUDs in comparison to the ASSIST.

Despite this, the TAPS tool is much briefer than the ASSIST and provides primary care providers with information about current substance use, thus underscoring its strong appeal for use in primary care. Given that visits to primary care represent an important window of opportunity to systematically screen and identify SUDs among a broad population , the TAPS tool is an example of a validated, brief and practical resource that can be routinely delivered, including in a digital format, in general medical settings. The CTN’s work has extended beyond development and validation of the TAPS SUD screening tool to evaluate the feasibility of embedding SUD screeners into EHRs in primary care and integrating screening into the primary care workflow. One trial evaluated how implementation of drug screening in primary care impacts rates of SUD assessment and subsequent care and demonstrated that screening led to an increase in SUD diagnoses, particularly cannabis use disorder diagnoses . Another multi-site study being conducted in both urban primary care and rural Federally Qualified Health Centers has identified barriers and facilitators of embedding screening into these settings and underscored the importance of clearly communicating with patients about the goals of screening to counteract stigma, addressing staff concerns regarding time and workflow, and providing SUD education and treatment resources to primary care clinicians . Several ongoing CTN projects have further extended this work to evaluate the feasibility, usability, acceptability and impact of OUD clinical decision support tools embedded in EHRs to help guide primary care providers in evidence-based treatment of OUD. Of considerable promise, and influenced by the research conducted within the CTN, the U.S. Preventive Services Task Force has just released a draft recommendation to screen for drug use among adults in general medical settings , 2019.The CTN has had a marked impact in the field of digital therapeutics for SUDs — interactive software used to treat SUDs. The most impactful clinical trial with a digital therapeutic conducted within the CTN evaluated the clinical effectiveness of the web-based Therapeutic Education System  . TES is a web-based, self-directed version of the strongly evidence-based Community Reinforcement Approach to behavior therapy developed by Azrin . This intensive behavioral treatment is designed to teach individuals with SUDs how to better understand and disrupt harmful behaviors and cognitions related to their drug-taking behavior and to develop new skills to restructure their lives. In this pivotal CTN trial, the CRA-based behavioral treatment was offered along with incentives targeting drug abstinence and treatment participation. In this trial, conducted in partnership with 10 SUD treatment sites, individuals in outpatient SUD treatment were randomly assigned to receive either 12 weeks of standard outpatient SUD treatment or a treatment model in which TES partially replaced 2 hours of patient-clinician therapy time or psychoeducation . This study found that participants who received TES as part of their care model had a markedly lower rate of treatment dropout and a higher rate of drug abstinence , an effect that was most evident among patients who had a drug-positive urine and/or alcohol-positive breath screen at the time of entering the study .

Independent samples t-tests probed significant group × time interactions within these 5 clusters

One summary measure from each test was chosen a priori as the best estimate of the function of that test. We factor analyzed the test battery to reduce the number of variables. Supplementary text and Table S1 provide extensive detail on the battery. We examined missing data prior to implementing multiple imputation. From a sample of 1043, 953 received baseline neurocognitive testing . Of the CHR sample that transitioned to psychosis during the two-year follow-up , 89 received testing . Overall data completeness for the tested sample was 96.6% for 19 test variables. After MI, we conducted a FA of the 19 neurocognitive variables . All analyses were done with SPSS, version 23.Groups were HCs, CHR converters and non-converters . T-tests, Kolmogorov-Smirnov Z and Chi Square tests were used to assess demographic comparability. Due to differences in age and maternal education, we controlled for both using MANCOVA and also controlled for site as a random effects factor with a linear mixed model. We covaried for estimated and premorbid IQ to test the role of general intellectual ability in cognitive dysfunctions. We compared medicated vs. Non-medicated groups of CHR +C vs HC, and CHR+C vs CHR-NC by conducting MANOVA with planned comparisons using residualized factor scores generated from the linear mixed models. To examine group cognitive profiles we residualized out age and maternal education from all neurocognitive indices . Area under the curve was calculated by the ROC program in SPSS. Prediction of conversion to psychosis and time to conversion was assessed by logistic and Cox regression. Covariates were selected based on similar prediction analyses conducted in NAPLS-164 and NAPLS-245 and entered into the model if they were associated with survival time and predicted conversion status in logistic regression. Survival time was time to the last SIPS interview or conversion, ebb and flow rolling benches whichever occurred first. Candidate covariates were added to the model as a block then subjected to backward selection with a criterion p value of 0.10. Candidates that survived at p ≤ .05 within domain were entered into an omnibus model.

ESs were calculated with Cohen’s d. Bonferroni corrected significance for mean comparisons was set for individual tests at p<. 00263 and for factors at p< .0125 .In the largest and most detailed study of CHR prodromal cases, using a multi-site, case control design and standardized assessments, we demonstrated that individuals at CHR were impaired in virtually all neurocognitive dimensions compared to controls, and this could not be accounted for by premorbid or current general cognitive ability, current depression, medications, alcohol or cannabis abuse. ESs in comparison to HCs for Declarative Memory and Attention/WM were large for CHR+C participants. Compared to CHR-NC, CHR+C participants were significantly impaired in Attention/WM and Declarative Memory, the latter significantly predicting conversion to psychosis and time to event in concert with positive symptoms. Comparable impairments were observed in never-medicated and currently unmedicated CHR-NCs and CHR+C’s. These data demonstrate the sensitivity of neurocognitive function as a component risk marker for psychosis.Our findings support theoretical models hypothesizing Attention/WM impairments, and even more strongly, impaired Declarative Memory, as central to the CHR stage.The results are consistent with NAPLS-1, in which Declarative Memory had the largest ES decrement and roughly the same magnitude in CHR+C.The distinct profile of performance across domains, especially in CHR+C, suggests that at the incipient psychotic phase, specific forms of neurocognition are affected and are predictive of later psychosis. Within CHR participants, there was considerable variability in neurocognitive performance. CHR-NC’s impairments , were on the order of other psychiatric disorders in young people, such as attention-deficit/hyperactivity disorder. CHR+Cs impairments were approximately 57% larger, although smaller than those observed in first episode schizophrenia . Analyses of individual variability and longitudinal analyses are needed to identify how profile and severity differ according to comorbid disorders, final diagnoses and pre- and post-conversion. A key question was how neurocognitive deficits are associated with medication status. Psychotropic-naive and unmedicated subgroups had significant impairments comparable to the overall CHR subgroups. Treated groups, including with antipsychotic medications, were largely comparable to those without treatment, except they had somewhat greater Attention/WM impairment.

These observations emphasize the essential nature of neurocognitive impairment in the CHR stage and de-emphasize the role of medications as confounders in our results. Our design precludes conclusions about causality and future work should study the effect of medications on neurocognition in CHR populations in a prospective design. There were a number of other potentially important observations. The unexpectedly higher Verbal score that was retained in logistic and Cox regressions in concert with impaired Declarative Memory was not a significant predictor in univariate comparisons. This pattern of high verbal premorbid ability and impaired memory, coupled with P1/P2 composite appears to be a pernicious combination predicting conversion and needs replication. Importantly, the BVMT-R showed comparably large impairments as the two verbal memory tasks, highlighting that Declarative Memory deficits in CHR are not solely verbal, and that Declarative Memory impairments are key neurocognitive risk markers . Neurocognitive tests used in concert with other clinical and psychobiological measures may enhance prediction of psychosis or functional outcome. For example, in analyses limited to two tests selected from literature review14 prior to these neuropsychological analyses, NAPLS-2 investigators found that the HVLT-R and BACS Symbol Coding added modest but significant independent predictive power above the clinical measures in a risk calculator algorithm for psychosis conversion and this was replicated in an independent non-NAPLS sample. Similar results have been observed in other studies. In this study, we showed that other tests, including BVMT-R, PAM, and ACPT QA Vigil added significant independent variance beyond P1-P2 symptoms, augmenting the importance of neurocognitive markers. NAPLS-2, because of its large sample from diverse geographical areas, extensive neurocognitive coverage, remarkably complete neurocognitive dataset, and large never medicated sample, allowed for a strong confirmation of neurocognitive hypotheses. The NAPLS-2 study built upon and improved the NAPLS-1 assessment, confirming and expanding prior results . This broad range of measures expanded the scope of what is known about CHR neurocognition. Limitations include the fact that most of these tests and factors are complex. Thus, while Declarative Memory is clearly affected, the tasks tapping this domain cannot parse the specific mechanisms underlying the deficits.

Further research with more molecular measures of cognition, such as those developed by CENTRACS, may allow specification of the cognitive processes underlying the deficits. We did not randomize or counterbalance the order of tests, so we cannot rule out order effects. However, the most impaired tasks were spread out across the battery from the sixth to the last tests in the battery so there is no obvious fatigue effect. Adolescence is a developmental period between childhood and adulthood characterized by marked physiological, psychological, and behavioral changes. Adolescents experience rapid physical growth, sexual maturation,rolling grow benches and advances in cognitive and emotional processing . These changes coincide with increases in substance use, with alcohol being the most widely used illegal substance among adolescents . National survey data indicate that 33% of 8th grade students have tried alcohol, and this percentage increases to 70% among 12th graders . Of greater concern is the increase in heavy episodic drinking where prevalence rates increase from 6% to 22% for 8th and 12th grades, respectively , as heavy episodic drinking during adolescence is associated with numerous negative effects on adolescent health and well being, including risky sexual behaviors , hazardous driving , and alterations in adolescent brain development . During adolescence, the brain undergoes significant changes, and a recent longitudinal neuroimaging study suggests that heavy episodic drinking during this developmental period alters brain functioning . Squeglia and colleagues examined the effects of heavy episodic drinking on brain function during a visual working memory task, comparing brain activity in adolescents at baseline and again at follow-up to compare brain activity in those who transitioned into heavy drinking during adolescence to demographically matched adolescents who remained nondrinkers. Adolescents who initiated heavy drinking exhibited increasing brain activity in frontal and parietal brain regions during a visual working memory task compared to adolescents who remained nondrinkers through follow-up, who showed decreasing frontal activation, consistent with studies in typical development . Thus, adolescent heavy episodic drinking may alter brain functioning involved in working memory; however, additional longitudinal studies are needed to explore the effects of alcohol on neural correlates of other vital cognitive processes, such as response inhibition.

Response inhibition refers to the ability to withhold a prepotent response in order to select a more appropriate, goal-directed response . The neural circuitry underlying response inhibition develops during adolescence , and as such, brain response during inhibition changes during adolescence . Briefly, cross-sectional research indicates that brain activation during response inhibition transitions from diffuse prefrontal and parietal activation to localized prefrontal activation . Longitudinal studies report that atypical brain responses during response inhibition, despite comparable performance, is predictive of later alcohol use , substance use and dependence symptoms , and alcohol-related consequences . Together, these findings indicate that neural substrates associated with response inhibition change over time and abnormalities in development may contribute to later substance use. To this end, the current longitudinal fMRI study examined the effects of initiating heavy drinking during adolescence on brain activity during response inhibition. We examined blood oxygen level dependent response during a go/no-go response inhibition task prior to alcohol initiation , then again on the same scanner approximately 3 years later, after some adolescents had transitioned into heavy drinking. Based on our previous findings , we hypothesized that adolescents who transition into heavy drinking would show reduced BOLD response during response inhibition prior to initiating heavy drinking followed by increased activation after the onset of heavy episodic drinking, as compared to adolescents who remained non-users. By identifying potential neurobiological antecedents and consequences of heavy episodic drinking, this study will extend previous research on the effects of alcohol on brain function and point to risk factors for heavy episodic drinking during adolescence.Table 1 provides baseline and follow-up descriptive information for Heavy Drinkers and Controls. Of note, the ability to inhibit prepotent responses improved with age with no differences in this improvement between groups . The no-go versus go contrast at baseline revealed activations consistent with meta-analyses of response inhibition showing significant clusters of activation in inferior, superior, and medial frontal gyri, and in parietal, temporal, cerebellar, and subcortical areas . Because Heavy Drinkers reported significantly more substance use than Controls at followup, a lifetime substance use composite and biological sex were included as covariates. A repeated measures ANCOVA revealed significant group × time interactions in 5 regions: the bilateral middle frontal gyri, right inferior parietal lobule, left putamen, and left cerebellar tonsil . At baseline, Heavy Drinkers showed significantly less no-go BOLD contrast than Controls in all 5 clusters . Across adolescence, Heavy Drinkers exhibited increasing response inhibition BOLD contrast, and Controls showed attenuated response in clusters. At follow-up, Heavy Drinkers showed significantly greater response inhibition activity than Controls in 4 brain regions : bilateral middle frontal gyri, right inferior parietal lobule, and left cerebellar tonsil. Exploratory post-hoc analyses examined whether BOLD response contrast change over time correlated with subsequent alcohol involvement in Heavy Drinkers . BOLD response contrast during no-go relative to go trials over time in the right middle frontal gyrus positively correlated with follow-up lifetime number of drinks . Follow-up hierarchical linear regressions revealed BOLD response contrast at baseline did not predict follow-up alcohol consumption after controlling for baseline alcohol, biological sex, and follow-up age at our conservative, corrected threshold . The present longitudinal neuroimaging study examined the effects of initiating heavy drinking during adolescence on brain responses during response inhibition. We hypothesized, based on previous findings , that adolescents who transition into heavy drinking would show reduced BOLD response during response inhibition prior to initiating heavy drinking followed by increased activation after the onset of heavy episodic drinking, as compared to adolescents who remained non-drinkers. Examining a longitudinal neuroimaging sample of youth both preand post-alcohol use initiation allowed us to address the etiology of neural pattern differences. Although group × time effect sizes were small, our findings suggest that differential neural activity patterns predate alcohol initiation and also arise as a consequence of heavy drinking.

Recent studies have shown that GlyRs are an important target for cannabinoids in the central nervous system

These data thus suggest that AEA augments ventricular myocardial IKATP through a CB2 receptordependent pathway, which may underlie the antiarrhythmic and cardioprotective action of AEA; by contrast, AEA exerts no effect on another inward rectifier current IK1 in ventricular myocytes. Nevertheless, whether AEA causes an inhibitory effect on myocardial KATP channels in excised membrane patches remains to be determined. Endocannabinoid production can be induced when the cardiovascular system is functioning under deleterious conditions such as circulatory shock or hypertension; endocannabinoids are also involved in preconditioning by nitric oxide. Activation by endogenously released AEA under pathophysiological conditions may contribute to the cardioprotection afforded by sarcolemmal KATP channels. The difference in KATP channel responses between different cell types to endocannabinoids may be partly attributed to the distinct, tissue-specific molecular compositions of KATP channels or the cellular background in which the channels are expressed.Across the studies discussed above, the time course for endocannabinoid lipids and analogues to induce the potassium channel-modulating effects is generally slow, with a maximal response achieved after several minutes of continuous drug exposure. Moreover, there is no significant washout of the endocannabinoid effect upon perfusion with drug-free solution, unless the wash solution contains lipid-free BSA . In almost all of these studies, the experimentation was performed at room temperature, except the study by Gantz and Bean, where the experimentation was conducted at 37°C instead. Cannabinoids are lipid-soluble compounds, and dimethyl sulfoxide or 100% ethanol was chosen as a solvent to prepare aliquots of endocannabinoids at millimolar concentrations in these studies,cannabis grow systems with the final concentration of solvent during experiments consistently ≤ 0.1–- 0.15%. Gantz and Bean showed that the maximal inhibitory effect of 2-AG on the fast inactivating A-type K+ current IA could be measured within 1 min of drug exposure .

The more prompt response to endocannabinoids observed by Gantz and Bean may be attributed, in part, to the higher temperature at which their study was carried out.AEA can modulate the functions of ion channels other than potassium channels, such as TRP vanilloid type 1 channels, 5-HT3 receptors, nicotinic acetylcholine receptors, glycine receptors, and CaV and voltage-gated Na+ channels, in a manner independent of known cannabinoid receptors. Several studies are reviewed below to exemplify that, besides potassium channels, multiple ion channel types belonging to other ion channel families can also serve as molecular targets of endocannabinoids, which collectively manifests the relevance of direct modulation of various ion channels in mediating the biological functions of endocannabinoids.The TRP channel superfamily of non-selective, ligand-gated cation channels is involved in numerous physiological functions such as thermo- and osmosensation, smell, taste, vision, hearing pressure or pain perception. The endocannabinoid AEA is structurally related to capsaicin , the agonist for TRPV1 channels. It has been demonstrated by Zygmunt et al. that AEA induces vasodilation in isolated arteries in a capsaicin-sensitive manner and that the AEA effect is accompanied by release of calcitoningene-related peptide , a vasodilator peptide. This vasodilatory action of AEA is abolished by a CGRP receptor antagonist but not by the CB1 receptor antagonist SR141716A; moreover, CB1 and CB2 receptor agonists do not reproduce vasodilation caused by AEA. Additionally, AEA concentration-dependently elicits capsazepine -sensitive currents acquired from cells over expressing cloned TRPV1 channels in both whole-cell and excised patch modes. These findings thus suggest that AEA induces peripheral vasodilation by activating TRPV1 channels on perivascular sensory nerves independently of CB1 receptors and consequently causing the release of CGRP. AEA and other structurally related lipids may act as endogenous TRPV channel agonists or modulators to regulate various functions of primary sensory neurons such as nociception, vasodilation, and neurogenic inflammation.Low voltage-activated or T-type calcium channels, encoded by the CaV3 gene family, regulate the excitability of many cells, including neurons involved in nociceptive processing, sleep regulation and the pathogenesis of epilepsy; they also contribute to pacemaker activities.

The whole-cell currents of both cloned and native T-type calcium channels are blocked by sub-micromolar concentrations of AEA; this effect is prevented by inhibition of AEA membrane transport with AM404, suggesting that AEA acts intracellularly. AEA concentration-dependently accelerates inactivation kinetics of T-type calcium currents, which accounts for the reduction in channel activity. The inhibitory action of AEA on these CaV channels is independent of CB1/CB2 receptors and G proteins, and the inhibition is preserved in the excised inside-out patch configuration, implying a direct effect; furthermore, AEA has little effect on membrane capacitance, reflecting that its effects are unlikely attributed to simple membrane-disrupting mechanisms. Accordingly, it is postulated that AEA may directly target and modulate T-type calcium channels to elicit some of its pharmacological and physiological effects. High voltage-activated, dihydropyridine sensitive L-type calcium channels are involved in excitation-contraction coupling in skeletal, smooth, and cardiac myocytes as well as the release of neurotransmitters and hormones from neurons and endocrine cells. It has been demonstrated via biochemical assays that AEA is able to displace specific binding of L-type calcium channel antagonists to rabbit skeletal muscle membranes in a concentration-dependent manner, with the IC50 around 4–30 μM, supporting a direct interaction between AEA and L-type calcium channels. Furthermore, AEA suppresses the whole-cell currents of both native NaV and L-type calcium channels in rat ventricular myocytes in a voltage- and pertussis toxin-independent manner, indicating that the inhibitory effect of AEA does not require activation of Gi/o protein-coupled receptors like CB1 and CB2 receptors. Direct inhibition of NaV and L-type CaV channel function may account for some of the negative inotropic and antiarrhythmic effects of AEA in ventricular myocytes.GlyRs belong to the Cys-loop, ligand-gated ion channel superfamily that comprises both cationic receptors such as nAChRs and 5-HT3Rs and anionic receptors such as γ-aminobutyric acid type A receptors and GlyRs. GlyRs are distributed in brain regions involved in pain transmission and reward, and they are thought to play a role in the analgesia process and drug addiction. AEA, at pharmacologically relevant concentrations , directly potentiates the function of recombinant GlyRs expressed in oocytes and native GlyRs present in acutely isolated rat ventral tegmental area neurons through an allosteric, CB1 receptor-independent mechanism.

The stimulatory effect of AEA on GlyRs is selective, as neither the GABA-activated current in VTA neurons nor the recombinant α2β3γ2 GABAAR current in oocytes is affected by AEA treatment.The homomeric α7 receptor is one of the most abundant nAChRs in the nervous system and it is involved in pain transmission, neurodegenerative diseases, and drug abuse. The endocannabinoid AEA has been shown to inhibit nicotine-induced currents in Xenopus oocytes expressing cloned α7 nAChRs; the inhibition is concentration-dependent with an IC50 of 229.7 nM and noncompetitive. In addition, pharmacological approaches using specific inhibitors uncovered that the inhibitory effect of AEA on α7 nAChRs does not require CB receptor activation, G protein signaling, AEA metabolism, or AEA membrane transport, suggesting that AEA inhibits the function of neuronal α7 nAChRs expressed in Xenopus oocytes via direct interactions with the channel. AEA is structurally similar to other fatty acids such as arachidonic acid and prostaglandins; it is possible that AEA and other fatty acids that are capable of modulating nAChRs share some common mechanisms of action to control the channel function.It is well established that potassium channels are important players in controlling the duration, frequency, and shape of action potentials, thereby controlling cell excitability. As described above, the endocannabinoid AEA is capable of exerting CB1/CB2 receptor-independent functional modulation of a variety of potassium channels, including native BK , Ito, delayed rectifier, KATP and TASK-1 channels, as well as cloned TASK-1, Kv4.3, Kv3.1, Kv1.2 ,and Kv1.5 channels . Moreover, native neuronal IA, pancreatic β–cell delayed rectifier and KATP, and atrial myocardial delayed rectifier potassium channels are subject to modulation by another endocannabinoid 2-AG, also in a CB receptor-independent manner . Likewise, for TRPV1 channels, ligand-gated ion channels such as cloned and native GlyRs, cloned α7 nAChRs and native 5-HT3Rs , plus voltage-gated ion channels such as native NaV , native L-type CaV, and native and cloned T-type CaV channels,ebb and flow tables the functional modulation elicited by AEA does not require activation of CB receptors . Interestingly, in the majority of studies reviewed in this article, the CB receptor-independent modulatory effects of AEA are induced only when endocannabinoids are introduced extracellularly to the ion channel targets that include cloned Kv1.2, hKv1.5, Kv3.1, and hKv4.3 channels heterologously expressed, and native delayed rectifier Kv channels in aortic vascular smooth muscle cells and cortical astrocytes, whereas in several reports endocannabinoids only alter ion channel function when administrated at the cytoplasmic side of the membrane. These observations imply the presence of distinct interaction sites or mechanisms of action, which may be attributable to differences in the types of ion channels or endocannabinoids investigated, cell models/cellular environments channels being exposed to, or experimental protocols adopted. On the other hand, although membrane environment seems to be critical for the regulation of signal transduction pathways triggered by G protein-coupled receptors like CB1 receptors, current evidence does not support an involvement of changing membrane fluidity or altering lipid bilayer properties in mediating the CB receptor-independent actions of AEA on ion channels.

Besides, it is worth noting that, unlike 2-AG, which is entirely localized in lipid rafts in dorsal root ganglion cells, most of AEA is found in non-lipid raft fractions of the membrane. It is therefore less likely that changes in membrane fluidity serve as a primary mechanism of action responsible for AEA’s CB receptor-independent effects. Lipid signals like endocannabinoids and structurally related fatty acids may modify gating of voltage-gated ion channels through a direct action on the channel via a membrane lipid interaction. A model for direct interactions between ion channel proteins and endocannabinoids is further supported by identification of specific residues in several channel proteins crucial for the CB receptor-independent modulatory actions exerted by endocannabinoids. For example, AEA may directly interact with, and in turn be stabilized by, a ring of hydrophobic residues formed by valine 505 and isoleucine 508 in the S6 domain around the ion conduction path of the hKv1.5 channel, thereby plugging the intracellular channel vestibule as a high potency open-channel blocker and suppressing the channel function. Molecular dynamic simulations have also helped reveal novel interactions between AEA and the TRPV1 channel on a molecular level, suggesting that AEA enters and interacts with TRPV1 in a location between the S1-S4 domains of the channel via the lipid bilayer.Brief interventions have empirical support for acutely reducing alcohol use among non-treatment seeking heavy drinkers. For example, randomized clinical trials of brief interventions have found favorable results among heavy drinkers reached through primary care , trauma centers and emergency departments . Brief interventions also have shown effectiveness in reducing alcohol use in non-medical settings among a young adult college population . Given this sizable evidence base, there is considerable interest in understanding the underlying mechanisms toward optimizing this approach. Neuroimaging techniques allow for the examination of the neurobiological effects underlying behavioral interventions, probing brain systems putatively involved in clinical response to treatment. To date, one study has examined the effect of a motivational interviewing-based intervention on the neural substrates of alcohol reward . In this study, neural response to alcohol cues was evaluated while individuals were exposed to change talk and counter change talk , which are thought to underlie motivation changes during psychosocial intervention. The authors report activation in reward processing areas following counter change talk, which was not present following exposure to change talk . Feldstein Ewing and colleagues have also probed the nature of the origin of change talk in order to better understand the neural underpinnings of change language . In this study, binge drinkers were presented with self-generated and experimenter-selected change and sustain talk. Self-generated change talk and sustain talk resulted in greater activation in regions associated with introspection, including the interior frontal gyrus and insula, compared to experimenter elicited client language . These studies employed an active ingredient of MI within the structure of the fMRI task, thus allowing for a more proximal test of treatment effects. Neuroimaging has also been used to explore the effect of psychological interventions on changes in brain activation that are specifically focused on alcohol motivation. For example, cue-exposure extinction training, a treatment designed to prevent return to use by decreasing conditioned responses to alcohol cue stimuli through repeated exposure to cues without paired reward, has also been evaluated using neuroimaging .

Examining the variability in recognition over time within this study is still meaningful

In summary, considering the HIV literature, the middle-aging literature, and the finding that episodic memory was associated with prefrontal structures rather than medial temporal lobe structures, episodic memory in middle-aged PWH is more likely related to frontally mediated etiologies. This could indicate that memory in middle-aged PWH is associated with HIV disease. Notably, this association was seen in PWH on ART without a detectable viral load, showing that this association is seen even in PWH who are virally suppressed. However, it is of course difficult to differentiate between the effect of HIV itself versus the effect of comorbid conditions, many of which may be increased in PWH due to the downstream effects of HIV and ART, or a combination of the two. The medial temporal lobe was not associated with episodic memory, which overall may indicate that at this age range, preclinical AD is not likely a contributor to memory functioning. However, the middle-aging literature does not provide a good estimate of when, on average, to expect to start detecting differences, even small, in memory and medial temporal structures in those that are on an AD trajectory; therefore, it is possible that this group is too young to even start detecting any preclinical AD effect. This is further complicated because the middle-aging literature is demographically different from the CHARTER sample, thus highlighting the need for more diverse aging studies. Additionally, this study did not specifically examine differences in the associations between memory and brain structures by AD risk ; thus, future research should examine memory associations by AD risk, particularly given that APOE status was associated with delayed recall. Relatedly, these findings show that on average this group is not showing associations with memory and the medial temporal lobe and early signs of preclinical AD, but this does not mean that no participants are on an AD trajectory. In fact, given base rates, some of this group will eventually develop AD. First, the multi-level models examining the cross-level interactions between time and medial temporal structures with dichotomous recognition as the outcome did not converge. This analysis would have examined if baseline medial temporal lobe structures are associated with greater likelihood of impaired recognition over time. Given that the models did not converge, this indicates the models were over parameterized and that the model was not supported by the data.

This was possibly affected by the modest sample size,greenhouse bench top with a particularly small group of participants with impaired recognition at baseline. For example, of the 12 participants that were impaired at baseline, only two remained impaired. Moreover, in those that were not impaired at baseline but were impaired at some point in time, most reverted back to unimpaired at subsequent visits. Only four participants remained impaired in recognition over time, although with limited follow-up. There is not data on why these participants do not have additional follow-up , and thus it is hard to make any definitive conclusion as to if consistently impaired recognition is a risk factor for negative outcomes. However, it would certainly be warranted to examine if consistent recognition impairment is associated with negative outcomes in a larger group of middle-aged and older PWH. For example, this small group of participants that were consistently impaired in recognition memory could represent those that are progressively declining and are on more of an AD trajectory. Moreover, a better understanding of how those that are consistently impaired differ from those that revert to unimpaired recognition would be beneficial. There are multiple reasons that may explain why recognition impairment status was variable over time. First, HIV-associated neurocognitive impairments are known to fluctuate over time. For example, in the CHARTER study, 17% of the sample improved over time . Therefore, this could simply reflect the heterogeneous and fluctuating course of HAND over time. Second, recognition is sometimes used as an embedded performance validity measure. While all participants were administered a standalone performance validity test at the beginning of the neuropsychological evaluation to verify credible test performance, effort can fluctuate throughout testing. That said, none of the participants at baseline were below the proposed cut-off of ≤5 for HVLT-R recognition , making this explanation less likely. Lastly, this variability over time may be in part due to the psychometric properties of the HVLT-R and the BVMT-R.

Recognition for both the BVMT-R and the HVLT-R are skewed with known ceiling effects, meaning that there is limited variability in this variable . Therefore, a one- or two-point difference can result in large differences in the normative score. Moreover, there are known modest interform differences on the HVLT-R recognition . Additionally, while the HVLT-R and BVMT-R test-retest reliability of recognition show adequate test-retest stability coefficients, the test-retest reliability of recognition is less reliable than other test measures such as total learning or delayed recall . Next, longitudinal delayed recall was examined. Most notably, there was little decline in delayed recall over time; the delayed recall T-score decreased by 0.041 per year. Additionally, there was little variability in this slope given that the standard deviation of the slope was 0.678. None of the cross-level interactions between medial temporal lobe structures and years since baseline were significant indicating that medial temporal lobe structures at baseline were not associated with a change in delayed recall. However, given that there was little variability in delayed recall over time, this was not surprising. As discussed in the introduction, worse baseline medial temporal lobe structures, particularly the hippocampus and entorhinal cortex, have been associated with an increased risk of future AD, MCI, and decline in cognition in older adults without HIV . This relationship is less understood in middle age. One study by Gorbach et al. found that hippocampal atrophy was associated with a decline in episodic memory in adults over the age of 65 but not in middle-aged adults between the ages of 55 to 60. As highlighted above, it is possible that the cohort from the current study is too young to expect to see associations between medial temporal lobe structures and longitudinal memory. Importantly, the current study only examined cross-sectional structural MRI; therefore, we cannot assume that smaller or thinner medial temporal lobe structures are indicative of atrophy. Additionally, this study does not have an HIV-negative comparison group and did not use normatively-adjusted morphometric values , so it is unclear if participants in this cohort deviate from average, although accelerated brain atrophy has been demonstrated in PWH previously .

Therefore, research examining changes in the medial temporal lobe and how that change relates to episodic memory, particularly recognition memory, in persons with and without HIV over the age of 65 is needed. This research may help to better understand if medial temporal lobe structures are associated with the risk of an AD trajectory and if these associations differ by HIV-serostatus. While there may be some individuals in this group that are experiencing objective decline, on average, in this group of middle-aged PWH we did not observe a decline in delayed recall T-scores over time. These T-scores are age-corrected, so the raw scores on the tests may be declining but they are not declining at a rate greater than what would be expected for age. Additionally, these T-scores also account for practice effects, which if unaccounted for can mask decline,cannabis dry rack although the best method of practice-effect correction is still debated . Similar results showing stable cognition over time were found in a study by Saloner et al. in a larger sample of CHARTER participants aged 50 and over. This study employed growth mixture modeling, and none of the three latent classes demonstrated a decline in global T-score over time. However, other studies of PWH over the age of 50 have observed a greater than expected effect of aging on episodic memory and a recent systematic review found accelerated neurocognitive aging in 75% of longitudinal studies in PWH . Some researchers have questioned if accelerated aging could be due to a neurodegenerative cause such as AD given the high prevalence of risk factors for AD in PWH such as chronic inflammation, increased cardiometabolic comorbidities, and lower brain reserve . While emerging studies have demonstrated some possible ways to disentangle HAND and aMCI , it remains unclear if PWH are at increased risk of AD or if a neurodegenerative etiology could, at least in part, account for some of the observed accelerated aging. For example, Milanini et al. showed a low frequency of amyloid positivity, measured via PET imaging, among virally suppressed PWH over the age of 60, and the rates of amyloid positivity were similar to published rates among an age-matched seronegative sample. However, a recent study among Medicare enrollees did find a higher prevalence of AD and related disorders among PWH . In summary, this aim showed that recognition was variable over time. While amnestic decline could not specifically be tested given that recognition models did not converge, these analyses indicated that within this group, medial temporal lobe integrity was not associated with a decline in delayed recall over time. Additionally, delayed recall only marginally declined over time , thus adding to the mixed literature examining episodic memory in middle-aged and older PWH.

Overall, this study did not detect clear signs of preclinical AD in this group, as delayed recall did not change over time and baseline measures of medial temporal lobe integrity were not associated with memory over time as seen in HIV-negative older adults. However, it is not clear if these associations would be expected in a middle-aged cohort of PWH due to a lack of literature on this topic in middle-aged adults. Therefore, it would be beneficial to re-examine this analysis in an older cohort of PWH.The last aim of this study was to examine if the medial temporal lobe mediates a relationship between peripheral inflammation and memory. It was hypothesized that medial temporal lobe structures would mediate a relationship between peripheral inflammation and episodic memory. Five peripheral biomarkers of inflammation were examined , and these biomarkers were chosen given that they have been associated with cognition in AD and HIV. In this mediation model, the association between peripheral biomarkers of inflammation and medial temporal lobe structures was also explored and the relationship between medial temporal lobe structures and memory was also reported, although this second relationship was already explored in aim 1. First, the mediation models examining recognition indicated poor model fit. Therefore, the relationship between the five plasma biomarkers of inflammation and recognition was examined instead. Greater levels of plasma CRP were associated with lower odds of having impaired recognition. None of the other plasma biomarkers of inflammation were associated with recognition impairment. These findings are generally not in line with the HAND , middle-aging , or older adult literature . Aging and HIV studies have found that a greater concentration of these plasma biomarkers of inflammation are associated with greater risk of HAND, worse memory, and an increased risk of future development of MCI or AD. However, many of these studies only find weak associations, and these studies do not examine recognition memory. The current study had a very small sample of PWH with impaired recognition; thus, it is possible that the CRP finding is spurious, and this finding should not be over-interpreted. Therefore, analyses should be reexamined in a larger, more generalizable sample. Next, a single-mediator model was used to examine if medial temporal lobe structures mediate the relationship between plasma biomarkers of inflammation and delayed recall. In the entire sample, none of the plasma biomarkers of inflammation were significantly associated with any of the medial temporal lobe structures, there were no significant direct effects between the plasma biomarkers of inflammation and delayed recall, and no mediated effect was established. As stated above, the lack of association between inflammation and delayed recall is a little surprising given that the association between inflammation and worse cognition has been demonstrated in HAND and the aging literature. Although, the effect sizes are often small, and the middle-aging literature is limited. Additionally, some of the peripheral inflammatory markers examined in this study have been associated with medial temporal lobe integrity and function in older adults , but the association between inflammation and the medial temporal lobe is much less studied in mid-life and PWH.

These inflammatory markers were not significantly associated with change in any other cognitive domain

To clarify, in the HIV literature, “older” PWH usually refers to PWH aged 50 and over; however, in the aging literature, “older” usually refers to people aged 65 and older, and “middle-age” refers to people aged 45 to 64. To rectify this discrepancy in terminology, the aging literature terminology will be used when discussing both the HIV literature and the aging literature.HAND remains prevalent in the ART era . While the pathogenesis of HAND is not entirely clear, HAND is thought to be the result of the neurotoxic cascade initiated by HIV . The majority of neurocognitive deficits associated with HAND are in the mild range and do not significantly impact everyday functioning , and executive functioning, learning, and memory deficits are most common . Importantly, longitudinal studies have shown that HAND is usually non-progressive . AD is a neurodegenerative disease associated with progressive cognitive and functional impairment . AD is the most common cause of dementia, and it affects 10% of persons without HIV over the age of 65 and 17% between the ages of 75-84 . AD is characterized by the accumulation amyloid plaques and tau tangles in the brain, that start in the medial temporal lobe and result in initial atrophy of the medial temporal lobe and later more widespread atrophy . These brain changes start years to decades before clinical symptoms appear . On neuropsychological testing, AD typically presents initially with impairment in memory, which progresses to global impairment and loss of independent functioning . Mild cognitive impairment is defined as the transitional stage between cognitively normal and major neurocognitive impairment in which persons have observable cognitive deficits but these deficits are not yet significantly impacting everyday functioning. MCI can be further divided into amnestic and non-amnestic MCI sub-types, with aMCI being more associated with AD . While participants are often dichotomized as “MCI” or “cognitively unimpaired,” cognitive decline associated with AD is insidious; therefore, even milder deficits in memory in participants classified as cognitively unimpaired are associated with underlying AD pathology such as amyloid accumulation or medial temporal lobe atrophy . Due to the overlap in cognitive presentation ,vertical grow racks middle-aged and older PWH are at risk of erroneously being classified as HAND, due to HIV diagnosis, when they may instead be on an AD trajectory.

Given that aMCI is associated with progressive cognitive and functional impairment, as opposed to HAND, which is more stable, it is imperative that the etiology of the cognitive impairment is correctly identified . While there is currently no cure for AD, a misdiagnosis of HAND when a person with HIV has aMCI limits the opportunity for early intervention when interventions may be most beneficial . For example, early identification of AD allows more time for life planning and the acquisition of compensation strategies, which may prolong independent functioning, and, by extension, sustain quality of life . Furthermore, accurate diagnosis is important to allay concerns in PWH without indication of an AD trajectory. It is hypothesized that PWH may be at increased risk of AD due to the compounding effects of HIV and aging on the brain , chronic inflammation despite viral suppression, increased prevalence of vascular and metabolic risk factors , and potentially common pathophysiological pathways . While little work has been done in this space, several recent case reports on AD in PWH have highlighted the risk of delayed diagnosis, detailed complications determining the etiology of cognitive impairment, and underscored the clinical need for tools to differentiate HAND and aMCI . Additionally, there is some evidence from the HIV and aging literature to suggest that memory may be particularly affected in older PWH; however, most of these studies do not consider other etiologies that may be contributing to the observed findings. For example, Goodkin et al. found that there was a greater than expected effect of aging on episodic memory in PWH aged 50 and over, and Seider et al. found that verbal memory declines more rapidly with age in PWH as compared to HIV-negative comparison participants. Moreover, in a recent study using latent class analysis to examine a group of PWH aged 50 and over, we found that three classes emerged: a multidomain impaired group, a learning and memory impaired group, and a cognitively unimpaired group . Due to the medial temporal lobe involvement in aMCI, the cognitive profile is described as “amnestic”, with encoding, storage, and rapid forgetting deficits observed as poor learning, recall, and recognition on memory tests . Conversely, HIV particularly impacts fronto-striatal systems , and the frontostriatal involvement associated with HAND accounts for a “subcortical” cognitive presentation. Thus, memory deficits in HAND are characterized by relatively normal memory storage and retention but impaired encoding and retrieval resulting in poor learning and delayed recall, but intact recognition .

This “subcortical” presentation in HAND has been observed even as PWH age . Therefore, recognition may be more indicative of aMCI than HAND and a useful tool for differential diagnosis . However, because recognition has historically been spared in HAND and only recently have PWH been reaching the ages at which they may develop aMCI/AD, there is little research examining recognition deficits in the context of HIV. Of note, deficits in other domains are unlikely to aid in differential diagnosis without further research. For example, while aMCI is characterized by memory deficits, other deficits, such as executive dysfunction, are also quite common in aMCI and AD . Therefore, the presence of executive functioning deficits, which are common in HAND, could be indicative of HAND, aMCI, or a mixed HAND and aMCI profile. Moreover, biomarkers may aid in differential diagnosis in the future; however, elevated amyloid beta is observed in HIV , so more research is needed in order for biomarkers to be beneficial in differential diagnosis of HAND and aMCI. ur research group at the HIV Neurobehavioral Research Program has begun to examine neuropsychological methods to identify aMCI among PWH using adapted Jak/Bondi MCI criteria. Jak/Bondi MCI criteria is an empirically based MCI criteria that has been shown to have greater associations with AD biomarkers and identify more participants who progress to dementia than traditional MCI diagnostic approaches . Our group utilized the basis of the Jak/Bondi criteria and adapted it to capitalize on the neuropsychological differences between HAND and aMCI . Thus, aMCI was defined as impairment on at least two memory tests with the adaptation that at least one impaired test be a test of recognition. In a sample of 80 PWH from the National NeuroAIDS Tissue Consortium with neuropathologically characterized Aβ42 and neuropsychological testing within a year of death, 40 participants met the adapted criteria for aMCI. Twenty-nine of the participants with aMCI were also classified with HAND. The aMCI group was 3.5 times more likely to have the presence of Aβ42 plaques. Conversely, when the same sample was split into HAND and no HAND groups, the presence of Aβ42 plaques was not significantly associated with the HAND group. In sum, these findings provide preliminary data to further support that aMCI may go undetected in a large proportion of PWH with HAND, and these PWH may be misclassified or have a mixed HAND and aMCI profile. Secondly, these preliminary analyses also suggest that recognition deficits in older PWH are sensitive to AD pathology .

Magnetic resonance imaging has shed light on brain changes associated with aMCI and AD and is increasingly used in clinical assessment of suspected AD . Medial temporal lobe atrophy is a core feature of aMCI/AD and has been shown to correlate with disease progression and predict progression from cognitively normal to aMCI . However, AD is also associated with more widespread cortical and subcortical atrophy and white matter abnormalities, particularly as the disease progresses . While neuroimaging has been used extensively to study aging and AD, mostof these neuroimaging studies exclude PWH . Consequently, it is unclear if aging/AD research is generalizable to older PWH. HIV has historically been associated with early changes to fronto-striatal circuits ,indoor grow light shelves although recent neuroimaging studies also report cortical atrophy . Similarly, HAND has been associated with fronto-striatal circuits, and, in more recent years, has also been associated with more cortical structures . Neuroimaging studies have examined neuroanatomical correlates of delayed recall as well as the effect of age on the brain within the context of HIV . Studies comparing PWH with HAND and HIV-negative participants with MCI or AD have shown that hippocampal volumes were able to discern HAND and MCI/AD . Additionally, within the context of HIV, decline in memory has been associated with hippocampal atrophy . However, there are notable limitations to the current literature. For example, most studies have been couched in the context of HAND, are not aimed at examining aMCI within the context of HIV, nor do they consider other etiologies . Additionally, several neuroimaging studies examining the effect of aging in PWH have samples with mean ages in the late 30s or early 40s, which is likely before the initiation of AD pathology . Moreover, memory recognition, which could improve differentiation of HAND and aMCI, was not examined in these studies. Both HIV and aMCI are associated with chronic, low-grade inflammation . As such, inflammation may be one biological mechanism that puts PWH at greater risk of aMCI. Peripheral inflammatory markers can cross the blood-brain barrier, and there is mounting evidence to support the hypothesis that chronic inflammation exacerbates both Ab42 and p-tau pathology and plays a role in the pathogenesis of AD . There is ample evidence linking increased inflammation to brain atrophy, cognition, and cognitive decline in late life , with emerging evidence that this link is present even in midlife . Chronic inflammation is also present in PWH despite viral suppression , and is hypothesized to contribute to and exacerbate HAND . Due to this overlap, inflammation may be one factor that also puts PWH at greater risk of aMCI/AD. While the literature has highlighted the need to investigate this association, little research currently exists . Determining how inflammation impacts brain integrity and cognition in middle-aged PWH could have great implications for our overall understanding of the role of inflammation in AD and for the development of early intervention strategies to lower the risk of AD within PWH. I have begun to examine the relationship between inflammation and change in memory. These preliminary analyses included 57 PWH aged 50 and older with peripheral inflammatory markers and neuropsychological testing at baseline and at 1-year follow-up. Overall, I found that baseline concentrations of inflammatory biomarkers were not associated with baseline memory performance. However, using multi-variable linear regressions, IL-6 and TNF-a were associated with decline in delayed recall, and greater baseline concentrations of CCL2 were associated with decline in recognition .Overall, these findings support the hypothesis that inflammatory markers may be related to cognitive changes associated with abnormal memory decline . As AD drug trials targeting amyloid continue to fail, there is increased focus on repositioning current drugs, such as anti-inflammatory drugs, to reduce the risk of AD . Epidemiological studies have shown that persons taking antiinflammatory drugs for diseases such as rheumatoid arthritis had a reduced risk of developing AD . Moreover, small, randomized control trials examining antiinflammatory drugs such as TNF-a inhibitors , though preliminary, have yielded encouraging results . If larger studies show that antiinflammatory drugs can lower the risk of AD, PWH may particularly benefit. Brain changes associated with future cognitive decline are evident in midlife, several years before cognitive impairment in aMCI and AD . Additionally, longitudinal research studies have shown that more subtle differences in episodic memory in midlife is associated with a decline in memory years later . Additionally, Jak et al. found that midlife memory performance is associated with hippocampal atrophy. As a result, there has been a shift in the aging field to characterize and identify middle-aged adults in the preclinical phase of AD rather than primarily focusing on elderly cohorts in which symptoms and pathology are already present . Furthermore, there is a growing literature suggesting that midlife risk factors are associated with future cognitive decline, suggesting that midlife may be a critical time point when some interventions may be efficacious in augmenting cognitive trajectories . In the HIV literature, most aging research has focused on PWH in midlife.

Elevated levels of BI may contribute risk for both anxiety disorders and substance use disorders

These high levels of sensitivity to uncertain threat in individuals with alcohol use disorder are also positively associated with self-reported coping motives for use. To further explore BI’s potential risk for, or protection against, substance use, studies have examined BIS and BAS levels on substance use outcomes. These studies have focused on undergraduate populations and yielded mixed results with some studies showing no association between BIS levels and substance use , others showing a positive association between BIS levels with substance use problems , and still others showing a positive association between BIS levels and substance use but only for high BAS levels . Given these conflicting results, Morris et al. used a cross-sectional design to examine whether BIS and BAS were indirectly associated with alcohol problems through coping and conformity motives among undergraduate students. Results indicated that those high in BIS levels were more likely to experience alcohol problems due to greater coping and conformity motives for use. Importantly this finding was independent of levels of BAS and high BAS levels only further strengthened these relationships. Taken together, these results highlight BI’s nuanced pathways for high or low risk for substance use and demonstrate the need to investigate potential additional factors contributing to the relationships between BI and substance use. Ethnicity may be one such important moderator of the relationships between BI, anxiety, and substance use. Hispanic/Latinx youth have consistently displayed increased rates of anxiety symptoms, anxiety disorders, and initial rates of substance use when compared to their non-H/L peers . The greater frequency and intensity for which H/L youth experience threats including increased exposure to crime, community violence, chronic stress, and racial discrimination may heighten levels of BI in H/L youth . In fact, H/L adults have displayed increased attentional biases to threat as compared to non-H/L adults . Cultural values may also further impact BI’s association with anxiety in H/L youth. Schneider and Gudiño showed a positive relationship between BI and anxiety symptoms in H/L adolescents and that this relationship was strongest for those H/L adolescents reporting high levels of Latino cultural values.

More specifically, H/L youth may also experience increased anxiety due to heightened social stigma of mental illness in the H/L community and factors related to collectivist cultural values, immigration, and acculturation that, especially in combination,grow light shelves put H/L youth at increased risk when compared to other racial/ethnic groups . It is possible that the combination of increased exposure to stressors and traumatic experiences as well as the context of heightened social stigma and collectivist cultural values may dissuade H/L youth from utilizing social support as a form of coping with their anxiety. Since high levels of BI may lead to alcohol problems through coping and conformity motives and H/L youth may experience greater exposure to substance use as they have the highest initial rates of substance use, H/L youth high in BI may be uniquely at risk for substance use. Therefore, H/L ethnicity may moderate the relationship between BI and substance use. However, it is presently unclear whether the strengths of the relationships between BI, anxiety, and substance use differ based on H/L ethnicity. Therefore, the present study will prospectively investigate the relationships between BIS scale scores , anxiety, and substance use and whether H/L ethnicity moderates such relationships in youth from the Adolescent Brain Cognitive Development Study at baseline , 1-year follow-up , and 2-year follow-up . Logistic regressions were conducted using the “glm” function in R to evaluate the impact of baseline BIS scores and the interaction between baseline BIS scores and ethnicity on past-year substance use at 1-year follow-up and past-year substance use at 2-year follow-up . Linear regressions were conducted using the “lm” function in R to evaluate the impact of baseline BIS scores and the interaction between baseline BIS scores and ethnicity on 1-year follow-up CBCL DSM-5 anxiety problems T-scores and 2-year follow-up CBCL DSM-5 anxiety problems T-scores. All analyses were conducted in version 4.2.3 of R. While the majority of youth did not report substance use at baseline , 0.50% of H/L youth and 0.42% of non-H/L youth did endorse use at baseline. At 1-year follow-up 0.22% of the sample endorsed any substance use and 0.74% endorsed any substance use at 2-year follow-up.

Baseline past-year use days was dichotomized into no past-year substance use and past-year substance use and included as a covariate. Past-year substance use at baseline was included in both models in which past-year substance use at follow-up was an outcome. Past year substance use at 1-year followup was also included as a dichotomous covariate in the model predicting past-year substance use at 2-year follow-up. To control for their effects on anxiety and substance use, all models included the following covariates: mean baseline BAS scores, race, sex, age, highest parental income, and highest parental education.BI has been shown to concurrently and prospectively predict anxiety, while the association between BI and substance use has been mixed. It is possible that the relationship between BI and substance use varies by social and contextual factors. H/L youth in particular may have stronger relationships between BI, anxiety, and substance use. The present study evaluated the prospective relationships between BIS scores, anxiety, and substance use in youth across 1- and 2- year follow-ups of the ABCD study and whether these relationships differed by H/L ethnicity. Results indicated that baseline levels of BIS scores prospectively and positively predicted anxiety symptoms at both 1- and 2-year follow-ups . The relationship between baseline levels of BIS and follow-up levels of anxiety did not differ by ethnicity. Baseline levels of BIS also prospectively predicted increased likelihood of substance use at 2-year follow-up , but only for H/L youth and not for non-H/L youth. No main effects of, or interactions between, ethnicity and BIS scores were found on substance use at the 1-year follow-up. The results showing that baseline BIS scores prospectively and positively predicted anxiety symptoms across the follow-ups are consistent with prior literature on the relationship between BI and anxiety . While prior studies have shown that H/L youth report higher levels of anxiety than non-H/L youth , the present study did not find any ethnic differences in the strength of the relationship between BI and anxiety. It is possible that ethnic differences in anxiety depend on the measure of anxiety . H/L are more likely to experience and report physiological symptoms of anxiety . The CBCL DSM-5 anxiety problems scales may not best represent H/L youth’s experience of anxiety.

Additionally,other risk factors for anxiety may play a more important role in H/L youth’s experience of anxiety and better explain the ethnic differences in anxiety in youth. For example, individual differences in sensitivity to uncertain threat may be a stronger predictor of anxiety, and particularly for H/L . Results related to the relationship between BIS scores and substance use varied across the follow-up years. The lack of association between BIS scores and likelihood of substance use at 1-year follow-up may be due to the fact that substance use at 1- year follow-up was infrequent and did not greatly increase from baseline. While overall substance use increased at 2-year followup in the sample, BIS scores only predicted increased likelihood of substance us in H/L youth . Similar to the results of Morris et al. , these results were independent of levels of BAS scores. This finding is also consistent with results from Chen and Jacobson showing that H/L youth have the highest initial rates of substance use. H/L youth’s increased exposure to crime, community violence, chronic stress, and racial discrimination may also increase coping and conformity motives which in turn may increase likelihood of substances use. It is possible that high BI, in addition to, or in conjunction with, additional risk factors such as increased access to substances, reduced parental monitoring, and association with deviant peers may uniquely contribute to risk for early use of substances in H/L. Further research is needed to understand whether and how such risk may change as rates of substance use change across development. The present study had several limitations and future directions. While the longitudinal nature of the ABCD study allowed for the investigation of prospective and not just concurrent relationships between BIS scores, anxiety, and substance use, it is possible that the age of the sample at baseline and through the followups is still too early to best capture these relationships. As BI is often first assessed in infancy or early childhood , the strength of the relationships between BI, substance use, and anxiety may vary across development and the lifespan. Relatedly,vertical grow system assessing BI via behavioral observation in infancy or early childhood may yield different results than the self-reported BIS scale scores utilized in the present investigation. Additionally, as rates of substance use increase across adolescence and early adulthood and use trajectories vary between ethnicities , the relationships between BIS scores, ethnicity, and substance use may vary based on the time point in which substance use is measured. These relationships may also vary across H/L youth and could differ based on factors such as time living in the US, social stigma, acculturation, language, nativity, and socioeconomic status . Lastly, the ABCD study sample is not a clinical or treatment seeking sample and utilizing clinical samples may impact the strength of the relationships explored in the present study. Additional prospective studies are needed to understand how BIS scores and ethnicity relate to substance use as use increases in future follow-up years of the ABCD Study.

Additional research is also needed to understand how factors such as trauma exposure, stress, cultural values, discrimination, coping motives, conformity motives, etc. may mediate the relationship between BI and substance use in H/L youth. In conclusion, high levels of BIS prospectively predict increased rates of anxiety symptoms in both H/L and non-H/L youth. However, BIS scores uniquely predict increased likelihood of substance use for H/L youth. Future studies are needed to further understand the mechanisms ‘underlying the relationship between BI and substance use in H/L youth that will provide a scientific basis to better inform prevention and intervention programs for the H/L community. Alcohol consumption accounts for 5.9%, or roughly 3.3 million, deaths globally each year . Although alcohol use alone represents a serious public health concern, high comorbidity rates have been observed at an epidemiological level between alcohol and nicotine use , such that 6.2 million adults in the United States endorsed both an alcohol use disorder and dependence on nicotine . Moreover, an individual is three times more likely to be a smoker if he/she is dependent on alcohol and those who are dependent on nicotine are four times more likely to be dependent on alcohol . Given these statistics, it is evident that heavy drinking smokers comprise a distinct sub-population of substance users that warrant unique investigation. Magnetic resonance imaging studies that have focused specifically on the effects that alcohol use may have on brain morphometry have investigated the relationship between drinking variables, such as lifetime duration of alcohol use or lifetime alcohol intake and brain structure in current alcohol users. For example, Fein et al. found lifetime duration of alcohol use was negatively associated with total cortical gray matter volume in alcohol dependent males, but not in light drinkers. Moreover, findings from Taki et al. suggest a significant negative association between lifetime alcohol intake and gray matter volume reductions in the bilateral middle frontal gyri among non-alcohol dependent Japanese men. A recent study , however, found no significant relationship between lifetime alcohol consumption and gray matter volumes in a sample of 367 non-alcohol dependent individuals. Given these contrasting findings, it is uncertain whether quantity variables, such as lifetime alcohol intake or duration of alcohol use account for many of the gray matter volume reductions observed with continued alcohol use. Various studies have implicated several different regions of gray matter atrophy in alcohol dependent individuals, such as the thalamus, middle frontal gyrus, insula, cerebellum, anterior cingulate cortex , and several prefrontal cortical areas . Due to these heterogeneous results, a meta-analysis was conducted, which concluded that there were significant gray matter decreases in the ACC and left dorsal striatum/insula , right dorsal striatum/insula , and the posterior cingulate cortex in alcohol dependent users relative to healthy controls .

Rates of new infections due to sexual transmission among non-injection drug users are increasing

The results show that the majority of patients with opioid use disorder developed this disorder following the presence of chronic pain. A plausible explanation for some of these cases, although not directly demonstrated by the data collected, is iatrogenic causation via use of opioid medication prescriptions for pain. As hypothesized, this group, as well as the OUD First group and Same Time group, had greater rates of co-occurring psychiatric and medical conditions compared to the No Pain group. Patients with mental health and multiple pain problems often present with more physical and psychological distress, resulting in greater frequency of opioid prescribing in primary care practices . Some unexpected, though not totally surprising, differences emerged between the OUD First and Pain First groups. The OUD First group generally had higher rates of other substance use disorders, commensurate with rates in the No Pain group. This was not unanticipated, as both groups were early-identified addiction patients and may have more genetic and environmental predisposition to developing substance use disorders than did the Pain First Group. The Pain First group had generally higher rates of co-occurring medical problems than did the OUD First group. Part of the explanation for this phenomenon may be related to the age of the Pain First group; that group was older and thus more prone to medical illness. It may also be possible that the Pain First group had a longer duration of pain, which contributed to declining health status. Several limitations should be considered when interpreting the results of this study. This was a study using medical record data. As in any research that uses data from medical records, variation in physician documentation and health insurance requirements may introduce bias in the data that are captured. The clinical data were initially recorded for clinical reasons and not specifically for research purposes, so the accuracy of the data may be less than that collected for research purposes. Further, as in other records-based research,vertical horticulture we do not have the information about patient diagnoses outside the system in our study and therefore are unable to ascertain the new OUD diagnosis except that it’s the first OUD diagnosis in the healthcare system under study.

Participants were predominantly white patients living in the Los Angeles area of the United States, potentially limiting generalizability to patients in other regions. Our findings are dependent on the extent, accuracy, and validity of the data available in the EHR dataset. For example, because OUD diagnosis information was obtained from the EHR, we were not able to distinguish if prescription or nonprescription opioids were used or the route of administration. Both mislabeling of people who do not actually have OUD and under-recognition of true OUD diagnoses could affect the true prevalence of OUD in the sample. Since addiction can be under-recognized in the EHR, it is possible that a subset of patients may not have been identified as having an OUD; thus, there may be some patients in the Pain First group that may actually belong in the OUD First group. Despite these limitations, the study revealed some important findings. As would be expected, the majority of patients in this general healthcare or medical setting were white and with private insurance or the resources to pay for their healthcare, as opposed to being black or members of Hispanic ethnic minorities, and without health insurance, who are more often treated in the public treatment system in Los Angeles. Nevertheless, comorbidities are common among patients in both settings. Somewhat surprising is that the rates of co-occurring chronic pain conditions and mental disorders appear even higher than most rates reported in the literature in connection with OUD, often heroin use disorder, treated in public settings. However, medical conditions among OUD patients treated in publicly funded programs are mostly based on self-report, whereas the present study allowed the delineation of the specific rates of several major co-morbid physical health and other disease diagnoses among OUD patients in a general medical setting. This study demonstrated that regardless of demographic differences, OUD is similarly associated with high morbidity among patients in the private sector as in the public sector, which put them at high risk for mortality. The Pain First group demonstrated the highest rates of physical and mental health problems.

As discussed earlier, opioid prescriptions for pain in some of these individuals could have increased the risk for OUD and related problems. On the other hand, because screening for drug use is not mandated in primary care and some other medical settings, OUD may not be recognized and treated until very late in the addiction course, exacerbating the negative consequences of the disorder. Regardless of the potential causes, expanding training for medical professionals to improve screening, early intervention, support, and monitoring could prevent some of the excess morbidity associated with OUD. Furthermore, implementation of recent CDC guidelines addressing opioid prescribing for chronic non-cancer pain may provide additional risk mitigation in patients with chronic pain prior to their development of OUD . Comorbid OUD and chronic pain complicates treatment decision-making, predicts poor outcomes, and increases healthcare costs . Similarly, studies of healthcare claims data reveal that the most challenging and costliest OUD patients had high rates of preexisting and concurrent medical comorbidities and mental health disorders . The present study reveals the type and extent of comorbidities among OUD patients, results that support improving clinical practice by addressing the complex treatment needs in this population. Finally, studies utilizing the EHR data of patient populations with substance use disorders are important in identifying the scope of the problem and the extent of medical, mental health, and substance use comorbidities that necessitate better models of assessment and coordinated care plans.The human immunodeficiency virus epidemic is shifting away from people who inject drugs , as most new cases of HIV in the U.S. are attributed to unsafe sexual practices. In 2014, sexual contact comprised 94% of new HIV infections in the U.S. . Among PWID, sexual risk behaviors are independently associated with HIV transmission, and may be a larger factor in HIV transmission than injection behavior . Sexual risk behaviors that lead to the transmission of HIV and substance use are intertwined behaviors.

Stimulant use, in particular, is associated with greater sex risk behaviors , including having unprotected sex . Prescription medications, including sedatives and painkillers, are also associated with sexual risk behaviors . Moreover, moderate drinking and having an alcohol dependence diagnosis have been associated with an increased likelihood of having multiple sex partners. Having sex under the influence of drugs and/or alcohol enhances sexual risk behaviors and is more strongly associated with new infections of HIV than is unprotected receptive anal intercourse with a partner of unknown HIV status . Substance use can negatively impact judgment and decision making, leading to sexual risk behaviors , such as trading sex for drugs or money , unprotected sexual intercourse , and unprotected sex with multiple partners . Alcohol users are likely to seek the immediate rewards without considering the long-term consequences while under the influence . It is important to consider the trajectories of substance use and sexual risk behaviors concurrently in order to decrease the transmission of HIV. Substance use disorder treatment, including methadone maintenance programs and outpatient drug free settings, may be an important venue for prevention of sexual transmission. While enrollment in drug treatment reduces drug-related HIV risk behaviors, such as injection drug use ,hydroponic rack system many substance users in treatment continue to engage in sex risk behaviors . As substance use is linked to sexual risk behaviors that can transmit HIV, it is possible that decreases in substance use may coincide with decreases in risk behaviors. Little is known about the temporal relationship between drug and alcohol use severity and high risk sexual behaviors among individuals in substance use treatment . The current study extends past research by examining whether reductions in alcohol and drug use severity predicted reductions in sexual risk behaviors among men in SUD treatment who were followed for a six month period. We hypothesized that decreases in drug and alcohol use at follow-up would coincide with decreases in sex risk behaviors. Participants were enrolled in a multi-site clinical trial of the National Institute on Drug Abuse Clinical Trials Network designed to test an experimental risk-reduction intervention, Real Men Are Safe , a five-session intervention that included motivation enhancement exercises and skills training, against a standard one-session HIV education intervention that taught HIV prevention skills. The intervention was delivered by counselors in SUD treatment programs, and approved by the local Institutional Review Boards. Details about this study have been published in greater detail elsewhere . In the parent study, participation was restricted to men in SUD treatment, who were at least 18 years of age, reported engaging in unprotected vaginal or anal intercourse during the prior six months, were willing to be randomly assigned to one of two interventions and complete study assessments, and were able to speak and understand English. HIV status was not assessed as part of this study. Exclusion criteria included gross mental status impairment, which was defined as severe distractibility, incoherence or retardation as measured by the Mini Mental Status Exam or clinician assessment, or having a primary sexual partner who was intending to become pregnant over the course of the trial.

All participants enrolled from methadone maintenance needed to be stabilized in treatment for at least 30 days to ensure the greatest likelihood that they had achieved a stable dose of methadone before starting the intervention groups. Participants were examined prior to receiving the clinical intervention and six months following the intervention. All participants provided informed consent prior to participating.Participants were recruited from seven methadone maintenance and seven outpatient drug free treatment programs in the U.S. that are affiliated with the CTN to participate in a research study on HIV risk reduction interventions. These modalities were chosen as the program’s counselors were trained to deliver the intervention. The treatment programs represented different geographic regions, population density, and HIV prevalence rates. Programs were located in U.S. states that included California, Connecticut, Kentucky, New Mexico, New York, North Carolina, Ohio, Pennsylvania, South Carolina, Washington, and West Virginia; they treated patients in urban , suburban , and rural areas . Recruitment was accomplished through posters and fliers posted in clinic waiting rooms, announcements about the study to clinic patients at group therapy meetings, directly through a participant’s individual counselor, and at clinic “open houses” designed to introduce the study to clinic patients. Most participants from the drug-free outpatient clinics were recruited close to treatment entry, to reduce the possibility of early dropout. Assessments were conducted at baseline, prior to randomization, and six months after. Alcohol and drug use severity were assessed with the Addiction Severity Index-Lite , a standardized clinical interview that provides problem severity profiles in seven domains of functioning by providing an overview of problems related to substance use, in addition to days of use . This instrument has been used in many studies of drug and alcohol abusing populations and its reliability and validity are well-established . Composite scores for each problem domain are derived ranging from zero to one, with higher scores representing greater need for treatment. For the purposes of this study, only the composite scores for the alcohol and drug domains were analyzed. These composite scores are calculated based on the number of days of recent drug and alcohol use , problems arising from this use, and the desire for seeking treatment. We also provided days of recent use of alcohol to intoxication, cannabis, heroin, cocaine, sedatives/hypnotics/tranquilizers, and other opiates.In bivariate analysis, we compared sex risk behaviors, recent substance use, and ASI drug and alcohol composite scores at baseline and follow-up to monitor changes over time. As the ASI drug and alcohol composite scores did not meet the conditions of normality, we used Mann-Whitney U tests and Spearman correlations. Next, we compared sex risk behaviors and ASI composite scores at baseline and at six month-follow-up. Wilcoxon signed-rank tests were used for continuous data and categorical variables with more than two levels and McNemar’s tests were used for dichotomous categorical data . Multinomial multi-variable logistic regression analysis was used to test the hypothesis that reductions in ASI alcohol and drug use severity composite scores would predict reductions in sexual risk behaviors.

A short description of the activity in question was included to help faculty decide on the point values assigned

During the first year , the requirements included only conference and module participation. The residency assessment requirement was subsequently enacted in the following year . Table 1 lists the final baseline education expectations required of faculty members. Before employing these education requirements, all faculty members were notified of the consequences of not fulfilling expectations, which included ineligibility for any academic incentive and an inability to participate in the voluntary ARVU system.In May 2018, stage two began, which involved the creation of an ARVU system to encompass all other academic activities. It was decided that the ARVU system would be voluntary, but to participate the baseline education expectations outlined in stage one had to be fulfilled. For the first step of this stage, the vice chair for education created a list of preliminary activities to be included in the ARVU system, such as teaching, lecturing, publications, grants, committee memberships, and leadership positions. These additional activities were ones in which faculty were already participating that aligned with the academic mission of the department, but had not been captured within the baseline education expectations, did not earn clinical hours reduction from the department or institution, or were not an implicit part of a faculty member’s role based on his or her leadership position. The thought was that activities that earned a clinical reduction in hours were already being financially rewarded, and this system was designed to recognize activities not yet distinguished. An example includes fellowship activities, which were not included because fellowship directors have a reduction in clinical hours to support their leadership role. After the initial list was assembled, it was shared with a select group of 11 leaders within the department, including residency leadership, undergraduate medical education leadership, fellowship directors, the research division, and the pediatric emergency medicine division. The participants were selected due to their various leadership roles in the department,cannabis drying system their dedication to scholarly achievement in their own careers, and the high priority they placed on these activities within their respective divisions.

These qualifications placed these faculty members in a prime position to help generate a comprehensive list of activities relevant to each division. After multiple discussions and written communications using a modified Delphi method, the group reached consensus on the activities that were to be included. The unique part of this project was the third step, which included a survey that was created and analyzed using Qualtrics software and distributed to a group of 60 faculty members across the department. These faculty members were chosen out of a total of 123 because they were identified as department members who regularly participated in the activity list created by the leadership group. Because these faculty members were the most active in these activities, they were in the best position to review the list and evaluate each activity to its fullest. Furthermore, because it was decided that the ARVU system would be voluntary, they were deemed the faculty most likely to be invested in and use this new system. Finally, one of the goals of this mission was to get faculty buy-in as they were the most important stakeholders in this endeavor, and this was achieved by allowing them a voice and to feel empowered in the final steps of this project. The survey included all agreed-upon activities and asked faculty to rate each on a scale from one to four . The 11 faculty members who contributed to the final list of activities created these descriptions. Effort was defined by the time needed to commit or prepare for a particular activity, the ongoing effort needed to sustain the activity if it involved a longer commitment than just one session, and whether the activity required a passive presence or more active participation. For example, activities that required a sustained effort included such things as grant involvement, committee membership, or a leadership position. As expected, some subjectivity was involved in the voting for various reasons, such as the activity being one in which the responsive faculty member participated in himself or herself, or differing opinions regarding how much preparation time might be needed for such things as a lecture.

To help reduce this bias, the survey was sent to many faculty members with different roles and responsibilities to obtain a consensus and to dilute idiosyncratic points of view. Furthermore, the knowledge of and dedication to each activity that the chosen faculty members had and the descriptions provided helped to further reduce bias in the points system. The survey also included free-text fields where faculty could input additional activities that they felt should be added to the list. Of the 60 faculty members surveyed, 49 responded and completed the survey in its entirety. The activities, ranked from highest to lowest based on the mean score including standard deviations, are presented in Table 2. The standard deviation was less than one for all activities included in the survey. The mean of each activity was translated into final points to be awarded in the ARVU system. Activities with higher means earned more points. Any activities that were similar in description and mean score were assigned the same number of final points. We introduced the final list and point system at a faculty meeting prior to implementing, and after this final feedback round, we launched the system in December 2018. The free-text responses were also reviewed, and these activities were added to the list and also voted on by the faculty group to create the final list with points. The next steps for the project included creating a database where faculty could log their completed activities. We created a Google form that listed all activities in the ARVU system where faculty members could select the activity in which they participated . Each activity had an associated drop down menu that asked for additional information, such as title, date, location, description, proof of activity, and an ability to upload documents. We then created a dashboard in the analytics platform Tableau , containing all activities. Statistics for the baseline educational expectations automatically loaded into the dashboard and could not be edited by faculty members. The ARVU activities logged into the Google form also fed directly into the dashboard for display. The full dashboard displayed each faculty member’s baseline education expectations, whether they had met requirements, the activities that they had entered into the ARVU point system, and total points earned to date . Final points were earned after academic leadership reviewed, approved, and signed off on each submitted activity.

Each month,cannabis vertical farming the system automatically e-mailed a link to each individual’s dashboard notifying faculty how many points they had earned to date and of any participation deficiencies. The medical school requires a teaching portfolio for faculty seeking promotion on the scholar track. This portfolio requires faculty to document their achievements in the following categories: teaching effort, mentoring and advising, administration and leadership, committees, and teaching awards. All ARVU activities were reviewed and categorized based on the elements of the teaching portfolio. These activities not only show up as itemized items with points, but they are also grouped into the appropriate portfolio category and are displayed on each individual faculty member’s dashboard. This allowed each faculty member to see how much scholarship they had completed within each of the teaching portfolio categories and in which areas they were lacking that deserved more attention. This provided faculty with a readily accessible repository of activities that could be transferred directly into the correct category of their teaching portfolio, facilitating tracking of activities upon which one needed to focus for promotion. A total of 123 faculty members were expected to participate in the baseline education expectations. At the end of the academic year in June 2018, 107 faculty had met requirements. Failure was defined as not attending the required number of conferences per year or not participating in the module system. Of the 16 who did not meet expectations, 94% signed up for conference modules to participate in specific activities, but none of them met overall required conference attendance. Of the deficient faculty, five worked full time at 28 or fewer hours, 10 were full time at more than 28 hours, and one was part time. Those who did not meet education expectations were notified and had their year-end AY 2017-18 financial incentive reduced to reflect this deficiency. We compared an individual faculty member’s conference attendance in AY 2016-17 and AY 2017-18 to determine any changes after implementing the new expectations. Overall, faculty attended 21% more conference days after expectations were implemented compared to the prior year. Preliminary data for the following AY 2018-19 reveals that conference attendance increased by 15% . The number of resident assessments completed in AY 2017-18 among all faculty was 2837 compared to preliminary AY 2018- 19 assessments of 4049, resulting in a 30% increase since expectations went into effect. To date, faculty across the department have logged a total of 1240 academic activities in the database. The distribution of points across categories is highlighted in Table 3 with most points earned through teaching activities at the medical school or through other scholarly work that doesn’t necessarily fit into the other categories of the teaching portfolio. Leadership will review each faculty member’s individual records to determine if they have met baseline education expectations.

The faculty who meet expectations will receive the set baseline incentive and have the potential to earn more financial incentive based on the number of points they have earned in the ARVU system. Once all the data is analyzed, the points will be converted into financial bonus amounts based on the number of faculty who are eligible and the amount of funds available.This project has resulted in preliminary positive effects on both education and documentation of scholarly work within our department. The first stage resulted in an overall increase in conference attendance and participation even prior to implementing the ARVU system. It is possible that these positive findings were a result of the academic incentive being dependent on meeting education expectations. However, in offline discussions with multiple faculty members, it appears that there was a shame factor that also contributed to improved attendance. Multiple faculty expressed their relief that many were being called out on their low attendance and participation and that faculty who had historically carried much of the teaching responsibility were now being recognized. In the same vein, resident assessments increased in the second year by a considerable amount, without any other changes being made to the system, and therefore were likely a result of the new expectations. The increase in assessments does not necessarily mean better quality, and this will need to be evaluated going forward to determine full impact. The improved participation in educational activities as a result of financial incentives or other measures is consistent with reports from other institutions and existing literature. There is a clear correlation between faculty documentation of scholarly output and the ARVU system, as there was no system in place prior that allowed tracking of activities. The increase in activities and documentation will need to be followed from year to year to draw conclusions on overall scholarly activity among individual faculty members and throughout the department. Unlike previous literature describing ARVU systems, our project has emphasized the ability to house activities in one place that can be transferred into a faculty member’s teaching portfolio, thereby further incentivizing the use of this system outside of financial rewards. We will continue to track baseline education expectations and the ARVU system across the department as well as continuously seek feedback from faculty and make changes as needed. This process will continue to be refined over time based on faculty feedback and departmental and institutional priorities. The majority of faculty who did not qualify for the academic bonus last year worked more than 28 clinical hours per week, and thus time issues may have affected compliance. To further probe this finding and facilitate educational commitments, we will solicit additional feedback from this group of faculty members to explore participation barriers that may be addressed in the future. We hope to follow the scholarly output of the department over time using the ARVU system as an estimate of faculty productivity.