The computational psychiatry and affective cognitive neuroscience (CPACN) laboratory is headed by Professor Sonia Bishop who is an adjunct associate professor within the Helen Wills Neuroscience Institute at UC Berkeley. Please visit our people pages to learn more about us.

Decision Making Study

Further Reading

SF Chronicle: In coronavirus pandemic's limbo, the only thing certain is more uncertainty

COVID-19: Mental health and well-being for ourselves and our children

Lab Research

Current research in the lab brings together strong experimental design, advanced computational modeling and cutting-edge functional magnetic resonance imaging methods to examine the computational and neural substrate of decision-making, representation, learning and attentional selection in health and disease.

Much of the lab’s recent work falls under the umbrella of ‘computational psychiatry’. Here we believe in taking a 3-step approach: (i) identifying clinical symptomatology or clinical presentation of interest, (ii) computationally characterizing associated cognitive processes and how these vary as a function of psychiatrically pertinent traits or dimensions of symptomatology, (iii) relating computational models of cognition to changes in BOLD activity obtained by asking participants to perform tasks tapping the cognitive processes in question while fMRI data is acquired. We are particularly interested in identifying alterations in computational processes that are common to both anxiety and depression versus unique to one or the other (see Bishop, S.J., Gagne, C. (2018)). In ongoing work, we are addressing this by using a combination of bifactor modeling of psychiatric symptoms and hierarchical Bayesian modeling of decision-making behaviors and associated changes in brain activity. See Projects: Computational Psychiatry for further information on this line of work in our lab.

Another major line of research in the lab uses multi-feature encoding models to explore the cortical representation of natural emotional visual images (both faces and scenes). Here we seek to understand the role occipital-temporal cortical regions play in the integration of semantic and affective features and whether this provides templates that can be used to guide behavior. We are also exploring the structure of representation of facial identity and emotion using a large dataset of natural faces with genuine (captured ‘in the wild’) emotional expressions. This work is conducted in collaboration with Alex Huth and Jack Gallant. Clinical extensions of this work are being conducted in collaboration with Bhisma Chakrabarti (Autism) and Brad Duchaine (Developmental Prosopagnosia).

Other work in the lab focuses on frontal mechanisms of attentional control and associative learning (e.g. fear conditioning) and their dysregulation in anxiety .

Please use the projects tab to the right-hand side to learn about ongoing work in the lab.