Overview
Research Assistant in Analysis and Modelling of Neural Dynamics – Strand, London, WC2R 2LS
About us
The School of Neuroscience is UK’s 2nd largest Neuroscience school with over 500 researchers and 200 PhD students. It is one of three schools at the Institute of Psychiatry, Psychology & Neuroscience.
CDN is one of four departments in the School of Neuroscience at the Institute of Psychiatry, Psychology & Neuroscience and is located at Guy’s campus. Researchers have access to support facilities in genotyping, molecular biology and Drosophila work. CDN has close partnership with the Medical Research Council (MRC) and CDN researchers, together with clinical researchers from King’s, make up the MRC Centre for Neurodevelopmental Disorders.
At the Centre for Developmental Neurobiology (CDN), we investigate the mechanisms governing the formation of the brain during embryonic development and in early postnatal life. This is based on the understanding that early experience shapes the way our brain is constructed. While the “ground plan” of the brain is genetically determined, it is also influenced by environmental experience. We are still far from a complete understanding of how these processes work.
About the role
We are seeking a motivated research assistant to join our team working on an exciting Wellcome-Trust funded project. The research focuses on decoding neural representations across dynamic brain states through advanced computational analysis of large-scale neural recordings.
What you would be doing:
- Process and analyse large-scale calcium imaging datasets from multisensory experiments, including neural responses from visual and auditory cortices recorded over multiple days
- Apply and adapt advanced machine learning frameworks (SPARKS and CEBRA) for supervised and unsupervised analysis of high-dimensional neural data to decode multisensory information
- Investigate how neural representations change across different brain states (awake, asleep, engaged) and track representational drift over extended time periods
- Analyse recorded animal behaviours throughout experimental trials and correlate behavioural variability with neural responses to naturalistic audio-visual stimuli
- Examine cross-modal interactions between visual and auditory cortices using techniques such as cross-modal decoding, unit reliability analysis, and shared variance component analysis (SVCA)
- Create comprehensive data visualisations and perform statistical analyses to assess stability and plasticity of multisensory representations
- Collaborate with experimental partners and team members to interpret findings and develop brain-machine interface applications
- Present findings at conferences and contribute to publications in peer-reviewed journals
This is a full time on a fixed term contract until 30th September 2026.
About you
To be successful in this role, we are looking for candidates to have the following skills and experience:
Essential criteria
- MSc. in Neuroscience, Physics, Computer Science, or a related field
- Strong background in computational neuroscience and data analysis
- Proficiency in programming (e.g., Python, MATLAB, and similar languages)
- Experience with large-scale neural network simulations
- Experience with analysing large-scale neural recordings
- Familiarity with neuroanatomy and neurophysiology
- Knowledge of dynamical systems theory
- Excellent analytical and problem-solving skills
Desirable criteria
- Advanced programming and data analysis skills
- Computational neuroscience background
- Behavioral data analysis skills
- Strong written and verbal communication skills
- Experience of working in multidisciplinary teams (ideally across theoretical and experimental neuroscience)
Downloading a copy of our Job Description
Full details of the role and the skills, knowledge and experience required can be found in the Job Description document, provided at the bottom of the next page after you click “Apply Now”. This document will provide information of what criteria will be assessed at each stage of the recruitment process.