After a marathon review process, our collaborative paper “Interpreting wide-band neural activity using convolutional neural networks” is now published in eLife! Main author Markus Frey wrote an excellent Twitter thread summarizing the study. The paper, Markus’ code as well as all datasets, including my neuronal recordings from the auditory cortex of a freely-moving mouse, are all available open access. If you record any type of brain data and would like to make sure you’re not missing anything important in the wideband signal – give this new tool a try!
In the middle of the rollercoaster winter 2020-21, Arthur joined the mouse love song adventure via EPFL’s Life Science Engineering MSc programme. He courageously embarked on a project developing an AI-based pipeline for the analysis of mouse behavioural modes. Many video meetings, code wrestling and one day of in-person visit later, Arthur closed the loop from raw data to a report and successfully completed his “remote lab immersion” project. It was a joy to work with him as he ironed out the kinks of an exciting new type of behavioural analysis, and to introduce him to the wonders and headaches of neuroscience research!
Yesterday, Camille gave a fantastic talk about her work and successfully defended her MSc thesis research in a really engaging remote viva – or at least I hope she enjoyed it as much as we examiners did! Camille completed her nice work, and wrote a wonderful thesis, despite unexpectedly having to deal with the historical ongoing disruptions: lab closure meant cutting her final experiments short, and travel restrictions meant moving out of London at very short notice. Yet she took it all in stride with impressive resilience and flexibility, and has now completed her MSc degree perfectly on schedule. I am so proud of how courageously and successfully she rose to the many challenges of these eventful 6 months. It was a pleasure to do all this science together, and I wish her best of luck, success and fulfilment in her next step!
Tania and I last worked together back in 2006, when she patiently taught me how to patch-clamp several neurons simultaneously at EPFL. In 2020 she and her fantastic team have again welcomed me with open arms, and are teaching me some of their acute high-channel count neuronal recording skills.
My experiments started out beautifully, before being cut short for some time due to the COVID-19 lockdown in Switzerland. I can’t wait to go back to the lab once restrictions lift. In the meantime I am enjoying discovering a new research environment, being part of this happy and productive auditory lab and all the excellent (video!) discussions on our respective work.
Some of my first neuronal recordings from the auditory cortex of an awake behaving mouse feature in an preprint unveiled in December 2019!
In this joint venture between the labs of Caswell Barry and Christian Doeller, lead author Markus Frey developed DeepInsight, a decoding framework for discovering and characterising the neural correlates of behaviour and stimuli in unprocessed biological data. This deep-learning tool is able to decode sensory and behavioural variables from both electrophysiological and calcium imaging data, in different behavioural situations, brain regions, and species. Importantly, it runs on raw data, and as such makes minimal assumptions on which frequency component(s) of the broadband recorded signal carry information. This means that once trained, the network can be interrogated to guide the discovery of novel neural representations!
This was such an exciting project to contribute some of my neuronal data to. And, as always, scientific discussions with researchers from a slightly different area of the wide neuroscience landscape was a great learning opportunity and eye-opener.
Over the last 4 years I sometimes despaired this day would never come, but here we are! I’m proud and delighted to share my latest and favourite piece of work, now online on the preprinting platform BioRxiv: Courtship behaviour reveals temporal regularity is a critical social cue in mouse communication.
Read on for an animated summary of what this study is about!
Did you know that male mice sing elaborate, ultrasonic “love” songs to attract females? All this at frequencies so high-pitched that they are inaudible to humans?
Maybe you did, actually. But what no one could tell you is which of the many acoustic features of these vocal sequences female listeners are actually using to make social decisions.
In this study we took advantage of the natural behaviour that female mice show in response to male courtship songs to ask: what acoustic cues are females using from vocal sequences during goal-directed social behaviour?
It turns out females are VERY sensitive to disruptions to the songs’ rhythmic regularity; they dislike irregular, artificially “stuttering” versions of male courtship songs!
Disrupting other acoustic features of the songs, such as syllable sequence, or spectrotemporal structure, did not really matter to the females.
The take-away? Temporal regularity is a key acoustic cue extracted by mammalian listeners from complex vocal sequences during goal-directed social behaviour.
Please check out the paper for all the details!
I am happily growing my scientific library this year! Hot off the press, the second and last chapter Chris Petkov and I co-authored was published in the book Multisensory Processes: The Auditory Perspective.
In this new volume of the book series dedicated to all things auditory, we unify recent insights from single-neuron, oscillatory activity and functional connectivity studies to understand how visual face information is combined with auditory voice information in the primate temporal lobe.
The book as a whole provides a compelling picture of how different aspects of auditory perception, cognition and behaviour are shaped by inputs from other senses. Thanks to the editors for their great job curating this volume!
Waiting for me on my desk, ready to start 2019 with a smile, was a hardcopy of the book Chris Petkov, my fantastic PhD advisor, and myself contributed to:
In our chapter Chris and I take a comparative perspective on voice perception and overview how recent insights from primate work advance our understanding of the neurobiology of voice perception. The book itself also highlights what makes voices special acoustically, how we use the emotions and social information contained in voices to communicate with each other, and how certain disorders change how our brain responds to voices. This is a very nice read that gives a comprehensive introduction into the exciting field of voice perception – check it out!
Last week I took part in the very first Tuebingen Neuroscience Alumni meeting as a programme speaker, along with some of my former classmates from Tuebingen University’s Graduate Training Centre of Neuroscience. It was an honour to be part of this reunion, and a great pleasure to spend a few autumn days in my favorite university town, find out about the great science planned in Tuebingen and enjoy discussions with friends old and new from academia and industry.
In swift succession over the last few weeks, Shanice handed in and viva-ed her MSc thesis, thus successfully concluding her UCL Neuroscience MSc degree. It was wonderful to have her on board this scientific adventure and she made this research avenue truly take off. I am so proud of her many achievements over the last year, and know there are many more to come.