Category Archives: Neurokit

Display this course using the Neurokit template (yellow)

Modern Approaches to Behavioral Analysis

This is a Cajal NeuroKit course that combines online lectures about fundamentals and advanced neuroscience topics with hands-on and physical experiments.

Researchers from everywhere can participate because the course material is sent home in a kit box.

We will run a pre-course in May to train Teaching Assistants who would like to help us to teach this course locally or online. We will open applications soon.

Course overview

The goal of neuroscience is to understand how the nervous system controls behaviour [1], not only in the simplified environments of the lab, but also in the natural environments for which nervous systems evolved.

In pursuing this goal, neuroscience research is supported by an ever-larger toolbox, ranging from optogenetics to connectomics. However, often these tools are coupled with reductionist approaches for linking nervous systems and behaviour. Even though the arrival of deep learning tools for animal tracking has changed the scale at which behavioural data is acquired, the scope of questions that can be addressed with these tools can only be expanded when combined with a more nuanced and context-driven approach to the study of behaviour. This course will introduce advanced techniques for measuring and analysing behaviour, as well as three fundamental principles as necessary to understanding biological behaviour: (1) morphology and environment; (2) action-perception closed loops and purpose; and (3) individuality and historical contingencies [2].

[1] Preface, W.M. Cowan, Annual Review of Neuroscience 1978, Vol 1

https://doi.org/10.1146/annurev.ne.1.072606.100001

[2] Gomez-Marin, A., & Ghazanfar, A. A. (2019). The life of behavior. Neuron, 104(1), 25-36

Course sponsor

What will you learn?

This course will emphasize the philosophical and observational skills required to understand behaviour, while also providing training in image-capture technologies and computer vision methods that can assist in the collection and analysis of video recorded behaviour datasets.

Focusing on the tool DeepLabCut, students will analyse an original video dataset and have the opportunity to practice tracking, pose estimation, action segmentation, kinematic analysis and modeling of behaviour.

By the end of the course, you will:

  • be familiar with modern and historical frameworks for studying the behaviour of living biological systems
  • practice methods for carefully and precisely observing and defining behaviours
  • understand the limits and capabilities of computer vision
  • develop an intuition for how to build experimental setups that can take advantage of tools such as DeepLabCut

This course shares and promotes open source software, and we encourage students to try new ideas, share insights, and connect with the open-source community.

Interested in teaching? We are hiring TAs!

We are looking for teaching assistants to help develop and deliver this course. TAs will be required to attend the pre-course at EPFL (September 19-23). They will then be paid an honorarium to teach the following courses. If you are interested, please fill out this form by [DEADLINE].

Faculty

Alexander Mathis

Course Director EPFL, Switzerland

Danbee Kim

Course co- director Neurogears, UK

Keynote speakers

Nicola Clayton (Univ Cambridge, UK)

Ole Kiehn (Univ of Copenhagen, Denmark)

Guest Lecturers

Johanna T Schultz – (USC, Australia)

Nacho Sanguinetti (Harvard Univ. USA)

Programme

Day 1 – What is animal behaviour?

  • Historical and current theoretical frameworks for the study of behaviour in living biological systems

  • Practical exercises for training skills in observing and defining behaviours

Day 2 – Tools for modern-day ethology

  • Fundamentals of video recording, computer vision, and deep learning

  • Introduction to DeepLabCut

  • Create an original video recorded dataset of behaviour (students video record their animal puppet engaged in some kind of behaviour)

Day 3 – Training computers to see as we see

  • Multi-animal tracking

  • Live tracking

  • Prepare original video dataset of behaviour for analysis (students trade original video datasets and train their DLC networks)

Day 4 – Analysis by eye and by computer

  • Movement kinematics in living biological systems

  • Action segmentation – when does a behaviour start and end?

  • Analyse original video dataset of behaviour (students try to figure out what behaviour is being performed by their fellow students)

Day 5 – From individuals to populations

  • How do behaviours of living biological systems generalize?

  • Advanced DLC topics and typical pitfalls

  • Students pool all animal puppet videos and try to train DLC to categorize different behaviours

The course will be held from 14:00 to 18:00 GMT.

Registration

Registration fee: 500€ per person (includes shipping of the course kit, pre-recorded and live lectures before and during the course, full attendance to the course, and course certificate).

Registration fee for a group: 500€ for one person and one course kit + 150€ per additional person (without the course kit). For this course, groups can be up to 3 persons maximum sharing  1 single kit.

Applications for Teaching Assistants will open in May 2022

To receive more information about this NeuroKit, email info@cajal-training.org

Visual Reactive Programming – Bonsai 0121

Visual Reactive Programming – Bonsai is a Cajal NeuroKit. NeuroKits are hybrid courses that combine online lectures about fundamentals and advanced neuroscience topics, with hands-on and physical experiments.
Researchers from all over the world can participate thanks to the course material sent by post in a kit box containing all the tools needed to follow the online course. 

Course overview

Modern neuroscience relies on the combination of multiple technologies to record precise measurements of neural activity and behaviour. Commercially available software for sampling and controlling data acquisition is often too expensive, closed to modification and incompatible with this growing complexity, requiring experimenters to constantly patch together diverse pieces of software.

This course will introduce the basics of the Bonsai programming language, a high-performance, easy to use, and flexible visual environment for designing closed-loop neuroscience experiments combining physiology and behaviour data.

This language has allowed scientists with no previous programming experience to quickly develop and scale-up experimental rigs, and can be used to integrate new open-source hardware and software.

Course Teaser

What will you learn?

By the end of the course you will be able to use Bonsai to:

– create data acquisition and processing pipelines for video and visual stimulation.
– control behavioral task states and run your closed-loop experiments.
– collect data from cameras, microphones, Arduino boards, electrophysiology devices, etc.
– achieve precise synchronization of independent data streams.

The online material will be soon found here.

Faculty

Gonçalo Lopes

Course Director

NeuroGEARS, London, UK​

Instructors

João Frazão Champalimaud Research, Lisbon, PT

Niccolò Bonacchi – International Brain Laboratory, Lisbon, PT

Nicholas Guilbeault – University of Toronto, CA

André Almeida – NeuroGEARS, London, UK

Bruno Cruz – Champalimaud Research, Lisbon, PT

Course sponsors

Programme

Day 1 – Introduction to Bonsai

  • Introduction to Bonsai. What is visual reactive programming.

  • How to measure almost anything with Bonsai (from quantities to bytes).

  • How to control almost anything with Bonsai (from bytes to effects).

  • How to measure/control multiple things at the same time with one computer.

  • Demos and applications: a whirlwind tour of Bonsai.

Day 2 – Cameras, tracking, controllers

  • Measuring behavior using a video.

  • Recording real-time video from multiple cameras.

  • Real-time tracking of colored objects, moving objects and contrasting objects.

  • Measuring behavior using voltages and Arduino.

  • Data synchronization. What frame did the light turn on?

Day 3 – Real-time closed-loop assays

  • What can we learn from closed-loop experiments?

  • Conditional effects. Triggering a stimulus based on video activity.

  • Continuous feedback. Modulate stimulus intensity with speed or distance.

  • Feedback stabilization. Record video centered around a moving object.

  • Measuring closed-loop latency.

Day 4 – Operant behavior tasks

  • Modeling trial sequences: states, events, and side-effects.

  • Driving state transitions with external inputs.

  • Choice, timeouts and conditional logic: the basic building blocks of reaction time, Go/No-Go and 2AFC tasks.

  • Combining real-time and non real-time logic for good measure.

  • Student project brainstorming

Day 5 – Visual stimulation and beyond

  • Interactive visual environments using BonVision.

  • Machine learning for markerless pose estimation using DeepLabCut.

  • Multi-animal tracking and body part feature extraction with BonZeb.

  • Student project presentation.

  • Where to next.

Registration

Fee : 300 € (includes lectures and kit)

Application closed on 20 December 2020.

You are welcome to express your interest in the next Cajal Bonsai NeuroKit. Click on the button in the top banner or here.

To receive more information about this NeuroKit, email info@cajal-training.org

Extracellular Electrophysiology Acquisition 0321

To apply to the second edition of this course please visit this webpage

Extracellular Electrophysiology Acquisition is a Cajal NeuroKit. NeuroKits are hybrid courses that combine online lectures about fundamentals and advanced neuroscience topics, with hands-on and physical experiments.
Researchers from all over the world can participate thanks to the course material sent by post in a kit box containing all the tools needed to follow the online course.

Course overview

Any data we collect has been shaped by the system we used to record it. Understanding the tools involved in data acquisition gives you the confidence to make informed experimental design choices, and the freedom to combine and try new approaches while building your dream setup.

In this course, we will develop your understanding of electrophysiology data acquisition. In terms of hardware, you will learn how acquisition systems can amplify tiny signals and filter out noise. You’ll test this understanding by building your own system to measure muscle and heart signals. In software, you will encounter synchronisation considerations, as we add incoming datastreams and build an increasingly complex experimental design.

Don’t be discouraged if you secretly panic at the mention of capacitance, this course starts from the very basics. Advanced students can make the final project as challenging as they like.

Designed by Open Ephys and Open Ephys Production Site, this course will have an open-source flavour and encourage you to try new ideas, share your insights, and connect with the open-source community.

Course sponsors

What will you learn?

By the end of the course, you will:

  • be familiar with the electronic building blocks of acquisition systems

  • be able to model and build circuits to amplify and filter incoming signals

  • be able to use the Bonsai programming language to stream data and run closed-loop experiments with multiple datastreams

Faculty

Alexandra Leighton

Alex Leighton

Course Director

Open Ephys Production Site, PT

Jakob Voigts

Course Director

MIT and Open Ephys, USA

Filipe Carvalho

Course co-director

Open Ephys Production Site, PT

Instructors

Aarón Cuevas López – Universitat Politècnica de València, ES

Joana Neto, FCT NOVA, PT

Jonathan P. Newman – MIT and Open Ephys, USA

Josh Siegle, Allen Institute, USA

Programme

Day 1 – Introduction

  • What are we trying to measure? Electrical signals in the brain and ways to record them.

  • How can we collect these signals without changing them? Considerations when building an acquisition system.

  • Using a simulator to visualise electrical circuits online and make predictions about real-world circuits.

  • Using the breadboard and components in your kit to test your understanding of electronics concepts.

Day 2 – Impedance

  • Using microcontrollers to acquire physiological data.

  • What is impedance? Understanding how we protect our signals while measuring them.

  • Understanding the function and limitations of operational amplifiers.

Cajal Images -Day 1
Cajal- Day 3

Day 3 – Data Acquisition

  • Understanding Instrumentation Amplifiers.

  • Simulating, building and testing low & high-pass filters.

  • Visualise your own EMG/ECG data using the Bonsai programming language.

Day 4 – Synchronizing Datastreams

  • Expanding on Bonsai – controlling cameras, receiving other datastreams.

  • Understanding closed-loop experiments, timestamp considerations, and synchronising datastreams.

  • Designing student projects and group feedback on plan.

Day 5 – Project and Open-Source Neuroscience

  • Open Ephys – open-source hardware & software development.

  • An overview of open-source community projects.

  • Student project presentation.

Cajal- Day 4

The courses will be held from 14:00 to 18:00 GMT.

Registration

Fee : 400€ (includes lectures and kit)

Application closed on 22 February 2021.

To apply to the second edition of this course please visit this webpage

To receive more information about this NeuroKit, email info@cajal-training.org

The Last Black Box

Interested in the Last Black Box course? The course was reshaped into a new NeuroKit called Experimental Neuroscience Bootcamp. Visit the page to learn more and apply now.

The Last Black Box is a Cajal NeuroKit. NeuroKits are hybrid courses that combine online lectures about fundamentals and advanced neuroscience topics, with hands-on and physical experiments.
Researchers from all over the world can participate thanks to the course material sent by post in a kit box containing all the tools needed to follow the online course.

Course overview

This course provides a foundation for new experimental neuroscientists. It is targeted at Master students, PhD students or researchers entering the field from another discipline. It should be considered a “prerequisite” for more advanced training courses in a specialized topic.

The course introduces the essentials of data acquisition/control, data analysis, and machine learning by guiding the students through the hands-on construction of an increasingly capable robot. In parallel, related concepts in neuroscience are introduced as nature’s solution to the challenges students encounter while designing and building their own intelligent system.

Course Teaser

What will you learn?

You will be building a robot without using any black boxes. The robot’s physical layout mimics the basic anatomy of a (vertebrate) brain, and as you gradually open this course’s 21 “boxes” your robot will evolve into an increasingly sophisticated machine. We thus call this robot the No-Black-Box-Bot or NB3.

The course is divided into three sections (following the anatomy of the brain): hindbrain (reflexes), midbrain (behaviour), and forebrain (intelligence?).

The online material can be found here.

Faculty

Adam Kampff

Course Director
Voight Kampff, London, UK

Elena Dreosti

Co-Director
University College London, UK

Instructors

Spencer Wilson – Sainsbury Wellcome Centre, London, UK

Hande Tunbak – University College London, UK

Virginia Rutten – Sainsbury Wellcome Centre, London, UK

Thomas Ryan – University College London, UK

Course sponsors

Voight-Kampff

Programme

Week 1: Measuring and Moving

Aims:

Students build a basic sensory-motor system (a Braitenberg vehicle) that seeks or avoids light and learn about fundamentals of electronics, sensors/actuators, and amplification.

Schedule:

Day 1: “The White Box” Toolkit, Electrons

Day 2: Magnets + Light, Sensors + Motors

Day 3: Semiconductors

Day 4: Amplifiers

Day 5: Reflexes, NB3 robot demos

Week 2: Computers and Programming

Aims:

Students extend their robot to make decisions based on sensory input and perform basic computations. With the addition of a microcontroller, the students will learn the fundamentals of computers and programming, and the robots will develop more complex behaviours.

Schedule:

Day 1: Decisions, Logic

Day 2: Data, Memory

Day 3: Computers

Day 4: Control

Day 5: Behaviour, NB3 demos

Week 3: Data Analysis and Machine learning

Aims:

Students add a computer and camera to their robot. They then learn how to use neural networks to create an “intelligent” visual system that can identify obstacles, rewards and much more…

Schedule:

Day 1: Hearing + Speech, Vision

Day 2: Learning, Intelligence?

Day 3: NB3 work, NB3 work

Day 4: NB3 work, NB3 work

Day 5: NB3 work, NB3 demos

Registration

Course Fee : 900 €
The registration fee includes the black box kit to build the NB3 robot, the white box containing additional tools, shipment of the boxes, faculty and instructor tutoring for 3 full weeks.

The CAJAL programme offers 2 stipends for the Last Black Box NeuroKit (waived registration fee). Please apply through the course online application form. In order to identify candidates in real need of a stipend, any grant applicant is encouraged to first request funds from their lab, institution or government.

To receive more information about this NeuroKit, email info@cajal-training.org

They took the Last Black Box NeuroKit

“This course opened a lot of black boxes for me. Everything was new and challenging, and I learned so much! It’s super rewarding to see how our robot evolves during these three weeks and how it turns out by the end of the course.”
Marta Maciel (Cellular and Molecular Biology, University of Coimbra)

“Everything about STEM that I missed from an undergrad degree in maths”
Kevin Huang (BA in Mathematics and MMath, University of Cambridge, US)

“The Bootcamp is an incredible way to put technology together, from the fundamentals of physics to modern computers. Not only by explaining it to the students, but also making the students discover and build each component by themselves.”
Rodrigo Carrasco Davis (Electrical engineering, Universidad de Chile, Chile)

“A completely novel approach to education where to understand is not to recite or regurgitate but instead to build, test, and develop a physical manifestation of all that you learn in this wonderful course.”
Christopher Hall (Cell & Molecular Biology, UC Berkeley, US)

“This course distilled a lot of the practical skills I learned from my 5-year electrical engineering education into an intense but exciting 3-week, hands-on adventure. I think this course should be mandatory for neuroscientists, but also available to anyone! Young children, high school students, professionals who rely on computers for their livelihood without understanding what makes them work, and many others can benefit greatly from a version of this course.”
Ali Haydaroglu (Engineering Science with a major in Electrical and Computer Engineering, University of Toronto, Canada)

Experimental Neuroscience Bootcamp

NeuroKits are hybrid courses that combine online lectures on advanced neuroscience topics with hands-on experiments by sending a kit containing the course material wherever you are.

Course overview

This course provides a fundamental foundation in the modern techniques of experimental neuroscience. It introduces the essentials of sensors, motor control, microcontrollers, programming, data analysis, and machine learning by guiding students through the “hands on” construction of an increasingly capable robot.

In parallel, related concepts in neuroscience are introduced as nature’s solution to the challenges students encounter while designing and building their own intelligent system.

Course Partners

Voight-Kampff

What will you learn?

The techniques of experimental neuroscience advance at an incredible pace. They incorporate developments from many different fields, requiring new researchers to acquire a broad range of skills and expertise (from building electronic hardware to designing optical systems to training deep neural networks). This overwhelming task encourages students to move quickly, but often by skipping over some essential underlying knowledge.

This course was designed to fill-in these knowledge gaps.

By building a robot, you will learn both how the individual technologies work and how to combine them together into a complete system. It is this broad-but-integrated understanding of modern technology that will help students of this course design novel state-of-the-art neuroscience experiments.

Course directors

Adam Kampff

Course Director
Voight Kampff, London, UK

Andreas Kist

Course Director
Department for Artificial Intelligence in Biomedical Engineering (AIBE), Erlangen, Germany

Elena Dreosti

Co-Director
University College London, UK

Programme

The course will be held from 14:00 to 18:00 CEST

Day 1 – Sensors and Motors

What will you learn?

You will learn the basics of analog and digital electronics by building circuits for sensing the environment and controlling movement. These circuits will be used to construct the foundation of your course robot; a Braitenberg Vehicle that uses simple “algorithms” to generate surprisingly complex behaviour.

Topics and Tasks:

  • Electronics (voltage, resistors, Ohm’s law): Build a voltage divider

  • Sensing (light-dependent resistors, thermistors): Build a light/temperature sensor

  • Movement (electro-magentism, DC motors, gears): Mount and spin your motors

  • Amplifying (transistors, op-amps): Build a light-controlled motor

  • Basic Behaviour: Build a Braitenberg Vehicle

Day 2: Microcontrollers and Programming

What will you learn?

You will learn how simple digital circuits (logic gates, memory registers, etc.) can be assembled into a (programmable) computer. You will then attach a microcontroller to your course robot, connect it to sensors and motors, and begin to write programs that extend your robot’s behavioural ability.

Topics and Tasks:

  • Logic and Memory: Build a logic circuit and a flip-flop

  • Processors: Setup a microcontroller and attach inputs and outputs

  • Programming: Program a microcontroller (control flow, timers, digital IO, analog IO)

  • Intermediate behaviour: Design a state machine to control your course robot

Day 3: Computers and Programming

What will you learn?

You will learn how a modern computer’s “operating system” (Linux) coordinates the execution of internal and external tasks, and how to communicate over a network (using WiFi). You will then use Python to write a “remote-control” system for your course robot by developing your own communication protocol between your robot’s linux computer and microcontroller.

Topics and Tasks:

  • Operating Systems: Setup a Linux computer (Raspberry Pi)

  • Networking: Remotely access a computer (SSH via WiFi)

  • Programming: Program a Linux computer (Python)

  • Advanced behaviour: Build a remote control robot

Day 4: (Machine) Vision

What will you learn?

You will learn how grayscale and color images emerge and how to work with them in a Python environment. By mounting a camera on your robot, you can live-stream the images to your computer. You will then use background subtraction and thresholding to program an image-based motion detector. You will use image moments to detect and follow a moving light source, and learn about “classical” face detection.

Topics and Tasks:

  • Images: Open, modify, and save images

  • Camera: Attach and stream a camera image

  • Image processing: Determine differences in images

  • Pattern recognition: Extract features from images

Day 5: (Machine) Learning

What will you learn?

You will learn about modern deep neural networks and how they are applied in image processing. You will extend the intelligence for your robot, by adding a neural accelerator to the robot. We will deploy a deep neural network for face detection and compare it to the “classical” face detector. Ultimately, you will create and train your own deep neural network that will allow your robot to identify it’s creator, you.

Topics and Tasks:

  • Inference: Implement a neural accelerator (Google Coral USB EdgeTPU)

  • Deployment: Deploy and run a deep neural network

  • Object detection: Finding faces using a deep neural network (Single Shot Detector)

  • Object classification: Train a deep neural network to identify one’s own face (TF/Keras)

Registration

Registration fee: 450€ per person (includes shipping of the course kit, pre-recorded and live lectures before and during the course, full attendance to the course, and course certificate).

Registration fee for a group: 450for one person and one course kit + 150€ per additional person (without the course kit)

Application closed on 26 July 2021

To receive more information about this NeuroKit, email info@cajal-training.org

Extracellular Electrophysiology Acquisition 0522

This is a Cajal NeuroKit course that combines online lectures about fundamentals and advanced neuroscience topics with hands-on and physical experiments.

Researchers from everywhere can participate because the course material is sent home in a kit box.

This course is now at its third edition.

Course overview

Any data we collect has been shaped by the system we used to record it. Understanding the tools involved in data acquisition gives you the confidence to make informed experimental design choices, and the freedom to combine and try new approaches while building your dream setup.

In this course, we will develop your understanding of electrophysiology data acquisition. In terms of hardware, you will learn how acquisition systems can amplify tiny signals and filter out noise. You’ll test this understanding by building your own system to measure muscle and heart signals. In software, you will encounter synchronisation considerations, as we add incoming datastreams and build an increasingly complex experimental design.

Don’t be discouraged if you secretly panic at the mention of capacitance, this course starts from the very basics. Advanced students can make the final project as challenging as they like.

Designed by Open Ephys and Open Ephys Production Site, this course will have an open-source flavour and encourage you to try new ideas, share your insights, and connect with the open-source community.

Course sponsors

What will you learn?

By the end of the course, you will:

  • be familiar with the electronic building blocks of acquisition systems

  • be able to model and build circuits to amplify and filter incoming signals

  • be able to use the Bonsai programming language to stream data and run closed-loop experiments with multiple datastreams

Faculty

Alexandra Leighton

Alex Leighton

Course Director

Open Ephys Production Site, PT

Jakob Voigts

Course Director

MIT and Open Ephys, USA

Filipe Carvalho

Course co-director

Open Ephys Production Site, PT

Instructors

Aarón Cuevas López – Universitat Politècnica de València, ES

Joana Neto, FCT NOVA, PT

Jonathan P. Newman – MIT and Open Ephys, USA

Josh Siegle, Allen Institute, USA

Programme

Day 1 – Introduction

  • What are we trying to measure? Electrical signals in the brain and ways to record them.

  • How can we collect these signals without changing them? Considerations when building an acquisition system.

  • Using a simulator to visualise electrical circuits online and make predictions about real-world circuits.

  • Using the breadboard and components in your kit to test your understanding of electronics concepts.

Day 2 – Impedance

  • Using microcontrollers to acquire physiological data.

  • What is impedance? Understanding how we protect our signals while measuring them.

  • Understanding the function and limitations of operational amplifiers.

Cajal Images -Day 1
Cajal- Day 3

Day 3 – Data Acquisition

  • Understanding Instrumentation Amplifiers.

  • Simulating, building and testing low & high-pass filters.

  • Visualise your own EMG/ECG data using the Bonsai programming language.

Day 4 – Synchronizing Datastreams

  • Expanding on Bonsai – controlling cameras, receiving other datastreams.

  • Understanding closed-loop experiments, timestamp considerations, and synchronising datastreams.

  • Designing student projects and group feedback on plan.

Day 5 – Project and Open-Source Neuroscience

  • Open Ephys – open-source hardware & software development.

  • An overview of open-source community projects.

  • Student project presentation.

Cajal- Day 4

The course will be held from 14:00 to 18:00 GMT.

Registration

Registration fee: 500€ per person (includes shipping of the course kit, pre-recorded and live lectures before and during the course, full attendance to the course, and course certificate).

Registration fee for a group: 500€ for one person and one course kit + 150€ per additional person (without the course kit). For this course, groups can be up to 3 persons maximum sharing  1 single kit.

Applications are closed. The course will be held again. You can express your interest in the course and we will contact you once the application call is open again.

To receive more information about this NeuroKit, email info@cajal-training.org

Visual Reactive Programming – Bonsai 0922

This is a Cajal NeuroKit course that combines online lectures about fundamentals and advanced neuroscience topics with hands-on and physical experiments.

Researchers from everywhere can participate because the course material is sent home in a kit box. 

This course is now at its second edition. 

Course overview

Modern neuroscience relies on the combination of multiple technologies to record precise measurements of neural activity and behaviour. Commercially available software for sampling and controlling data acquisition is often too expensive, closed to modification and incompatible with this growing complexity, requiring experimenters to constantly patch together diverse pieces of software.

This course will introduce the basics of the Bonsai programming language, a high-performance, easy to use, and flexible visual environment for designing closed-loop neuroscience experiments combining physiology and behaviour data.

This language has allowed scientists with no previous programming experience to quickly develop and scale-up experimental rigs, and can be used to integrate new open-source hardware and software.

Course Teaser

What will you learn?

By the end of the course you will be able to use Bonsai to:

– create data acquisition and processing pipelines for video and visual stimulation.
– control behavioral task states and run your closed-loop experiments.
– collect data from cameras, microphones, Arduino boards, electrophysiology devices, etc.
– achieve precise synchronization of independent data streams.

The online material will be soon found here.

Faculty

Gonçalo Lopes

Course Director

NeuroGEARS, London, UK​

Instructors

João Frazão Champalimaud Research, Lisbon, PT

Niccolò Bonacchi – International Brain Laboratory, Lisbon, PT

Nicholas Guilbeault – University of Toronto, CA

André Almeida – NeuroGEARS, London, UK

Bruno Cruz – Champalimaud Research, Lisbon, PT

Course sponsors

Programme

Day 1 – Introduction to Bonsai

  • Introduction to Bonsai. What is visual reactive programming.

  • How to measure almost anything with Bonsai (from quantities to bytes).

  • How to control almost anything with Bonsai (from bytes to effects).

  • How to measure/control multiple things at the same time with one computer.

  • Demos and applications: a whirlwind tour of Bonsai.

Day 2 – Cameras, tracking, controllers

  • Measuring behavior using a video.

  • Recording real-time video from multiple cameras.

  • Real-time tracking of colored objects, moving objects and contrasting objects.

  • Measuring behavior using voltages and Arduino.

  • Data synchronization. What frame did the light turn on?

Day 3 – Real-time closed-loop assays

  • What can we learn from closed-loop experiments?

  • Conditional effects. Triggering a stimulus based on video activity.

  • Continuous feedback. Modulate stimulus intensity with speed or distance.

  • Feedback stabilization. Record video centered around a moving object.

  • Measuring closed-loop latency.

Day 4 – Operant behavior tasks

  • Modeling trial sequences: states, events, and side-effects.

  • Driving state transitions with external inputs.

  • Choice, timeouts and conditional logic: the basic building blocks of reaction time, Go/No-Go and 2AFC tasks.

  • Combining real-time and non real-time logic for good measure.

  • Student project brainstorming

Day 5 – Visual stimulation and beyond

  • Interactive visual environments using BonVision.

  • Machine learning for markerless pose estimation using DeepLabCut.

  • Multi-animal tracking and body part feature extraction with BonZeb.

  • Student project presentation.

  • Where to next.

The course will be held from 14:00 to 18:00 GMT.

Registration

Registration fee: 500€ per person (includes shipping of the course kit, pre-recorded and live lectures before and during the course, full attendance to the course, and course certificate).

Registration fee for a group: 500€ for one person and one course kit + 150€ for any additional person (without the course kit)

Applications are closed. However you can express your interest in this NeuroKit course*
and we will contact you once the application call for the next edition is open.

You can also register to the Cajal newsletter at the bottom of this page.

*Please note that this is not considered as a valid application.

To receive more information about this NeuroKit, email info@cajal-training.org

Experimental Neuroscience Bootcamp 1122

This is a Cajal NeuroKit course that combines online lectures about fundamentals and advanced neuroscience topics, with hands-on and physical experiments.
Researchers from all over the world can participate because the course material is sent home in box, and contains all the tools needed to follow the online course.

This course is now at its second edition

Course overview

This course provides a fundamental foundation in the modern techniques of experimental neuroscience. It introduces the essentials of sensors, motor control, microcontrollers, programming, data analysis, and machine learning by guiding students through the “hands on” construction of an increasingly capable robot.

In parallel, related concepts in neuroscience are introduced as nature’s solution to the challenges students encounter while designing and building their own intelligent system.

Course Partners

What will you learn?

The techniques of experimental neuroscience advance at an incredible pace. They incorporate developments from many different fields, requiring new researchers to acquire a broad range of skills and expertise (from building electronic hardware to designing optical systems to training deep neural networks). This overwhelming task encourages students to move quickly, but often by skipping over some essential underlying knowledge.

This course was designed to fill-in these knowledge gaps.

By building a robot, you will learn both how the individual technologies work and how to combine them together into a complete system. It is this broad-but-integrated understanding of modern technology that will help students of this course design novel state-of-the-art neuroscience experiments.

Course directors

Adam Kampff

Course Director
Voight Kampff, London, UK

Andreas Kist

Course Director
Department for Artificial Intelligence in Biomedical Engineering (AIBE), Erlangen, Germany

Elena Dreosti

Co-Director
University College London, UK

Programme

Day 1 – Sensors and Motors

What will you learn?

You will learn the basics of analog and digital electronics by building circuits for sensing the environment and controlling movement. These circuits will be used to construct the foundation of your course robot; a Braitenberg Vehicle that uses simple “algorithms” to generate surprisingly complex behaviour.

Topics and Tasks:

  • Electronics (voltage, resistors, Ohm’s law): Build a voltage divider

  • Sensing (light-dependent resistors, thermistors): Build a light/temperature sensor

  • Movement (electro-magentism, DC motors, gears): Mount and spin your motors

  • Amplifying (transistors, op-amps): Build a light-controlled motor

  • Basic Behaviour: Build a Braitenberg Vehicle

Day 2: Microcontrollers and Programming

What will you learn?

You will learn how simple digital circuits (logic gates, memory registers, etc.) can be assembled into a (programmable) computer. You will then attach a microcontroller to your course robot, connect it to sensors and motors, and begin to write programs that extend your robot’s behavioural ability.

Topics and Tasks:

  • Logic and Memory: Build a logic circuit and a flip-flop

  • Processors: Setup a microcontroller and attach inputs and outputs

  • Programming: Program a microcontroller (control flow, timers, digital IO, analog IO)

  • Intermediate behaviour: Design a state machine to control your course robot

Day 3: Computers and Programming

What will you learn?

You will learn how a modern computer’s “operating system” (Linux) coordinates the execution of internal and external tasks, and how to communicate over a network (using WiFi). You will then use Python to write a “remote-control” system for your course robot by developing your own communication protocol between your robot’s linux computer and microcontroller.

Topics and Tasks:

  • Operating Systems: Setup a Linux computer (Raspberry Pi)

  • Networking: Remotely access a computer (SSH via WiFi)

  • Programming: Program a Linux computer (Python)

  • Advanced behaviour: Build a remote control robot

Day 4: (Machine) Vision

What will you learn?

You will learn how grayscale and color images emerge and how to work with them in a Python environment. By mounting a camera on your robot, you can live-stream the images to your computer. You will then use background subtraction and thresholding to program an image-based motion detector. You will use image moments to detect and follow a moving light source, and learn about “classical” face detection.

Topics and Tasks:

  • Images: Open, modify, and save images

  • Camera: Attach and stream a camera image

  • Image processing: Determine differences in images

  • Pattern recognition: Extract features from images

Day 5: (Machine) Learning

What will you learn?

You will learn about modern deep neural networks and how they are applied in image processing. You will extend the intelligence for your robot, by adding a neural accelerator to the robot. We will deploy a deep neural network for face detection and compare it to the “classical” face detector. Ultimately, you will create and train your own deep neural network that will allow your robot to identify it’s creator, you.

Topics and Tasks:

  • Inference: Implement a neural accelerator (Google Coral USB EdgeTPU)

  • Deployment: Deploy and run a deep neural network

  • Object detection: Finding faces using a deep neural network (Single Shot Detector)

  • Object classification: Train a deep neural network to identify one’s own face (TF/Keras)

The course will be held from 14:00 to 18:00 CET

Registration

Registration fee: 450€ per person (includes shipping of the course kit, pre-recorded and live lectures before and during the course, full attendance to the course, and course certificate).

Registration fee for a group: 450for one person and one course kit + 150€ per additional person (without the course kit)

Application closed on 26 July 2021

To receive more information about this NeuroKit, email info@cajal-training.org

Extracellular Electrophysiology Acquisition 1221

This is a Cajal NeuroKit course that combines online lectures about fundamentals and advanced neuroscience topics with hands-on and physical experiments.

Researchers from everywhere can participate because the course material is sent home in a kit box.

This course is now at its second edition.

Course overview

Any data we collect has been shaped by the system we used to record it. Understanding the tools involved in data acquisition gives you the confidence to make informed experimental design choices, and the freedom to combine and try new approaches while building your dream setup.

In this course, we will develop your understanding of electrophysiology data acquisition. In terms of hardware, you will learn how acquisition systems can amplify tiny signals and filter out noise. You’ll test this understanding by building your own system to measure muscle and heart signals. In software, you will encounter synchronisation considerations, as we add incoming datastreams and build an increasingly complex experimental design.

Don’t be discouraged if you secretly panic at the mention of capacitance, this course starts from the very basics. Advanced students can make the final project as challenging as they like.

Designed by Open Ephys and Open Ephys Production Site, this course will have an open-source flavour and encourage you to try new ideas, share your insights, and connect with the open-source community.

Course sponsors

What will you learn?

By the end of the course, you will:

  • be familiar with the electronic building blocks of acquisition systems

  • be able to model and build circuits to amplify and filter incoming signals

  • be able to use the Bonsai programming language to stream data and run closed-loop experiments with multiple datastreams

Faculty

Alexandra Leighton

Alex Leighton

Course Director

Open Ephys Production Site, PT

Jakob Voigts

Course Director

MIT and Open Ephys, USA

Filipe Carvalho

Course co-director

Open Ephys Production Site, PT

Instructors

Aarón Cuevas López – Universitat Politècnica de València, ES

Joana Neto, FCT NOVA, PT

Jonathan P. Newman – MIT and Open Ephys, USA

Josh Siegle, Allen Institute, USA

Programme

Day 1 – Introduction

  • What are we trying to measure? Electrical signals in the brain and ways to record them.

  • How can we collect these signals without changing them? Considerations when building an acquisition system.

  • Using a simulator to visualise electrical circuits online and make predictions about real-world circuits.

  • Using the breadboard and components in your kit to test your understanding of electronics concepts.

Day 2 – Impedance

  • Using microcontrollers to acquire physiological data.

  • What is impedance? Understanding how we protect our signals while measuring them.

  • Understanding the function and limitations of operational amplifiers.

Cajal Images -Day 1
Cajal- Day 3

Day 3 – Data Acquisition

  • Understanding Instrumentation Amplifiers.

  • Simulating, building and testing low & high-pass filters.

  • Visualise your own EMG/ECG data using the Bonsai programming language.

Day 4 – Synchronizing Datastreams

  • Expanding on Bonsai – controlling cameras, receiving other datastreams.

  • Understanding closed-loop experiments, timestamp considerations, and synchronising datastreams.

  • Designing student projects and group feedback on plan.

Day 5 – Project and Open-Source Neuroscience

  • Open Ephys – open-source hardware & software development.

  • An overview of open-source community projects.

  • Student project presentation.

Cajal- Day 4

The courses will be held from 14:00 to 18:00 GMT.

Registration

Registration fee: 450€ per person (includes shipping of the course kit, pre-recorded and live lectures before and during the course, full attendance to the course, and course certificate).

Registration fee for a group: 450€ for one person and one course kit + 150€ per additional person (without the course kit)

The applications closed on 18 October 2021.

However you can express your interest in this NeuroKit course* and we will contact you once the application call for the next edition is open.

You can also register to the Cajal newsletter at the bottom of this page.

*Please note that this is not considered as a valid application.

To receive more information about this NeuroKit, email info@cajal-training.org