Development of Electronic Hardware and Control Software for a Miniaturized Microscope
We (Grewe and Liu group) are looking for a student (Master level) who is interested in carrying out a collaborative technological development project at the Institute of Neuroinformatics. One of the primary goals of the Grewe lab is to extract fundamental principles of information processing from biological neural networks and to reverse-engineer this functionality with the goal to improve machine-learning applications. To study biological neuronal networks, we use state of the art miniaturized microscopes that allow to record large-scale neuronal network activity in freely moving mice. Calcium imaging of neurons during unrestricted movement of the animal gives the possibility to study the dynamic adaptation of the neural network at a single cell level during learning. The Liu group is developing technology based on these fundamental principles, for example via the Dynamic Vision Sensor and Audio Sensors and technology for prosthesis such as source separation algorithms and platforms that benefit hearing aid devices.
In the framework of an interdisciplinary engineering/neuroscience project, we are looking for a student who develops a new imaging electronics (imaging sensor & readout electronic) to improve miniaturized microscopes (see figure below). We aim to achieve this by implementing new high-speed and super-sensitive CMOS sensor and the necessary control electronics into our current microscope setup. The project would involve selecting the optimal CMOS sensor, designing a simple PCB circuit and programming an FPGA microprocessor board (incl. GUI for recording calcium imaging data) to control and read out the sensor.
What we offer:
- - Product research into the optimal CMOS sensor for our application.
- - Find a suitable FPGA board for simple interfacing with the imaging sensor (e.g. Opal Kelly)
- - Interface FPGA board with CMOS sensor for control and readout.
- - Implement a LED control with the option to extend to two different LEDs simultaneously
- - Control imaging sensor through GUI (Frame rate, Exposure Time, externally triggered recording, etc.)
- - Program a stable data acquisition GUI which easily can be extended for live processing.
Start of Project:
- - The interdisciplinary and collaborative environment at the intersection of engineering, neuroscience, and machine learning
- - A highly motivated research team and cutting-edge research project
- - The potential for continuing work at the INI
Length of Project:
Interested students with an EE background and experience in FPGA/GUI programming are especially encouraged to contact us. Please attach a CV, short motivation and background (<0.5 page). If you have any questions about the project, do not hesitate to contact us (shih (at) ini.uzh.ch & bgrewe (at) ini.ethz.ch).