VIPS: ultra-low power VIsual Perception System

Open PhD and Postdoc positions

(see in google docs)

The Sensors Group at the Inst. of Neuroinformatics, UZH-ETH Zurich has open PhD and Postdoc positions for the the project VIPS  funded for 4 years.

VIPS is a Swiss National funding BRIDGE project partnering the Sensors Group (Prof. Tobi Delbruck) and the Swiss Center for Electronics and Microtechnology (CSEM) (Dr. Pierre-Francois Ruedi).   BRIDGE offers new funding opportunities at the intersection of basic research and science-based innovation


VIPS will develop an ultra-low power (ULP) visual perception system for battery or self powered applications. To achieve this goal, we will develop a deep neural network architecture optimized for ULP processing. This architecture will be implemented in the form of a dedicated integrated circuit that interfaces to ultra low power vision sensors. The final demonstrators will perform face detection, people tracking, and people counting, combining the neural network classifier chip and a ULP image sensor dedicated to visual scene analysis..

The illustration shows a possible approach for realizing VIPS in the context of an always-on-always-quick fall detector to detect if a person has fallen down and cannot summon help.


Recent progress in machine learning for computer vision has led to applications ranging from mobile phone object recognition to driverless cars, to name just two well known examples. But existing systems burn far too much power for long term battery or solar powered operation. Bringing vision-based artificial intelligence (AI) to such systems would open a new world of applications for people’s homes and public spaces to improve comfort, security and environment protection. A few examples are easily installable cordless fall detectors in homes to make the environment of elderly people safer, intelligent canes for blind people able to recognize objects, and smart wilderness cameras that monitor wildlife species.  VIPS aims to bring such ultra low power smart visual perception systems to Swiss industry.

Scientific and social context of the research project

VIPS has an overt technological goal of bringing ULP hardware for AI to the Swiss electronics industry. It also seeks to understand fundamental approaches used by animals to solve two life-critical but opposing goals: Saving precious energy and remaining constantly alert to quickly react to the world. In the context of today’s world, where we are increasingly surrounded by intelligent networked sensors, VIPS will enable ultra low power intelligent visual perception in battery powered devices. The capability of extending the battery lifetime to long periods of months or even years will enable many useful applications that are currently impossible.


  • Position 1 (PhD or Postdoc): Digital IP cores for 3D CNN acceleration;. You will lead the design of digital architectures and circuits to accelerate CNN inference in the context of always-on, always-quick IoT vision. You wil work with CSEM to integrate this IP to ASIC. You will assist in prototyping this IP on FPGA with the ULP cameras developed by the Sensors Group (DVS) and CSEM (Ergo).
    • Requirements: Digital computer architecture, FPGA experience, computer vision, deep learning. Analog mixed signal design desirable for integration of passive infrared (PIR) presence sensor interface.
    • You will also participate in the application studies
  • Position 2 (PhD or Postdoc): FPGA implementation and demonstrator applications: You will lead the practical application studies of the prototype VIPS processor developed by the Sensors Group.
    • Requirements: Embedded system design, logic design, computer vision, deep learning


To indicate interest in a position, send email to Prof. Tobi Delbruck ( with subject line containing “VIPS position”. You may wish to include the following material

  1. Your email should summarize briefly your background, accomplishment, and particular interest in SCIDVS
  2. Your CV including TOEFL, GRE, project/publication accomplishments, and 1 or 2 possible references.
  3. Your grade transcripts from undergraduate and masters programs
© 2020 Institut für Neuroinformatik