How do animals process sensory information to control their motion, and how should one design sensor-based robot control systems? Answering these questions involves integrating biomechanics and dynamics with biological and engineering computation.
We study sensorimotor control of animal movement, using a “control theoretic” perspective; specifically, we use mathematical models of biomechanics, together with principles of control theory, to design perturbations. The responses to these perturbations can be used to furnish a quantitative description of the way the nervous system processes sensory information for control.
The LIMBS laboratory in close collaboration with Eric Fortune, is investigating sensorimotor integration and control in weakly electric knifefish. With the support of the National Science Foundation and the Office of Naval Research, we are investigating a combination of behaviors from locomotion control to how these animals modulate their electric output in the context of complex social interactions.
People: Prof. Noah Cowan, Prof. Eric Fortune, Sarah Stamper, Manu Madhav, Shahin Sefati
Alumni: Eatai Roth, Sean Carver, Yoni Silverman, Terrence Jao, Katie Zhuang
Myriad creatures rely on compliant tactile arrays for locomotion control, mapping, obstacle avoidance and object recognition. Our laboratory is “reverse engineering” the neural controller for cockroach wall following to better understand sensorimotor integration in nature. In addition, we are building tactile sensors, inspired by their biological analogs.
The LIMBS laboratory is investigating the mechanisms by which the human nervous system controls rhythmic dynamic behaviors, and locomotion in particular. We are trying to address rhythmic motor control in humans through a combination of experiments, modeling, and computational analyses. Our analysis are based on two experimental paradigms: juggling and human locomotion.
People: Prof. Noah Cowan, M. Mert Ankarali, Robert Nickl
Alumni: Jusuk Lee, Sean Carver
Vision-based control, also known as visual servoing, is a field of robotics in which a computer controls a robot’s motion under visual guidance, much like people do in everyday life when reaching for objects or walking through a cluttered room.
People: Prof. Noah Cowan, Prof. Greg Hager
Alumni: John Swensen, Vinutha Kallem, Maneesh Dewan