How do animals process sensory information to control their motion, and how should one design sensor-based robot control systems? Answering these questions involves integrating biomechanics and dynamics with biological and engineering computation.

We study sensorimotor control of animal movement, using a “control theoretic” perspective; specifically, we use mathematical models of biomechanics, together with principles of control theory, to design perturbations. The responses to these perturbations can be used to furnish a quantitative description of the way the nervous system processes sensory information for control.

Sensorimotor Integration In Weakly Electric Fish


The LIMBS laboratory in close collaboration with Eric Fortune, is investigating sensorimotor integration and control in weakly electric knifefish. With the support of the National Science Foundation and the Office of Naval Research, we are investigating a combination of behaviors from locomotion control to how these animals modulate their electric output in the context of complex social interactions.

People: Prof. Noah Cowan, Prof. Eric Fortune, Erin Sutton, Manu Madhav
Alumni: Sarah Stamper, Shahin Sefati, Eatai Roth, Sean Carver, Yoni Silverman, Terrence Jao, Katie Zhuang

Antenna-Based Tactile Sensing for High-Speed Wall Following


Myriad creatures rely on compliant tactile arrays for locomotion control, mapping, obstacle avoidance and object recognition. Our laboratory is “reverse engineering” the neural controller for cockroach wall following to better understand sensorimotor integration in nature. In addition, we are building tactile sensors, inspired by their biological analogs.

People: Prof. Noah Cowan, Alican Demir, Ned Samson, Brent Dolan
Alumni: Brett Kutscher, Kelly Canfield, Jusuk Lee, Andrew Lamperski, Owen Loh, Nick Keller

Human Rhythmic Movement:  Control, Timekeeping, and Statistical Modeling


How do humans control rhythmic dynamic behaviors such as walking and juggling?  We address this central question through a combination of virtual reality experiments, systems theoretic modeling, and computational analyses.

Researchers: Prof. Noah Cowan, Robert Nickl, Nicole Ortega
Alumni: M. Mert Ankarali, Ned Samson, Sean Carver, Andy Lamperski, Rob Grande, Avik De, Jusuk Lee

Vision-Based Control


Vision-based control, also known as visual servoing, is a field of robotics in which a computer controls a robot’s motion under visual guidance, much like people do in everyday life when reaching for objects or walking through a cluttered room.

People: Prof. Noah Cowan, Prof. Greg Hager
Alumni: John Swensen, Vinutha Kallem, Maneesh Dewan
Representative Publications: 

V. Kallem, M. Dewan, J. P. Swensen, G. D. Hager, and N. J. Cowan. “Kernel-based visual servoing” . Proc IEEE/RSJ Int Conf Intell Robots Syst (IROS), San Diego, CA, USA, Oct 29 – Nov 2, 2007.

J.P. Swensen and N.J. Cowan. “An almost global estimator on SO(3) with measurement on S^2”. Proc Amer Control Conf, Montreal, Canada, 2012.

N. J. Cowan, J. Weingarten, and D. E. Koditschek. “Visual servoing via navigation functions.”IEEE Trans Robot and Automat, 18(4):521-533, 2002. [pdf]

N. J. Cowan and D. E. Chang. “Geometric visual servoing.” IEEE Trans Robot, 21(6):1128-1138, 2005. [pdf]