Auckland Bioengineering Institute

Laboratory for Animate Technologies

Imagine a machine that can laugh and cry, learn and dream, and can express its inner responses to how it perceives you to feel. It can express itself in a natural manner but also allows you to visualise the mental imagery emerging in its mind. 

The Laboratory for Animate Technologies is creating ‘live’ computational models of the face and brain by combining Bioengineering, Computational and Theoretical Neuroscience, Artificial Intelligence and Interactive Computer Graphics Research.

We are developing multidisciplinary technologies to create interactive autonomously animated systems which will define the next generation of human computer interaction and facial animation.


“If I had my time again I’d want to spend it in this lab” - Alvy Ray Smith, Co-founder of Pixar (on his visit to the Laboratory for Animate Technologies).


Our Research



Applications of the technology encompass both Pure Research and Commercial Applied Research, including:

∎    Exploration and embodiment of theories of brain function and behaviour through computational modelling.

∎    New, more natural ways of interacting with technology through expressive human-machine communication.


Neurobehavioural Animation
Neurobehavioural Animation

We believe the best way to simulate biological behaviour is through biological models.

We model the brain processes which give rise to behaviour and social learning and use these to animate lifelike models of the face that can interact with you.

BabyX Evolution

BabyX is an interactive animated virtual infant prototype. BabyX is a computer generated psychobiological simulation under development in the Laboratory of Animate Technologies and is an experimental vehicle incorporating computational models of basic neural systems involved in interactive behaviour and learning. 

These models are embodied through advanced 3D computer graphics models of the face and upper body of an infant. The system can analyse video and audio inputs in real time to react to the caregiver’s or peer’s behaviour using behavioural models.

BabyX embodies many of the technologies we work on in the Laboratory and is under continuous development, in its neural models, sensing systems and also the realism of its real time computer graphics.

Image link for BabyX v3.0 video
Image link for BabyX v3.0 video


Interactive Modelling and Simulation of Biologically Based Neural Networks
Interactive Modelling and Simulation of Biologically Based Neural Networks

We create interactive models of neural systems and neuroanatomy enabling visualisation of the internal processes generated by computational simulations giving rise to behaviour.

The Auckland Face Simulator
Face Simulation and Biomechanics

The Auckland Face Simulator is being developed to cost effectively create extremely realistic and precisely controllable models of the human face and its expressive dynamics for Psychology research.

Face Simulation and Biomechanics
Face Simulation and Biomechanics


We are developing the technology to simulate faces both inside and out. We simulate how faces move and how they look, and even their underlying anatomic structure.

Brain Language
Brain Language

We are developing a visual modelling methodology for the construction, visualisation and animation of neural systems called Brain Language [BL], a novel simulation environment for neural models.

This allows users to create animations and real-time visualisations from biologically based neural network models, allowing simulation effects to be viewed in an interactive context. Such a visual environment is not only suitable for visualising a simulation; it is also ideal for model development.

Facial Tracking and Analysis
Facial Tracking and Analysis

We are developing computer vision based systems to track and analyse facial expression and state of the art algorithms to solve for individual facial muscle activation.

Applications range from real-time expression recognition to microdynamic interaction analysis for psychology research.

Heading: News and media coverage



BabyX Version 3


Dr. Mark Sagar


Hello World, Episode 1: New Zealand
Article on Bloomberg by Ashlee Vance, March 23, 2016.

BabyX and the Auckland Face Simulator both selected for SIGGRAPH 2015 Real-Time Live!
Article on ABI website, 23 July 2015. BabyX at SIGGRAPH 2015, Auckland Face Simulator at SIGGRAPH 2015.

Auckland Face Simulator
Article on fxguide by Mike Seymour, 27 May 2015

Baby X, The Intelligent Toddler Simulation, Is Getting Smarter Every Day
Blog on The Creators Project, by Beckett Mufson, 22 August 2014

How are we advancing artificial intelligence?
Research Works Wonders (Video)

Prince Charles admires Kiwi innovation, economic success
Article in the National Business Review, 24 March 2014

Look who's Thinking
Article in the New Zealand Herald, 10 August 2013

BabyX at TEDx Auckland 2013

BBC World News Impact - Interview Dr Mark Sagar Baby X

Dr Mark Sagar - Screen Life
Article on Morgo, 2012

Heading: Selected publications



Creating connection with autonomous facial animation (In press). Sagar, M., Seymour, M. Henderson, A.M.E, Corballis, P.C., Bullivant, D. Robertson, P., Efimov, O., Jawed, K., Kalarot, R., Ollewagen, W. & Wu, T. In Communications of the ACM

Sagar M, Broadbent E. 2016 Participatory medicine: model based tools for engaging and empowering the individual. Royal Society Interface Focus 6: 20150092.

Sagar, M., Robertson, P., Bullivant, D., Efimov, O., Jawed, K., Kalarot, R., & Wu, T. (2015). BL: A Visual Computing Framework for Interactive Neural System Models of Embodied Cognition and Face to Face Social Learning. In Unconventional Computation and Natural Computation (pp. 71-88). Springer International Publishing.

Sagar, M., Bullivant, D., Robertson, P., Efimov, O., Jawed, K., Kalarot, R., & Wu, T. (2014, December). A neurobehavioural framework for autonomous animation of virtual human faces. In SIGGRAPH Asia 2014 Autonomous Virtual Humans and Social Robots for Telepresence. ACM.

Sagar, M., Bullivant, D., Robertson, P., Efimov, O., Jawed, K., Kalarot, R., & Wu, T. (2014). Embodying models of expressive behaviour and learning with a biomimetic virtual infant. In Proceeding of The 4th International Conference on Development and Learning and on Epigenetic Robotics (IEEE ICDL-EPIROB 2014).



Principal supervisor: Associate Professor Mark Sagar,

We are interested in hearing from talented, motivated and capable graduates who want to do groundbreaking work.

More information is available on the ABI Postgraduate Research Opportunities page.

Heading: LAT Members


Portrait: Oleg Efimov

Oleg Efimov
Computer Graphic Artist



Dr Khurram Jawed
Computer Vision Research Engineer


Portrait: Ratheesh Kalarot

Dr Ratheesh Kalarot
Research and Development Software Engineer


Portrait: Werner Ollegwagen

Werner Ollewagen
Computer Graphic Artist


Portrait: Tim Wu

Dr Tim Wu
Research and Development Software Engineer