Headroid1 – a face tracking robot head

The video below introduces Headroid1, this face-tracking robot will grow into a larger system that can follow people’s faces, detect emotions and react to engage with the visitor.

The above system uses openCV’s face detection (using the Python bindings and facedetect.py) to figure out whether the face is in the centre of the screen, if the camera needs to move it then talks via pySerial to BotBuilder‘s ServoBoard to pan or tilt the camera until the face is back in the centre of the screen.

Update – see Building A Face Tracking Robot In An Afternoon for full details to build your own Headroid1.

Headroid is pretty good at tracking faces as long as there’s no glare, he can see people from 1 foot up to about 8 feet from the camera. He moves at different speeds depending on your distance from the centre of the screen and stops with a stable picture when you’re back at the centre of his attention. The smile/frown detector which will follow will add another layer of behaviour.

Heather (founder of Silicon Beach Training) used Headroid1 (called Robocam in her video) at Likemind coffee this morning, she’s written up the event:

Andy White (@doctorpod) also did a quick 2 minute MP3 interview with me via audioboo.

Later over coffee Danny Hope and I discussed (with Headroid looking on) some ideas for tracking people, watching for attention, monitoring for frustration and concentration and generally playing with ways people might interact with this little chap:

The above was built in collaboration with BuildBrighton, there’s some discussion about it in this thread. The camera is a Philips SPC900NC which works using macam on my Mac (and runs on Linux and Win too). The ServoBoard has a super-simple interface – you send it commands like ’90a’ (turn servo A to 90 degress) as text and ‘it just works’ – it makes interactive testing a doddle.

Update – the blog for the A.I. Cookbook is now active, more A.I. and robot updates will occur there.

Reference material:

The following should help you move forwards:


Ian is a Chief Interim Data Scientist via his Mor Consulting. Sign-up for Data Science tutorials in London and to hear about his data science thoughts and jobs. He lives in London, is walked by his high energy Springer Spaniel and is a consumer of fine coffees.

9 Comments

  • Wow - that works great! Great stuff Ian.
  • Awww, thanks, I did have a silly grin when I got it working - it only took a few hours of hacking on the Python side. The smile detector will be a bit more work but looks eminently do-able.
  • Looks like there's a longer video over here: http://www.youtube.com/watch?v=IJgV_P1YFjQ
  • Much obliged John, post updated.
  • dave
    You made that thing amazingly fast Ian. What most readers probably won't realise is that 2 weeks ago this was an idea, and Ian has a day job.
  • It took 2 hours for the first software (after compiling openCV with Python bindings) and another 2 hours of tuning (for variable speeds and 2-axis control). I'll post the recipe to http://blog.aicookbook.com/ shortly (be warned that the A.I.Cookbook site is 24 hours old and empty at present!).
  • Ian, that's really great. Also slightly sinister!
  • Luka
    Hi Ian, can this be turned into a stand-alone device, without the PC? please contact me, Luka
  • Hello. I have an arduino with servos on the pwm pin 9 and 10. Can any one help in the arduino code to read the serial data and move servos. It can be suedo that works with http://aicookbook.com/wiki/Headroid1 code. thanks.