In Body Tracking, Interfaces matter

I

Earlier today I received a press release. It advertised the Quentiq, a Quantified Self / body tracking suite consisting of sensors, a web service, an iPhone app, and some kind of tracking gadget.

QUENTIQ Health Score – English from QUENTIQ on Vimeo.

Dramatic music aside, this is where this otherwise ambitious project is bound to fail:

QUENTIQ Health Score

Quentiq asks you to carry around this hideous little box with a lousy display and thumb through long lists of text interface.

When tracking day to day behavior, subtle & implicit data gathering is key. If you require your users to punch in a bunch of data points manually, you’ll never get there. No, I don’t think I’m over-simplifiying.

Take the Runkeeper app for Android for example. You hit the start button when you start running, and the stop button when you get back home. And that’s it. Runkeeper handles the rest and gives you this as an output:

Running Activity 4.44 km | RunKeeper

Or Massive Health‘s first app, The Eatery:

Screenshot: Massive Health Eatery

The ideal version of this would have you snap a photo of your meal and provide an analysis. In practice that’s not how it works – you actually still have to manually enter a whole bunch of data, which frustrates at least the users I know.

Anyway, that’s all just by way of reminding everybody building body tracking apps right now not to rely on active, explicit user input too much. You’ll make your users’ lives and your own much easier.

Quantified Self Europe & More Thoughts

Two more announcements.

First, this coming weekend I’ll be at Quantified Self Conference Europe. Ping me if you’re around.

Second, over at the Third Wave blog we just kicked off a series of blogposts about body tracking and the Quantified Self. For all the articles that will go online one by one over the next couple of weeks, follow this link.

1 comment