SolidCon notes (part 1)

It’s been great to catch up to all the folks in the field in one place, and to check up on what’s going on in San Francisco.

Below, for future reference and largely unsorted & unedited, some notes & links. This is the quick & dirty version, I’m paraphrasing roughly and hope I’m not misrepresenting. Proceed at your own risk.

Enter: Bio

Interesting to see bio entering the picture in such a big way, as a part of an emerging hardware-software-bio ecosystem, a resource to be programmed, and which comes with its own potentials and challenges. Super exciting, can’t wait to learn more about it.

Joi Ito’s opening

Computational design + additive manufacturing + materials engineering + synthetic biology = a new type of design (HW+SW+BIO)

The Atlantic

Fun fact: Impeccibly timed, this article called The Internet of Things You Don’t Really Need is critical of most wearables and overall gadget-ism.

Building a fab for synthetic biology

Joe Jacobson (MIT Media Lab) gave a solid primer on synthetic biology. Very accessible, focused on the opportunities that cheaper tools give us now, like gene printers. Just deep enough for me to realized just how much I need to learn before understanding any of it for real.

Some related reading: http://search.oreilly.com/?q=synthetic+biology&x=0&y=0

Robot swarms for automated construction

Justin Werfel (Harvard University, Wyss Institute for Biologically Inspired Engineering)

Session description: “Social insects like ants and termites build huge, complex structures through the collective actions of millions of independent agents. How could we engineer systems that harness that kind of power? I’ll describe how we built and programmed a decentralized system of construction robots that takes a blueprint of a desired final structure as the only input, and provably builds that structure.”

You don’t know the number of bots in the swarm, so central control isn’t desiredable here; every individual bot has to sense and decide autonomously. They’re very limited, but you can get reliable, predictable emergent outcomes by programming the bots to follow a specific set of relatively simple rules.

For swarms, there are 2 kinds of rules robots will follow: traffic rules and safety rules.

Think emergent construction that function loosely like an anthill.

Conversational IoT

Nick O’Leary of IBM Emerging Technology Services made a bit of a playful deep dive into conversational IoT, and how human-readable, structured language can be a key to unlock it.

See: Tom Coates’ blog post from last year on interacting with connected objects

Check out this paper on Conversational Sensing (2014)

Building a beautiful future: Consumer biotechnology and the power of “Wow!”

Keira Havens and Nikolai Braun (Revolution Bioengineering)

Session description: “The conversation around bioengineering alternates between visions of a truly sustainable society in the far future, and unknown dystopian horrors just around the corner. How do we transcend these stereotypes? We make biotechnology beautiful and personal. Beauty captures the imagination in a way that practical and necessary advances can’t – we’re leading the way with color-changing flowers.”

We can now get a flower to change color uni-directionally because we understand very well how that works.

But what’s the overall vision for applied biology (in the widest sense)?

  • Health care?
  • Agriculture, like more nutritient, more productive plants?
  • Sustainability, like oceans/fishing?
  • Architecture and building, like reducing the amount of cement needed to build things? (Cement has a very large carbon footprint.)

There’s still so much that’s not yet understood – E.Coli, after being studied for 50 years or so, still has 1.000 or so genes that nobody knows what they’re for.

Analytics for large scale data analysis of genes, etc., is needed and could unlock huge potential.

But bio engineering has a bad rep because of black sheep in the industry, which always is a threat because it might lead to road blocks that will get in the way of the positive effects of the field. How can the image be changed to the better? Rev Bio wants to try and create beautiful things.

Interacting with a world of connected objects

Tom Coates of Thington

  • Dramatic growth in computing/computer numbers used/ubiquity of connected devices over the last few decades, told through hilarious pics of young Tom.
  • Connectivity sneaks into just about everything (CES full of connected everythings; Samsung announcing all their stuff to be part of IoT…)
  • Just like electricity was added to every appliance in the household, so is connectivity: “Anywhere computers can take hold, they will take root and stay.”
  • Matt Ronaldson, partner at SF design firm Ammunition: “You should use the network to amplify a tool’s core purpose, not to be another web browser or Twitter client.”
  • Current state of affairs is to remote control connected things through smartphones. This works, kinda, but isn’t terribly exciting. We can do better.
  • Check out for other relevant opinions on embedded interactions/tangible & embodied computing, etc.: David Rose, Paul Dourish, Durrell Bishop, Natalie Jeremijenko, Hiroshi Ishii
  • Is the idea of merging service layer and the object itself desirable? This might be missing the point, the potential power of connected products.
  • Two opposing models of thought: Bring connectedness/interaction into the object vs extending the object into the IoT (I’m very roughly paraphrasing)
  • Relearning a different UI per objects is tough, doesn’t scale. Can we learn from general-purpose computing? Can one UI rule them all? (Again, very roughly paraphrasing.)
  • The true innovation and value might happen (or be hiding) in the service layer.
  • Example: Zipcar. Everything interesting there happens on the service layer, the hardware is basically boring.
  • Interaction migrates from device to more appropriate context: Illustration of historic (and funny, awkward) light switches based on pullies and other complex contraptions.
  • Current smart home appliances aren’t terribly smart when it comes to authorization, context, use by various people. (Example: Can former guests still control the smart light?)
  • How do we get objects to make our stuff do the things we want it to do? How can the devices communicate their assumptions and “learned behaviors” to the user so it’s not confusing or annoying?
  • The experience of House of Coates was interesting (emotionally and rationally) and not just to Tom & Matt.
  • Beautiful seams.
  • “Carve nature at its joints” (Plato) v Minimum Cut Algorithms
  • Which parts/features/buttons/interactions should stay on the object? Which could/should be somewhere else? The best way isn’t yet clear, but now is when the discussion happens across the industry. Important, powerful moment!
  • The world’s one big computer. Transformative moment, esp. for UX.

Misc startups/sites/projects/links to check out

  • Solidcon startup showcase
  • Ammunition, SF-based design/product company
  • June oven, a smart oven that sees, senses & measures the food inside and integrates w/ cooking app, learns, looks pretty good)
  • Proteus (healthcare: battery powered by stomach acids to power wifi to transmit data to a band aid)
  • Autodesk Tinkerplay (design app for kids on ipad)
  • alike.io social wristbands
  • Transformair, a connected air purifier that destroys indoor air pollutants

(Note for day 2)

Leave a Reply