Tagethics

How to build a responsible Internet of Things

H

Over the last few years, we have seen an explosion of new products and services that bridge the gap between the internet and the physical world: The Internet of Things (IoT for short). IoT increasingly has touch points with all aspects of our lives whether we are aware of it or not.

In the words of security researcher Bruce Schneier: “The internet is no longer a web that we connect to. Instead, it’s a computerized, networked, and interconnected world that we live in. This is the future, and what we’re calling the Internet of Things.”1

But IoT consists of computers, and computers are often insecure, and so our world becomes more insecure—or at the very least, more complex. And thus users of connected devices today have a lot to worry about (because smart speakers and their built-in personal digital assistants are particularly popular at the moment, we’ll use those as an example):

Could their smart speaker be hacked by criminals? Can governments listen in on their conversations? Is the device always listening, and if so, what does it do with the data? Which organizations get to access the data these assistants gather from and about them? What are the manufacturers and potential third parties going to do with that data? Which rights do users retain, and which do they give up? What happens if the company that sold the assistant goes bankrupt, or decides not to support the service any longer?

Or phrased a little more abstractedly2: Does this device do what I expect (does it function)? Does it do anything I wouldn’t normally expect (is it a Trojan horse)? Is the organization that runs the service trustworthy? Does that organization have trustworthy, reliable processes in place to protect myself and my data? These are just some of the questions faced by consumers today, but they face these questions a lot.

Trust and expectations in IoT
Trust and expectations in IoT. Image: Peter Bihr/The Waving Cat

Earning (back) that user trust is essential. Not just for any organization that develops and sells connected products, but for the whole ecosystem.

Honor the spirit of the social contract

User trust needs to be earned. Too many times have users clicked “agree” on some obscure, long terms of service (ToS) or end user license agreement (EULA) without understanding the underlying contract. Too many times have they waived their rights, giving empty consent. This has led to a general distrust—if not in the companies themselves then certainly in the system. No user today feels empowered to negotiate a contractual relationship with a tech company on eye level—because they can’t.

Whenever some scandal blows up and creates damaging PR, the companies slowly backtrack, but in too many cases they were legally speaking within their rights: Because nobody understood the contract but the abstract product language suggests a certain spirit of mutual goodwill between product company and their users that is not honored by the letter of that contract.

So short and sweet: Honor the spirit of the social contract that ties companies and their users together. Make the letters of the contract match that spirit, not the other way round. Earning back the users’ trust will not just make the ecosystem more healthy and robust, it will also pay huge dividends over time in brand building, retention, and, well, user trust.

Respect the user

Users aren’t just an anonymous, homogeneous mass. They are people, individuals with diverse backgrounds and interests. Building technical systems at scale means having to balance individual interests with automation and standardization.

Good product teams put in the effort to do user research and understand their users better: What are their interests, what are they trying to get out of a product and why, how might they use it? Are they trying to use it as intended or in interesting new ways? Do they understand the tradeoffs involved in using a product? These are all questions that basic, but solid user research would easily cover, and then some. This understanding is a first step towards respecting the user.

There’s more to it, of course: Offering good customer service, being transparent about user choices, allowing users to control their own data. This isn’t a conclusive list, and even the most extensive checklist wouldn’t do any good in this case: Respect isn’t a list of actions, it’s a mindset to apply to a relationship.

Offer strong privacy & data protection

Privacy and data protection is a tricky area, and one where screwing up is easy (and particularly damaging for all parties involved).

Protecting user data is essential. But what that means is not always obvious. Here are some things that user data might need to be protected from:

  • Criminal hacking
  • Software bugs that leak data
  • Unwarranted government surveillance
  • Commercial third parties
  • The monetization team
  • Certain business models

Part of these fall squarely into the responsibility of the security team. Others are based on the legal arrangements around how the organization is allows (read: allows itself) to use user data: The terms of services. Others yet require business incentives to be aligned with users’ interests.

The issues at stake aren’t easy to solve. There are no silver bullets. There are grey areas that are fuzzy, complex and complicated.

In some cases, like privacy, there are even cultural and regional differences. For example, to paint with a broad brush, privacy protection has fundamentally different meanings in the US than it does in Europe. While in the United States, privacy tends to mean that consumers are protected from government surveillance, in Europe the focus is on protecting user data from commercial exploitation.

Whichever it may be—and I’d argue it needs to be both—any organization that handles sensitive user data should commit to the strongest level of privacy and data protection. And it should clearly communicate that commitment and its limits to users up front.

Make it safe and secure

It should go without saying (but alas, doesn’t) that any device that connects to the internet and collects personal data needs to be reliably safe and secure. This includes aspects ranging from the design process (privacy by design, security by design) to manufacturing to safe storage and processing of data to strong policies that protect data and users against harm and exploitation. But it doesn’t end there: Especially the end-of-life stage of connected products are important, too. If an organization stops maintaining the service and ceases to update the software with security patches, or if the contract with the user doesn’t have protections against data spills at the acquisition or liquidation stage of a company, then the data could have been safe for years but all of a sudden poses new risks.

IT security is hard enough as it is, but security of data-driven systems that interconnect and interact is so much harder. After all, the whole system is only as strong as its weakest component.

Alas, there is neither fame nor glory in building secure systems: At best, there is no scandal over breaches. At worst, there are significant costs without any glamorous announcements. In the same way that prevention in healthcare is less attractive than quick surgery to repair the damage, it is also more effective and cheaper in the long run. So hang in there, and the users might just vote with their feet and dollars to support the safest, most secure, most trustworthy products and organizations.

Choose the right business model

A business model can make or break a company. Obviously, without a business model, a company won’t last long. But without the right business model, it’ll thrive not together with its customers but at their expense.

We see so much damage done because wrong business models—and hence, wrong incentives—drive and promote horrible decision making.

If a business is based on user data—as is often the case in IoT—then finding the right business model is essential. Business models, and the behaviors they incentivize, matter. More to the point, aligning the organization’s financial incentives with the users’ interests matters.

As a rule of thumb, data mining isn’t everything. Ads, and the surveillance marketing they increasingly require, have reached a point of being poisonous. If, however, an organization finds a business model that is based on protecting its users’ data, then that organization and its customers are going to have a blast of a time.

To build sustainable businesses—businesses that will sustain themselves and not poison their ecosystem—it’s absolutely essential to pick and align business models and incentives wisely.


  1. Bruce Schneier: Click Here to Kill Everyone. Available at http://nymag.com/selectall/2017/01/the-internet-of-things-dangerous-future-bruce-schneier.html 
  2. See Peter Bihr: Trust and expectations in IoT. Available at https://thewavingcat.com/2017/06/28/trust-and-expectation-in-iot/ 

Toolkits for designers & developers around ethics, privacy & security

T

At SimplySecure’s excellent Underexposed conference we discussed the importance of making it easier for those involved in making connected products and services to make safe, secure, and privacy-conscious products. After all, they might be experts, but necessarily security experts, for example. So, toolkit time!

I asked participants in the room as well as publicly on Twitter which toolkits and resources are worth knowing. This is what this looked like in the room:

“Which toolkits should we all know? Ethics, privacy, security”

Here’s the tweet that went with it:

So what are the toolkit recommendations? Given the privacy-sensitive nature of the event, I’m linking to the source only where people send the recommendations on public Twitter. Also, please note I’m including them without much background, and unchecked. So here goes:

This list can by no means claim to be complete, but hopefully it will still be useful to some of you.

Opportunities at the intersection of emerging tech, strategy, and good ethics

O

We strongly believe that good ethics mean good business. This isn’t just an empty phrase, either: We know from our own experience that often it pays great dividends to go the extra step and taking into account the implications of business decisions.

This is especially true in areas that employ new technologies, simply because there are more unknowns in emerging tech. And more unknowns = higher risks.

Our field of operation is at the intersection of emerging tech, strategy, and good business ethics.

Take, for example, the global tech company’s VP who adapted community-driven guidelines for data ownership in IoT: He knew that this particular pioneer community had a deeper understanding than most of the issues at stake. Even though these data ownership guidelines meant possibly losing some short term revenue gains, he trusted in their long-term positive side effects. Now, and at the time unexpectedly, his organization is in a better position than most to comply with the new EU data protection regulation (GDPR). Even before that, these guidelines likely inspired user trust and confidence.

Other companies lose their best talents because of sketchy business tactics—to those who are honest and trustworthy, and have a credible and powerful mission.

If you pay attention you’ll find these examples everywhere: Good ethics aren’t a buzzword, nor are they rocket science. They’re 100% compatible with good business. They might just be a requisite.

“View Source: Shenzhen” is now out

&

View Source: Shenzhen cover

Executive Summary: We went to Shenzhen to explore opportunities for collaboration between European Internet of Things practitioners and the Shenzhen hardware ecosystem—and how to promote the creation of a responsible Internet of Things. We documented our experience and insights in View Source: Shenzhen.

Download View Source: Shenzhen as a as a PDF (16MB) or…
read it on Medium.

View Source is the initiative of an alliance of organizations that promote the creation of a responsible Internet of Things:

  • The Incredible Machine is a Rotterdam-based design consultancy for products and services in a connected world.
  • The Waving Cat is a Berlin-based boutique strategy, research & foresight company around the impact and opportunities of emerging technologies.
  • ThingsCon is a global community of practitioners with the mission to foster the creation of a responsible & human-centric IoT.
  • Mozilla Foundation’s Open IoT Studio aims to embed internet stewardship in the making of meaningful, connected things.

Along for part of the ride were two other value-aligned organizations:

  • Just Things Foundation aims to increase the awareness about ethical dilemmas in the development of internet connected products and services.
  • ThingsCon Amsterdam organizes the largest ThingsCon event globally, and also organized a guided delegation of European independent IoT practitioners to Shenzhen which coincided with our second Shenzhen trip.

What unites us in our efforts is great optimism about the Internet of Things (IoT), but also a deep concern about the implications of this technology being embedded in anything ranging from our household appliances to our cities.

About this document

This document was written as part of a larger research effort that included, among other things, two trips to Shenzhen, a video documentary, and lots of workshops, meetings, and events over a period of about a year. It’s part of the documentation of these efforts. Links to the other parts are interspersed throughout this document.

This research was a collaborative effort undertaken with the Dutch design consultancy The Incredible Machine, and our delegations to China included many Dutch designers, developers, entrepreneurs and innovators: One of the over-arching goals of this collaboration was to build bridges between Shenzhen and the Netherlands specifically—and Europe more generally—in order to learn from one another and identify business opportunities and future collaborations.

Creative Industries Fund NL
We thank the Creative Industry Fund NL for their support.

*Please note: While I happen to be the one to write this text as my contribution to documenting our group’s experiences, I cannot speak for the group, and don’t want to put words in anyone’s mouth. In fact, I use the “we” loosely; depending on context it refers to either one of the two delegations, our lose alliance for responsible IoT, or is a collective “we”. I hope that it’s clear in the context. Needless to say, all factual errors in this text are mine, and mine alone. If you discover any errors, please let me know.

ThingsCon Amsterdam 2016 Keynote: A responsible IoT

T

ThingsCon: Peter Bihr (The Waving Cat) from ThingsConAMS on Vimeo.

This week I’m in the Netherlands for ThingsCon Amsterdam, the largest ThingsCon event this year (and one of an ever-growing number, see the event list <3).

The local team around Monique, Marcel & Iskander kindly asked me to give the keynote. I was honored and psyched obviously – here’s my slide deck for now.

Here are my slides:

The super short executive summary: – We need to build IoT in a responsible & human-centric way, and we founded ThingsCon to promote this goal. – It’s hard to get right because HARDWARE IS HARD, NETWORKED SYSTEMS INTRODUCE DYNAMICS OF POWER & CONTROL, and WE DON’T HAVE GOOD LANGUAGE TO DISCUSS IoT. – The ThingsCon community tries to tackle this, and we think it’s both a duty and a privilege to do so. In fact, this is our chance to have a massive positive impact.

A proper write-up will (hopefully) follow later!

Understanding the Connected Home, 2nd edition

U

Cover: Understanding the Connected Home

The second edition of our book Understanding the Connected Home is out. Michelle Thorne and I fully revised, rewrote and updated this edition. It’s both broader and deeper and reflects our thinking around connected homes and smart homes; IoT and ethics; and some other related fields.

You can read it online at theconnectedhome.org and also find various other formats to download there. For even easier reading, you can find a specially formatted edition of Understanding the Connected Home on the Kindle Store (this is also a way to support this and further books).

What we can learn from VW’s emission scandal for IoT

W

As the digging into Volkswagen’s emission/cheating scandal continues, it’s very interesting to watch the kind of conflicts and issues we see emerge from the whole thing. Interesting not because it’s fun to ridicule corporations (it’s not, especially when emissions are concerned), but because this particular case gives us a good idea of the kind of scandals, issues and questions we’ll increasingly see over the next few years around #iot and sensor-data based decision making.

(more…)