AuthorPeter Bihr

What’s long-term success? Outsized positive impact.

W

For us, success is outsized positive impact—which is why I’m happy to see our work becoming part of Brazil’s National IoT Plan.

Recently, I was asked what long-term success looked like for me. Here’s the reply I gave:

To have outsized positive impact on society by getting large organizations (companies, governments) to ask the right questions early on in their decision-making processes.

As you know, my company consists of only one person: myself. That’s both boon & bane of my work. On one hand it means I can contribute expertise surgically into larger contexts, on the other it means limited impact when working by myself.

So I tend (and actively aim) to work in collaborations—they allow to build alliances for greater impact. One of those turned into ThingsCon, the global community of IoT practitioners fighting for a more responsible IoT. Another, between my company, ThingsCon and Mozilla, led to research into the potential of a consumer trustmark for the Internet of Things (IoT).

I’m very, very happy (and to be honest, a little bit proud, too) that this report just got referenced fairly extensively in Brazil’s National IoT Plan, concretely in Action Plan / Document 8B (PDF). (Here’s the post on Thingscon.com.)

To see your work and research (and hence, to a degree, agenda) inform national policy is always exciting.

This is exactly the kind of impact I’m constantly looking for.

Monthnotes for January 2018

M

January isn’t quite over, but since I’ll be traveling starting this weekend, I wanted to drop these #monthnotes now. A lot of time this month went into prepping an upcoming project which is likely to take up the majority of my time in 2018. More on that soon.

×

Capacity planning: This year my work capacity is slightly reduced since I want to make sure to give our new family member the face time he deserves. That said, this year’s capacity is largely accounted for, which is extra nice given it’s just January, and it’s for a thing I’m genuinely excited about. That said, I think it’s important to work on a few things in parallel because there’s always potential that unfolds from cross-pollination; so I’m up for a small number of not-huge projects in addition to what’s already going on, particularly in the first half of the year. Get in touch.

×

On Sunday, I’m off to San Francisco for a work week with the good folks at Mozilla because reasons and a number of meetings in the Bay Area. (Full disclosure: my partner works at Mozilla). Last year I’ve done some work with Mozilla and ThingsCon exploring the idea of a trustmark for IoT (our findings).

Image: commons (SDASM Archives)

Should you be in SF next week, ping me and we can see if we can manage a coffee.

×

IoT, trust & voice: More and more, I’m coming around to the idea that voice is the most important—or at least most imminent—manifestation of IoT regarding user data. Voice, and how it relates to trust, is what I’ll be focusing on a lot of my work in 2018.

×

User profiling in smart homes: Given my focus on voice & trust in IoT this year, I was very happy that Berlin tech & policy think tank Stiftung Neue Verantwortung invited me to a workshop on user profiling in smart homes. It was all Chatham House rules and I don’t want to dive into specifics at this point, but smart homes and voice assistants are worth a deep dive when it comes to trust—and trustworthiness—in IoT.

Connected homes and smart cities

Not least because (as I’ve been hammering home for a long time) the connected home and the smart city are two areas that most clearly manifest a lot of the underlying tensions and issues around IoT at scale: Connected homes, because traditionally the home was considered a private space (that is, if you look at the last 100 years in the West), and embedded microphones in smart homes means it’s not anymore. And smart cities, because in public space there is no opt-out: Whatever data is collected, processed, and acted on in public space impacts all citizens, if they want it or not. These are fundamental changes with far reaching consequences for policy, governance, and democracy.

×

Worth your time: A few pointers to articles and presentations I found worthwhile:

  • Kate Crawford’s talk on bias in AI training data is ace: The Trouble with Bias [Youtube].
  • TechCrunch has a bit of a top-level explainer of GDPR, Europe’s General Data Protection Regulation that goes into effect in May this year. It’s being widely lauded in Europe (except by the usual suspects, like ad-land), and been unsurprisingly criticized in Silicon Valley as disruptive regulation. (See what I did there?) So it came as a pleasant surprise to me that TechCrunch of all places finds GDPR to be a net positive. Worth 10 minutes of your time! [TechCrunch: WTF is GDPR?]
  • noyb.eu—My Privacy is none of your Business: Max Schrems, who became well-known in European privacy circles after winning privacy-related legal battles including one against Facebook and one that brought down the US/EU Safe Harbor Agreement, is launching a non-profit: They aim to enforce European privacy protection through collective enforcement, which is now an option because of GDPR. They’re fundraising for the org. The website looks crappy as hell very basic, but I’d say it’s a legit endeavor and certainly an interesting one.

×

Writing & thinking:

  • In How to build a responsible Internet of Things I lay out a few basic, top-level principles distilled from years of analyzing the IoT space—again with an eye on consumer trust.
  • On Business Models & Incentives: Some thoughts on how picking the wrong business model—and hence creating harmful incentives for an organization to potentially act against its own customers—is dangerous and can be avoided.
  • I’ve been really enjoying putting together my weekly newsletter together. It’s a little more personal and interest-driven than this blog, but tackles similar issues of the interplay between tech & society. It’s called Connection Problem. You can sign up here.

I was also very happy that Kai Brach, founder of the excellent Offscreen magazine kindly invited me to contribute to the next issue (out in April). The current one is also highly recommended!

×

Again, if you’d like to work with me in the upcoming months, please get in touch quickly so we can figure out how best to work together.

×

That’s it for January. See you in Feb!

How to build a responsible Internet of Things

H

Over the last few years, we have seen an explosion of new products and services that bridge the gap between the internet and the physical world: The Internet of Things (IoT for short). IoT increasingly has touch points with all aspects of our lives whether we are aware of it or not.

In the words of security researcher Bruce Schneier: “The internet is no longer a web that we connect to. Instead, it’s a computerized, networked, and interconnected world that we live in. This is the future, and what we’re calling the Internet of Things.”1

But IoT consists of computers, and computers are often insecure, and so our world becomes more insecure—or at the very least, more complex. And thus users of connected devices today have a lot to worry about (because smart speakers and their built-in personal digital assistants are particularly popular at the moment, we’ll use those as an example):

Could their smart speaker be hacked by criminals? Can governments listen in on their conversations? Is the device always listening, and if so, what does it do with the data? Which organizations get to access the data these assistants gather from and about them? What are the manufacturers and potential third parties going to do with that data? Which rights do users retain, and which do they give up? What happens if the company that sold the assistant goes bankrupt, or decides not to support the service any longer?

Or phrased a little more abstractedly2: Does this device do what I expect (does it function)? Does it do anything I wouldn’t normally expect (is it a Trojan horse)? Is the organization that runs the service trustworthy? Does that organization have trustworthy, reliable processes in place to protect myself and my data? These are just some of the questions faced by consumers today, but they face these questions a lot.

Trust and expectations in IoT
Trust and expectations in IoT. Image: Peter Bihr/The Waving Cat

Earning (back) that user trust is essential. Not just for any organization that develops and sells connected products, but for the whole ecosystem.

Honor the spirit of the social contract

User trust needs to be earned. Too many times have users clicked “agree” on some obscure, long terms of service (ToS) or end user license agreement (EULA) without understanding the underlying contract. Too many times have they waived their rights, giving empty consent. This has led to a general distrust—if not in the companies themselves then certainly in the system. No user today feels empowered to negotiate a contractual relationship with a tech company on eye level—because they can’t.

Whenever some scandal blows up and creates damaging PR, the companies slowly backtrack, but in too many cases they were legally speaking within their rights: Because nobody understood the contract but the abstract product language suggests a certain spirit of mutual goodwill between product company and their users that is not honored by the letter of that contract.

So short and sweet: Honor the spirit of the social contract that ties companies and their users together. Make the letters of the contract match that spirit, not the other way round. Earning back the users’ trust will not just make the ecosystem more healthy and robust, it will also pay huge dividends over time in brand building, retention, and, well, user trust.

Respect the user

Users aren’t just an anonymous, homogeneous mass. They are people, individuals with diverse backgrounds and interests. Building technical systems at scale means having to balance individual interests with automation and standardization.

Good product teams put in the effort to do user research and understand their users better: What are their interests, what are they trying to get out of a product and why, how might they use it? Are they trying to use it as intended or in interesting new ways? Do they understand the tradeoffs involved in using a product? These are all questions that basic, but solid user research would easily cover, and then some. This understanding is a first step towards respecting the user.

There’s more to it, of course: Offering good customer service, being transparent about user choices, allowing users to control their own data. This isn’t a conclusive list, and even the most extensive checklist wouldn’t do any good in this case: Respect isn’t a list of actions, it’s a mindset to apply to a relationship.

Offer strong privacy & data protection

Privacy and data protection is a tricky area, and one where screwing up is easy (and particularly damaging for all parties involved).

Protecting user data is essential. But what that means is not always obvious. Here are some things that user data might need to be protected from:

  • Criminal hacking
  • Software bugs that leak data
  • Unwarranted government surveillance
  • Commercial third parties
  • The monetization team
  • Certain business models

Part of these fall squarely into the responsibility of the security team. Others are based on the legal arrangements around how the organization is allows (read: allows itself) to use user data: The terms of services. Others yet require business incentives to be aligned with users’ interests.

The issues at stake aren’t easy to solve. There are no silver bullets. There are grey areas that are fuzzy, complex and complicated.

In some cases, like privacy, there are even cultural and regional differences. For example, to paint with a broad brush, privacy protection has fundamentally different meanings in the US than it does in Europe. While in the United States, privacy tends to mean that consumers are protected from government surveillance, in Europe the focus is on protecting user data from commercial exploitation.

Whichever it may be—and I’d argue it needs to be both—any organization that handles sensitive user data should commit to the strongest level of privacy and data protection. And it should clearly communicate that commitment and its limits to users up front.

Make it safe and secure

It should go without saying (but alas, doesn’t) that any device that connects to the internet and collects personal data needs to be reliably safe and secure. This includes aspects ranging from the design process (privacy by design, security by design) to manufacturing to safe storage and processing of data to strong policies that protect data and users against harm and exploitation. But it doesn’t end there: Especially the end-of-life stage of connected products are important, too. If an organization stops maintaining the service and ceases to update the software with security patches, or if the contract with the user doesn’t have protections against data spills at the acquisition or liquidation stage of a company, then the data could have been safe for years but all of a sudden poses new risks.

IT security is hard enough as it is, but security of data-driven systems that interconnect and interact is so much harder. After all, the whole system is only as strong as its weakest component.

Alas, there is neither fame nor glory in building secure systems: At best, there is no scandal over breaches. At worst, there are significant costs without any glamorous announcements. In the same way that prevention in healthcare is less attractive than quick surgery to repair the damage, it is also more effective and cheaper in the long run. So hang in there, and the users might just vote with their feet and dollars to support the safest, most secure, most trustworthy products and organizations.

Choose the right business model

A business model can make or break a company. Obviously, without a business model, a company won’t last long. But without the right business model, it’ll thrive not together with its customers but at their expense.

We see so much damage done because wrong business models—and hence, wrong incentives—drive and promote horrible decision making.

If a business is based on user data—as is often the case in IoT—then finding the right business model is essential. Business models, and the behaviors they incentivize, matter. More to the point, aligning the organization’s financial incentives with the users’ interests matters.

As a rule of thumb, data mining isn’t everything. Ads, and the surveillance marketing they increasingly require, have reached a point of being poisonous. If, however, an organization finds a business model that is based on protecting its users’ data, then that organization and its customers are going to have a blast of a time.

To build sustainable businesses—businesses that will sustain themselves and not poison their ecosystem—it’s absolutely essential to pick and align business models and incentives wisely.


  1. Bruce Schneier: Click Here to Kill Everyone. Available at http://nymag.com/selectall/2017/01/the-internet-of-things-dangerous-future-bruce-schneier.html 
  2. See Peter Bihr: Trust and expectations in IoT. Available at https://thewavingcat.com/2017/06/28/trust-and-expectation-in-iot/ 

On Business Models & Incentives

O

We’ve been discussing ethics & responsibility in IoT specifically, and business more generally, a lot lately. This seems more relevant than ever today, simply because we see so much damage done because wrong business models—and hence, wrong incentives—drive and promote horrible decision making.

One blatantly obvious example is Facebook and its focus on user engagement. I’d like to make clear I pick Facebook because it is simply the best known example of an industry-wide trend.

Advertisers are sold on “engagement” as a metric since the web allowed to measure user behavior (ie. what used to be called “Web 2.0”, now “social media”). Before that (early Web), it was various flavors of page impressions as a proxy for reach. Before that (print, TV) it was calculated/assumed reach based on sampling and the size of print runs.

It’s important to keep in mind that these metrics have changed over time, and can change and be changed any time. They aren’t a divine hand-down, nor a constant in the world. They are what we, as an industry and society, make them.

Now, for a few years advertisers have been sold on, and have been overall quite happy with, having their ad efficiency and effectiveness on engagement. This term means how many people didn’t just see their ads, but interacted (“engaged”) with them in one way or another. Typically, this means clicking on them, sharing them on social media or via email, and the like. It’s a strong proxy for attention, which is what advertisers are really after: They want potential customers to notice their messages. It’s hard to argue with that; it’s their job to make sure people notice their ads.

That said, the focus on engagement was driven forcefully by the platforms that profit from selling online ads as a means to differentiate themselves from print and TV media, as well as the online offerings of traditionally print/TV based media. “Look here, we can give you much more concrete numbers to measure how well your ads work”, they said. And they were, by and large, not wrong.

But.

The business model based on engagement turned out to be horrible. Damaging. Destructive.

This focus on engagement means that all incentives of the business are to get people to pay more attention to advertisements, at the expense of everything else. Incentivizing engagement means that the more you can learn about a user, by any means, puts you in a better position to get them to pay attention to your ads.

This is how we ended up with a Web that spies on us, no matter where we go. How we ended up with websites that read us more than we read them. With clickbait, “super cookies”, and fake news. Every one of these techniques are means to drive up engagement. But at what cost?

I truly believe you can’t discuss fake news, the erosion of democracy, online harassment, and populism without discussion online surveillance (aka “ad-tech”, or “surveillance capitalism”) first.

Business models, and the behaviors they incentivize, matter. Facebook and many other online advertisement platforms picked horrible incentives, and we all have been paying the price for it. It’s killing the Web. It’s eroding our privacy, the exchange of ideas, and democracy. Because where our communications channels spy on us, and the worst and most troll-ish (“most engaging”) content floats to the top because of ill-advised and badly checked algorithmic decision-making, we can’t have discussions anymore in public, or even in the spaces and channels that appear to be private.

It doesn’t have to be that way. We can choose our own business models, and hence incentives.

For example, over at ThingsCon we were always wary of relying too much on sponsorship, because it adds another stakeholder (or client) you need to accommodate beyond participants and speakers. We mostly finance all ThingsCon events through ticket sales (even if “financing” is a big word; everything is mostly done by our own volunteer work). Our research is either done entirely in-house out of interest or occasionally as a kind of “researcher-for-hire” commission. We subsidize ThingsCon a lot through our other work. Does that mean we lose some quick cash? Absolutely. Do we regret it? Not in the very least. It allows a certain clarity of mission that wouldn’t otherwise be possible. But I admit it’s a trade-off.

(A note for the event organizers out there: Most of the sponsors we ended up taking on were more than happy to go with food sponsoring, a ticket package, or subsidizing tickets for underrepresented groups—all entirely compatible with participants’ needs.)

If we want to build sustainable businesses—businesses that will sustain themselves and not poison their ecosystem—we need to pick our business models and incentives wisely.

Monthnotes for December 2017

M

December was a slow month in terms of work: We had a baby and I took a few weeks off, using what little time was left to tie up some loose ends and to make sure the lights were still on. In January, I’ll be back full time, digging into some nice, big, juicy research questions.

My capacity planning for 2018 is in full swing. If you’d like to work with me in the upcoming months, please get in touch quickly.

Media

SPIEGEL called to chat about insecure IoT devices. We chatted about the state of the IoT ecosystem, externalized costs, and consumer trust. In the end, a short quote about cheaply made, and hence insecure, connected gadgets made the cut. The whole article is available online (in German) here: SPIEGEL: Internet der Dinge Ist der Ruf erst ruiniert, vernetzt es sich ganz ungeniert.

Thinking & writing

Just two quick links to blog posts I wrote:

  • The key challenge for the industry in the next 5 years is consumer trust is something I originally had written for our client newsletter, but figured it might be relevant to a larger audience, too. In it, I explore some of the key questions that I’ve recently been pondering and that have been coming up constantly in peer group conversations. Namely, 1) What’s the relationship between (digital) technology and ethics/sustainability? 2) The Internet of Things (IoT) has one key challenge in the coming years: Consumer trust. 3) Artificial Intelligence (AI): What’s the killer application? Maybe more importantly, which niche applications are most interesting? 4) What funding models can we build the web on, now that surveillance tech (aka “ad tech”) has officially crossed over to the dark side and is increasingly perceived as no-go by early adopters?
  • Focus areas over time explains how the focus area of my work has shifted from across a number of emerging technologies and related practices.

Newsletter

I’ve pulled my good old newsletter out of storage, blown off the dust, and been taking it for a spin once more. I’m experimenting with a weekly format of things I found worth discussing, some signals on my radar, and some pointers to stuff we’ve been working on. To follow along as I try and shape my thinking on some incoming signals, sign up here for Season 3.

What’s next?

Capacity planning for 2018 is in full swing, and it’s shaping up to be a busy year. There’s a couple of big projects that (barring some major hiccups) will kick off within the next few weeks. Just like I’ve done a lot of self-directed research in 2017 which has been tremendously useful, I’ll continue this kind of work in 2018. I’ll try to also write a lot to help spread what I learn that way. Between all of that, the year is filling up nicely. It looks like 2018 won’t be boring, that’s for sure.

If you’d like to work with me in the upcoming months, please get in touch quickly so we can figure out how best to work together.

Thanks and Happy Holidays: That was 2017

T

This is end-of-year post #10 (all prior ones here). That’s right, I’ve been writing this post every year for ten years in a row!

So what happened in 2017? Let’s have a look back: Part work, part personal. Enjoy.

Globally speaking I’d file 2017 under shitty year. So much so that I’ll try not to go into anything global or all too political here. But in terms of work it’s been quite interesting and impactful, and personally it’s been a pretty damn great year.

So, right to it!

The theme for 2017

Last year I wrote (and I’m paraphrasing to keep it short):

“(…) even in hindsight 2016 didn’t have one theme as such, but rather a few in parallel: 1) Growth & stabilization, in the business generally speaking, but also and specifically in all things related to ThingsCon. 2) Lots and lots of collaborations with close friends, which I’m grateful for. 3) Also, 2016 was a year for a bit of overload, I may have spread myself a little thin at times.”

Again, lots of collaboration with old and new friends. But this year I was a lot more focused, with lots of research that allowed me to go deep. I’d say in 2017, the theme was first and foremost impact. Impact through large partners, through policy work, through investments into research.

My work was with some large partners with big picture themes, like our work with Mozilla on trustmarks for the internet of things.

I hope to continue this high-impact work in one way or another.

Friends and family

Overall a bit of a mixed bag.

The bad: Some family members had health issues. Some friends received some nasty diagnoses.

The good: Some of the health issues were solved, we got to spend lots of time with close friends and family. Also, lots of babies were born among our friends, including one of our own. Welcome, little K! To be honest, this alone would make me love 2017. So yay, personal 2017!

Travel

For years I had been trying to cut down a little on travel to a somewhat more sustainable level. It kinda-sorta worked in 2017, at least a little bit. Still ways to go, but it’s a start.

Looking at my Tripit, this is what comes up. Tripit stats are a little fuzzy. (Did I mention I still miss Dopplr?) As far as I can reconstruct it on the quick, including vacation time I traveled to 7 countries on just 9 trips, and spent about 89 days traveling. (As opposed to 21 trips to 12 countries for a total of 152 days the year before.) So that’s great, even if it sounds like I might have missed a couple short trips.

pyrenees

Work

There was a lot going on in 2017, so I had to consult my monthnotes to refresh my memory. The focus is still, and ever more so, at the intersection of strategy, emerging technologies, and ethics/governance.

Lots of work around trustmarks and consumer trust generally speaking around the internet of things. Increasingly, artificial intelligence has also solidly established itself as part of the emerging tech canon I’ve been watching closely.

I wrote a lot. I mean, a lot. And I’ve enjoyed it tremendously. Outside my blog and some project-related newsletters and Twitter I did some long and short form writing:

If the writing is part of my overall communications landscape, then so is my website. So I relaunched that completely and restructured it for much more clarity.

I also got to work more with foundations, which is always fun. From workshops with Boell Foundation to research for Mozilla Foundation, the non-commercial, impact-driven sector is certainly an area I’d like to spend more time in.

Very Fun Side Projects

Then there are two “side projects” that have been especially fun this year: ThingsCon and Zephyr.

ThingsCon, our global community on a mission to foster the creation of a responsible & human-centric IoT, has been growing steadily. Milestones in 2017 include:

  • Another research trip to Shenzhen, the Silicon Valley of hardware.
  • We had a bunch of ThingsCon-labeled publications, including about Shenzhen and IoT trustmarks: View Source: Shenzhen, The State of Responsible IoT, A Trustmark for IoT.
  • We launched the ThingsCon Fellowship Program to recognize achievements and commitment that advance the ThingsCon mission of fostering the creation of a responsible and human-centric IoT generally, and support for the ThingsCon community specifically. Shout-out to our most excellent six initial fellows, Alexandra Deschamps-Sonsino, Ame Elliott, Dries de Roeck, Iohanna Nicenboim, Michelle Thorne and Ricardo Brito. I hope we’ll get the fellowship program into full swing in 2018!
  • New cities with salons or conferences around the world. Let me use stats from November: At that point ThingsCon events have happened in 20 cities across 12 countries, from Berlin to Brussels to Amsterdam and Milan to London and Shanghai to Austin and Copenhagen and Nairobi.

I can’t possibly tell you how awesome this is for me to watch and experience. Learn more at thingscon.com.

Zephyr Berlin, the trousers/pants project M and I launched on Kickstarter just over a year ago, continues to be a lot of fun. Just a few weeks back we produced another small batch of men’s trousers, this time with super deep pockets to make things like cycling with large phones super easy. So there’s a new batch of men’s, and a very small number of women’s available. Check out zephyrberlin.com to learn more.

Conferences

A lot less conference work this year. What I did in terms of conferences was mostly for ThingsCon. I always enjoyed conferences (both the curation and the planning, but the curation much more than the planning), but not having a conference to plan isn’t too bad either, to be honest. A lot of my other work, especially the writing, would not have been possible if I had committed to another conference.

As a directly related note, without the fantastic, lovable, smart and endlessly committed ThingsCon Amsterdam crew and their annual ThingsCon event (it just happened for the fourth time!), ThingsCon also wouldn’t be what it is today. My eternal thanks go to Iskander, Marcel & Monique and their team.

Speaking

As part of my cutting down on conference travel, I gave just a few talks in 2017. Most of them focused on IoT and consumer trust.

There were a few at ThingsCon events like in Berlin and Shenzhen, others were at Underexposed, TU Dresden, Netzpolitik conference, DevOpsCon, and Transatlantic Digital Debates. There were also a few (paid) in-house talks.

Media

It was a pretty good year for media and writing. Among others, my thoughts or projects were mentioned/quoted/referenced/etc on CNN, SPIEGEL, and WIRED. I had some interviews—the lovely conversation for Markus Andrezak’s Stories Connected Dots stands out to me.

Things started and discontinued

Started:

  • Writing more, if you’ll forgive me going so meta.

Continued:

  • Zephyr Berlin, producing pants that travel extremely well.
  • ThingsCon as an event platform, and growing it beyond that into other areas of engagement.

Discontinued:

  • My Facebook account, just now as I’m writing this. Bye bye Facebook. You feel like Old Social Media by now, not worth having around.

Books read

Read an okay, but not great amount. I think it was pretty much these: WTF, Tim O’Reilly. Control Shift, David Meyer. The Rings of Saturn, W. G. Seibald. Wiener Straße, Sven Regener. Go: Die Mitte des Himmels, Michael H. Koulen. Babyjahre, Remo H. Largo. American Gods, Neil Gaiman. Death’s End, Deep Forest, The Three-Body Problem, Cixin Liu. Goldene Unternehmerregeln, Bihr & Jahrmarkt. Schadenfreude, Rebecca Schuhman. Rapt, Winifred Gallagher. Shoe Dog, Phil Knight. The Story of My Teeth, Valeria Luiselli. Snuff, Terry Pratchett . Deep Work, Cal Newport. Bonk, Mary Roach. The Power and the Glory, Graham Greene. The Industries of the the Future, Alec Ross.

Firsts & some things I learned along the way

Firsts: Wrote a ton of long form and launched it properly. Cut an umbilical cord. Diapered a newborn. Merged photo libraries.

Learned: How to communicate my work (focus, offering, structure) better (as the website will demonstrate). To make time for writing, thinking, processing input. Some Python. Some more about tech policy. These are all qualitative upgrades in my book.

So what’s next?

It looks like 2018 might bring a fantastic opportunity to continue some of my work from this year and before in a big-impact context; if this happens, I’ll be extremely happy. (If not, I’ll continue chipping away at the same issues with all the means available to me.) I hope to continue doing lots of research and writing. I’ll take some parental leave at some point, and otherwise spend as much time as I can with the baby. (They grow up so fast, as I’m learning even now, after not even a month.) Some travel, and hopefully once more a month or two spent working from a new place.

I’m always up for discussing interesting new projects. If you’re pondering one, get in touch!

But for now, I hope you get to relax and enjoy the holidays!

I’m leaving Facebook

I

I’m leaving Facebook.

I’m not leaving in a huff, nor to make a strong statement. I simply haven’t been getting anything out of Facebook in a long time and like to do a good house cleaning from time to time.

To be honest, I’m a bit surprised myself to find myself leaving out of disinterest rather than conviction. (I do feel a little ashamed of that fact, but there you go. We all contain multitudes.) I never particularly liked FB, but used to use it a lot. And as someone who for a long time worked professionally in/with/about social media, there simply wasn’t a way around it, and that was ok, and I would say “I’m not a fan, but it works for x or y, and there’s no way around it anyway.” In 2017, this feels patently untrue.

I’d like to stress that I’m not judging for using or not using FB or any other platform. People like what they like, and it’s ok!

Personally, to me Facebook feels like an outdated model of social media. It feels a bit like reading the news on paper rather than my phone: It might be ok, but it’s just not for me anymore. Social conversations still happen of course, but the semi-public model, and more importantly the model that’s financed through driving up “engagement” (read: anything goes that gets you to click “like” or “share”) is one that feels kind of dirty by now.

For me, the conversations happen across a number of platforms. Slack and Whatsapp are a constant presence in my communications landscape, I still enjoy a good private Instagram, and of course I never left Twitter: It’s still the platform I use most, every single day, and I still get a lot of interesting and helpful interactions there every day. (I’m old school that way.)

Again, this isn’t a political statement. I’m 5 years too late for that, when many of my early adopter friends left. It simply feels like the party is over. That said, I’ll be happy to vote with my feet and take a tiny, miniscule fraction out of the “monthly active users” stats away with me. Facebook aligned its service a little too perfectly with their financial incentives, and picked dangerous incentives for my taste.

I’m of course a little worried about losing some contact details. I’m afraid there’s only so much I can do about that. The best I can do, at this point, is to share my contact details and hope everybody who needs them notes them down. They’re also easy to find online.

I might also keep a shadow profile to occasionally have a look at some pages I (notionally) manage. But given that we haven’t done a great job maintaining those anyway and you can tell by the lack of conversations there, we might just delete them altogether. The conversations for ThingsCon and my other collaborative projects are happening on Twitter and Slack anyway. Maybe it’s better that way.

Sincerely, P.