Now, I can’t guarantee that this will be a regular thing, but since I seem to be struggling to write smut at the moment, I might as well do something useful with all my writer’s block frustration, and writing about the topic of digital privacy is something I don’t need my creative mojo for, because I’m an unrepentant nerd.
I’m going to get the red pen out and start critiquing smart sex-toy companies’ privacy policies. Data about your sexual activity is about as sensitive and confidential as it gets (‘confidential’ doesn’t mean ‘must be kept secret’ here, but rather ‘you should have exclusive control over who gets to know it’). Unfortunately we still live in a world which has a negative and harmful attitude to sex, which means that data about sex is more often weaponised against people than it is used to empower and nurture them. So, in my opinion; sex toy manufacturers whose products generate or collect digital information about product users should be extra-careful about safeguarding digital privacy and security when it comes to their products.
A tweet from an information security researcher to WeVibe caught my eye this week – saying that the tweeter had been trying to get in touch for a while about a security problem with their app, but hadn’t had any response. Now, you may recall that WeVibe has been caught out being bad at data security and privacy before, so you’d think they’d be on top of this sort of thing. I hope they sort it out quickly, and I’ll be uninstalling their app (after this review) until I hear that they have.
Security red flags are often an indicator of privacy red flags, because where one has been overlooked, you can bet the other was as well. Based on this theory, I decided to
torture myself have a look at WeVibe’s app privacy information to see if that were the case.
I’m looking at this through the lens of European culture and privacy law, because that’s what I know – this isn’t a compliance audit though, it’s my personal opinion on whether WeVibe meets MY minimum standards for privacy (which are based on my knowledge of the data economy, the GDPR and my personal code of ethics. YMMV.)
My bare minimum standards are as follows:
- I want to know what data is being generated/collected, used for what purposes and on whose behalf.
- I want to know which legal rights I can exercise in relation to which datasets and uses of data.
- I want contact information for the Data Protection Officer, if the GDPR applies.
- I want any use of data collected or generated about my use of the toy to be excluded from: personal profiling, predictive analytics or microtargeting, for advertising in any way; by default and forever, unless I take active steps to opt in (which I’m not going to do, because I know too much to trust y’all)
- I want the toy to have basic functions without requiring me to install any apps, create accounts or register anything.
Disclaimer: This isn’t a technical security review, legal advice or a professional service. I’m doing it because I’m a nerd; make your own informed, risk-based decisions as you please.
Let’s see how WeVibe’s We-Connect app does…..
(The app allows WeVibe toys to be controlled from a mobile device, and for partners to share control of a toy even if one isn’t physically present)
At least I can access the app privacy info without having to connect a toy or register an account.
- It’s clear at the start of the policy that WeVibe toys can be used without the app, so you’re not being forced to agree to your sex life being spied on as a condition of using the toy.
- The purposes for which data from the toy will be used by WeVibe are set out, along with some clues about what data that actually is, and what will happen to it.
- It tells me that people in the EU have rights over their data and provides contact information for exercising them. If my country were still in the EU, this would be much more of a comfort to me than it is right now.
- The advertising analytics option is OFF by default
What’s not so good:
- The language is dense with legal jargon in places, so there’s a high barrier to comprehension on the part of the reader – undermining their ability to make an informed decision
- I want more detail about what data is going where, and why. Listing one example is a good start, but makes me suspicious about what’s been tactically omitted.
- The standard ‘we don’t share or sell your data unless one of the vaguely-described loopholes lurking in this document applies’ statement….which then goes on to say, essentially, ‘we, and companies we do business with, will probably pass/trade on your data if we think we can get away with it’. I want to know who, what, why and where!
- LOL, Privacy Shield. If you don’t already know what that is and why it’s suddenly become a massive headache, then count your blessings.
- ‘Anonymous’ App Data does not mean what you think it means. The individual pieces of data may not point directly to you, but when they are put together (phone specs + IP address + geolocation + connections, etc) then it becomes possible – even easy – to single people out by their unique patterns. And the app uses Google Analytics, which means that if you enable it, Google gets copied in on every bit of data you generate when using the toy. WeVibe can’t see or use that data at an individual level, but Google certainly can – and does.
- The retention and deletion policy basically says ‘we keep the data we believe we need, for as long as we think we need it’ which doesn’t tell me anything useful at all, and makes me suspect this is a boilerplate phrase that’s been copied from somewhere as an alternative to actually coming up with a real data management plan. Not confidence-inspiring.
What WeVibe could do better:
- Describe the dataflows, rather than just hinting at them. Better yet; provide pictures
- Be more specific about which data is retained, for how long and why
- Sort out their app security response so that security researchers who have identified problems can raise them without having to take it to Twitter after months of being ignored
*Warning! These things are not the same! Privacy is about upholding human rights and a degree of personal autonomy, where as security is about protecting stuff. There’s an overlap, but it’s smaller than you might think