If you’re not one of the 37 million people whose data was hacked in the Ashley Madison breach, you can breathe a sigh of relief.
The Ashley Madison story may be great for a few news cycles of schadenfreude, but it also illustrates the realities we face in the age of data ubiquity: as people, consumers, businesspeople, patients and citizens.
1. Intimate data about us is everywhere. Our purchases, location, sexuality, religion, health history, political party, whose house we went to last night, the stiletto heels or sleek watch or expensive bourbon we clicked on on a website–is out there, somewhere. In most cases this data is protected by layers of security, encryption, policy and regulation, but, as we’ve seen from Anthem to Target to Ashley Madison–it’s not always effective. Beyond data security, however, is the question of how this data is actually used by the businesses that collect it. Is it to deliver better services, products ads? Is it being sold to a third party?
2. Profiling is not just for the FBI. Marketers love profiling. Why? Because good marketers realize that it’s good business to sell you something you are likely to want, rather than wasting your attention (and their money) on trying to sell you something you don’t. So, naturally, they want to know more about you: who you are, what you covet, where you shop, where you live, how old you are and how much money you have, so they can target ads and products and services more effectively. Whoever you are, you’re profiled somewhere: thrifty boomer, young married, millenial hipster; sounds like a Hollywood casting call, doesn’t it? Like any tool, profiling can be extremely effective when properly used, dangerous if not.
3. You leave digital footsteps everywhere you go, and they just may live forever. Everywhere you go, you leave digital traces. Even if you were “just browsing” in a store, you may have left a digital trace if you used a retail app, and/or the store used beacons or shelf weights. Add to that your web, mobile and social activity, and any apps you’ve used. Now imagine a ten-year timeline of that data being used to try to predict your next purchase. Or next spouse.
4. Chances are, you haven’t the slightest idea what data is being collected about you at any given time. If you want to do a simple test, install Ghostery on your web browser for a while. It’ll tell you what data is being collected by the website you’re using. Did you know this data is collected? Do you know how it’s used? I bet not.
5. Your data may be cheating on you. When you clicked “Accept” on any one of a number of apps you used, or bought a book, or downloaded a movie, you may have digitally consented to share this data with third parties. But did you really know what you were consenting to? Sometimes this is a non-issue (some companies will never share your data with others). Sometimes it can have uncomfortable implications, as when Borders declared bankruptcy, and decided to sell one of its greatest assets–its customer purchase history. (The FTC stepped in and required Borders to provide an “opt out” option).
To be clear, I’m not saying any of this is inherently bad, or suggesting we can roll back the clock; it’s just reality these days. But as data becomes more intrinsic to our lives and our business, I believe in finding “teachable moments” anywhere we can:
- As individuals, there will never be a better time to educate ourselves about what tradeoffs we are making, consciously or unconsciously, with our our data.
- As business people, we need to decide what kind of data stewards we will be, especially as data becomes more ingrained in business strategy.
- As an industry, we need to start putting clear and practical norms in place to clarify these issues so that we can have a fair and productive conversation about them and, frankl,y set a good example.
I’ve outlined a lot of these issues and recommendations in The Trust Imperative: A Framework for Ethical Data Use. If you’re not lying on a beach somewhere, I’d love your thoughts and feedback.