Today we're celebrating International Data Privacy Day with a special edition of our podcast, and a conversation with industry leading experts: Dr. Maritza Johnson, Rowenna Fielding, and Dr. Nathan Good. We've spoken to both Dr. Johnson and Ms. Fielding on the Zero Hour podcast. We're excited to have Dr. Nathan Good join us here, too. We talk GDPR, CCPA, and IoT. That's a lot of letters, but our conversation is, I'm happy to report, relatively jargon free. And, contrary to what you might think about privacy experts, these folks are insightful and hilarious. 

 

Without further ado, here are our experts and our conversation:

 

Maritza-Johnson Rowenna-Fielding _Dr-Nathan-Good

Dr. Maritza Johnson

Security & Privacy Researcher

Rowenna Fielding

Head of Individual Rights & Ethics

Protecture

Dr. Nathan Good

CEO, Appcensus

Principal, Good Research

 

 

It strikes me that the principal challenge to advocating for data privacy, is that we are swimming against a behavior that has become codified since the internet started going mainstream in the 90s: trading privacy for convenience. How would you address this topic with customers who have become accustomed to display ads reminding them what they left in their carts, or are now inured to "personalized" recommendations?

Maritza Johnson: I'm hopeful that we'll reach a point where sustainability of technical deployment is as important as innovation. Truly stopping to ask that important question of should we, instead of stopping at can we? I'm not sure that the pressure will come from consumers, yeah, I'd be very surprised if it did. It is so tough for any one person to understand the details of how data is collected, used, shared, and protected that I'm just not even sure someone could assess the broad implications. There's a basic level of transparency that's simply missing. And depending on the service or app, it's even harder. We already know from things like social networking that opting out isn't always possible.
 
Rowenna Fielding: I don’t believe it’s an active, consensual trade - more that people have resigned themselves to a learned helplessness in the face of mystifying technological gubbins. In my experience, most of the people who would willingly trade their privacy for convenience are those who would be affected least by the shadier uses of their data; people who aren’t routinely discriminated against, judged unfairly or expected to bear the weight of society’s biases. These people don’t see the damaging effects of surveillance and profiling, so they don’t feel the need to employ defensive tactics. For those people who do understand the risks, the amount of effort that is required to protect themselves is overwhelming. I’d address the people who work in digital marketing, web design and social media instead, as they are the people benefiting from these technologies most. If I can get even a few organizations to consider privacy-friendly alternatives, then that’s a few thousand fewer data points in the hands of the organizations who profit from manipulating our lives through our data. That’s a win.
 
Nathan Good: Given the pervasiveness of retargeting and personalization, one could easily assume that people are comfortable with the technology that performs this. I’m always surprised how this is not the case. When consumers speak to me about what privacy means to them, they almost always point to a specific instance related to retargeting or personalization did something creepy and unexpected. For example, showing ads for shoes they were looking at on a blog now are showing up in their social media, or being recommended a hotel to a destination they only mentioned to a friend. The biggest surprises I get are when these kinds of ads show up and it isn’t obvious to the person how the technology could have known, and they get creeped out connecting the dots. The systems have gotten so good in some cases that they seem to know what the person is thinking about in real time, and are so accurate that people let their imaginations run wild. Its hard for experts, and even harder for consumers to understand how much digital information they leave can be combined together to create pretty accurate inferences. From my perspective, the challenge in privacy has been less about people getting so used to technology that they don’t notice it, to more about trying to provide business, consumers and even government with the tools to explain, understand and manage customers choices when they do notice it. Context is an important part of privacy, and providing tools for managing context would help reduce surprises and the kinds of polarized behaviors we are seeing customers exhibit today.

 

Ever since Cambridge Analytica, we have seen a torrent of headlines about data breaches, with the number of compromised records seeming to increase by orders of magnitude every time. If you read a bit deeper, these breaches, while associated with particular brands, actually tend to be the result of a third-party vendors misconfiguring their databases or just not securing them at all. What are steps that companies can take to reduce this risk among suppliers and partners? And what are the legal implications, either through GDPR, CCPA, or other legislative measures?
 
Johnson: We need better tools for audits and compliance. Sometimes it feels like third party vendors are bound by a pinky promise and a vow to take privacy seriously. It's not enough and we have to do better. It'd be great to see advances in technical tools for enforcing limitations on data use.
 
Fielding: Spend more than zero time and effort on pre-contract assurances (how much time and effort should be proportionate to the risks, not just to the organization but to the data subjects also) - preferably ask for audit reports and policies to review. Don’t choose suppliers based solely on price or buzzword bingo score. Remember that contracts are not an effective protective control on their own - anyone can make promises, but making sure those promises are kept is critical. While a contract might give you the standing to go to court to recover your losses if the third party screws up, that’s no use to you if your business has been damaged too badly to afford legal representation, or if you can’t show that you were diligent in your responsibilities. The legal aspects will only kick in after the operational, so manage the operational risks first.
 
Good: As you pointed out, so many data breaches are a result of companies or third parties not doing basic security practices that we have known about for many, many years. Patching systems, changing default passwords, assigning and maintaining roles, updating and maintaining firewalls, the list goes on. When working with third parties and suppliers, there is an added risk that they could flat out lie to you or in the more likely case not even know what is in place. There are several ways to mitigate, but the most important is asking the right questions and being familiar with standard security practices. Definitely don’t work with anyone who can’t provide a reasonable security story around their product that can address at least the most common issues. For more risk critical applications demand and expect more proof that they have good security and privacy controls and they can back up their story. Legal frameworks like GDPR and CCPA speak to obligations companies have to protect data, and it would be a lot to go into these in depth. The spirit of much of the regulation is the same though: Take responsibility of knowing what data you have and take means to inform users and protect it. If you are able to put the customer first when thinking about data protection and privacy, then that goes a long way towards both delighting customers and working with regulations.
 
Today, we're seeing a host of products come into the market that are designed for privacy: New players like Brave browser, that are actively taking on data-sharing ad models and organizations, and products from legacy companies, like Verizon's new OneSearch privacy-focused browser. What is your take on this shift in the market?

Johnson: I'm thrilled to see privacy emerge as marketing fodder. But to be honest I always get nervous about whether or not the claims match the implementation, I don't want people to get burned out on false promises.
 
Fielding: It’s really encouraging to see organizations breaking away from the stranglehold that surveillance capitalism has on the internet. I hope it continues and we see more organizations like Brave, Disconnect and DuckDuckGo emerging in the consumer space. However, it’s still an arms race, where the big guns are in the hands of companies whose business model depends on undermining people’s autonomy and personal space, which is why we also need regulation and enforcement to keep things level.
 
Good: I’ve found this very encouraging, as it is much too difficult for customers to try and manage privacy on their own. Without tools and technology in this space, customers will not be able to exercise the control they need nor will we be able to move the privacy conversation towards things that work. I also find it very encouraging that companies have recognized the need for some type of privacy story as they talk with customers, and privacy conversations are easier to have with companies nowadays.

 

We have GDPR, we have CCPA here in the US, rapidly being followed up with the proposed CEPRA revision. Laws in New York. What does this variegated legal landscape mean for businesses? How should they incorporate the spirit of these regulations into designing their processes or products?

Johnson: Smart businesses would consider legislation as a minimum bar. Read the legislation coming out and aim to do better. If your goal is to merely be compliant, you're not doing enough, and you're probably not doing it in a way that matters.
 
Fielding: Decisions about risk in relation to values need to be made right at the top of organizations, not left to operational circumstances. Right now there is a huge disconnect between what organizations purport themselves to be, and how they actually behave. Part of this is a tendency to characterize ‘compliance’ as a burden or a barrier to be wriggled around, especially when it interferes with revenue. The spirit of the GDPR is “don’t be a git (with people’s data)” - if processed, systems and products were designed with this as a first principle, then "compliance" becomes a side effect of doing a good job as opposed to an undesirable optional extra. There’s a conflicting imperative between "maximize growth and minimize cost" vs "don’t harm people" which is why we have regulation in the first place. Managing this conflict is something that executive boards should be doing but aren’t - instead they reap the rewards of the former while throwing their minions onto the bonfire for getting caught making the "wrong" decision. An organization that places human rights over profit is less likely to annoy their customers or employees into making the sort of complaint that triggers regulatory scrutiny, so my advice to organizations seeking to navigate the maze of international law is "pick the highest standard for human rights and corporate social responsibility, then you’ll be on the safest ground."
 
Good: I think it is important for companies to remind themselves of the motivation and intent of these different legislations, and do their best to align themselves and their operations with the intents. GDPR firmly established privacy as a human right, and CCPA was created in a reaction to rampant and unchecked data collection and usage from companies. An undercurrent of all of this legislation is public desire to re-establish a basis of respect between consumers and business moving forward. If companies strive to respect their users and their own sense of agency, a lot of good design ideas and implementations naturally fall out of that process.
 

Even before 5G's arrival we've already seen breaches of IoT devices. What would you tell product designers to keep in mind vis-a-vis data privacy? For example, should there be a universal protocol for how devices communicate with one another and share data?


Johnson:
 I think product designers are the real leverage point for making changes to data privacy and security. Don't settle for the status quo on data collection and use. Think hard about the context in which people will use your product, what are they expecting based on the value that your product offers? People don't want to be surprised by unrelated bonus features. When they install an app that does a certain thing, they expect to get the feature and only the feature. Don't be lazy and do something in a certain way because it's easiest or that's the way it's always been done. Minimize the amount of data collected, minimize how long the data is retained, and reduce the fidelity in which it's stored.
 
Fielding: Know your data protection rights and demand that they are upheld. Don’t take any corporate claim to virtue at face value. Say no to surveillance. And don’t mistake security for privacy - a company that’s got your data locked up right may still be abusing the hell out of it within their own walls.
 
Good: Much of what people refer to as IoT technologies and the kinds of public breaches we have seen are using standard internet protocols and have the same kinds of security issues we have seen of early routers and other early connected devices. Open ports, weak and unchanged default passwords, outdated OS or vulnerable packages, etc. Similar to what we see for other security issues like data breaches, the basics would really come in handy here. iOT devices have challenges with power limitations, etc, but overall many of the devices that have contributed to massive attacks had the capability of thwarting those attacks using simple means that we know of already. In terms of privacy and security, IoT isn’t uniquely behind other industries. Security, privacy and correctness are a common goal that IoT shares with all other software-driven industries, and there’s much room to share techniques to benefit everyone.
Share this post:   
Last updated
July 6, 2020
Kevin Walter
Written by
Kevin Walter

Subscribe to our blog

RELATED ARTICLES