S3 #12 Integrate Privacy To Unleash Business Value

S3 #12 Integrate Privacy To Unleash Business Value

S3 #12 Integrate Privacy To Unleash Business Value

Guest:
Guests:
Anthony Prestia

Anthony Prestia

Anthony has more than a decade of privacy experience as an attorney and developer of software to assess privacy compliance. Before joining TerraTrue, Anthony served as Senior Privacy Counsel at Snap Inc. and spearheaded the evolution of Snap’s world-class privacy program to comply with the GDPR. Previously, Anthony was an associate at Perkins Coie, working as outside privacy counsel to startups and Fortune 50 companies. Before that, he drafted the first self-regulatory privacy code for the mobile advertising industry and developed software to monitor participating companies’ compliance with the NAI (network advertising initiative).

We're living in a very complex privacy world compounded by new data privacy laws in a slew of digital technologies, such as IOT, mobile, immersive tech, and increasing vulnerabilities. We're all wondering how to balance the risk and opportunities for your organization, which is why Anthony Prestia of TerraTrue joined this episode to explain the best approach to compliance and maximizing rewards.

Keywords:
privacy, consumers, data, risk
Season:
2
Episode number:
12
Duration:
33:58
Date Published:
July 16, 2022

[00:00:00] KRISTINA: We're living in a very complex privacy world compounded by new data privacy laws in a slow of digital technologies, such as IOT, mobile, immersive tech, and increasing vulnerabilities, as we all know. We're all wondering how you balance the risk and opportunities for your organization, which is why we're here today. So, stay tuned.

[00:00:19] INTRO: Welcome to The Power of Digital Policy, a show that helps digital marketers, online communications directors, and others throughout the organization balance out risks and opportunities created by using digital channels. Here's your host, Kristina Podnar.

[00:00:34] KRISTINA: Businesses everywhere are challenged by increasing needs to weigh the risk and opportunities that arise for massive amounts of available consumer data. With us today to discuss the nuances is Anthony Prestia. Anthony has more than executive privacy experience as an attorney, believe it or not, and also developer of software to access privacy and compliance solutions. Before joining TerraTrue, Anthony served as the senior privacy counsel at Snap incorporated. If you don't know, they are the makers of Snapchat. He also has advice startups and fortune 50 companies. And one of the things that I found out recently about him is that he's to live in DC, but we're not going to hold that against him. So, if anyone knows how to best balance opportunities in risk of digital and data, I think it's going to be Anthony. Happy to have you here today. Thanks for making time to share your knowledge and insights with us.

[00:01:25] ANTHONY: Really excited to join. This should be fun.

[00:01:28] KRISTINA: Oh, I'm so excited about this because it's very rare. I think it's only been on two occasions now that I've been able to talk with lawyers who are also fun because they understand the legal side and the operation side. So, you're in that sweet middle lane.

[00:01:40] ANTHONY: I also try to avoid talking to lawyers, whatever possible, despite being one. So makes sense to me.

[00:01:45] KRISTINA: Awesome. Awesome. All right. I found the right person to hang with today then. So, you've spearheaded many successful data compliance programs over the years. I want to just dive in and find out what is your formula for helping organizations adopt new technologies and data analytics in sync with emerging global privacy regulations that are kind of like a, whack amole a new one every day.

[00:02:05] ANTHONY: Yeah. So that's a really great question. I actually started in the privacy space before smartphones were even a common thing. And I think back then; we spent a lot of time thinking about cookies and banner ads and a lot of these privacy concerns that really only mattered when you were kind of inside and sitting at your computer, maybe even had to ask a family member to get off the phones. You could do it. And then we jumped to today. And our relationship with technology, I think, has fundamentally changed. Probably every person listening to this has a smartphone nearby. That's always sending data from a dozen sensors to someone on the internet. We're not exactly sure who, so the amount of data being collected and kind of the potential for good in harm is just so much greater than it was, say, a decade and a half ago. Despite that, though, I do think the core principles and practices to build a good privacy program haven't really changed all that much. And I think, at this point, we could talk for an hour about ways to identify and prioritize compliance requirements and things like that. I'm going to skip it because, number one, I think it's table stakes for any privacy program. And there are a lot of resources on it out there in the wild. I also think it's kind of dreadfully boring. So, I'd like to talk about, and I think, maybe two core things that I think transform a program from maybe a box-checking compliance program into like a truly successful privacy program. And I think those are both trust and education. So, what is trust? I think it's really about your privacy function. Building a relationship with the rest of the business. Meeting your marketers, PMs, engineers, C-suite, whoever it might be, and making it clear that you want to be a trusted partner of the business and not just kind of this obstacle to overcome on your way to accomplish something. I think the first step to do that is just making yourself available. Making it easy for anyone to reach out to you in whatever method they like. I've seen it all. Jira tickets, emails, Slack messages, people showing up at your desk in the olden days, just make yourself available. And the second is to just really get to know the people that are on the ground designing products, deploying marketing programs, developing software, and spending most of your time listening to try to figure out what their goals are. I think as a professional privacy program, it's not your responsibility to tell people what they can and can't do, but it's really to help them achieve their goals in a privacy safe way, and you can't do that unless you know what they are. And that kind of butts up to the second point, which is all about education. So, I fundamentally believe that most people want to do the right thing when it comes to privacy. And just don't know how. A good example I think, is back when I was at Snap, we were trying to come up with a set of privacy principles alongside our kind of designers and program managers that we could really like to look to when we were trying to address issues. And one that really stuck with me was this idea that we only wanted to develop products that handled information with the same care we'd use, like for ourselves, our friends, and our family. And I think. Probably true of most people at most organizations. So, it's really about creating this framework for folks to make these good decisions and spot potential issues and making sure that every interaction between the privacy team and the business is a learning opportunity. Never just saying, yes, you can do this. No, you can't do that. But building that framework, right. Your goal shouldn't be to make everybody in your org, fluent in the nuances of every single privacy law, but really to help give them a good feeling for, you know, a core set of privacy principles that have been around since the seventies and like really developing empathy for their end users. And ultimately just saying, hey, reach out and ask if you're not sure there's, there's never going to be a penalty there.

[00:05:55] KRISTINA: So, do you think we're going to actually achieve a point in time in anytime soon, let's say in the next decade even, where we're really all practicing the right, right, I'm going to put right in quotes here, but the right data privacy practices, because I oftentimes wonder if we're not on this journey almost like accessibility. Web accessibility. Digital accessibility has been around since 95. And we're only at a point now where governments are required legally to do it. Some countries do require private sector companies to also be accessible. But even though we know it's good for us and we know it's good business, forget about, just being nice and trustworthy. I mean, it's just good dollars and cents for most businesses. And yet most organizations don't do it because it's a little bit hard, people have to learn. Are we there with privacy as well? Or do you think it's going to go a little bit faster?

[00:06:42] ANTHONY: I think it's going to go faster. And I think realistically, we need to demystify the space, a good amount. We talk about the complexity and certainly in the US here, we've had kind of a, a patchwork of sectoral laws and things targeted specific types of data processing and those sorts of things. But realistically, like I mentioned, before a lot of this boils down to some core principles that were developed in the seventies, right around, uh, what folks worry about or ways to treat people's data in a responsible way. So, things like, you know, giving folks access to data, being transparent about how you're going to use it and those sorts of things, and just having better literacy among business folks to understand these concepts because realistically, a lot of 'em don't require. Technical implementations or things like that. It requires just for them to be in the mind of your business folks and your engineers when they're developing product or coming up with a marketing campaign and those sorts of things. So, I think as we get better literacy across the board, as we develop tools to make, even reaching out for help and things like that, faster and easier, or even understanding how your orgs already using data, I think we'll get there. Certainly, it's going to be a long journey. It's been a decades long journey at this point, and we're always going to be outpaced by the technology itself. And, and I think that's right, you don't want regulation or practices to be ahead of the technology necessarily, but you do want them to respond in a timely manner. And I, and I think we're on the right path for sure.

[00:08:08] KRISTINA: I was thinking the other day, mostly around email, just because I get these crazy. So, my complaint isn't that companies are collecting my personal information. I'm just like, okay, look, if you're going to collect it, can you please use it at least? I get these crazy emails constantly. And the other day I got one from like the same company, two minutes apart for completely different things, with different percentages off. And it made me think like, if I wait and I don't buy anything for another hour, Will you up the game and offer me not 10%, not 20%, but 30%. And maybe I should just be a holdout. And I'm thinking to myself, what crazy marketer out there is doing this. And I think what it has happened is I give people way too much credit. I think they're collecting all this data and not actually doing a good job with it. And so how do you deal with that instead of the organization because it's not just about laws and compliance. Data is an opportunity, if you know how to use it, it becomes an opportunity. So how do you flip that inside of the organization? So, it's not a conversation constantly about risk and not doing something because consumers like me, arguably, aren't afraid to give up some of their data if they can get something really good in exchange.

[00:09:17] ANTHONY: Yeah, look compliance again, I think is kind of table stakes and hopefully something you don't spend most of your time talking about when you're trying to do privacy by design and things like that, place that burden on folks like me that are going to have to keep your records of processing up to date and deal with all of this nonsense that no one particularly enjoys and instead tackle these real kind of interesting product questions when it comes to, how do we use people's data? So, it's understanding when I talk about getting to know the developers and folks like that, again, figuring out what their goals, what are they trying to do with this email campaign, right? Like you're, you're trying to sell something. You're trying to offer the right discount to the right person at the right time. And those sorts of things and helping them scope the data collection and use to achieve that goal in a way that you're using kind of the minimum amount of information necessary to do it. But you're also using everything that you collect, right? Like having these conversations may help you even realize, oh, we collect all this stuff. Well, we, we either need to use it and deliver an effective product to achieve our goals, or we just shouldn't be collecting it in the first place and let's get rid of it. So, I do think these conversations when you're doing sort of a privacy by design review or discussion can lead to better product because you really start to wrap your head around the data you have, how you're using it and can potentially improve what you're doing. I, I hear you. There's a fine line between, hey, why isn't this better? And hey, please stop collecting information about me and hopefully we can help resolve that through these processes.

[00:10:46] KRISTINA: So that also makes me think, because it always comes down to the human, that's really what you're pointing me towards is it's about the human it's about that marketer who thought it was a great idea to collect the information, but then nobody did anything with it. And so, you're possibly putting your organization at risk and necessarily, you're not actually getting any benefit. It's only a risk to you. How do we best address the human factor, becasuse it really is about the people, I think. And, and that's why it seems to be taking us so long to hopefully Nirvana state around privacy. So, like, how do you improve that privacy compliance chain when it's about the people.

[00:11:17] ANTHONY: Look it, you hit it right on the head, right? Like if there is a privacy issue, it's always the human factor. Like privacy and privacy risk is a very kind of human concern. If you're out, say, jogging in the forest alone, you're surrounded by life, but you're not worried about what the birds think of you. What we really get worried about are these technologies developed by people and how people end up using our data. Sometimes you'll hear folks say, oh, it's the technology. But technologies are designed by people. If, even if it's ML and AI, somebody decided what, what's the corpus of data we're going to train this model on, how is this model going to look at it? What is the output of it going to be? What are the reach and ultimate use of this data going to be? So I think at every step, there are humans making decisions, right? Whether it's your organization or a marketer internally deciding to use data in a certain way. Or maybe they're working with a vendor who's going to send out email marketing and things like that. That's had a breach or doesn't have the best privacy practices. There's a whole chain there. So, there was your internal decision around how to use the data. There was your decision to work with a certain vendor. Did you do vetting of their privacy practices and things like that before you started working with them, there's that vendor itself; when they were designing their products and processes, were they thinking about privacy issues? So, I think by the time we end up with a privacy concern, we've had to go through multiple tiers of folks, maybe making a decision that was not privacy-protective. So, the way we address that is again by better education across the board; there are so many human checks step if just one of them is in the position to take a critical eye at how the data's being used and what the impact could be on individuals, we can kind of snuff out some of these problems. So again, back to my cure, the two things you really need to think about for a successful program, one of them is education, and there it is right there, making sure we're educating folks across the business to identify issues and they've got the right framework to try to find these things.

[00:13:18] KRISTINA: So, there's always that consumer-facing component and educating individuals so that they're thinking about the consumer and the data we're collecting, and are we doing the right thing or not? What about the people that are working inside of an organization? And I asked that because the other day I read that Accenture has decided to onboard 150,000 employees through the metaverse. And there's a whole bunch of biological data and information that can be collected anything from like, your gaze and are you paying attention? How alert are you? It could impact anything from your performance on your job, all the way to do we offer you healthcare. And so, it's not just about the consumer; it's also about the employees. And so, what advice do you have there? Is it the same thing, and we're just not paying as much attention internally?

[00:14:05] ANTHONY: I think you have to look at the type of organization you want to be and think of your employees and consumers and everyone else, as I mentioned before, like you would your family and your friends, and what do you think the right thing to do for these folks are like, obviously you consider efficiency and business goals and things like that, but it's always going to be about what is a potential harm to the individual and how much of a dystopia do you potentially want to live in? I think, and so really kind of designing your products and processes and internal, uh, programs as well to make sure you're respecting individuals and the type of data you collect about them and how you use it. So, I, I try not to think of employees any differently than I do consumers. We talk about consumer data a lot because it's the stuff typically gets the headlines. And it's something that's kind of ubiquitous and impacts all of us, whether we want it to or not. There's not like that relationship between employee and employer. It's more like I'm out in the world and there's microphones and cameras everywhere. But I think it's equally important to address those problems for employees, for sure.

[00:15:07] KRISTINA: And so how do you balance that out? So, it seems to me a little bit creepy. An employer's going to have all this data about me and potentially use it in a way that I don't want it to be used. But then I thought, if somebody's about to have a stroke or a heart attack and they're at home, and what if an employer could be able to tell that? And they can summon 911 emergency folks, and all of a sudden, that person has their lives changed. I think it's about perspective, and when is it okay to use the data when it's not, but do you feel like we're at a point where people are really thinking 360 about this kind of stuff because it doesn't feel like it, at least not to me.

[00:15:43] ANTHONY: Look, some are, and some aren't. And I think over time again; we're going to see more literacy and privacy and folks think about these things. Yes, there are wonderful uses for technology like them, especially ones that monitor your heart rate or these biological signals that could do something like say, oh, we think a stroke is happening here, but you do have to ask the question, right? Who's the right person to be monitoring that and making that decision. It feels odd to me that the employer would be the person doing that in many cases; what is the real motivation for collecting this data? How do you need to use it? What's reasonable here. Like realistically, what is the reasonable expectation between the person collecting and processing that data and the individual whose data it truly is. So, I think that's the important thing to consider and have sort of those purpose limitations in place, which is a core concept in privacy. And we do have a lot of protection as well around these things where we are making decisions that can have like a significant legal impact on an individual. So, when you're talking about insurance decisions, things like that, can you rent a property? Can you get a loan, these sorts of things that impact your life? They could pull in data from any number of signals, right? That's one of the core things I think many privacy laws do try to address and I think is valuable because, look that's a real impact to your day to day, that you may not expect that your browsing history impacts your ability to get credit down the line or something like that.

[00:17:07] KRISTINA: Yeah, that's a little bit crazy. And to me, it was crazy meeting the other day with our school board representative, where I did not realize the extent to which they're collecting data about my child. It's every single search term from the moment he stepped foot into the school and was handed his Chromebook has been collected and stored. And I was like, wow, interesting; for how long do you store that? What's the record retention? I'm cool with you monitoring him, making sure that he is not cheating, but I'm thinking, like, if he's going to get a job in the real world, is it okay to have that history somewhere on file? It could come back to bite him.

[00:17:42] ANTHONY: Yeah. And look as the parent, you should have been the person in the position to understand what sort of data was being collected. And honestly, kind of give your thumbs up or thumbs down as to how it's being used and how long it's being stored and that sort of thing. So, part of it is just making sure the right information is getting to the right people. And it sounds like it maybe didn't happen in that case. But ultimately, I think, look, there are, are good aims of preventing cheating and things like that, especially in a remote environment where it becomes more challenging. But I do think we need to also question when developing software and things like that. If there's some sort of overreach, right? Think about what the norms are in a day-to-day conversation or interaction, and try to stick to that, try to uphold some of these norms rather than take it a step further. And I think this goes back to the Snap experience again. One of the core things that app was all about was having these ephemeral messages to make online communication more like a conversation. I was going to say, you and I are having this recorded. So, it's a terrible example, but a conversation you may have with a friend or a family where, you know, there's not a permanent record of it, and you're more willing to be yourself, and genuine, which kind of disappears when everything is being monitored in the catalog. So I think when we use technology to collect this much information, we really need to think about what are the individual and societal impacts, um, that come out as a result of that. Does it change human interaction, just knowing that you're being observed and reported? And is that the outcome we ultimately want? And these are the sort of questions I think you should be asking when you're working on products and features and things like that.

[00:19:26] KRISTINA: Well, is it the case that a consumer can ever assume that they're not being recorded? Because I'm thinking about even for legal purposes, we keep a record of certain things. You're right. This is a bad example because not only are we being recorded, but we're purposely recording, and we're going to blast it around the globe. But if I was right now using Snapchat, for example, to chat with you, I would anticipate that whatever I sent to you is going to be deleted. That doesn't really happen. Right? There's a record somewhere on some server. I presume if nothing else, for some type of record retention or legal compliance purposes. And so, is it the case we should just always assume sort of what your parents told you, which is, if you don't want it out there, don't put it out there because it's always going to be somewhere, there's always going to be even like forensics traceability that something happened?

[00:20:12] ANTHONY: So, look, as a general practice, it's good to assume that things you create digitally will be out there, right? But again, that goes back to my original point, because we can do this, it's shaping how we think and interact with one another online, and you will be different if you're acting a way that you think everything you do online is reported, then maybe you would be in person where you feel a little more genuine or frivolous or whatever that you don't want to be associated with your business persona online or whatever it might be. Generally, if you're concerned about these things, it's a good practice to have in mind, but I don't think that technology has to develop that way. Certainly, when we talked about communication methods that are end-to-end encrypted; realistically, those shouldn't be accessible by anyone in between, right? It's just the note on either end, sure. The person receiving it could record it and be a bad actor or whatever. But for the most part, we can design systems that do not give the intermediaries access to that information. And we should consider like when that's appropriate and all in various cases. So.

[00:21:15] KRISTINA: How do you do that in a multinational organization, especially when there's a complex environment. What is a practical approach, especially to somebody who maybe is in charge of a marketing team and trying to figure out, okay, how many laws do I need to think about? Or how many laws do I want Anthony to be thinking about, how do I actually deal with this really complex environment? Or is it the case that we should just take the most stringent of all laws and comply with the most stringent of laws, which just seems like a joy kill and doesn't allow for a lot of creativity and innovation because we're going to be so incredibly nervous that we're probably just going to shut down all digital because that's going to be the safest?

[00:21:51] ANTHONY: Yeah, now we're going to get into some fun compliance talk, but I'll try to keep it to the point and relevant for this audience. So, look, realistically, I think when you're in that sort of scenario, you need to make some initial kind of high-level decisions, right? As you suggested, do we want to align ourselves across the board to kind of the most stringent standard? And we do that same thing everywhere or do we want pick, essentially what our floor is, and then kind of ratchet up and down in different regions depending on what's permissible there. And I think a few things need to be thought about when you're doing that. One is, again, going back to the business goals, and talking to people. What is kind of the value of doing this marketing campaign in this region, in this way versus maybe, the stricter thing we're doing in Germany or whatever it might be. So, kind of understanding the business goals and needs in deciding, do we need to change things in specific regions or for specific products? The other is also when we talk about privacy; we mentioned before, it's very much a human consideration, and it's about these kinds of expectations. And those may vary just region by region. So even if you're just trying to look out for reputational harm and not just pure compliance, you do need to think about, well, what does privacy mean to someone in Saudi Arabia versus like sitting where I am in Los Angeles, could be a very different thing. The potential harms are very different as well, depending on a person's age. Or their mental health, the way the data is collected. So, try to take those into account, and you may be able to do that just by the nature of the product that you're trying to market. If you're trying to market a health device that may have very different concerns than a pair of shoes, so it's just taking all of those factors into account and having that open dialogue and conversation between the privacy folk and the marketing folk. And again, using it as an opportunity to educate and talk through these problems rather than just say yes or no. One thing since we're on this topic, that's worth bringing up. When I talk about a successful privacy program, probably the best thing you can do as a business stakeholder is just reach out to your privacy team early and start asking these questions, even if you're not sure that there's an issue. And this is especially true in this kind of multinational context because as a privacy person, if you come to me at the very end, after you've designed a marketing campaign, onboarded, a vendor put together all of the materials you want to put out there and then say, Hey, is this okay? My options at that point are yes or no. You want to go live tomorrow? I can either say yes or no, and that's really not what I want to do. And it's probably not what you want to do, cuz the answer's probably going to be we've invested too much. So, the answer is yes, but wow. We wish we had done some stuff differently. If you come to your privacy person early, your toolbox is much greater. You can start thinking about what other options do we have here? Can we tweak the list of people we're marketing to? Can we target different interest buckets? How do we develop the opt-out for this communication, whatever it might be? You have so many more options than you can collaborate with the business on versus just hard, yes, no kind of thing.

[00:24:59] KRISTINA: I love that you brought that up because that's what I hear from every privacy professional, every lawyer these days. Come early because we're here to support enable you, but I think it's always scary or it's an afterthought. But more options as sooner you get out there. So, I, I love that piece of advice. What's interesting to me is we keep coming back continuously to the human aspect, and undeniably, humans are at the core of privacy, but you're now at TerraTrue where you're spearheading privacy. I guess of the first privacy platform designed to seamlessly, you can't see me, you know, the audio, but I'm putting air quotes seamlessly work with product development, not against it. And that sounds great. And I was trying to kind of really envision what does that mean? What exactly is the product, and how does it help a business achieve compliance? Because obviously, you're trying to do something with software or with a product that helps enable and maybe even make up for the human aspect.

[00:25:54] ANTHONY: Yeah, look, it goes back to the things I was talking about; making it a successful privacy program is what are the tools we can do to help make sure those things happen. So, the core product itself, if I'm a privacy professional or security professional, it's where I can live and do all of my reviews. And automate out a lot of the boring compliance stuff, keeping up certain types of records and, you know, not needing to manage my own templates for different regional laws and all that sort of thing. So that I, as the privacy person, can spend less time on the compliance bit and more time on that interesting impactful work of working with the business to talk through potential solutions and options, things like that. So, it helps in that way with kind of the education component, just by freeing up time for your privacy folks. The part I think is compelling is getting to that bit about coming to the privacy team early and making it easy as possible to do that. So fundamentally, I do not care if a PM or an engineer or anyone else ever logs into our product, as long as their roadmap, their questions, and things like that can get into TerraTrue, so the privacy and security teams can see them. So we focused on building out a lot of integration, so things, Jira, so if your product roadmap is in Jira, you can have some triggers to automatically say, Hey, here's a new product or feature or a change to a product or feature that may impact how we're processing data. And that gets on your, say, privacy team's radar right out of the gate. And you can start a conversation back and forth. Similarly, we do it for Slack and Ironcloud, and a bunch of other tools. So if your marketing team never wants to see JIRA, and I don't blame them, they can ask that question via Slack, and it shows up in TerraTrue, and then that back and forth your communications can happen within Slack, but it also syncs back to TerraTrue. So different parts of the business can use the tools they like to use to request help. We also built the platform to learn from every privacy request that comes in, so we say, oh, hey, look in the past, this was approved for online marketing campaigns. You could use these pieces of data from these types of individuals to advertise these sorts of products. TerraTrue will learn that. And then the next time the marketing team wants to say, buy a new set of email addresses or target a new segment of potential customers, the tool will say, oh, hey look, here's the only thing that changed. This is the only thing that you really need to review from a privacy perspective and can provide real-time feedback to the business as well. So, we can say, Hey, look, actually you have considered this to be higher risk, and maybe you want to go down this avenue instead, that sort of thing. But the goal really is just to build out tools that help us shift privacy. So this is a concept that has been in the security space for a long time. And privacy's only getting up to speed with, which is to say, start this privacy discussion and feedback very, very early in the product or marketing development process and, and provide that feedback then. So, you've got this whole toolkit available to you rather than a simple yes, no.

[00:28:57] KRISTINA: So is there a way for organizations to understand what they don't know. Because I think the idea of coming to the privacy team is so great and people should come early. In fact, maybe just take your privacy professional to lunch once a week and, you know, get them up to speed on what you're working on.

[00:29:12] ANTHONY: They should take you to lunch, but yes.

[00:29:14] KRISTINA: Okay. Well, so there should be lunch involved somewhere. Definitely, let's agree on that. There should be lunch involved somewhere and I think constant communication, so you can have enablement, but what about all of these organizations that are dealing with the fact that they don't even know what they don't know. It's like my HVAC company probably doesn't even know that they haven't asked me for an updated privacy policy or disclosed what information they're collecting. And I only say that because the installer who installed the HVAC system gave me one of these handy-dandy wall units. And when I complained that I didn't know how to update the software, they just swapped it out and brought a new one. But somebody somewhere probably should be collecting permission or not collecting my data, but they're collecting my data. And so, how do they resolve that?

[00:29:58] ANTHONY: Yeah, look, I think when an org is first trying to stand up a privacy program, one of the things you need to do is get a look back and assessment of what you've already developed, what you've already released, what you already have. There are quite a few strategies to do this, certainly TerraTrue, we've built out tools to help with this, but there's, I think, a mix of things that are effective. Some folks look at, you know, database scanning and figuring out what has already been collected, and that's fine and useful, but there always needs to be a human element on top of it as well, which is to say, okay, great, we've collected all these IP addresses, but why, what are they being used for? And adding that like a functional, useful layer of intelligence on top of just the raw, what have we collected? Um, I like to recommend that companies, when you're first looking at this, try to consider a few things. So, one is, what are the most important components of your business? That if a regulator came knocking on your door tomorrow and said, you need to shut this down because we're worried about X, Y, and Z privacy concerns. What would kill your business, focus on those, and wrap your head around everything that's happening with those product lines first? Also, try to identify anything that's potentially sensitive data. So, if you have a medical subsidiary or you're doing something with biometric authentication, these sorts of things, histories of location, data, look at those as well. So, look at things that are very risky from a business perspective and then risky from just a pure kind of data privacy perspective. Do that deep dive with the business and understand what's going on there. What's being built and identifying gaps. Beyond that, like take your time, fill out the rest, and do it at the rate that's correct for you; every privacy team and every company is short-staffed. So, it could take some time. And maybe you're really hitting a lot of those as a change is made. So suddenly, your HVAC is collecting some new information about how often you're running your AC and that sort of thing. Fine. That's an opportunity to reassess how that entire product works. And wrap your head around what's going on there. So, I think that's a good way to start. And part of that assessment will be what are the disclosures we make to individuals? So presumably, there's some piece of software on your HVAC that lets you manage it. Well, that's probably that company's best avenue to talk to you. If they don't have, say an email address tied to your account or whatever it is. So, consider what information are you providing to the user then? How do we need to update that? What have we done in the past? And if you're not sure. Put it all out there now, so that, you know, going forward, that sort of thing, there's no one size fits all, but that's kind of a general framework for how I like to think about it.

[00:32:34] KRISTINA: No, that's great advice. And so, I think that what I said at the beginning of this conversation, Anthony is true, which is you are an amazing resource to help us balance the opportunities and the risk of digital and data. And start to understand privacy. I think this is not just insightful, but it's practical, and that's what all of us need because it's so overwhelming otherwise that I think we would hide under the rock, but some great advice. And I hope that you'll come back some more and talk to us about what we can and can't do in the privacy space as we continue to evolve because we need to take this one step at a time. And like you said, consider the human at the center, and make it practical and actionable.

[00:33:11] ANTHONY: Absolutely. This was a lot of fun.

[00:33:13] KRISTINA: Thanks for coming, hanging out. I appreciate it. Have a good rest of your day, and enjoy Los Angeles!

[00:33:18] ANTHONY: I shall!

[00:33:20] OUTRO: Thank you for joining the Power of Digital Policy; to sign up for our newsletter, get access to policy checklists, detailed information on policies, and other helpful resources, head over to the power of digital policy.com. If you get a moment, please leave a review on iTunes to help your digital colleagues find out about the podcast.

You can reply to this podcast here: