Cat Coode is a data privacy consultant and founder of Binary Tattoo, with a mission to help protect your data. Backed by two decades of experience in mobile development and software architecture (BlackBerry), as well as a certification in data privacy law, Cat helps individuals and corporations better understand cybersecurity and data privacy.
She specializes in global privacy regulation compliance and delivering privacy education seminars. She is a member of the Canadian Standards Council for GDPR. She has worked with tech start-ups through to large organizations in a breadth of industries, including healthcare, education, and social media. Cat leverages her aptitude for teaching and her engineering background in both her speaking and consulting engagements to break down technical concepts in ways everyone can understand.
By making a few simple changes to your devices and accounts, you can maintain security against organizations' unwanted attempts to access your data. You will protect your privacy from those with whom you don’t consent to share your information. And if you are part of a business, take note and start to treat consumer data privacy with the respect it deserves. Cat Coode talks about individual and organizational data privacy considerations, cybersecurity, and how to get a grasp in an increasingly complex digital world.
KRISTINA PODNAR, HOST: Thanks again for joining me here at the Power of Digital Policy Podcast. Today, I've invited a friend north of the border to come and speak with us about data privacy and security. It's a show that I'm happy you didn't miss because we will get into both privacy and cybersecurity from an individual and corporate perspective.
Without further delay, let me introduce you to Cat Coode. Cat is a data privacy consultant and the founder at Binary Tattoo, a consultancy whose mission is to help protect your data. And whether you're an individual or an organization. She is backed by two decades of experience in mobile development and software architecture, as well as a certification and data privacy law. Cat not only preaches this stuff, she's had a hands-on experience and has been through the trenches. Cat is one of my favorite go-to people on cybersecurity and data privacy. She probably doesn't know that I've been stalking her online as much as I have, but I'm delighted that she's here today. So without a do welcome, Cat!
CAT COODE, GUEST: Thank you so much. I love this stocking. Now, I'm always interested in what people can find privacy-wise; you shouldn't be able to find anything.
KRISTINA: Oh, I found a lot. It was quite interesting. It's funny because not only do I stalk you online and thinking about you often, but I invoke your name. I was just talking to a colleague this morning, and we were debating the kind of over our morning, virtual coffee, whether consumers are getting the short stick on data privacy. And I thought, Oh, what a great first question to post a Cat, thinking about the fact that we have, the recent Facebook, LinkedIn, Stanford, data lakes, scrapes, hacks, whatever you want to call them. That's my first question for you; today, are we as consumers getting the short stick?
CAT: Oh, we are totally getting the short stack. I think there's a sense of, like, I think I know there's a massive sense of entitlement with companies that because the data exists, they should be able to have it because it's there. Why can't they have it and use it? And, and so that's we as consumers, it's not even buyer beware anymore because we don't know. We don't even have the information to know what people are taking, and companies are leveraging that stack in order to be able to take the data. So a very scary example of this is that April 2021. Apple is introducing new privacy features so that if you are being tracked by apps, that you can indicate that you do not want to be tracked by said apps, and already Procter and Gamble was caught trying to make a deal with another software company so that they could circumvent that policy and track anyway, without individuals knowing it.
KRISTINA: Scary stuff. Scary stuff. And so, that's sort of a good question. From an everyday person's perspective, when I talk to friends, colleagues, parents of my teenage son's friends, one of the questions they always ask me are things like, what should we be doing? What is sort of these guardrails that we should put in place for ourselves? They don't even know where to start. Where do you tell people to start? When they're like, wow, Cat. This is some serious stuff, but I'm overwhelmed.
CAT: It is overwhelming. I think knowing that it's overwhelming is the first step because a lot of people are like, I'm not tech-savvy. I'm too old to understand this. No one understands this. No one has the full picture. So everything is new. Everything is, is large and it's big. And it's a lot to understand. So know that you are not alone. You are not behind the times. It's just a lot. So the first place that I tell people to take steps is to open up their privacy settings on their phones.
It will take you time, but go through every single one of those settings because you will be shocked at the number of applications that are connected to your microphone, your camera, your location, your contacts. Just by doing that. And I always tell people this, and the instant feedback is I had no idea. So looking at that, because again, if something is attached to your microphone, maybe it is listening to you. If something is attached to your contact book, it is using that information. It is taking your contacts, email addresses, birthdays addresses. So it's, it's really key that you are aware first and foremost of what you have put on your telephone and what it's taking from your personal store of data.
KRISTINA: So that's sort of interesting. You've done a lot to educate all of us about social media platforms and our privacy. I think I tried to go out and count the number of articles you've done around the topics. Something like 36 on Facebook, I think alone, and 44 on social media or more. That's a lot of advice. What are the top things you want users to know? And conversely, what should organizations take away from this knowledge test?
CAT: Oh, wow. Okay. So that, I mean, when you're downloading social media, it's always a trade-off. We're so used to paying for things in money, and that puts a value on it. And we have to retrain ourselves to think in terms of benefits and in data exchange. So, a great example in Canada, we have a grocery store that does points, and those points have an actual dollar value on them. So in exchange for me being able to accumulate these points. I am trading all of the grocery items that I purchased from that grocery store. So they have everything I've ever bought. I have no grocery shame. So, I don't really care if they're tracking my groceries, but if, for instance, I bought all of my healthy food from a farmer's market and all of my junk food from a grocery store. And then the same grocery store chain here, it's called President's choice, also runs insurance, it runs health insurance. So if a company I was working for decided to use President's choice health insurance, that health insurance company might take a look at my purchase history and say, well, this woman only buys junk food. So we're going to hike her premiums up because she's clearly not living a healthy lifestyle. So here's an example where that data could be used against you. Uh, if you have a Fitbit, for instance, and you wear it for a long time, and then, you know what, you just stopped wearing it because you have a rash on your wrist. But then again, health data, somebody looks at it and says, well, this person stopped exercising for six months.
So we're going to hike up their premiums because they're not taking care of themselves. There's all this kind of decision-making that can be made based on your data. So that's why your data has value. It's not the, Oh, I have nothing to hide. I have something to control. So looking at it that way and saying, okay, if I download this app, this piece of social media if I share this piece of information and someone uses and abuses, this, is it worth it for me to use this app, is it worth it to connect with friends and community members on Facebook knowing that Facebook is giving my data away to advertisers? And if the answer is yes, and that is the cost of using the app, and it's worth it to you, then it's worth using, but it's, doesn't just download an app because it's there, download it because it has value in the cost that you're giving it.
KRISTINA: So what are the applications or services that you will not install on your phone? Give us a scoop. Who's the worst offender?
CAT: It's definitely Tik-Tok. And finally, actually, right now in the UK that they are trying to sue Tik-Tok because they have been able to prove that they took a lot of information for minors without knowledge. So the rules for kids under 18 are vastly different than they are for adults. And they couldn't really catch them on the adult stuff because again, to talk is taking a tremendous amount of your information, but what they could do is say, okay, well, you didn't disclose you were taking contact information, biometrics, emails from children. And so they're going after them from that aspect, but you have to be careful that privacy laws are different in different countries, and some countries don't have the same privacy laws. So in American based app has certain laws to protect different demographics. Canadian-based apps have higher ones. EU ones have even higher, but there are some, there are some applications coming from other countries that don't have those privacy standards. And so you can't rely on that. So that's one event. The other one that makes me nervous is Clubhouse, which I know is super popular right now. But it takes all of your contact information again. And, there was a quote-unquote breach of Clubhouse data, and it was who you're connected to who you invited when you started on the service, a bunch of your handles, and Clubhouse said, Oh wait, Whoa, that's not a breach. That's publicly available data. So to me, I'm like that shouldn't be publicly available data. That's not a good argument there. So, uh, and in some states in the US publicly available data, it doesn't matter. That still counts as a breach.
KRISTINA: There are so many complex use cases out there. For example, a colleague of mine is moving to the EU at, I think, summertime. And we were talking about the fact that she's had her own business for many, many years. She has a US-based phone number, and she said, you know what, I'm moving to the EU, but I'm going to keep my US-based number. I'm T-Mobile customer, I'm going to take my phone, my phone number with me, and I'm probably going to have it for a while just because it makes business sense to do that. And it's a good way for the family to stay in touch. Right? So now she's going to be living residing in the EU. Still, she's going to have her US-based phone, her US-based phone number, and presumably all over apps on that phone, on that iPhone that she has downloaded, items from the US how does an organization understand not just who you are, right? Because that's probably a much easier question, but where you are and what laws apply, it's becoming a sort of a patchwork for most organizations out there; what type of advice do you have for them?
CAT: Yeah, that's a really hard one. So, I mean, I've seen this handled in many different ways. My advice is always to go with the highest possible privacy standard because got EU is currently in GDPR is currently the highest standard, but we see all this new legislation out in all of the States in the US we are getting new legislation in Canada. So going for that high standard it's, it's not just me like a stamp of approval. It really is protecting the individuals whose data you collect. And so GDPR, to me, isn't just an annoying checklist. It's actually validating that you've put privacy into your product and service, and it's protecting your users or patients or whatever you have there. So my number one advice would be just to meet GDPR standards because it makes it simplifies everything. If you have a product that you require, some kind of tracking or something, and you don't think that applies to GDPR, that's where location-based things could come in where you could use location to verify where someone is coming from. But there is no; there's no single answer to this. I've seen other companies say, Hey, if your email has like a dot fr a dot de extension, you're European. And if it doesn't, then you're not there is, if there's no simple way to identify it, which again is why it's, it's easiest just to, to assume the highest standard of privacy practice.
KRISTINA: But, one of the things that I hear a lot from my clients is a lot of pushback on that, right? Because they say, well, I'd love to be able to do that. I would love to be able to just give the highest level of privacy to everybody, but then I can't compete with the Amazons or the Facebooks of the world. So is there a fair advantage or an unfair advantage? I guess it is. These big tech companies are getting away with using a lot of data and probably not giving everybody the same level of rights, definitely not up to GDPR standards across the board. And how do other sorts of more established businesses compete with that? Should they be able to, or is that a regulatory question we need to address?
CAT: Yeah, I think it is a regulatory question. So there, there is an element of, they were here first, and they just snuck under the line. There was a lot of obviously kind of some scrupulous and some, not so much behavior that's happened with the big players where they, I mean, Google probably knows more about all of us than anything on the planet. But, I mean, Google knows what I search, what I write Google lists through my Gmail. Google knows everything. I don't think you can compete with that level of data. And I don't think you can compete with those companies. But we see, like in Illinois has a biometric ruling that you can't use biometrics. And because Facebook was tagging individuals in photos without their permission, they are now being sued under that legislation. So I feel like as this stuff continues to roll out, yes, those are multi-billion dollar companies, but if they're being sued for like hundreds of millions of dollars, again and again, and again and again, they are going to have to rectify their practices. So we can't change the amount of data they've already collected, but going forward, I think, I think there will be a lot more. It's not going to be a slap on the wrist. I think there's going to be a lot more penalty to these big players, and then it will give the smaller players a chance to get in there.
KRISTINA: When do you think we're going to reach not necessarily the nirvana state, but when do you think we're going to start to feel more comfortable as consumers around the fact? That companies are being transparent, more transparent, or we have more control.
CAT: So, a lot of the US-based new regulations are due in January of 2023, which is when I suspect Canada will also be updated. And then we might see some, I mean, Brazil is the fourth largest internet use survey in the world, and they released LGBT, which people are still trying to catch up with. So I feel like if all these legislations are out by 2023, then by 2025, everything will be fixed.
KRISTINA: I'm just writing that down. I'm going to call you in 2025 if that's not the case!
CAT: Call me in four years and say, you said, but we see rapid change. I mean, look, GDPR was released in 2018 and coming up on May 2021, which is. Three years later, there has been a massive change massive. And Europe is also working on an AI regulation, which is again huge and overdue. So like I, that's in three years, we have seen a massive change in the way companies collect, handle, process, retain data. Imagine what another three years could do.
KRISTINA: One of the things that I hear from clients I work with; they're very frustrated. They say oh, and 2018. We ended up having GDPR; we're still not clear on what GDPR expects us to do. And we already feel like GDPR is outdated. It's not getting updated quickly enough for what we're doing or where the digital technology is going. All of these other legislative frameworks are going to be the same. Is there a nirvana state where organizations just turn around and say, look, I'm going to do the right thing because it's the right thing, and then I don't have to worry about all of these individual regulatory States? Cause that's what I believe. I'm like, what, if you just kind of go and do the right thing and adopt this high standard, as you mentioned, it does take the onus off of you in some way that you don't have to react to every single new legislation that comes out of the door. Is that true, though?
CAT: I am a hundred percent with you! We should be protecting privacy because it's the right thing to do. I've had companies go well; what's the fine I'm going to get for this? Well, in the cybersecurity world, the way it's often looked at is in risks and cost. So if it's very much black and white, if your company has, let's say, a denial of service attack and you go down for two days, it's going to cost you a million dollars. And the preventative in order to avoid that denial of service is a hundred thousand dollars. Well, people look at that, and they say, what's the likelihood, and then they say, yeah, I'd rather pay 10% of that cost. You can't put a number like that on privacy because it is damaging to individuals. You can't say if I need a list of people from an oncology clinic, what is the damage? What is the number I'm putting on that? And so that is a fundamental problem with privacy. Where if you say to a company you shouldn't really protect these people's privacy better. I'm sure they, Ashley Madison leak ruined lives. Right? And, and I like, I'm trying to come from a privacy perspective and not a judgmental perspective. And we're not going to say, Hey, these people signed up for this because. They signed up for a service that they thought would protect their privacy and that privacy was leaked, and there was no dollar value to put on that. So, everything has to be a cost-benefit in a company. And like you're saying, people should be doing it because it's the right thing to do. And we're having difficulty articulating up to the C-suite and saying, this is why privacy should be baked in. So if these fines continue to happen, to me, that is the leverage to come back to companies and say, you are now with 17 different regulations. The fines could be upwards of $5 million, just put your privacy in, and you will avoid all of these fines if that's the selling point, so be it, but I'm total with you. People should be doing it because it's the right thing to do. And we're protecting the people that we are supposed to be serving.
KRISTINA: Do you see cultural change happening within organizations and people grasping that concept, or are we still at a point where the board of directors and executives need to be well versed and educated to these facts?
CAT: I think, for the most part, we still have to educate people. There's, there are definitely two kinds of clients I see: there's The "I want to do privacy right" cause it's the right thing to do. What's it going to take to get there? I'm apparently supposed to do a privacy impact assessment or a data protection impact assessment. How much does that cost, and how do I get it done quickly? So there are definitely two camps still. There's the "I have to," so what's the minimum I can do. And then there's the, "I want to get this right." Again, I think we're hopefully moving towards, like you had alluded to earlier, customers want to trust, and we are going to get to a point where consumers will be aware of that. I want to bank here and not there because they have a better platform of how your security in it because they protect my information better. I'm not going to download this app because they don't protect my data that belongs to me that I should have control over.
KRISTINA: You just made me think back to an organization I recently encountered. So they're doing something very interesting. They're training their employees, including their executives. It goes all the way up to the C level. And once they train them on privacy and cybersecurity, they run tests, spearfishing, for example, other types of tests. And the first time the individual fails, sort of this outside penetration kind of test, or fake test, basically the employee is asked to retake the training. The second time they take the bait and fail, they're fired. And so what's interesting is it's a financial organization with a very, very high profile here in the US but what's interesting is they have had no incidents. I'm not sure if that's a direct result of training or not. We can assume maybe it could be that they just haven't been targeted enough, but is this the right approach you think for people to take? Is this extreme?
CAT: No. It's interesting. I don't, I mean, I don't know about firing people, but yeah, no, for sure. They, the weakest link is the human link. We've found that over and over again, definitely administrative safeguards, making sure there are policies and processes around things like a lot of the recent, large breaches we have seen to have been because there were too many people who had access to sensitive data. So, definitely, cybersecurity awareness training, like you're talking about, is key, but we also have to make sure we are using processes and policies that limit the access of the data to only people who have it. And we're ensuring that that access is authenticated. So that Equifax breach was because they were using admin admin username, admin password admin, because there were so many people who had access to that computer, that the argument was that it was inconvenient to continually change the password on there. But that's the kind of mindset that's where I feel we need to put focus and say there is a trade-off between convenience and security. We are acknowledging that trade-off. We are acknowledging it's a pain, but that pain is something we have to do in order to protect the information. So I like we live in a world of convenience, and I think people are so used to taking shortcuts that they don't want to be inconvenienced by something else.
KRISTINA: So what role should employees play in preventing a data breach or protecting private information. Do you know who at the end of the day is accountable?
CAT: That's a good question. I feel like everyone should feel ownership over it. Every employee should have accountability and ownership because then they feel like they have a role. It's not just someone telling them that there's something they have to do. No, they are actually responsible for it. But in addition to that, you have to create a culture where, if an employee comes to you and tells you that they have clicked on something by accident or done something, lost a USB key, that they are rewarded for the information and not penalized for it so that they are rewarded for contributing to the culture of safety and not penalized for making a mistake because humans are going to make mistakes. And that is how you stop things quickly. And the number of stories I've, I've heard of it involved with breaches, where someone clicked on something and the minute they realized that it had been some kind of phishing campaign, they alerted IT, IT was able to shut it down quickly, and it prevented a lot of damage. So empowering your employees with that accountability, but also with the ability to say, Hey, you could be a part of the solution, and you will not be punished for letting me know that this has happened that I think is really key. So it's interesting like you're saying these people are fired. If they fail the thing, that's one method. I prefer to go the other way and say, let's everyone work together to do this.
KRISTINA: So how do you flip the switch inside of an organization to that? Because I'm thinking about the number of times that an organization asks me to complete the security training. And the security training is the next, next, next click answer three questions. Next, next, next click. And I have to tell you at this point, I don't even bother to listen. I shouldn't say that probably publicly, but I don't bother to listen because I can answer all of the questions. I'm thinking to myself, that's probably not the right approach. It probably should be more engaging training, more engaging information things where we learn something because we can always learn. Even if you're in this field, you can always learn something new, but it's this click, click, click next kind of training that we get instead. How do you flip that switch so that people really start understanding that it needs to be more engaging, more real life, a more relevant to the type of work that we're doing every day?
CAT: I guess it's a matter of showing them the computer-based training is great. There's nothing wrong with that. Additionally, you need live speakers, and now with online, you could do it and tape it and then have people rewatch it, but you need live speakers; what I have found in my own experience because I do a lot of cyber awareness and cyber hygiene training is. I don't make it about the company. I make it about the individuals because, at the end of the day, you care more about yourself than you do about your company. So I always make it about here's. How to pretend to be you. Here's how to protect your passwords, your products, how to protect your accounts because in doing that, you are protecting the company. So teaching someone how to avoid social engineering is protecting the company. An easy example is most people's laptops are their first name, last name. Most people laughed. I was like a mini Cat Coode's laptop. And if I were sitting in an airport lounge on Wi-fi and somebody did an easy search on public Wi-fi to see who was connected and saw a Cat Coode's laptop, it would take about five minutes to find me online because I'm like the only one in Canada. And like one of two, I think in North America, So it would be pretty easy to identify me and then pull information about me, know what flight I was getting on, and then call back to my company and say, Oh, Hey Cat, just told me to send this email. And I know she's on a flight to France right now, but if you could just press this button quickly and she'll take care of it when she lands, that would be great. That's way too much insider information to have for someone who doesn't know me. So that's the kind of social engineering that could happen again quite quickly, from somebody just pulling things off there. So by telling individuals, Hey, you should change the name on your, on your laptop. So it's not your first name, last name that protects them. And then again, it protects the company.
KRISTINA: That's a great example of how do we protect ourselves. And I know you've done so much writing and public speaking on this topic. I looked at your website, BinaryTattoo.com, and found, for example, cybersecurity when working from home, which is something that we've all been doing in this pandemic. Interestingly, many organizations hadn't been training employees on working from home. I mean, we've had to telework for a while. This isn't new, maybe the number of people working from home is new, but teleworking has been around for decades now; what do you say to organizations like that or even individuals? How can they not only get their program to a more mature level where they are thinking 360 about their employees and the organization's benefit. Like you said, training everybody up. And then what are the things that they should be thinking about if they've already adopted this advice of being safe, working from home?
CAT: So working from home is really tricky because of endpoint management. And when we say endpoint management, that is any access into your company. So that is any laptop device, anything that can reach into your systems, and it's been very difficult. So, there are big companies that have deployed, let's say Microsoft 365, is a great example. They have all sorts of tools and tricks that you can push out to individuals' phones. So if you want to access corporate email from your phone, it will analyze your phone and make sure that you're using it safely. It can, again, log-ins from your computer. But for smaller companies, that's a real challenge to make sure that someone isn't using their individual laptop at home. And, and then, maybe their kids are using it for homeschool or for gaming, or maybe there's a roommate that can look over their shoulders. So I think once we've come out of the office and we often forget a lot of the things as I would never take a photo in an office in a meeting and share it. And yet, as soon as we went into the pandemic mode, everyone was doing screencaps of their Zoom calls and then sharing them on LinkedIn. And I'm like, well, that's a social engineering feast because now I have the images and the names of everyone that works for your company shared publicly. So having a really good info stack, like an information security policy that translates to both that essentially tells you, what are your password requirements? What are your acceptable use requirements? How are people allowed to use the devices you've provided for them, and how are they allowed to use their own devices to access your stuff. And then there are other little weird nuances, like smart devices. So if you have like Alexa or Siri and those devices are listening at all times. So if you were in a job where you have a highly confidential meeting, you should not be doing that meeting anywhere in front of a smart device. So even your phone, honestly, I tell companies not even to bring their phones into meeting rooms so when they're having competition meetings because most people have microphones turned on. So that there are little nuances with the home that you probably wouldn't have in a work office, but definitely creating a policy and you can get easy templates to where that works for both that, that ensures that people are just aware of what some of the privacy practices are for both in-office and office, from home.
KRISTINA: That's a challenge. I also think for a lot of organizations, at work, they can dictate to people. Don't bring your phone into a meeting, but what happens when everything is taking place in your home? Your employer is a little bit of a guest in your home in a way because you are working in your environment, and then you're working from home. What are the accountabilities there, and how do people actually get that policy right, in your opinion?
CAT: That is really hard because you're right. It's very difficult to stipulate it. Look, if you lived in a one-bedroom apartment and you had a smart TV, where are you supposed to go? So sometimes, it's just a matter of awareness. There is a really massive problem happening right now with virtual health. And I think this is here to stay because why drive to a doctor's office sick. If you can call in and take some kind of telehealth call over the internet, that is infinitely easier for sick people. It'll probably reduce the spread of illness. So I'm pretty sure that's here to stay, but the way people are using the tool could be the safest tool. But if you're using it from a space that's not private. Then that's a problem. So if you are a doctor who lives with other people, and you're making that call in a space where other people can hear your conversation, that is a breach of confidentiality. So it's so subjective, and that's like, unfortunately, there isn't a one size fits all InfoSec policy. You really have to look at what you do now from a health perspective and a sensitive data perspective. There's actually a lot of really great guidance happening right now. So if you do a search for telehealth or virtual health guidance, there's a lot of recommendations. I know I put together a quick infographic on what people should consider when they're doing health from home. But it certainly, yeah, there's a lot of little good guidance and tips, and it's just a matter of getting it out in a way that your staff or your employees understand the importance of it.
KRISTINA: It seems like there's a lot of information floating out there, but often I feel like there are not enough people listening. There were two examples this last week that I had just in my personal life. One was we sold a house, and the title company sent me all of this documentation. They wanted to understand the routing number and bank account number. So they can wire the money. They wanted the social security number, so they could withhold some taxes and do the reporting. And what was fascinating to me is they sent the documents, emails to me. So I didn't have to go into an office and said, just print this out and send it back to us. After you signed it and had it notarized. And I thought, wow. Okay. That's great. So I did what they asked me to. I had everything notarized. I PDF. I encrypted it. And then, I asked them to let me know how they would like to get the password so they can unencrypt the PDF. And what I realized is it was very unusual for them to have that scenario. They're not used to it. They just weren't thinking about that. I'm thinking, wow, here you are collecting all of this PII, and you're asking me to email it back and forth. But I see that a lot with doctors as well. I saw my doctor this week. They wanted to have me do a release form, which required my name, birthdate, current date, and address. And they just said, just email us that information back. And these aren't like one local doctor's office; these are like really big conglomerates. Even though we have a lot of this information and awareness floating out, there still seems like we can't get it installed when it comes to that last mile. What do you think is the challenge here? Like, why can't we get that last mile?
CAT: I think that's kind of the million-dollar question. Transfer personal information is a massive problem, and it, and again, the pandemic, has highlighted it. People used to always do this. Accountants typically use your social security or social insurance number as an unlock for the password of these files. And I'm like, that's not secure. Because that number is unfortunately not a secret number anymore, every employer you've ever worked for has that number. But, the transfer of data people don't appreciate transferring via email and, and educating the world on secure file transfer is a feat. I love Google. I use Google, but Google is not a place for sensitive information. They even have a disclaimer that says, do not store your health information on our servers. I think more awareness around that would be good. I don't know how to get that awareness out there. I don't know how to tell every small company sending sensitive data not to send it via email, except for the fact that people like you and I will keep saying, I'm not sending you this via email. I will send it to you this way. And then that, that will make them think, and then they'll change their practices. And then someone else will use them and say, Oh, I didn't know that that was a thing. And then a second practice they use, they'll say, Hey, by the way, should I be sending this securely? Yeah. So definitely, you would've thought that this pandemic would have seen the rise of the secure file transfer protocol people and all those apps. But, but it hasn't happened yet, but it will, it will. People will like it; people now understand VPN. What is that? No one knew years ago; now they get it. Password keepers were like only privacy professionals. Now other people are using them. So I think we'll get there. It just isn't happening fast enough.
KRISTINA: Fair enough. Thinking about the things that you wish could still happen a little bit faster or big tips for folks out there listening. Hey, do these three things to keep yourself safe, protect your privacy. What are they?
CAT: Okay, number one, again, is to review your privacy settings. Always, always, always review every privacy setting you have. Number two is to reduce your digital footprint. So when you are sharing, only minimize the amount of information that you're sharing to what is required. Anytime there is unrequired information like gender birthday, do not put that in there. Use only the stuff that is required to run the applications that you need to use. And then the other thing is to really understand that your privacy is your responsibility. And we cannot count on application services—even some things provided by our own businesses to protect us. So having even a basic understanding or stopping and saying, what am I downloading? Why am I downloading this? What am I giving it? Uh, what can it connect to? But again, this convenience over privacy thing, like I get the smart TV. I understand why people like them. I will not put one in my house because I will not have it listening to me 24 seven. So ask yourself again, like you, that's your privacy; you're trading your privacy for that convenience. Is that something you really want to do? Is it really worth it to sit on the couch and yell at your television? If it is good. Good for you. You go ahead and do it. For me, I'm still going to pick up the remote. So, yeah. Understanding what works for you and owning that privacy yourself.
KRISTINA: What about from the corporate perspective? What are the three things you want organizations to be doing differently today going forward?
KRISTINA: I love all of these tips. This is so great. And I feel like we could continue talking for hours and still not work through all of the wealth of knowledge and insights you have to share, Cat. So if listeners want to read more, you do have phenomenal resources. One of the things that I love about your website, you classify different types of apps on how safe they are to use or how unsafe they are to use for individuals. I love that you've done a global data, privacy regulation, compliance overview. It speaks to my heart. I did something similar where I took the 99 articles of GDPR and mapped them to LGP, to CCPA, to the emerging regulations. But I see that you've done the same, so lots and lots of information that you spent hours pulling together for us and making our life easier. If listeners want to read more, or if they want to reach out to you directly with questions, how do they find you?
CAT: As you said, my company website is BinaryTattoo.com. You can also just find me at catcoode.com, and that's code C O O D E, fancy. Or you can certainly check me down on LinkedIn.
KRISTINA: Thanks Cat, for taking the time to catch up today. I hope you'll come back and continue to support in conversation because not only is it highly relevant, but you make the concepts very easy to understand.
CAT: Oh, thank you so much. It's such a privilege to talk to you today.