S6 #1 Trust, Transparency, and the Future of Digital Privacy

S6 #1 Trust, Transparency, and the Future of Digital Privacy

S6 #1 Trust, Transparency, and the Future of Digital Privacy

Guest:
Guests:
Cindy Warner

Cindy Warner

Cindy Warner began her career as a deep technologist turned global 1000 technology strategist. She has spent 4 decades in public companies, including Fedex, Ernst and Young, PwC, IBM and Salesforce. As a third career, in 2016 she witnessed the effects of GDPR on corporations in Europe, and designed a data privacy workflow platform called 360ofme. This platform was designed to bridge the need for consumers to have transparency and control over who was using their private and sensitive data, and enterprises need to verify identity, gain consent to use this data, and minimize the consumer data used. This platform was first released in 2017, but re-platformed and re-released in 2024. Today Cindy serves as the CEO and Founder of 360ofme, as she and the company move to market adoption and onboarding of clients.

Cindy’s other passions outside of her love of technology and its transformative potential, would be getting more women in the field of technology. She has served as the President of Michigan Council of Women in Technology and is currently the Executive Chair for the American Cancer Society southeastern Michigan initiative, Taste of Hope. She balances her love of her career, with the fulfillment she gets from giving back to those less fortunate.

In this episode of The Power of Digital Policy, Kristina hosts a thought-provoking conversation with Cindy Warner, a technology veteran and founder of 360ofme, a platform that empowers individuals with greater control over their personal data. Cindy reflects on her transition from deep technologist to strategic leader at companies like FedEx, EY, PWC, IBM, and Salesforce. She explores the pivotal role of trust in digital relationships, emphasizing that ethical data use is the foundation for responsible AI and innovative digital services. They discuss how consent, data minimization, and transparency are critical for maintaining customer loyalty and trust. The conversation also touches on the risks of passive data sharing, the importance of identity verification, and the potential impact of stronger privacy regulations in the U.S. Cindy concludes with advice for the next generation of technologists, encouraging them to find their voice, articulate their value, and lead with a clear vision for the future.

Keywords:
data privacy, trust, digital policy, ethical data use, identity verification, consent management, ai, data governance, digital transformation, customer loyalty, privacy regulations, data security, technology leadership
Season:
6
Episode number:
1
Duration:
32:07
Date Published:
May 12, 2025

[00:00:00] INTRO: Welcome to The Power of Digital Policy, a show that helps digital marketers, online communications directors, and others throughout the organization balance out risks and opportunities created by using digital channels. Here's your host, Kristina Podnar.

[00:00:20] KRISTINA: Welcome back to the Power of Digital Policy, where we're exploring the intersection of technology, innovation, and governance with leaders who are shaping the future. Today's guest is a powerhouse and technology and purpose-driven innovation. Cindy Warner began her career as a deep technologist and has evolved into strategic leader across iconic companies like FedEx, Ernst and Young, PWC, IBM, and Salesforce. What this means is you're getting somebody who has battle scars and can tell you all the good stuff as a result of them. In 2016, she turned her focus to a mission-driven venture founding 360ofme, which is a platform designed to give individuals transparency and control over their data, while helping enterprises upload and uphold privacy, consent, and identity verification in the GDPR era. Cindy, I'm so honored to have you. Thanks, and welcome to the show.

[00:01:10] CINDY: Oh, thank you so much. I'm so honored to be here.

[00:01:13] KRISTINA: What is it in your journey with privacy and data and what hit you in this moment in 2016 that made you say, you know what? There has to be a better way to do data privacy, than what we've been doing so far. And what have you learned that we just need to keep remembering, not reinventing.

[00:01:30] CINDY: Yeah, great question. I would say, it's the T word as I affectionately say, which is trust. And you know, we, we talk about trusted in relationships or trusted relationships, whether it's with our partner, our spouse, or whatever it is. But, in business I think we have to some extent overlooked or underestimated maybe the power of trust in relationships.

And so, you know, on or around maybe, 2014 ish as I was going through my journey of systems implementations, and they were CRM implementations notoriously I said if people trusted us, whoever us was, whatever enterprise I was working with their data, they would tell us what they want, they would tell us what they need. We wouldn't have this hunt and hide and go seek, Where's Waldo experience of trying to find people and trying to sell them something or trying to sell them something that they didn't want in the first doggone place. So, you know, it's really about a trusted relationship that can bridge the gap of all of the mess that's part of marketing and part of sales and trying to find people and trying to give them what they need.

[00:02:35] KRISTINA: So what does the platform actually do, right? Because you said, look, nobody's doing this right now or not doing it well, obviously, and so you went out and said, I'm just gonna do it the right way. What is it that you're doing differently? What should we be paying attention to?

[00:02:51] CINDY: Yeah. And the platform, what I would tell you is the platform has kind of four components. And those four components comprise what we call data exchange workflow. So it's really a workflow. And when I think of workflow, and I use my hands 'cause I'm Italian, but when, when I think of workflow, I think of kind of left to right, somebody kicks off something, it has steps, and then it has, you know, kind of a beginning and end. And in some cases it has a virtuous cycle, which is why we call this 360 of me because there is a virtuous cycle between the enterprise and the consumer. So think this, a consumer says, let's take a connected car. A consumer says, I want you to tell me where the closest McDonald's is. Let's just say for something simple. And in order to do that the service provider has to know where your GPS location is, and they have to know where the closest McDonald's are to that, and then they have to map you. Okay? But you've requested a service. So that kicks off this interaction between a service provider and the consumer. And so what we do is four things. One, it kicks off the interaction. Two, we gain consent for use of the personal data that's necessary to fulfill that service. Three, we minimize. We minimize the data that is necessary to be used. So when you think of a connected car, there's this huge amount of telemetry that comes into a connected car. But really for that service, the one that I just intimated, all you need is the GPS. So field number 22 out of a field of 70. Okay. You just need the GPS. So we minimize the data down to only the field necessary for the service, and then we get the consent, we provide that data back to the enterprise to use for that service fulfillment.

And then we have transparency on both ends where the enterprise sees that they fulfilled the service and the consumer sees that they requested the service, gave consent for the data, and they can see that the enterprise is using their data for that service fulfillment.

[00:04:56] KRISTINA: So that sounds really simple. If you say it fast, why are we struggling with this?

[00:05:03] CINDY: Yeah. It's, it's not that simple. There's one thing that I, you know, somewhat left out, which is the validation, which today it's actually becoming almost the most prolific portion of the workflow, which is validating that the person giving consent for the data is who they say they are with deep fakes and all the, the wonder that we have today of nefarious actors, validating that somebody is giving consent for their GPS that they own in that car is really becoming almost one of the most important parts of our workflow, is that validated person to give the consent.

[00:05:43] KRISTINA: I had this experience last year. I was actually in Florida, rented a Tesla car from Hertz. You know, at the time they had them. And it was interesting because my teenager pointed out to me that there were videos being recorded, and I don't remember actually signing or giving permission, and it was interesting because he said, well, you know, even if you didn't tell them my name, I had my high school shirt on. So I wonder if they could actually figure out the rough age I'm at my high school, the fact that it was a soccer shirt and triangulate his name off of this data and piece of

[00:06:18] CINDY: That is a piece of cake.

[00:06:21] KRISTINA: Right. And so I'm wondering, to what extent are we passively giving up all of this data and is there a point in which we need to also educate the consumer? And if so, how do we even onboard the consumer? Because I think a lot of times people don't know, or maybe they're not thinking like, Hey, somebody just recorded a video with my shirt on.

[00:06:40] CINDY: Yeah. In fact, this just this morning, I was going back and forth with a group of people on this whole new real ID issue, and it is an issue because if we're using biometrics in your face and then we're applying all kinds of analysis to that face, right, the things that can come out of that by and large can be very nefarious and very damaging. As an example, we pontificate somebody's age and so we discriminate against them because of their age. We look at their face and we say, oh, we think they're of this nationality, and because we think they're of this nationality, they must be a criminal, or they must be at a higher risk as they go through TSA. We think that person probably then is a higher risk profile for no apparent reason other than we just saw their face, right? Your question about seeing my high school t-shirt or whatever on even just a face can lead to horrible outcomes, unintended consequences. When you just put it through some AI engine and it says A, B, C, D, E, F, and G. Oh, maybe with her eyes, she's not trustworthy. Look at those eyes. Those eyes tell me she's not trustworthy of the shape of her mouth. Tells me that she's not intelligent or whatever it happens to be. So, there are so many AI engines that can use that data and apply logic and reasoning that may not be for the best and it may not be accurate. And so it is, it is a bit damaging for sure.

[00:08:06] KRISTINA: Are organizations aware of that, you think?

[00:08:08] CINDY: So I think the first step in the right way out is verify identity. And for us as consumers, it's going to get frustrating before it gets better because we have so many bad actors that are making out that they are you. When an enterprise wants to interact with you, they have to know it's you first, right? But even more so, you would really want that. You would really want several layers, whether it's two factor authentication, or whether it's five sources of authentication. We're gonna end up having a validation step in order to do business and interact with companies that's pretty egregious. I mean, it's gonna be, it's gonna make us a little bit frustrated, I'm sure. So that's kind of step number one. Step number two is enterprises have to believe in the value and the differentiation in a trusted relationship. They have to get to that point that they believe there's business value in a trusted relationship, and then they have to do everything in their power to make sure that they don't breach that trust.

[00:09:12] KRISTINA: So how does that actually work in an era where, we hear oftentimes, well, you know what? AI is coming. We have to move much faster, we have to lean into innovation.

[00:09:20] CINDY: What I would say to those folks is a couple of things. First of all, with a neural network and a large language model that is not the type of situation where if you have bad data, and, and to me bad data is, is non-trust and non-con consented data. So if if you have bad data in a neural network, getting it out, right? It's not just a flat file. We don't delete row 32. And so. The disruption in that can be very pervasive. It can be very difficult for the organization, right? And it can lead even the suitor that got the outcomes to believe, none of the data's good If one of it's not good, none of it's good, right? So then you start to have a lack of confidence in the outcomes, right? When I think of AI, we always say responsible AI starts with ethical data. And, and in my mind, that's really the hallmark. Ethical data use should be the foundation for responsible AI. And you should ask yourself when you're building those models and engaging with AI, is this ethical data use? Would it meet the standard of ethical data use, the data that I'm putting into this model, would it be considered ethically used? And if the answer is yes, great, off you go. If there's any doubt in your mind, you need to meet that standard.

[00:10:42] KRISTINA: I'm wondering how many people would then have to check their data.

[00:10:48] CINDY: Yeah. Well that's the wild, wild west, right? That's the wild, wild west today, because getting your hands on data to build those large models, to get confidences that you can then sell. 'cause at the end of the day, that's really what the challenges here. I remember when I was at IBM when we released IBM Watson and, and we thought this was gonna be the panacea for healthcare and our confidence level, true confidence is coming out of some of the therapeutics and some of the outcomes were very low. Like when I say very low in the sixties percentile and when you're dealing with somebody's life and, and treatment options. 60 percentile is not good. And now some would say, well, that's better than a doctor that's been working for 18 hours and is fatigued and tells you this situation, right? But at the end of the day, you're trying to get to confidence levels that are high enough that you would believe them, and you would think then that you could trust that outcome. And so to me, if somebody doesn't trust the input, why would they trust the output?

[00:11:48] KRISTINA: We've talked about things like data providence recently, and it just seems to me that's what the crux is, is like where did this thing come from and do I know if I can trust it or not?

[00:11:56] CINDY: I would say, you know, we always say remediate and delete and the reason we say that is if you remediate and if you come to the conclusion that you cannot get the data subject whose rightful data that was their consent to ever use that data. You don't have an email address, you can't go back to them and say, Hey, could we use your data for this purpose? Yada, yada, yada. If you can't do that, then delete the data, because at the end of the day, you know that you are not, then you don't meet the criteria for ethical data use, right? Because you didn't get any consent to use it. And the purpose with which they provided that data has already passed. You provided the service and the ships sailed. So if people need to remediate and delete, that's great. We do have clients and we've seen a lot of clients that can remediate, get consent and turn that into ethical data use, and that's great. And if you ask somebody and they can trust you, and it's for, for good purposes, for research and development, that type of thing, a lot of clients will give consent because they believe in ethical data use and they wanna be helpful. If you look at 23 and Me, those people paid for a service and then gave up their data for research and development didn't get a dime. So they wanted to be helpful in their heart. They wanted to be helpful, right? They wanted to help research and development, but they also wanted to be consulted. They wanted to be consented. And they wanted it to be used ethically. They didn't even require being paid for it. But you know, in my mind, if somebody's got a massive data, if it's non-con consented data, if you can't get consent, I don't know why on God's earth you'd be storing it. I'd get rid of it. 'cause then you gotta protect it.

[00:13:33] KRISTINA: But it's interesting, a lot of CMOs that I talk with are cringing probably at this very moment somewhere, because they're like, oh my gosh, I don't wanna delete that data.

[00:13:40] CINDY: there is the balancing act now because the cost of a data breach and somebody getting their hands on that data and personal and sensitive data, and then having to provide all of the aftermath and , the credit monitoring and all that stuff is pretty high, notwithstanding the brand damage, lack of trust, on the story goes. So the question becomes, okay, I got cybersecurity on this side. I got data, I maybe, or maybe cannot. Maybe can use or maybe not use. So if I hold it now, I gotta protect it, right? And so I'm spending a lot of money to hold onto something that I don't know the value I'm gonna drive out of it, or I can't articulate the value I'm gonna drive out of it. So do you do that? We always say when, when, when on the balance sheet of data is a liability and it can't be turned into an asset, then you really do need to think about letting it go.

[00:14:29] KRISTINA: If we could understand the value of our data, we could then understand is it a risk, is it a liability, or is it an asset? Like you just said, why is this so hard for people to figure this out?

[00:14:39] CINDY: Well, well, interestingly enough, I had a client that we were working with and they had on their balance sheet, they had a $400 million line item for the risk associated with the data in their organization. So the, and that $400 million was. If it gets breached, if it gets, you know, lost, we, something bad in nefarious happens to that data set that we hold, there's gonna be unintended consequences that we believe put us at risk to the tune of about $400 million of liability. And they had it on their balance sheet. So when you start looking, I mean, it's a real math problem. It's a real equation, right? It's a real asset liability, you know, type of thing. And when it gets to the point where it's such a large liability. And you really don't have, and I would say to A CMO, help me understand when, how, and the value of use of this data. Let's sit down and talk about that and have real conversations about it. And if we can't articulate that, should we be protecting it? Should we be keeping it and protecting it? And so it's, it's a real pragmatic discussion around, the future of that data and can you ever turn it into an asset?

[00:15:48] KRISTINA: That makes sense. If it could only be so simple though. And I was hoping that we would make it simpler with frameworks like GDPR, CCP and others, looking at those frameworks, do you think they go far enough?

[00:15:59] CINDY: No, I don't think they go far enough. We don't come at data privacy necessarily from the avenue of compliance because that's, there's the carrot and the stick, right? So that's the stick. We like to say to people, wouldn't you like your customers to trust you? Wouldn't that customers trust lead to greater loyalty, wouldn't that greater loyalty lead to greater share of wallet all that stuff, right? Sounds like CRM speak, but at the end of the day we try and say, if you can build trust, then you can build retention and loyalty and all that other stuff. And we do believe in that. And we have lots of studies that show that is the case, right? That is the case, that if somebody trusts you, they will stick with you longer and they will buy from you additional products and all that other stuff. But I don't think they go far enough because at the end of the day, and it's the situation in the United States right now, unfortunately, where the penalties are not significant enough, where they are a rounding error for a company and barely have to be disclosed because they're so small, so they don't have to be disclosed on any of their financials, or they don't have separate disclosures required. Then there's a shoulder shrug, there's a, well, we'll wait till that happens. Or it's an oh, well, right? And right now, my personal opinion is in the United States the penalties are so tiny that you almost can take the risk of, well, if the stuff gets breached, oh, we'll pay the penalties and we'll go on with life. So I think until the penalties are sufficient enough, and in Europe, they have really made it punitive, right? Like, look at the penalties for Meta, they're in the billions, and that's real money. That's real money, but we're not anywhere close to that in the United States.

[00:17:40] KRISTINA: Do you think we'll get there at some point, because it seems like that's mostly driven in the US at least by consumer sentiment.

[00:17:46] CINDY: Yeah. I think we will get there and I don't believe it'll be because of consumer sentiment. I believe it will be because of personal harm. And I hate to say that, but I think personal harm in the United States is one of the key, if not the key driver to data privacy, unfortunately. So, can somebody stalk you? Can somebody surveil you? Are our children doing horrible things like committing suicide because of the interaction with nefarious actors, things of that nature. So I think it's really the greatest issue that our lawmakers see today is if there's, personal harm, not just that your data got breached and your identity got breached, but that literally somebody stalked you and was able to find you or passed, you know, love, interest or whatever, was able to do horrible things to you. They find that egregious.

[00:18:40] KRISTINA: That's interesting because it makes me a little bit scared when I think about the troves of data that, for example, school districts have using things like Google Chromebooks and Google Suite, and a lot of that data is being collected and has been right for years. It's nothing new. And I keep getting told, well, it's anonymized. It's okay.

[00:18:58] CINDY: Well, one, it isn't, but second of all there's a lady who, she's a lawyer and schools, she's a data privacy czar. And schools are absolutely the rotten apple of her eye because she looks at the way they manage data. I mean, school systems are not in the business of IT and they're not in the business of protecting children's data, and yet they distribute that data far and wide. They give it to the federal government, they give it to other organizations to look at grants. They, they distribute children's data very broadly, and a lot of times that the data that they have on those kids also, insight IQ, developmental issues maybe mental health issues, et cetera, et cetera. And you think about the long tail for these children, like, do you want when you're 14 years old and you had developmental issues or mental health issues or behavioral issues, and that is in a quote unquote public forum, do you want your first employer to get their hands on that? Probably not. And so, I mean, there's just so much harm that can happen there. Insurance rates going up because of mental health issues or developmental issues or behavioral issues in school. Things that should not be disclosed, right? So she's on a complete terror about protecting children's data and for good reason.

[00:20:28] KRISTINA: Do you know her name by any chance?

[00:20:29] CINDY: Yeah, Heidi sas, ironically last name, SAS, SAAS. And I always say to her, you did that because that makes you relevant in technology. And she's like, I did not. I swear that's my real name. So I don't know.

[00:20:41] KRISTINA: I like it. She was born under a certain star.

[00:20:44] CINDY: She was. And she is a star.

She suffers no fools and she is fearless. So I love her. I love that.

[00:20:52] KRISTINA: We'll definitely look her up. Just thinking about, you know, we were talking a lot about privacy, obviously. There's lots of stuff going on in the AI space. Where do you see the most promising opportunities really for privacy tech and data governance to evolve? We've been doing this for decades. Where do we go?

[00:21:07] CINDY: I do believe that the automakers and I'll tell you why in a second, but I believe the automakers are inspired, intrigued, and duly embarrassed by the press and the media such that they are now really looking at data privacy and how the car does or does not create trust And I'll tell you why that's so important. Two things. Number one, monetizing the car. Typically the extra costs, right? So it's like the baggage fees with the airlines, but monetizing the car comes from extra services in the head unit, typically, right? So they can pump the head unit with all kinds of entertainment and this and that, all kinds of convenient services, and they want you to be able to use those services, but in order to use those, you have to give up personal and sensitive data. If you don't trust them, you're not going to, so there's truly a monetization model there that will fall off a cliff. However, here's the real, real problem with automotive. If people don't trust the automotives. Cars today are what is affectionately described as software defined vehicles.

They are completely, it's like a Tesla. You mentioned the Tesla. I mean that whole car is run on software, right? There's very little mechanical and it's all run by software. And even what is mechanical still has sensors that are run by software. Now if somebody disconnects a software defined vehicle, so something with 70 sensors on it, and you say, I am not keeping this car connected 'cause I do not trust you.

How in the world when there is found to be a braking issue, so inadvertent braking based on these conditions, which could cause a real problem, right? Let's just use it for sake of argument and I need to update the software, but I've disconnected my card 'cause I don't trust you. How do I update that software?

[00:22:59] KRISTINA: The choice to disconnect it in the first place is what I would ask

[00:23:02] CINDY: You do, you do! And so number one is you do have the choice to disconnect it. And if all of these things need to be updated, what do you do? If software needs to be updated on 70 modules, you are then forced to go to the dealer. Think about that experience and how cranky you're gonna get. That's happen, right? Yeah. I'm not going to the dealer every two days to update software,

[00:23:23] KRISTINA: But do you think that we're gonna get forced into this notion of accepting those updates no matter what. And I'm thinking just about the connectedness of everything

[00:23:32] CINDY: and that is part of what we see as the future of data privacy laws in the United States, is that opt out is a standard condition. That is, that optout is the condition and that opt-in requires an active voice by you. So in other words, when you get so, and it is, it's a great question because, and I've been on the bandwagon on this for, oh my goodness, probably a decade because it used to be, and I don't know if it still is, I bought a Fitbit. And in order to instantiate that Fitbit and for it to work, I had to agree that all the data off my Fitbit was going to Fitbit. And I'm like, wait. I don't want you to know where I am, how much I weigh, how many steps I take, the whole nine years. So same with your thermostat. In order to put in a Nest thermostat, it was that opt-in was mandatory for the thermostat to work, so the device was dependent. The use of the device that you bought to work and do a service for you was dependent on you giving up your data. Now, there are many states, including California, that have said, no, no, no, no, no. Nice idea, nice try. But if somebody wants to give you their data, they can, but they don't have to to use the device that they paid you for. And that's the way I, I think it should be. I think it should be that with a car is to say, you don't have to start out with this thing connected. By the way, we will tell you. The implications of that. But if you would like it to be connected, here's all the ways that we will protect your privacy. So it's gonna force them to earn your trust in order to have that data and that data coming from you.

[00:25:08] KRISTINA: Do you fundamentally believe that companies can get to the point where we do trust them? Can they do the right thing and get to the point where we say, yeah, I actually, because I would love to do that. Are there folks already doing it the right way today?

[00:25:20] CINDY: I think the answer is yes, and I don't think that that's gonna happen just because they want to do it or that they're gonna do it necessarily because they think it's just the right thing to do. They're gonna do it because it's dependent upon their services working. It's dependent upon their car being connected and them pushing software into the car, software updates into the car. So while, I'm optimistic that there are companies that do it because it's the right thing to do. By and large, I will say with how technologically driven all products seem to be, in order for those products to stay connected, they're gonna have to prove to you that they're maintaining your privacy. And so I think they're gonna get forced into it and will be the benefactors, which is great.

[00:26:05] KRISTINA: That's an exciting future to look forward to.

[00:26:08] CINDY: I think so too. And I, and we already see, we already see the trappings of that. We already see companies saying, I don't wanna be on the wrong side of this. I don't wanna have somebody find out that they can't trust me after they bought my product. 'cause then I'm doomed.

[00:26:20] KRISTINA: Tell me one more thing that excites you about the future of privacy tech. As we're so optimistic at this moment, I.

[00:26:26] CINDY: I would tell you, I am very bullish about I even see in this administration who seems to be fast and loose on near everything, but, the data side of it seems to catch their attention, right? There's a lot of conversations about what are you doing with that data? What are you doing with that data? What are you doing with that data ? That's personal data. So there's a conscious awareness to what the heck is happening with that data, be it social security data. Be it, immigrant data, whatever it happens to be, and the rights and responsibilities of that data from the standpoint of it's rightful owner and it's data subjects. We've never heard that before. And I can tell you from this administration, we never heard it in the first administration, that data, what they would've said to you, data, what, what's your problem? And there's a very keen conscious awareness. And so that gives me pause to think that. They do wanna protect citizens. They do wanna protect citizens' data and they don't want nefarious actors. And look, if no other thing, they're scared to death about the Chinese getting their hands on all that data. So even if they're protecting us from what they feel to be the big, bad Chinese, okay. Who cares. That's fine. And, and that's the car thing, by the way. They do not want the Chinese distributing cars in the United States because they believe they're bringing cheap cars here to get their hands on data. And guess what? They're not wrong.

[00:27:44] KRISTINA: Cindy, you have been such a mover and shaker in this space and we so value you, but you've also done so much, I think for so many others. As I think about just the push you've given for a lot of women in the industry as well, what advice would you give to the next generation of women technologists and digital policy innovators listening to you right this moment?

[00:28:05] CINDY: I've given this a lot of thought over the last couple of months in particular because the acronym, DEI, is somewhat under siege, if you may say, and I've spent my whole career call it, 35 plus years, ensuring that women had equal opportunity. Not that there's no women that I know, myself included, that wanna be at the table because they're a woman and I don't wanna be at the table. 'cause I'm a woman to be very candid. I don't wanna ever be hired 'cause I'm a woman. I don't want anybody to say, well, we need a, a woman for that. Okay? Because I'm not that person. Right? I want equal opportunity to be at the table because I'm the right woman or the right person at the table. I've got the brains to do it. I got the Braun to do it. I got the attitude, whatever it is. But what I would say is, let's say that DEI is no longer in vogue. Okay. So what do you say to women and to me, I, I kind of say three things. Number one, don't lose your voice. Because when women lose their voice, we lose it for a long, long time. And we have a hard time finding that voice again. So. God knows, don't lose your voice. Don't get quiet. You start getting quiet in this environment and you're gonna become the next church mouse.

So retain your voice. Identify your value and really be clear on your value as what do you bring to the table? What do you bring to the party? If we've done nothing, we need to have a tattoo on the inside of our arm that says, here's all the values that we bring to the party and be unapologetic about those. And then thirdly, have a vision. I kind of call it, I call it V three, really crafty, right? But instead of DEI, I call it V three, which is don't lose your voice. Be able to identify, clearly, articulate, and identify your value and have a vision for whatever it is. Come in with a vision. Here's where I see this going. We have too many people talking about problems today. We have too many people saying everything's broken. Oh, you know, it's the ER syndrome. What was me? Everything's broken. What's the vision? Where do we go from here? And I think when people start talking in vision, here's where I see this going. Here's where this could go. What do you think about taking this here? I think that. Turns the conversation around and it gets us optimistic again. It gives us hope and it gives us joy and reason to be to be thinking that there is a future.

[00:30:21] KRISTINA: That's great advice, and I asked it in context of women, but it, frankly, it applies to every person out there.

[00:30:25] CINDY: Indeed, indeed. It's really not gender specific and I know a lot of people, I'm sure you do as well, that are really struggling with finding joy on a day over day basis. But you know what, there's so much to be hopeful about. There really is a ton to, to have great vision because there's a lot to fix. And so, you know, I love fixing things. I love building things and fixing things. So to me, I'm like, look at all the stuff that needs to be fixed. I'm very optimistic about all the opportunity to fix stuff. So I don't know, maybe I just, get up on the other side of the bed.

[00:30:56] KRISTINA: Thank you so much for sharing your story, your insights, and your vision, most of all, for a more transparent, human-centered digital future. And also just for all of us having to get up every morning out of bed, put one foot in front of the other and make it all work.

[00:31:08] CINDY: Thank you

[00:31:09] KRISTINA: To everybody listening, if you're interested in learning more about Cindy's work, everything she's up to, I'll drop some links including that to the 360 of me. Into the resources, so you can take a look there. But as always, thank you for joining us on the Power of Digital Policies. Don't forget to subscribe and share this episode if it sparked a new idea or inspires you to think differently about the role of policy and technology. Hopefully, you have been inspired, and until next time, stay curious, stay balanced, and make sure that you're hitting that space between the risk and opportunity that the digital world provides.

[00:31:39] OUTRO: Thank you for joining the Power of Digital Policy; get access to policy checklists, detailed information on policies, and other helpful resources, head over to the power of digital policy.com. If you get a moment, please leave a review on iTunes to help your digital colleagues find out about the podcast.

Feel free to respond to this podcast here: