We Live to Build Logo
    37:002023-01-24

    I Got Ads for Cancer Cures After My Wife Died

    It's the dark side of the internet most of us never see until it's too late. This is the story of how I Got Ads for Cancer Cures After My Wife Died. In this deeply personal interview, Dan Frechtling, CEO of the security firm Boltive, shares how his family's health crisis exposed him to the predatory underbelly of targeted advertising and how it drove his mission to protect data privacy.

    Data PrivacyDigital SecurityEthical Technology

    Guest

    Dan Frechtling

    CEO, Boltive

    Chapters

    00:00-Introduction
    03:33-The 250 Billion Daily Auctions Trading Your Personal Data
    07:07-How to Comply With New US Privacy Laws
    10:23-How We Scan the Internet for Malware and Data Leaks
    13:53-My Personal Story: Ads for Cancer Cures After My Wife Died
    17:15-How Your Search History Can Be Used to Prosecute You
    20:38-Data Privacy is Now a National Security Threat
    24:25-The Real Reason Governments Are Banning TikTok
    27:49-The History of Cookies (And How They Became a Monster)
    31:02-The 2018 "Great Collision" That Changed The Internet Forever
    34:28-The Future: When Neuralink Can Read Your Thoughts

    Full Transcript

    Sean Weisbrot: Dan Frectling is the CEO of Boltive, which helps brands to protect their end users from malicious, offensive and surveillance media. It's kind of a mouthful. Hopefully you can explain that in a little bit more depth. So why don't you tell everyone more about that, a little bit more about yourself. Do you want to share about skiing?

    Dan Frectling: Well, I, I actually discovered something you haven't done with all of your world travels and experiences. we've, we, I was, I'm quite surprised here. You haven't gone skiing. I, yes, we got, we got good snowfall up here in the Cascades, outside of Seattle, Washington, which is where I live.

    Dan Frectling: Took my boys, on New Year's Eve, and it was almost empty, which surprised me because we had snowfall the entire time and we had great skiing and, and short lift line, so about all you can ask for. but my name is Dan Frectling.

    Dan Frectling: Thanks Sean very much for allowing me to be on your podcast. I'm CEO of Boltive. and I've had a little bit of my own kind of journey about how I ended up here, which we can talk about in a moment. But, for the areas that are bolted, protecting against the malware and advertising and other media is a problem.

    Dan Frectling: Offensive and annoying, annoying ads, and interruptions are part of that, but the big driver right now, especially. This year, and especially, right now as we record, this is a major movement in data privacy that's going on, not just in the US but really around the world. And so, our software also protects against privacy violations, that through this protection brands can keep their consumers safe.

    Sean Weisbrot: So what kinds of privacy issues do end users and brands face?

    Dan Frectling: There's really a couple key areas. One is, it is around advertising and the other is around what's called on-page digital objects. And so let me kind of explain why, why each of those are important. So advertising, especially what's called programmatic targeted advertising.

    Dan Frectling: Really funds the open web that we've grown used to. That's why content is free. And 80 to 90% of news is paid for by ads. Although we say we see paywalls from time to time, digital ads are over 70% of all ads. in marketing, it's about a half a trillion spent globally. So that's one side of it.

    Dan Frectling: Now, digital objects, which may be a little bit harder to conceive of, are like tags on pages, pixels, beacons, they enable commerce on the web and. 95% of marketing websites use Google Analytics, which is a very well known example of this. But there's other, other tags out there too.

    Dan Frectling: Facebook has pixels that have gained some notoriety and, and there's many more.

    Dan Frectling: But these allow measurement of conversions to know if somebody saw an ad or took an action later, are those two connected? And, and that's really, the, the two, the two key parts of it. But the problem is that both. Online advertising and on-page digital objects may sell and share data which violates newly enacted consumer rights, newly enacted.

    Dan Frectling: I say in the US 'cause as we record this on January 2nd, two new laws just took effect yesterday. One is California, CPRA, and the other is Virginia's V-C-D-P-A. And, that's quite a, a, a use of acronyms, but they're basically two privacy laws that are protecting users from their data being, being sold or shared.

    Dan Frectling: And it, it's a key issue because the amount of, and, and scope and scale of, of trading, of user data is beyond what most people realize. So there's a, there's 250 billion, realtime auctions per day, and that compares to 15 billion stock trades per day. When those auctions are happening, user data is being sold to the highest bidder.

    Dan Frectling: I. And we can talk about what that kind of user data is, but it's you, it's me. It's where we've been. It can be sensitive data like health conditions, political beliefs, ethnicity, immigration status, sexual orientation. All of that is actually quite valuable in the world of advertising. So I'll stop there for a second. Just kind of paint the picture of, of online ads and digital objects being the, the, the two enablers of, of, of the web, but also two key culprits also.

    Sean Weisbrot: I'd like you to go into these two new laws that were just passed. you mentioned the names of them, but what do they actually do and how are they different from each other? And, are they basically just clones of GDPR or not?

    Dan Frectling: GDPR started at all. so that is true and, and many, many laws around the world have sought to. emulate GDPR, including what's going on in the US but California is really the leader in the US California. What you'll find is, we'll talk about California, Virginia, but there's actually three other states, that are, that are passing laws as well, later this year.

    Dan Frectling: But California and Virginia are the real trendsetters around this and, and California in particular, California passed the CCPA. In 2018 through, our legislative process, and I think it took effect in 2020, and then it wasn't strong enough. So they passed an amendment called CPRA, which was through the referent which meant it was, it was, initiated by the voters or initiated by, interest groups and, and voted in by the voters, not the legislature.

    Dan Frectling: and so the, the, the, the protections in California have just become greater and greater to protect their citizens. Virginia is a little bit more business friendly. It has some elements that are not quite as strong, as what's stated in California, for example, in California when, when we talk about the definition of a sale of user data, it's for anything of value, not just monetary value.

    Dan Frectling: In Virginia, the definition is really around monetary value. So that's one area that makes California stronger. But on the other hand, Virginia's quite strong too because it has this concept of sensitive data and sensitive data under Virginia law. You must opt into, you must, you must affirmatively say, yes, I will share my data.

    Dan Frectling: Whereas in California, it's an opt-out regime, meaning if you don't say anything, the assumption is you're allowing your data to be shared. You need to opt out. Well, the opt-in for sensitive data in Virginia is much more like G-D-P-R-I. And so Virginia's one of three states that has this opt-in regimen.

    Dan Frectling: So as strong as California is and as pervasive as it is, and as trendsetting as it is, it's not quite as strong as Virginia in that regard. But both laws are taking effect, along with, in. Later this year, there's also, Connecticut, Utah Law is going to affect, Colorado is going into effect, and every single one of these laws has certain protections that involve targeted advertising. Since that has been the standard bearer of. Of consumer awareness and outrage and I, I would say around privacy violations.

    Sean Weisbrot: How can companies become compliant with these new laws? Would it be sufficient to just say, we're GDPR compliant and, and that's good enough, or are there differences in, in actual compliance for us?

    Dan Frectling: There is some merit to that. If you, if you plan and, and you orient around the most stringent laws in the world, if you're, say you're a global business, then you're not wrong. It's much better, in my opinion, to do that than to try to geo-fence your users and try to do the Virginia thing for Virginia residents and the Colorado thing for Colorado residents and the California thing for California residents.

    Dan Frectling: 'cause, I, IP identification is not infallible. So yes, if you're a global business, many already have. Managed the stringent elements of GDPR and then set that as their global standard. there, but there's other things I would say companies should do beyond that because it really starts with assigning a leader for your data privacy program.

    Dan Frectling: And it doesn't need to be as formalized as the data protection officers that are required under GDPR. you can, if you're just operating in the US for example, just, just assign someone, who's, whose point on that. The second thing though is really understanding where your data is going and mapping your data across the organization, because many businesses just don't know.

    Dan Frectling: They don't know where their own data is kept. They don't have all their databases in, in alignment, and they don't know what their vendors are doing with the data either. So knowing where your pockets of data are and inventorying that, and there's, there's good frameworks around like the nist, NIST, privacy framework.

    Dan Frectling: But this, this idea of knowing where your data is, means an inventory of, of attention to where the data's stored, especially if it's sensitive data and then the data flow. How is it transmitting? How is it, what's the full life cycle of when it starts and when it's used and when you purge it. There was a, not too long ago, pen only database where data was never thrown away.

    Dan Frectling: Well that's a real risk. Because businesses have gotten caught with data that's five or 10 years old that violates laws, and that's data that they weren't really using anyway, which is kind of a tragedy to get, and suffer fines for data that you weren't really using. But anyway, tho those are, I would say, assigning someone, to be a point person, mapping out where your data's going, and then there's more sophisticated things that Tive can do.

    Dan Frectling: To make sure that you are not, inadvertently, making mistakes in, in, in how your consumer data is, is then being, really is being leaked.

    Sean Weisbrot: So how can companies figure out how to map their data then

    Dan Frectling: if you pick one of these frameworks, like NIST is a good one, if you pay attention to what's called PII, personally identifiable information. Or sensitive data. And again, sensitive data would be health, race, religion, religion, politics, union membership, immigration status. You, and you recognize that your data sprawl is likely, even if you're a small business, just track which databases you're using, what APIs you may have set up, or data flows or data transfers to external partners.

    Dan Frectling: And why this is so important is because there's the right. In California, for access and delete, meaning contact a business and require that business to tell that, to tell you all the data that they have on you, and then delete it if you request and, and you and correct it. Also, this is a very hard thing to do to access, delete and correct if you don't know where the data is.

    Dan Frectling: So that, I think is really, really key around data flow, where it's moving between systems. And this full lifecycle concept, what are the rules you wanna set for the people who own that data and operate that data? And what third parties are, do you want to, do you, do you wanna honor as well? So I think that's a good place to start. There are consultants out there that can help you get started with that. And there's software like bolts that helps automate the process. But that's where I would say the next steps would be.

    Sean Weisbrot: How are you able to discover the malware and these other issues that these brands are facing and their end users are dealing with?

    Dan Frectling: We have two technologies, which are scanning and blocking. And the scanning that we do is, which is protected by several patents, is to undertake user journeys. So we visit websites, we simulate real users. And we do this in partnership with these websites. So if we're working with a travel company, we might be business travelers.

    Dan Frectling: If we're working with an athletic company, we might be college athletes. If we're working with a consumer goods company, we might emulate young parents, and you can do this by emulating the patterns, the browsing history, and the purchase history of these different personas. And we scan, meaning we go through the customer journey.

    Dan Frectling: We visit websites. We click on things. We browse things, we consume ads, and we record everything that's going on to see if anything contains malware, contains Trojans, contains redirects, malicious browser extensions, if, if those are resident in the ads. And then that's, that's the scanning part.

    Dan Frectling: But then we supplement that with blocking, which is, everything I've described to you so far requires no integration. But the, the blocking portion is, is a line of code that then sits on websites, and then when one of these signatures comes up, one of these problems comes up, we can block it and replace that with a good ad and make sure that the user experience and the revenue, and the marketing reach aren't interrupted.

    Sean Weisbrot: What made you wanna get into this?

    Dan Frectling: The way I got into this was actually accidental. So, I was my own victim, I guess, of predatory advertising and my wife one day had back pain. And when she was experiencing that back pain, we, I. Went on a lot of health sites. So we searched for different causes. We searched for physical therapy that we could do.

    Dan Frectling: And then after no improvement after a while of this, we went in for an MRI and we got a very surprising diagnosis, which was possible cancer. I. More tests. We got the worst news, which was, what's called non-small cell lung cancer, which is a non-smokers form of lung cancer. And it was stage four, so it had spread throughout her body.

    Dan Frectling: So then I went into overload really, overdrive. And I was looking all over the web for information on this. What are the different treatments? What's the prognosis? What, what can you do? And I didn't realize that I was being profiled based on my browsing history and the sites that I was going to. So I started to get these ads for shady cancer treatments.

    Dan Frectling: I. And, some of them were very suspicious. but they were all personal. And this is what, where I discovered data leakage and surveillance advertising, that you can observe where you're going and you can be targeted based on that. And after a career as a marketer for most of my career, prior to that point, I realized how intrusive and predatory this marketing can be.

    Dan Frectling: And these ads are so durable that even after my wife passed away, I was still seeing the ads about cancer treatments. So that's a real problem. At the same time, I was finishing up, I was integrating a business that I had just sold to Verisk, my prior business, which was in cybersecurity also. And, I was ready to try something different and I came across tive.

    Dan Frectling: Through an introduction, I realized that they had the technology to stop this kind of intrusive act, media, and by taking it from the malware world, which I described a moment ago, and applying it to the privacy world. So there's this sort of kinship between security and privacy, and we're seeing that overlap growing.

    Dan Frectling: We were able to do something that no one's done before, which is actually create a smoke test for internet pipes. To see where the data leaks are, who exactly is causing them, which of your vendors, how to stop them, and then to correct, and then to verify that the corrections have been made. So from kind of a, a personal situation, I found myself taking a left turn career-wise and, joining boltive as the CEO, the pervasiveness of getting ads you don't want.

    Sean Weisbrot: Is crazy and it's possible that you were still getting ads because even after she died, you were the one who had been doing the searches. So maybe it was assuming you were the one who had the cancer and as long as you were searching, you were still alive. Therefore you might still have cancer and might still want these things.

    Dan Frectling: I might have also gotten bucketed in, into. A medical care professional because the amount of research I was doing was suggestive of someone who might be a doctor, and the ads for doctors are very premium priced, so there's an incentive to serve those ads to medical professionals. There's all kinds of reasons and all this stuff that's underpinning the ecosystem of advertising, that led these ads to be, as I said, so durable and pervasive that I couldn't get rid of 'em.

    Sean Weisbrot: I've heard of examples where you're like a guy. got something in the mail from Walmart, like for a, a baby crib or something for his, like, something about pregnancy. And the daughter, like, wasn't even sure she was pregnant yet, but she had been doing some research about pregnancy and the dad was like, what's going on?

    Sean Weisbrot: Well, it turned out she was pregnant and it was obvious they were pissed because Walmart was breaching her privacy 'cause I think she was a minor. But I, I've, I've also encountered this where like, oh, I want to go to Italy, and then next thing I tell someone on WhatsApp, I want to go to Italy, and 20 minutes later I'm on Google and I'm getting served ads to get flights to Italy.

    Dan Frectling: It's uncanny how that happens. I, I think the story you're describing, I believe it was Target, but the, the, but the, the, the details you're describing are under, I understand what happened that she hadn't told her parents. She was searching for things to make. Target's algorithm.

    Dan Frectling: Think that she was pregnant and then she started getting coupons and, and baby related stuff. And that outed her in a sense. and that was just the beginning. So there was a bit of a stir that's still going on now after Roe versus Wade was overturned in the US there was the, the Dobbs decision, which allowed states to basically set their own rules on, on, Pro-life versus pro-abortion stances.

    Dan Frectling: But, one of the areas that makes the internet kind of our friend and our foe is that geolocation allows businesses to see the advertisers, really internet media to see if someone is near or inside, Planned Parenthood, or family planning centers or places where abortions are offered.

    Dan Frectling: And if you overlay that with the law enforcement element, which is that it is, it is legal for law enforcement to purchase private data that's collected this way through geolocation, through search history, and law enforcement can purchase that information and use it in prosecution without running afoul of the fourth Amendment on reasonable search and seizure.

    Dan Frectling: There's a real concern that the combination of knowing someone is searching for family planning help that they have been inside or near, an abortion clinic, could lead to prosecution, could lead to evidence that could get people arrested. so that's where the kind of elements of privacy and safety and, and law enforcement intersect. And it goes far beyond the target case where it's fairly innocuous kept within the family. I. Now you've got the potential for crimes to be, accused people to be accused of crimes based on internet information that otherwise wouldn't have been available through law enforcement surveillance practices.

    Sean Weisbrot: I mean, that's what Minority Report was warning against was precognitive crime or pre Pre, yeah. Pre-committed crimes and having those people be arrested for committing a crime. For, for, thinking about committing a crime or planning on committing a crime,

    Dan Frectling: showing all the pre-planning, all the premeditation, but not right, but not committing the act.

    Sean Weisbrot: And I'm sure the things that you contend with are, are, in that realm they're, preventing people from having the government be, too, too hot to, to try too hot to handle.

    Dan Frectling: That is true. And beyond that even, I. We are seeing in the past year that privacy has also become a national security issue as well. So, Google in mid 2021, there was a group of senators that sent a letter and the ad tech ecosystem saying that, da, that data shared in digital ads would be a gold mine for foreign intelligence. And then a year later, Google was found to be sharing data with RU target. Which is actually a sanctioned Russian company, a unit of spare bank that was subject to sanctions after the invasion of Ukraine.

    Dan Frectling: And, that was exactly what the senders had warned against. Then in September of 2022, a few months ago, Biden signed an executive order, assessing foreign firm's use of data for surveillance, tracing, and tracking. So that was Google. Now a little bit. Closer to the conversation is TikTok. And TikTok was investigated mid last year because they misled lawmakers about Chinese employees' access to US data.

    Dan Frectling: Also, mid last year there was a $92 million class action against TikTok that a judge approved. And as more and more, evidence kind of came around this in December, just a few days ago, just last month. There is a ban for federal government employees on government device devices, who can no longer use TikTok that was following a ban in the house, 13 states college campuses.

    Dan Frectling: And why? Why is everybody going after TikTok? What's the problem? Are we just, are we being China phobic? Well, China's national intelligence law requires citizens to assist in state intelligence work, and every company has a chapter of the Chinese Communist Party. You can't really, as a private industry, get away from a request from the Chinese authorities.

    Dan Frectling: So TikTok has become quite a magnet, but it's not just TikTok. As I mentioned, it's Google. And then the most recent, I think. We talk about the dangers of unfriendly countries. A Russian software vendor called Push Wash was found to be installed in military and US government agencies because they had, they had disguised themselves.

    Dan Frectling: I think they, they had purported to have American employees based in Washington, DC and this SDK, which was installed to profile online activity of app users. Was, was actually run out of a company outta Siberia and similarly to the Chinese law. Russian authorities have compelled local companies to hand over user data to.

    Dan Frectling: So we're seeing this overlap and we're seeing the privacy issues become criminal, related as we talked about a moment ago with Dobbs decision and national security related with the actual involvement of technology companies with, with Google and TikTok and push, push being three. Commercial examples of that.

    Sean Weisbrot: Yeah. Tiktox is a really bad one. It's, it's been made clear that they have two algorithms, one for inside of China and one for outside of China. The one inside of China promotes social harmony, positive things, because the Chinese government wants people to see positive imagery and things that promote them being good citizens, but what they promote outside of China. Are things that aren't positive. Things that get people angry. The things that they know work in the Facebook algorithm. things that don't create social harmony. They, they want to create stress and anxiety and panic and fear in citizens around the world as long as they're not Chinese citizens,

    Dan Frectling: right? It's the algorithm's good for eyeballs and. It started with sound bites, then click bait, and now scrolling. Right? You can't get away. It's much harder to turn away from the TikTok app if you're engaged in a, in a, in a mood of anxiety.

    Sean Weisbrot: I deleted TikTok a while ago. I just, yeah, I'm, I'm not a social media person, so I've got an Instagram, but like I post pictures of my dog once every few months when I'm with him. I've got a Twitter, but like. I share podcast quotes like, I, I don't use social media, so I, I'm able to see what's happening without being exposed to it directly. Although there isn't so much in the social media space where, well, I guess there are still invasive ads and all that, but. I don't suspect a social media company is gonna come knocking on your door and go, Hey, help us to protect our users. 'cause they probably don't care.

    Dan Frectling: First of all, what you just described there is best practice for privacy protection is to not overshare information. We don't think about this, but your dog is safe, but people will share pictures and photos and activities of their children and there's a, that creates a permanent record and is harm gonna come from that?

    Dan Frectling: Hard to really know, but that child's gonna grow up to be an adult and that permanent record will still be out there. So the oversharing problem, I think in social media, is encouragingand that does run counter to privacy. but that, so what, what you're describing there is a good practice. I, yeah. And I, I don't think social media networks really care about privacy.

    Dan Frectling: 'cause it, it, it, it also, cuts across their desire to monetize their users. Because their services are free after all. But there, there's a bunch of things. User, there's a bunch of things individuals can do to protect themselves and, not oversharing is one of them. managing your privacy settings.

    Dan Frectling: we are about to, as, as we record this in early January, we're, we're not too far away from the, national Data Privacy Week that runs January 24th to January 28th. And that's a good time to remember to rethink your digital footprint and to, to rethink your, your digital footsteps.

    Dan Frectling: There's a resource from the National Cybersecurity Center, which is, which created, data Privacy Week, and this is stay safe online.org. And we can maybe put this in the show notes, but if you go to stay safe. Stay safe online.org. There is, a bunch of links so that you can manage your privacy settings.

    Dan Frectling: You can go to major apps like Amazon, websites like Amazon and apps like Venmo, zoom, Spotify, and through this utility you can opt out and you can update your privacy settings to be safe. So that's one thing to consider. Other things like you, you deleted TikTok. Good move. Like a lot of us have apps that are sitting on our mobile devices that may be tracking us.

    Dan Frectling: iOS is a little bit better about that, but deletes unused apps. Don't keep 'em around. When you use a browser, be aware, you may be, you wanna use a privacy safe browser, like Brave or Ghostery, Dawn or Firefox, and like apps you might wanna delete unused browser extensions. When you do your searching, do you want to use Google where you're gonna be tracked and recorded?

    Dan Frectling: Or do you wanna use something like DuckDuckGo where again, you don't leave a history of what everything you're doing? So there's lots of things that we can do as we think about our privacy and, and it's really a continu right? We like, we love convenience and we love privacy, but we, they're, they're trade-offs to be made.

    Dan Frectling: And some of us that don't care about privacy and care more about convenience will do certain things and some that swing in the other direction on that gradient, will, will opt more for privacy.

    Sean Weisbrot: The problem that I have with those people is, oh, I've got nothing to worry about. I'm not doing anything wrong. And my, my counterargument is, it doesn't matter if you're doing nothing wrong. The fact remains is. What you do should be private. No one should have the ability to see that. And, and that's that,

    Dan Frectling: That's right. And am I doing anything wrong? No, I mean, no, people don't, for the large part, do things wrong, but there are things that are private about them that they may not realize, like their health conditions with our family, it was cancer, but there's all kinds of other health conditions that could be embarrassing that you don't want necessarily.

    Dan Frectling: Dozens and dozens of ecosystem partners and ad tech to know that, are, can be shared and can be accessed.

    Sean Weisbrot: So is there anything we haven't talked about that you'd like to add?

    Dan Frectling: If we think about where marketing is going and where the world is going, I guess that there's been this, this movement. I think if we, we take a little bit of history. I think there's a nice history lesson here because. Privacy was a non-event really until about 15 years ago. No one really talked about privacy, and the origin of tracking on the web came about because of changes in media and with the rise of the internet, and more importantly, with the rise of search engines.

    Dan Frectling: You got this starting point for web surfers where people weren't using the web. Like they used to use magazines like the dawn of the internet. You were going to MSN, you were going to an OL, you were going to Yahoo, and that's the destination. That's where all your content was. So if you're buying a car, you're gonna go to Yahoo, you're gonna go to MSN, you're gonna go OL, and you're gonna research.

    Dan Frectling: But then when search engines came around, the audience fragmented and you've got all these different places where you can visit sites and learn about cars and. That's where cookies came along, right? We didn't, we haven't spent a lot of time talking about cookies here, but the, the first unique cookie was, was invented in 94 by Netscape, by a young engineer who was simply trying to allow a browser to remember, private privately and anonymously.

    Dan Frectling: Remember a user that had signed in or had something in their shopping card or preferred a certain language and. Then the cookie became a monster because a year later, double click said, Hey, this this utility that was created by Netscape. You know what? We can use that to follow people around the internet because double click was in so many different places and seeing users in multiple sites, and that has led to all these targeting innovations over 20 years with what's called interest.

    Dan Frectling: interest based advertising that tracks users across sites, the multi-device era in 2007 with. with the iPhone that created cross app advertising. And then seven years later you had these device graphs, which Facebook is really, really good at, and you can connect someone between their mobile device, desktop and later connected tv.

    Dan Frectling: So that was the kind of snowballing effect of all the targeting and retargeting, using data collected in one place to serve an ad to somebody in another place. And at the same time when you had. data breach is growing at, at really high rates, which I think the data breach grew from 419 cases in the US.

    Dan Frectling: And, about 10 years ago to, almost 1900. the most recent year you had Edward Snowden, you had Equifax, you had Cambridge Analytica, you had this great collision in 2018 that actually led us to GDPR, and that collision was between hyper targeting and data protection. Well, that's where we find ourselves today.

    Dan Frectling: Even though we're less than five years later, we're kind of in the aftermath of how, of this collision and why, privacy is moving in the direction it is. So I think that's something to think about as we observe data privacy week, which is again, the 24th of the, the 28th of January. But also remember that data privacy isn't just a week.

    Dan Frectling: there's data being shared. Probably in ways we don't realize 51 other weeks of the year besides, data privacy week. So, to kind of create our own awareness of that and to make our own choices about how much we wanna share because, the world is changing and the laws are there to protect consumers from, from some of the dangers.

    Sean Weisbrot: And one of the things I've mentioned in a previous episode, I think it was, it was 102, I was talking with a cybersecurity expert from Portugal. And what I had said in that episode, which I'll say again here, is you may think that your encrypted conversations today will forever remain encrypted, but quantum computing will destroy those encryption methods, which means anything you say at any time. Anyone on any application can at any point in the future potentially become public. So it doesn't matter if you are just looking for privacy in the browser. If you're not thinking about privacy when it comes to what you say on a device. You need to be doing that as well.

    Dan Frectling: That's a great point. I, I didn't, I didn't know that. Like, I guess you're right. Quantum computing can break encryption, then nothing is safe. I do remember that episode, I think it was with Dr. Eduardo Rocha. That was a really good one. I enjoyed listening.

    Sean Weisbrot: Data privacy is extremely complicated. Conversations are very complicated. Basically the best way to live your life is if you don't want anyone to know what you're thinking or saying, then don't think it or say it in a way that other people can access.

    Dan Frectling: Fortunately, our thoughts will still be our own, and will be protected by our skulls, but anything else that leaves our bodies is now as, as you suggest, subject to. Inspection,

    Sean Weisbrot: not after Neuralink.

    Dan Frectling: Oh, there's another episode topic, Neuralink.

    Sean Weisbrot: Once you get a Neuralink, your thoughts are no longer yours, and the next step after that is piping ads right into your brain.

    Dan Frectling: Alright, well then my New Year's resolution is to not neural link with anybody anywhere knowingly.

    Sean Weisbrot: That's a good one. It's like Bluetooth for your brain.

    Dan Frectling: Yeah. Okay. Sounds, it sounds good. It sounds good until you think about it.

    We Also Recommend

    Network
    Before
    You Need It

    How I generated $15M for my businesses and $100M+ in value for my network.

    Sean Weisbrot
    Sean Weisbrot
    We Live To Build

    Network Before You Need It

    How I created $100M+ in value for my network
    and earned $15M for my own businesses.

    Delivered as 6 lessons I learned from experience as an entrepreneur.

    Subscriber 1
    Subscriber 2
    Subscriber 3
    Subscriber 4

    Join 235,000+ founders