460. AI, Internet Scams, and the Balance of Freedom | Chris Olson

Primary Topic

This episode explores the intricate world of internet scams and their impact on different demographic groups, along with the broader implications for individual freedom in the digital age.

Episode Summary

In this enlightening episode, Dr. Jordan B. Peterson discusses with Chris Olson, CEO of the Media Trust Company, the pervasive nature of internet scams and the potential solutions to safeguard users. They delve into the specific vulnerabilities of different demographic groups, such as seniors targeted by romance scams and tech support fraud, and younger individuals targeted by drug dealers or human traffickers via the internet. The conversation extends into the role of AI in facilitating these scams, raising ethical concerns about technology's reach and the balance between freedom and security in the hyper-connected digital world. They emphasize the need for a combined effort from governments, tech companies, and individuals to combat these growing threats effectively.

Main Takeaways

  1. Internet scams disproportionately target vulnerable groups, including the elderly and the young, exploiting their specific vulnerabilities.
  2. The role of AI and third-party code in perpetuating these scams is significant, making it difficult to trace and tackle.
  3. Effective solutions require collaboration across sectors, including tech companies, government, and law enforcement.
  4. There's a critical need for enhanced cybersecurity measures that go beyond protecting corporate assets to safeguard individual consumers.
  5. The discussion also touches on the ethical implications of AI in society, particularly concerning its use in scams and digital crime.

Episode Chapters

1: Introduction to Online Scams

Overview of the vast landscape of internet scams and their impact on society. Key topics include the exploitation of the elderly and the sick, and the use of AI in scams. Jordan Peterson: "In general, a substantial proportion of online interaction is criminal."

2: The Role of AI and Third-Party Code

Discussion on how AI and third-party codes are used by scammers to target consumers, and the challenges in regulating this space. Chris Olson: "The Internet is made up of roughly 80% 3rd party code."

3: Strategies to Combat Digital Crime

Exploration of potential strategies to reduce digital crime, including the role of government and the need for proactive digital policing. Jordan Peterson: "It's like armed guards at a safe in a bank compared to police on the street."

Actionable Advice

  1. Verify the source of any online request for personal information or money.
  2. Regularly update software to protect against the latest scams.
  3. Educate vulnerable groups, like seniors and teenagers, about the risks of online interactions.
  4. Support and advocate for stronger digital crime laws and enforcement.
  5. Use reputable cybersecurity services to protect personal digital information.

About This Episode

Dr. Jordan Peterson sits down with cybercrime expert and CEO of The Media Trust, Chris Olson. They discuss the key targets for cybercrime, dating and other online scams, what legislative measures for internet safety might look like, and the necessary sacrifice major companies need to make for a better digital ecosystem.
Chris Olson is the CEO of The Media Trust, a company founded with the goal of transforming the internet experience by helping technology and digital media companies create a safer internet for people. Under his leadership, the company invented the world's first digital data compliance, Children's Online Privacy (COPPA) and website/mobile-app malware scanning technologies. Through infrastructure in 120 countries, The Media Trust protects billions of people every month from digital attacks. Fortune 10 to hundreds of small and medium-sized tech and digital media companies leverage The Media Trust to protect their customers from digital harm and unwanted data collection.

People

Chris Olson, Jordan B. Peterson

Companies

Media Trust Company

Books

None

Guest Name(s):

Chris Olson

Content Warnings:

None

Transcript

Jordan Peterson
Hello everybody. Today I have the opportunity to speak with Chris Olson, who's CEO of the Media Trust Company. His company is involved, is occupies the forefront of attempts to make the online world a safer place. He mostly works with corporations to do that, mostly to protect their digital assets. But I was interested in a more broad ranging conversation discussing the dangers of online criminality.

In general, a substantial proportion of online interaction is criminal. That's particularly true if you include pornography within that purview, because porn itself constitutes about 20% to 25% of Internet traffic. But there's all sorts of criminal activity as well. And so Chris and I talked about, for example, the people who are most vulnerable to criminal activity, which includes elderly people who are particularly susceptible to romance scams initiated on dating websites, but then undertaken off those sites, and also to phishing scams on their devices that indicate, for example, that something's gone wrong with the device and that they need to be repaired in a manner that also places them in the hands of criminals. The sick and infirm are often targeted with false medical offers.

17 year old men are targeted with offers for illicit drug purchase, and juvenile girls, 1413 that age who are interested in modeling careers, for example, are frequently targeted by human traffickers. This is a major problem. The vast majority of elderly people are targeted by criminals on a regular basis. They're very well identified demographically. They know their ages, they know where they live, they know a lot about their online usage habits, and they have personal details of the sort that can be gleaned as a consequence of continual interaction with the online world.

And so I talked to Chris about all of that and about how we might conceptualize this as a society when we're deciding to bring order to what is really the borderless, the ultimate borderless Wild west community, and that's the hyper connected and possibly increasingly pathological online world. Join us for that. Well, hello, Mister Olson. Thank you for agreeing to do this. We met at the presidential prayer breakfast not so long ago, and we had an engaging conversation about the online world and its perils.

And I thought it would be extremely interesting for me and hopefully for everyone else to engage in a serious conversation about, well, the spread of general criminality and misbehavior online. And so do you want to maybe start by telling people what you do, and then we'll delve more deeply into the general problem? Great. Yes, and thank you, Jordan. Thanks for having me.

Chris Olson
I'm the CEO and founder of the media trust company. Not intended to be an oxymoron. Our primary job is to help big tech and digital media companies not cause harm when they monetize audiences and when they target digital content. So let's delve into the domains of possible harm. So you're working with large companies.

Jordan Peterson
Can you give us what sort of companies do you work with? And then maybe you could delineate for us the potential domains of harm? Yeah, so I work with companies that own digital assets that people visit. And I think maybe to set a quick premise. Cybersecurity is a mature industry designed to monetize the CISO, the chief security officer, generally protecting machines.

Chris Olson
So there's a mindset geared to making sure that the digital asset is not harming servers, the company, or government data. Our difference is that we're helping companies that are in digital. So think big media companies. We're helping them protect from harming consumers, which is the difference between digital crime, which is going to target people, and cybersecurity, which is generally targeting corporates and governments and machines. So now does your work involve protection of the companies themselves also against online criminal activity?

Jordan Peterson
Or is it mostly a mostly aimed at stopping the companies themselves from, what would you say? Mostly, I suppose inadvertently harming their consumers in pursuit of their enterprise and their monetization. Yeah. So the great question, and I think that's where the heart of the matter is. So our primary job is to watch the makeup of what targets digital citizens devices.

Chris Olson
The Internet is made up of roughly 80% 3rd party code. And what that means is when a consumer is visiting a news website, when they're checking sports scores, when they're visiting social media, the predominance of activity that's running on their machine is coming from companies that are not the owner of the website or the mobile app that they're visiting. That third party code is where this mystery begins. So who actually controls the impact on the consumer when they're visiting an asset that is mostly made up of source code and content coming from other companies. So our job is to look at that third party content, discern what is good and bad based on company policies, based on what might be harming the consumer, and then informing those companies what is violating and how they can go about stopping that.

Jordan Peterson
What sort of third party code concerns might they face or have they faced? What are the specifics that you're looking for? Maybe you could also provide us with some, some of the more egregious examples of the kinds of things that you're ferreting out, identifying, ferreting it out and attempting to stop. Yeah. So I think putting any digital company into the conversation is critical.

Chris Olson
So we're talking about tech support scams and romance scams targeting seniors. That is an epidemic. If you're a senior and you're on the Internet on a regular basis, you're being attacked, if not daily, certainly every week. That is now a cultural phenomenon. There's movies being produced about the phenomenon of seniors being targeted and attacked online.

It's teens. So a 17 year old male is being bombarded with information on how to buy opioids or other drugs and having them shipped to their house. If you're a 14 year old female and you're interested in modeling, you're being approached by human traffickers. The sick and infirm are frantically searching the Internet for cures. While that's happening, they're having their life savings stolen.

So our job is to watch that third party content and code, which is often advertising. It's basically a real estate play on what keeps the consumer active on the digital asset to find that problem and then give it back to the company. I can jump in quickly in how we go about doing that. We become a synthetic Persona. We've been doing this for not quite two decades, but getting on 19 years.

We have physical machines in more than 120 countries. We know how to look like a senior citizen, a teenager, someone with an illness. And then we're rendering digital assets as those Personas, acting more or less as a honey pot to attract the problem that's coming through the digital supply chain which runs on our devices. And I think that's going to be a key part of this conversation as we go. Most of that action is happening with us.

And so it's difficult for tech companies and media companies to understand fully what's happening to us. That's the point of their monetization, right that moment in time. So our job is to detect these problems and then help them make that go away. Right. Okay, so, okay, so you.

Jordan Peterson
You set yourself up as a replica of the potential target of the scams, and then you can deliver the information that you gather about how someone in that vulnerable position might be interacting with the company's services in question to keep the criminals at bay. Let's go through these different categories of vulnerability to crime that you described. I suspect there's stories of great interest there. So you started with scams directed as seniors. I've had people in my own family targeted by online scammers who were, in fact, quite successful at making away with a good proportion of their life savings in one case.

And I know that seniors in particular, who grew up in an environment of high trust, especially with regards to corporate entities. They're nothing particularly technologically savvy. They're trusting. And then you have the additional complication, of course, in the case of particularly elderly seniors, that their cognitive faculties aren't necessarily all that they once were. And they're often lonely and isolated, too.

And so that makes them very straightforward targets for, especially people. For people who worm into their I confidence you talked about, was it romance scams on the senior side? It is romance scams on the senior side. Okay, so lay all that out, tell us some stories, and describe to everybody exactly what they would see and how this operates. Okay, so a senior is joining a dating website, just as a teenager or someone in middle age would do.

Chris Olson
They're looking for romance. There are people on the other side of that that are the collecting data on that senior, potentially interacting with them. Once they get enough information on that particular senior, they're going to start to find them in other ways. Send me emails and information. Let's move out the dating site.

They're going to start calling them on the phone. As that starts to evolve, it's that information collection, getting them to do certain things online that sucks them deeper and deeper in. From that moment forward, they become very much wed and emotionally oriented towards that person that they're involved with. And the theft goes from there. Right, so you go on a dating website as, say, someone in your seventies, you're lonely and looking for companionship.

Jordan Peterson
There are scam artists that are on those dating websites as well, who must have what I suspect they probably have keywords and profiling information that enables them to zero in on people who are likely targets. Do you know how sophisticated is that? Like, do you think that the criminals who are engaged in this activity, how good is their ability to profile? Do you think they can identify such things as early signs of cognitive degeneration? I think this is organized crime, and they have their own algorithms and processes to identify people.

Chris Olson
I also, to your earlier point, people believe what they see on computers. They're following what's being provided to them, which makes them relatively easy marks. So once that process starts, they're reeling them in. If they lose a fish, that's no problem, because they're going after so many in any given day. They also have infrastructure in local markets to go deal with people personally.

This is a very large criminal organization that has a lot of horsepower to identify and then attack. Right. Okay, so do you have any sense? See, I hadn't thought about the full implications of that. So obviously, if you were a psychopathic scam artist, posing as a false participant on a dating website would be extremely, potentially extremely fertile ground, not only for seniors who could be scammed out of their savings, but you also mentioned, let's say, younger people who are on the website who might be useful in terms of human trafficking operations.

Jordan Peterson
So do you have any sense, for example, of the proportion of participants on a given dating platform that are actually criminals or psychopaths in disguise? Because here, let me give you an example. Undoubtedly know about this, but there was a website, can't remember the name of it, unfortunately. I believe it was canadian, that was set up some years ago to facilitate illicit affairs. And they enrolled thousands of people, all of whose data was eventually leaked, much of that to great scandal.

The notion was to match people who were married secretly with other people who were married to have illicit affairs. They got an awful lot of men on the website and almost no women. And so they created tens of thousands, if I remember correctly, fake profiles of women to continue to entice the men to maintain what I believe was a monthly fee for the service. Ashley Madison it was called. Right?

And so, obviously.

C
Starting a business can be tough, especially knowing how to run your online storefront. Thanks to Shopify, it's easier than ever. Shopify is the global commerce platform that helps you sell at every stage of your business. From the launch your online shop stage all the way to the did we just hit a million orders? Stage.

Shopify is there to help you grow. Our marketing team uses Shopify every day to sell our merchandise, and we love how easy it is to add more items, ship products and track conversions. Shopify helps you turn browsers into buyers, with the Internet's best converting checkout up to 36% better compared to other leading commerce platforms. No matter how big you want to grow, Shopify gives you everything you need to take control and take your business to the next level. Sign up for a $1 per month trial period@shopify.com.

jbpenne. Go to shopify.com jbp now to grow your business, no matter what stage you're at. That's Shopify.com jbp.

Jordan Peterson
Our dating website would be wonderful hunting grounds for any kind of predator. And so do you have any sense of what proportion of the people who are participating on online dating sites are actually predators, criminals? I don't know what percentage of the participants on the sites are predators, but where we come in and our expertise is that everyone that is visiting is giving information into the digital ecosystem. The issue from there is that they're then able to be targeted wherever they go online. So there's information that's being collected from the site that they're visiting, that is then moving out into the ecosystem so that wherever they go, they're being pulled back and targeted.

Chris Olson
In an example like in Ashley Madison, a criminal may be able to get the digital information about the people whose data was stolen, come back to them six months later, coming from another website via email or sms text, and then press the attack. At that stage, for us and becoming a digital Persona, our job is to look like someone based on the information that sites have collected about them. So we look like an 85 year old grandmother living in a senior community. When you become that type of profile, no matter who else is engaging with you online, the algorithm and the content that is going to be served to you is coming from criminals, regardless of their activity on that particular site that you're visiting. It's simply based on who you are.

So artificial intelligence has been around, in use in digital media and targeting people for 2010 to eleven. So the initial initial use case was collecting data on us. That was the key initial step for AI utilization. The second step was then turning that around and targeting people better. AI was first used to collect information, make things interesting behind the scenes for people.

Second, creating better audience segments which enable that targeting. This third phase that's happening today, you see Chachi PT and the LLMs being used in regular use. The third big stage is writing content on our devices on the fly. So regardless of where the criminal actor is, regardless of how they're moving into the ecosystem, in what initial buying point, they're able to find that person write content on the fly that's particularly tailored to what the digital ecosystem knows about them, to create the situation where they then respond and the criminal activity can occur. Right.

Jordan Peterson
And so what that implies as well then, I suppose, is that we're going to see very sophisticated LLM criminals, right? Who will be able to. This is the logical conclusion of what you're laying out, is that they'll be able to engage. Ha. So I just saw a video, it's gone viral, was released about three weeks ago that portrayed the newest version of chat GPT.

And it's a version that can see you through the video camera on your phone and can interact with you very much like a person. So they had this chat GPT device interacting with a kind of unkempt, nerdy sort of engineer character who was preparing for an interview, a job interview, and the chat GPT system was coaching him on his appearance and his presentation, and I think they used Scarlett Johansson's voice for the chat GPT bot. It was very, very flirtatious, very intelligent, extremely perceptive, and was paying attention to this engineer who was preparing his interview like a. What would you say? Like the girlfriend of his dreams would if he had someone who was paying more attention to him than was ever paid attention to him in his life.

And so I could imagine a system like that set up to be an optimal criminal, especially if it was also fed all sorts of information about that person's wants and likes. So. So let's delve into that a little bit. How much of a digital footprint do you suppose, like, how well are each of us now replicated online as a consequence of the criminal or corporate aggregation of our online behavior? So the typical senior, for example, how much information would be commonly available to criminal types about, while the typical senior, the typical person, typical 14 year old, for that matter.

Chris Olson
Right. The majority of their prior activity that they've engaged in online. So corporate digital data companies know a highly high. Their job is to know as much about us as possible, and then to target us with information to maximize profit. Right?

That's the core goal. Criminals have access to that data, and they're leveraging it just like a big brand advertiser would. So they know it's a grandmother, and they're going to put in something that only runs on the grandmother's device, which makes it very, very difficult for big tech and digital media companies to see the problem before it occurs. I think another thing that's really important to understand is this is our most open border, right? So we've got an idea of national sovereignty.

There's lots of discussion on whether or not our southern border is as secure as it should be. Our actual devices, our cell phones, our televisions, our personal computers, are open to source code and information coming from any country, any person at any time, and typically resolved to the highest bidder. Right? Right. So the digital world, the virtual world, is a.

Jordan Peterson
Is it a lawless frontier? I mean, I guess one of the problems is, like, if I'm targeted by a criminal gang in Nigeria, what the hell can I do about that? I mean, the case I mentioned to you of my relative who was scammed out of a good proportion of their life savings, that gang was operating in Eastern Europe. We could more or less identify who they were, but there was really nothing that could be done about it. I mean, these are people who are operating well out of any physical proximity, but also even out of, hypothetically, the jurisdiction of, well say, lawmakers in Canada, police in police services in Canada.

And so how lawless, how should we be conceptualizing the status of law in the online and virtual world? Yeah, and I think this is where the major rub is. So I'm going to walk back and, and talk about cybersecurity as an industry first. So cybersecurity is relatively mature. It is now geared to monetizing the chief security officer, the chief information security officer, what that means, its providing products and services designed to protect what they are paid to hold dear, which is the corporate asset.

Chris Olson
So the machines and the data for the corporation. If youre part of the government, which is where were going to go in the conversation, then your job as a CIO or a CISO is to protect government machines. Governments will tell you that they're protecting you, right? They're protecting you from digital harm. What that means today is they're protecting your data on the DMV website.

That's basically the beginning and the end of cybersecurity and digital protection. There's legislation which is occurring, coming from attorneys general, from state legislatures, from the federal government in the US, to a degree. Other countries seem to be further ahead seeking to protect people from data collection. And that's your GDPR in Europe. Many states in the United States are putting some rules in place around what corporations can collect, what they can do with the data.

The predominant use case is to provide a consumer with an opt out mechanism. Most consumers say, okay, I want to read the content. They're not doing a whole lot with the opt out compliance. So that's not been a big help to your typical consumer. But it's really the mindset that's the problem in the mindset of corporate and government that is at issue.

And so governments need to tactically engage on a 24/7 basis with digital crime in the same way that they're policing the street. So the metaphor would look like this if grandmothers were walking down the street and being mugged or attacked. At the rate of that they're getting hit online, you would have the National Guard policing every street in America. The government needs to take step forward. When I say the governments, that is, governments need to take a step forward and do a better job at policing people tactically.

And that does not mean that they're going after big tech or digital media companies. It means that they're protecting people with the mindset that they're going to go ahead and cooperate with the digital ecosystem to do a better job to reduce overall crime. Right. So your point appears to be that we have mechanisms in place like the ones that are offered by your company that protect the corporations against the liability that they would be laden with if the data on their servers was compromised. But that is by no means the same thing as having a police force that's accessible to people, individual people, who are actually the victims of criminal activity.

Jordan Peterson
Those aren't the same things at all. It's like armed guards at a safe in a bank compared to police on the street that are designed that are there to protect ordinary people or who can be called. Is that. Have I got that about right? Yes.

Chris Olson
And digital crime is crime. So this is when you're stealing grandmother, stealing grandmother's money. That is theft. We dont need a lot of new laws. What we need to do is actively engage with the digital ecosystem to try to get in front of the problem, to reduce overall numbers of attacks, which reduces the number of victims.

And to date, when we think about digital safety, its predominantly education and then increasing support for victims. Victims are post attack. Theyve already had their money stolen. Getting in front of that is the key. We've got to start to reduce digital harm.

I've been doing this for a good number of years, and the end of that conversation does reside with local and state governments. And ultimately, the federal government is going to have, in the United States is going to have to find resources to actively protect beyond having discussions about legislating data control or social media as a problem. Okay, so I'm trying to wrestle with how this is possible, even in principle. So now you said that, for example, what your company does is, and we'll get back into that, is produce virtual victims, in a sense, false virtual victims, so that you can attract the criminals, so that you can see what they're doing. So I presume that you can report on what you find to the companies so that they can decrease their susceptibility they have to exploitation by these bad actors.

Jordan Peterson
But that's a, that's not the same thing as actually tracking down the criminals and holding them responsible for their predatory activity. And I'm curious about what you think about how that's possible, even in principle, is first of all, these criminals tend to be, or can easily be acting at a great distance in jurisdictions where they're not likely to be held accountable in any case, even by the authorities, or maybe they're even the authorities themselves. But also, as you pointed out, more and more, it's possible for the criminal activity to be occurring on the local machine. And so that makes it even more undetectable. So I don't see, I can't understand easily you obviously in a much better position to comment on this, how even in principle, there can be such a thing as, let's say, an effective digital police force.

Like, even if you find the activity that someone's engaged in, and you can bring that to a halt by changing the way the data is handled, that doesn't mean you've identified the criminals or held them accountable. So what, if anything, I can't understand how that could proceed, even in principle. So the digital ecosystem is made up of a supply chain, just like every other industry. There are various steps that a piece of content is going to go through before it winds up on your phone. So it's running through a number of different companies, different cloud solutions, different servers.

That, okay, they're intermediaries. And so a relationship between those digital police with the governments and those entities on a tactical basis is really the first step. Seeing crime and then reporting that back up the chain so that it can be stopped higher and higher up towards ultimately the initiation point of where that content is delivered. So it seems fantastic, but it is possible. Well, the criminals need to have, they need to use intermediary processes in order to get access to the local devices.

And so you're saying that. I believe that those intermediary agencies could be enticed, forced, compelled, invited to make it much more difficult for the criminals to utilize their services. I guess that's. And that that might actually be effective. That does that.

That still doesn't. Does that aid in the identification of the actual criminals themselves? Because, I mean, that's the advantage of the justice system, right, is you actually get your hands on the criminal at some point? Yes, and I think ultimately it does. So you have to start, and you have to start to build the information about where it's coming from.

Chris Olson
You then have to cooperate with the private entities. Our digital streets are managed and made up of private companies. It's not a government run Internet. All of the information that's fed to us, at least in western society, is coming from these private companies. And so I think rather than having an antagonistic relationship between governments and private companies, where they're trying to legislate, to put them into a position that may be appropriate for certain rules and regulations, it may be appropriate to raise the age of accessing social media from 13 to 16 or 18.

And that is a proper place for the government to be legislating on the other hand, an eye towards reducing crime is critical. And the ethical and moral mindset among all of the parties, and that's governments through, through our corporations, has to be solely on protecting people. And I think that's something that is significantly missing. It's missing in the legislation, it's missing in cybersecurity. It's not something that we've engaged in as a society.

So there are a few countries, and I think even a few states in the US that are looking at a broader whole of society approach. That whole of society approach is, is a mimicking of how the Internet and the digital ecosystem works, which is certainly a whole of society activity. So it is the thing that influences and affects all of us every single moment of every single day. Engaging in that, looking across the impact of society and doing better via cooperation, is a critical, critical next step. How often do you think the typical elderly person in the United States say is being successfully.

Jordan Peterson
No. Is being first communicated with by criminal agents? And then how often successfully communicated with. What's the scope of the problem? The scope is if you're a senior citizen, in particular, if you're a female senior citizen, roughly 78 to about 85 years old, we see that two and a half to 3% of every single page, impression or app view is attempting to target you with some form of crime or influence that's going to move you towards crime.

Chris Olson
So it is highly, highly significant in some ways. Looking at this is shooting fish in a barrel to make a dent. So you're concerned that the legal system isn't going to be able to find the criminals. There is so much to detect and stop, and so much room to turn them off quickly, that we can gain a significant reduction in digital crime by working together and considering society as a whole, instead of the different pockets. And how can we legislate or how can we try to move a private company to do better on their own?

Jordan Peterson
Okay, so we talked a little bit about the danger that's posed by one form of con game in relationship to potential criminal victims, and that was senior romance scams. What are the other primary dangers that are posed to seniors? And then let's go through your list. You talked about 17 year olds who are being sold online access to drugs. That includes now, by the way, a burgeoning market in under the table hormonal treatments for kids who've had induced gender dysphoria.

So you talked about seniors, 17 year olds who are being marketed illicit to drugs, 14 year olds who are being enticed into, let's say, modeling and people who are sick and infirm. So those are four major categories. Let's start with the seniors. Again, apart from romance scams, what are the most common forms of criminal incursion that you see? The most common form is the tech support or upgrade scam.

Chris Olson
And essentially, the Internet knows that you are a senior. When you're going to a website that you and I would visit, instead of having a nice relationship with that site and reading the content and then moving on to something else, you're getting a pop up or some form of information that's telling you there's something wrong with your computer. You either need to call a phone number or you need to click a button, which then moves you down to something, something else that is more significant. This is happening millions and millions and millions of times per day, and it is something that we can all do something about. Attempting to educate seniors to try to not listen to the computer when it's telling you to do something is not working.

No. Well, no wonder. To manage that, it's so sophisticated because, you know, once you've worked with computers for 20 years, especially if you grew up with them, you know, when your computer, your phone is telling you something that's actually valid, and when it isn't, it doesn't even look this. A lot of these criminal notifications, they don't even look right. They look kind of amateurish.

Jordan Peterson
They don't, they don't have the same aesthetic that you'd expect if it was a genuine communication from your phone. But, man, you have to know the ecosystem to be able to distinguish that kind of message from the typical thing your phone or any website might ask you to do. And educating seniors, it's not just a matter of describing to them that this might happen. They would have to be tech savvy cell phone users, and it's hard enough to do that if you're young, much less if you're outside that whole technological revolution. So I can't see the educational approach.

The criminals are just going to outrun that as fast as it happens. So 3% a. That's a lot. That's about what you'd expect. Yeah, it is highly significant.

Chris Olson
And I think getting in front of this problem requires cooperation with states moving that tactically, to have the idea of a police force looking at digital. And I think one of the things that both sides, whether it's private companies or states, needs to get to, to wrap their head around is that there's gonna be a cooperative motion to do better with people in mind. Yeah. All right, so let's move to the other categories of likely victim. So unless.

Jordan Peterson
So you talked about romance scams and also computer upgrade, repair, and error scams for seniors. Is there any other domains where seniors are particularly susceptible? Also, I think what I'd put into context is a lot of the data collection that results in people getting phone calls with a voice copy of their grandchild. Right. Which then ultimately is going to result in a scam.

Chris Olson
It is that digital connection that is the leading point that drives the ability to commit those types of crimes, the ability to marry their grandson or granddaughter's voice with their digital Persona, and then finding a phone number that they can use to call them. So there's a lot of action happening just in our daily interactions that's ultimately being moved out into the ecosystem that we have to take a look at that is not easy to fix. Right. So that's a. Right.

Jordan Peterson
Well, and then you're going to have. Right. Well, you're going to have that deep fake problem, too, where those systems that use your grandchild's voice will actually be able to have a conversation with you, targeted to you in that voice in real time. And we're probably no more than. Well, the technical capacity for that's already there.

I imagine we're no more than about a year away from widespread adoption of exactly that tactic. So I've been talking to some lawmakers in Washington about such things, about protection of digital identity. And one of the notions I've been toying with, maybe you can tell me what you think about this, is that the production of a deepfake, the theft of someone's digital identity to be used to impersonate them, should be a crime that's equivalent in severity to kidnapping. That's what it looks like to me, you know, because if I can use my daughter's. If I can use your daughter's voice in a real time conversation to scam you out of your life savings.

It's really not much different than me holding her at gunpoint and forcing her to do the same thing. And so I don't know, like, if you've given some consideration to severity of crime or even classification, but theft of digital identity looks to me something very much like kidnapping. What do you like? Any thoughts about that? Yeah, for me, I would simplify it a little bit.

Chris Olson
That using section 230 or the First Amendment to try to claim that the use of our personal identity to do something online when it's a crime doesn't make sense. So if it's being used we want to simplify this first. We don't need a broad, broad based rule on identity, necessarily, before we simply state that if someone's using this for a crime, it's a crime, and that that is going to be prosecuted if you're caught and detected, which then goes back to actually catching and detecting that the way that that alters, that uses. The pre existent, that uses the pre existent legal framework and doesn't require much of a move. But I'm concerned that the criminals will just be able to circumvent that as the legal, as the technology, as the technology develops.

Jordan Peterson
And that that was why I was thinking about something that that might be a more, a deeper and more universal approach. I know it's harder to implement legislatively, but, but that was, that was the thinking behind it anyway. So, yeah, for us, there is a path that leverages that content to bring it to the device. And I think understanding that mechanism and how it's brought forward versus looking at the content. And I'll give you an example of what's happening in political advertising as we speak.

Chris Olson
Understanding the pathway for how that content is delivered is ultimately how we get back to the criminal or the entity that's using that to perpetrate the crime. The actual creation of the content is incredibly difficult to stop. It's when it moves out to our devices that it becomes something that we need to be really paying attention to. So in political advertising up to October of this past year, our customers asked us to flag the presence of AI source code. So the idea there was they didn't want to be caught holding the bag of being caught being the server of AI generated political content, right, because that just, it looks bad in the news.

Someone's letting someone use AI, it's going to wind up being disinformation or some form of deepfake. By October, we essentially stopped using that policy because we had achieved greater than 50% of the content that we were scanning had some form of AIH. It may have been to make the sun a little more yellow, the ocean a little bit more blue, but using that as a flag, right, to understand what's being delivered out. Once you get over 50%, you're looking at more than you're not looking at. That's not a good automated method to execute on digital safety.

So as we move forward, we have a reasonably sophisticated model to detect deepfakes very much still in a test mode, but it's starting to pay some dividends. And unquestionably what we see is using the idea of deepfakes to create fear is significantly greater than the use of deepfakes. Now, that's limited to a political advertising conversation. We're not seeing a lot of deepfake serving in information or certainly not in the paid content side. But the idea of fearing whats being delivered to the consumer is very much becoming part of a mainstream conversation.

Jordan Peterson
Yeah. Well, wasnt there some insistence from the White House itself in the last couple of weeks that some of the claims that the Republicans were making with regards to Biden were a consequence of deepfake audio, not video? I don't think, but audio, if I got that right. Do you? Does that story ring a bell?

Speaker one? And I think where we are at this stage in technology is very likely. There is plenty of deepfake audio happening around the candidates. So whether you're Donald Trump or Joe Biden or even local political campaigns, it's really that straightforward. I think on the video side, there are going to be people working on it left and right.

Chris Olson
I think it's the idea of using that as a weapon to sow some form of confusion among the populace. Some doubt. Some doubt is going to be dramatically more valuable than the actual utilization of deepfakes to move society. You do, eh? So you do think that even if the technology develops to the point where it's easy to use, so you think that it'll be weaponization of the doubt that's sowed by the fact that such things existed.

And we've been watching this for a very, very long time. And our perspective is coming at this from a digital crime and a safety in content. Safety in content typically means don't run adult content in front of children, don't serve weapons in New York state. They're not going to like that. Don't have a couple walking down the beach in Saudi Arabia.

Right. Their Ministry of Media is going to be very unhappy with the digital company that's bringing that kind of content in. I have the beholder, safe content, drugs and alcohol, targeting the wrong kinds of people. So we look at this from a lens of how do you find and remove things from the ecosystem? If we continue down the path that we're on today, most people won't trust what they see.

We're discussing education. They're going to self evolve to a point where so much of the information that's being fed to them is just going to be disbelieved because it's going to be safer to not go down that, to go down that path. I'm wondering if live events, for example, are going to become once again extremely compelling and popular because they'll be the only events that you'll actually be able to trust. Frank I think it's also critical that we find a way to get a handle on kind of the anti news and get back the entities promoting trust in journalism.

That is a very meaningful conversation and it is something that we need to try to get back to. It's much less expensive to have automation or create something that's going to create some kind of situation where people continue to click. That's a terrible relationship with the digital ecosystem. It's not good for people to have that in their hand. And, you know, with the place where digital crime is today, if you're a senior citizen, your relationship is often net negative with the Internet.

Right. You may want to stick to calling your kids on voiceover IP where you can see their face. Lots of different ways to do that in video calling. But doing other things on the Internet, including things as simple as email, it may be more dangerous to engage than any benefit that you're going to get back. And I think as we move closer to that moment in time, this is where we all need to be picking up and focusing on digital safety, focusing on the consumer.

I think corporates are going to have to engage on that. Okay. Okay. So let me ask you a question about that, because one of the things I've been thinking about is that a big part of this problem is that way too much of what you can do on the net is free. Free.

Jordan Peterson
Now the problem with free is that lets take Twitter for example. Well if its free, then its 20% psychopaths and 30% bots because theres no barrier to entry. And so maybe theres a rule like this is wherever the discourse is free, the psychopaths and the psychopaths will eventually come to dominate and maybe quite rapidly the psychopaths and the exploiters because theres no barrier to entry and theres no consequence for misbehavior. So were putting together a social media platform at the moment thats part of an online university. And our subscription price will be something between 30 and $50 a month, which is not inexpensive, although compared to going to university its virtually free.

You know, and we've been concerned about that to some degree because it's comparatively expensive for like a social media network. But possibly the advantage is that it would keep the criminal players at a minimum, right? Because it seems to me that as you increase the cost of accessing people, you decrease people's ability to do well, low cost, you know, multi person monitoring of the sort that casts a wide net and that costs no money. So have you. What are your thoughts about the fact that so much of this online pathology is proliferating?

Because when we have free access to a service, so to speak, the criminals also have free access to us. Am I barking up the wrong tree or does that seem. Does it mean that the Internet is going to become more siloed and more private because of that? I think its going to go in two ways. One, you will find safety in how much money you spend, and thats already true.

Chris Olson
So when there are paywalls within even large news sites, the deeper you go into the paywall, the higher the cost to reach the consumer. Not just coming from the consumer, but even through with advertising and other content producers, the lower the activity of the criminal because it's more expensive for them to do business. That is true, right? That's been true, true throughout. I think the other requirement, because we're very acclimated to having free content, is that the entire supply chain is going to have to engage.

So when you think through who is responsible for the last mile of content that's able to reach our devices inside of our home? Right, is that the big telcos is that the companies that are giving us Wifi and bringing data into our houses right now, they're putting their hands back. And it's not our job to understand what happens to you on your device. If anything, there's a data requirement that says we're not allowed to know or we're not allowed to keep track of where you go and what comes onto your device. There's a big difference between monitoring where we go online and what is delivered into our device.

And this is missing from the conversation. Privacy is critically important, and privacy is about how we engage in our activities on the Internet. The other side of that is what happens after the data about us is collected. And that piece is not something that is necessarily private, it should not be broadcast what is delivered to us. But someone needs to understand and have some control over what is actually brought in based on the data that is collected.

And that is a whole of society, meaning all of the companies, all of the entities that are part of this ultimate transaction, to put that piece of content on our phone and our laptop and our tv, need to get involved in better protecting people. One of the primary issues is there are so many events, trillions of events per day on all of our devices, that even when you have paywalls, the problem is so huge that you can always find access to people's machines until we get together and do something better about it. Okay. Okay. Okay.

Jordan Peterson
So paywalls, in some ways are a partial solution, but they're okay. So that's useful to know. Now, do you have. I want to ask you a specific question. Then we'll go back to classes of people who are being targeted by criminals.

I want to continue walking through your list. Do you have specific legislative suggestions that you believe would be helpful at the federal or state level? And do you, are you in contact with people as much as you'd like to be? Who are legislators who are interested in contemplating such legislative moves? The reason I'm asking is because I went to Washington probably the same time I met you, and I was talking to a variety of lawmakers on the hill there who are interested in digital identity security.

But it isn't obvious that they know what to do, because it's. Well, it's complicated, you might say. It's extremely complicated. And I think the big tech companies are in some ways in the same boat. So do you already have access to people sufficiently, as far as you're concerned, who are drawing on your expertise to determine how to make the digital realm a less criminally rife place?

Chris Olson
I would always like more access. What I find is that the state governments are really where the action is. And when I say, and they're closer to people. Right. So the federal government is quite far away from a grandmother or someone in high school.

The state governments know the people who run the hospitals. They know people at senior communities. They understand what's happening on the ground. They're also much closer, if not managing overall police forces. Right.

So that may be down at the county level or other types of districts, but they understand a daily police force. So I think what we're seeking is to influence states to take tactical action. And if that requires legislation, what that would be is putting funds forward to police people from digital crime the same way that they're policing people, or helping to police crime against people in their homes, walking down the street, on our highways, in our banks. Right. The typical type of crime, we're 20 years in from data collection, data targeting, third party code, kind of dominating our relationship with our devices.

It is the one piece that governments really haven't started to work on a whole lot. The United Kingdom, on the other hand, has three different agencies that are. That they've been given the authority to tactically and actively engage with the digital ecosystem. So those are the companies that make up the cloud that serve advertising and serve content, that build websites and e commerce systems. They're finding problems and then they're engaging tactically with that digital supply chain to turn off attacks.

It's the beginning. Are they doing that in a manner that's analogous to the approach that you're taking? The creation of these virtual victims and the analysis of. I think they're mostly. It's receiving feedback from people that are being targeted and getting enough information about those to then move it upstream.

Legislation that would say that a synthetic Persona in a particular local geography counts as crime, that would be a big leap for governments to take. That would be very, very useful in the ability to go out and actually prosecute. But I think that's going to be a very, very difficult solution. I think the problem must be addressed in cooperation with big tech and digital media, and that as a police force in a local market, content is targeted locally, it's geo fenced. Something is going to be served into the state of Tennessee differently than it served into New York state.

As that information is gathered, it should be given to those who can turn off attacks quickly, that is crime reduction, and then ultimately be working together where if there is certainty that there is a crime and the companies that are part of the supply chain have information on the actual criminal, that they're sharing that in a way that, one, they're not getting in trouble for sharing the information, but two, they're collectively moving upstream to that demand source that's bringing the content to our device. I think that becomes a natural flow at some point in the future. The faster we get there, the better. And I want to make sure that I'm making this clear. That's not about protecting a machine.

That's about protecting the person at the other end of the machine. And keeping that mindset is critical. Right. Right. Okay, so let's go back to your list of victims.

Jordan Peterson
So you were talking about, you mentioned 17 year old males who are being offered the opportunity to buy drugs online. So tell us about that market. I don't know much about that at all. And do you have some sense of how widespread? First of all, if the 17 year old is being targeted to purchase illicit drugs, are they being put in touch with people who actually do supply the illicit drugs?

Has the drug marketing enterprise been well established online? And can you flesh that out? What does that look like? Yeah. So this is a place where the biggest tech and digital media companies have done a very good job removing that from.

Chris Olson
From digital advertising and targeted content on their pipes. But that is still something that's happening every single day and actually growing predominantly through social media channels or interactions between the person who's going to end up selling the drugs. And the person could be in any country, this is coming through the mail or it's leading to the streets and making a purchase. But what I can give you, if I'm going to get these numbers right, roughly 2000 deaths from fentanyl or similar drugs in the commonwealth of Virginia in 2023. And the belief is that greater than 50% of those drug transactions began online.

So it is a predominant location for the targeting of people to buy, informing people that the drugs are available, and then ultimately making the sale. Okay. Okay. And they break down the demographics in the same way, making the presumption that males, in all likelihood of a particular age, are the most likely targets, and using. I wonder what other demographic information would be relevant if you were trying to target the typical drug seeker.

Jordan Peterson
I don't know. Okay, so then you talked about 14 year old girls who are being targeted by human traffickers. And so. And you mentioned something about modeling. Yeah.

Chris Olson
So if there's going to be a core profile, you know, a low hanging fruit profile for human traffickers, young females that are interested in things like modeling fashion, presenting themselves out on social media, are going to be a high, high target. Often that is going to be the people that are finding their way to become friends with them and use that information psychologically to have a relationship. But there's also the algorithms in place that enable entities to put code on device, which allows them to continue to track them and find them off platform, off social media as well. Right. And what's the scope of that problem?

Jordan Peterson
Do you, like you said that it's something in the neighborhood of 3% of the interactions that the typical elderly woman between you said, I think, 78 and 85, something like that. 3% of the interactions they're having online are facilitated by criminals. What's the typical situation for a 14 year old girl who's been doing a fair bit of fashion shopping online? That is incredibly difficult. We don't have that data.

Okay. Incredible. Incredibly difficult to find, but is something that's happening on a daily, routine basis. Okay. So it's something for people to be aware of if they have adolescent girls who are interacting online?

Yeah. Yes. I imagine they're targeted in all sorts of ways. And then you mentioned people who are looking for medical information online, who are sick and infirm, and then obviously in a position to be targeted by scammers. In consequence of that.

Chris Olson
If you want to put it into context of frequency, senior citizens are the highest targeted, and then the next highest targeted segment are going to be those searching for some form of medical solution to a problem. Oh, yeah. So they're number two. Okay. And that is a heavy desperation moment.

They're also often traveling back and forth to health facilities. Whether they're in a hospital directly or they're moving back and forth between doctor's offices, that information is made available. You can buy people who visit health places on a routine basis, and then they're very easy. They're desperate, and so they're very easy to suck in to a problem. It really ranges from stealing money.

So having access to bank accounts to phishing attacks, where you're suggesting that they become part of a program, you're gathering more and more information on them to then do future attacks, to selling scam products. So one of the great phenomenons in digital media during COVID especially in the first, maybe 910 months, was targeting seniors and then people with any form of illness, explaining to them how Covid is going to do something very, very bad to you. Buy this product now. And those were scams. So the product rarely showed up.

It certainly wasn't very, very useful. Those may be kind of low. They're low problem on a perennial crime basis. But when you look at it across society, the impact is spectacularly huge. Okay, why is the impact spectacularly huge if you look at it in that manner?

Well, the numbers add up. They're spending more and more and more money, which is a big, big issue. But it also feeds. The mindset of the computer is going to tell me something, going to create some sort of concern within me. If they weren't looking at the computer, it never would have occurred to them to look for the problem in the first place.

And so in addition to stealing our money, it's stealing our time, and it's creating a great sense of fear that people are then living with and kind of walking around all day wondering. My computer told me this thing. I'm very concerned about it. It's continuing to feed more information. The more you click, the more afraid you become, which becomes a very, very big impact on society.

Jordan Peterson
So I don't know if you know this, but it's an interesting fact. It's an extremely interesting fact in my estimation. Do you know that sex itself evolved to deal with parasites? I did not. Okay, so here's the idea.

I mean, I don't know if there's, there are very few truths that are more fundamental than this one. So parasites are typically simpler than their hosts so they can breed faster. And what that means is that in an arm race between host and parasites, the parasites can win because they breed faster, so they can evolve faster than so sex evolved to confuse the parasites. Imagine that the best way for your genes to replicate themselves would be for you to breed parthenogenetically. You just clone yourself.

There's no reason for a sexual partner. When you have a sexual partner, half your genes are left behind. That's a big cost to pay if the goal is gene propagation. The parasite problem is so immense that sexually reproducing creatures, and that's the bulk of creatures that there are sexually reproducing creatures are willing to sacrifice half their genes to mix up their physiology so that parasites can't be transmitted perfectly from generation to generation. So the parasite problem is so immense that it caused the evolution of sex, and creatures will sacrifice half their genes to prevent it.

So what that implies, like we have this whole new digital ecosystem, which is a biological revolution for all intents and purposes, its a whole new level of reality. And the parasite problem is very likely to be overwhelming. I mean, we have police forces, we have laws, we have prisons to deal with parasites in their human form, but now we have a whole new ecosystem that is amenable to the invasion of the parasites. And they are coming like mad, I mean, in all sorts of forms. I mean, we don't even know how extensive the problem is to some degree, because there's not just the criminals that you talk about, they're bad enough or they're bad, but we also have the online troll types who use social media to spread derision and I to play sadistic tricks and games and to manipulate for attention.

And we know that they're sadistic, psychopathic, machiavellian, and narcissistic because the psychological data is already in. They fall into the parasite category. And we also have all that quasi criminal activity like pornography. And so it's certainly possible that if the Internet in some sense is a new ecosystem full of new life forms, that it could be swamped by the parasites and taken out.

That's what you'd predict from a biological perspective looking at the history of life. And so this is an unbelievably deep and profound problem. See, I kind of think this is one of the main dangers of this untrammeled online criminality is that societies themselves tend to undergo revolutionary collapse when the parasites get the upper hand. And it's definitely the case that by allowing the unregulated flourishing of parasitical criminals online that we risk, we really risk destabilizing our whole society, because when those sorts of people become successful, that's very bad news for everyone else. It doesn't take that many of them to really cause trouble.

So anyways, that's a bit of a segue into. Well, it's, it's, it's pretty fascinating. A couple quick points here. So, so one, the, the primary concern for the entities in digital is on their content versus the consumer. So there's content adjacency.

Chris Olson
The, the largest flag we have for content that's brought by third parties that's going to run on someone else's content is the Israel Hamas conflict. The reason for that is less about having a person get upset than it is for having a large brand like Coca Cola or Procter and gamble have other content that's going to run near that Israel Hamas context. Right. Just in vicinity, yeah. And that is worrying about pixels or perhaps the name of a corporation more than the impact on the grandmother.

Right. Who's going to be hit in the next impression with the crime. And so we're still in a nascent spot within the tech infrastructure where those who would provide the capital to provide us with all of those free services are dominating the conversation. That's part of why a government needs to step in and say we're going to focus on crime. What that also does getting back to a parasitic evolution.

And what's the sacrifice that big tech, digital media and the corporates the brands are going to make in order to protect grandmothers? Right now, the bigger concern is about what might be fake because its wasting a penny or a fraction of a penny when a pixel is delivered to an end device. The spend is about monetizing each individual nanosecond to pixel thats going to run in front of us versus the consumer. And I think this is an incredibly myopic viewpoint. Digital safety for the brand is about making sure the picture of their product is in a happy location while grandmothers are losing bank accounts.

And I think that evolution is going to require a sacrifice. I think the companies that engage in digital safety and many big tech and digital media companies go way out of their way to do a good job protecting people. Ultimately, theyre going to win because the relationship with us is going to be so much significantly better and protected and trusted that theyre just going to wind up interfacing with us better than those who are trying to protect their own. Right. Right.

Jordan Peterson
Well, that makes sense to me. Thats an optimistic view because fundamentally, what makes companies wealthy reliably over the long run is the bond of trust that they have with their customers. Right? That's what brand, that's what, that's really what a brand worth is in the final analysis. I mean, Disney was worth a fortune as a brand because everybody trusted both their products and the intent behind them.

And so that's a very hard thing to build up, but it is the basis of wealth. I mean, trust is the basis of wealth. And so it's interesting to contemplate the fact that that means that it might be in the best interests of the large online companies to ensure the safety of the people who, rather than the safety of their products, the safety of the people who are using their services. That's an interesting to think about. Okay, so let me, maybe we can close at least this part of the discussion with a bit of a further investigation into these virtual Persona that you're creating that work as the false targets of criminal activity.

Tell me about them, and tell me how many of them, approximately, if you can. I don't want to interfere with any trade secrets, but, like, how many? What kind of volume of false Personas are you producing to attract criminal activity? And is that something that can be increasingly AI mediated? Yes, so it is, we, we use manual processes, but we also use AI and continuous scanning of digital assets to keep those profiles active.

Chris Olson
So our job isn't so much to become a grandmother to the world. It's to have certain components that enable big tech or ad serving or content delivery to perceive us to be that. And so we're really gaming back to the system to find those objects or those Persona kind of classifications on device, whether that's actual phones or televisions or actual computers, and then to run the content with as much of that as possible. So we're running millions of combinations of potential consumers. Some of them are many, many profiles at the same time, because it's not going to discern between different activities as long as you have something that leans towards what they're looking for.

But then what gets very interesting is a predominance of the content is an auction model. And so you have to fit within price points of what the criminals are trying to attract as well, which is not always, uh, people with a lot of money. It's everyone, um, in the ecosystem. And so we're, we're very much becoming a very nuanced set of Personas, millions of these. A very, very critical component is geography.

Right? So, um, they're going to target a specific town differently um, I don't know if you've had any offers from, uh, for your government to buy you solar panels. There aren't actually a lot of government programs that are going to pay for your solar panels. Those are typically some. Some forms of scams, and they're directed at a local market.

What's very interesting right now is that rather than using AI to design content to pull you in better, we're seeing more and more of similar content designed so that it's harder to pull it down once it's made bad. Right. So they'll make 30 copies where there used to be one or two. So you can pull down 1520 and there's still going to be ten or 15 left. Right.

Jordan Peterson
So what a strange world where we have the proliferation of AI enabled victims, victim decoys to decoy AI enhanced online. Criminals from praying, yes, AI is used for safety, to defend us from AI. We have hit that moment.

Yeah. Well, so then I was just trying to contemplate briefly what sort of evolutionary arms race that produces. Right. Hyper victims and super criminals. Something like that.

Jesus. Weird. Well, there is some worry that ultimately it's a horsepower game, right? So when it's AI versus AI, the more computer horsepower you have, the more likely it is that your AI is going to win at the media trust. Our job is to make the digital ecosystem safer for people.

Chris Olson
We're not all that concerned about one AI beating another AI, unless that's in context of having a grandmother not lose her bank account. That is the core gist of how we look at it, which is different than an enamour relationship with technology and seeking technology solutions for a technical problem. This is a human issue, and with that, the Personas are human reflections back into the delivery of content. It's not about the machine. How are you feeling about your chances of control over on, or our chances, for that matter, of control over online criminality?

Jordan Peterson
And how successful do you believe you are in your attempts to stay on top of and ahead of the criminal activity that you're trying to fight? For? Our customers that prioritize digital safety, it is a. The vast majority of what might run through to attack someone is being detected and removed. They need to have the appropriate mindset.

Chris Olson
They need to be willing to go up onto the demand source to remove bad activity that's going to be coming down. You don't just want to play whack a ball. You have to engage in that next step. Those that do are very successful and create safe environments. It is not possible to make this go away the pipes, the way that the Internet works, the way the data targeting works, it's just not something you can eliminate entirely.

But there are companies that are in front of this that will withhold millions of dollars in revenue at any given moment to prevent the possibility of targeting something and having something bad happen. But there are a lot of companies that are not willing to go that far. I think right now, in some of the bigger companies we see a lot of risk towards this. Who's going to win the chat GPT, who's going to win the LLM race? There is so much at stake in that from a competitive and revenue perspective, the companies that can monetize that, the best are going to start to leap forward.

When you're looking at the world from a how does my technology win versus how do I safely get my technology to do the things that I want? That's when you start to run a lot of risk. We're in a risk on phase in digital right now. Right. But your earlier claim, I think, which is worth returning to, was that over any reasonable period of time, there's the rub.

Jordan Peterson
The companies that do what's necessary to ensure the trust, what you say to ensure that their users can trust the interactions with them, are going to be the ones that are arguably best positioned to maintain their economic advantage in the years to come. And I think, yes, and those that are willing to engage with governments to do a better job, to ultimately find the bad actors and take them down, they're going to be a big part of making the ecosystem better, rather than insulating and hiding behind a sort of risk legal regime that's going to not want to bring data forward to clean up the ecosystem. Okay. Okay. Okay.

Well, for everybody watching and listening, I'm going to continue my discussion with Chris Olson on the Daily wire side of the interview, where I'm going to find out more about, well, how he built his company and how his interest in prevention, understanding and preventing online crime developed, and also what his plans for the future are. And so if those of you who are watching and listening are inclined to join us on the daily wire side, that would be much appreciated. Thank you to everybody who is watching and listening for your time and attention. And thank, thank you very much, Mister Olson, for, well, fleshing out our understanding of the perils and possibilities that await us as the Internet rolls forward at an ever increasing rate. And also for, I would say, alerting everybody who's watching and listening to the.

What would you say, the particular points of access that the online criminals have at the moment when we're in our most vulnerable states, sick, young, seeking old, all of those things, because we all have people, we all know people who are in those categories and are looking for ways to protect them against the people that you're also trying to protect us from. Thank you very much for that. Thank you. Thanks for having us. Me?

You bet. You bet. And again, thanks to everybody who's watching, listening to the film crew down here in Chile today in Santiago. Thank you very much for your help today, guys, and to the daily wire people for making this conversation possible. That's much appreciated.

Thanks very much. Mister Olson, good to talk to you. Thank you.

Chris Olson
Thank you.