PodClips Logo
PodClips Logo
#1558 - Tristan Harris

#1558 - Tristan Harris

The Joe Rogan ExperienceGo to Podcast Page

Joe Rogan, Tristan Harris
·
90 Clips
·
Oct 30, 2020
Listen to Clips & Top Moments
Episode Summary
Episode Transcript
0:00
Hello friends, welcome to the show this episode the podcast is brought to you by liquid IV liquid IV is a fantastic electrolyte supplement and they have a new formula called liquid IV hydration multiplier plus immune support to maintain and strengthen your immune system vitamin C is well-known to protect your body and support good health vitamin D facilitates the immune system function and improves your daily.
0:30
Defense and zinc, which is the second most abundant Trace mineral in your body supports your immune cell health and function hydration multiplier plus immune support is a cutting-edge blend of these nutrients vitamin C vitamin D zinc and wellmune in a convenient single-serve pack wellmune is a naturally sourced beta glucan that's proven to help strengthen your immune system. Each packet is bursting with fresh natural tangerine.
0:59
Ever it's delicious and this blend is powered by their cellular transport technology, which is which is designed to enhance the rapid absorption of water and other nutrients and liquid Ivy's an awesome company because with each purchase liquid IV donates a serving of liquid IV to someone in need liquid IV is donating 3.7 million servings in response to covid-19. The products are being donated to hospitals First Responders food banks veterans and active military.
1:29
Terry and the company has donated over six point seven million servings globally liquid IVs new hydration multiplier plus immune support is available at Walmart or order online and get 25% off when you go to liquid IV.com and use the code Joe Rogan at checkout. That's Joe Rogan all one word. No space that's 25% off anything you order when you use the promo code Joe Rogan at liquid IV.com get better hydration today at
2:00
The IV.com promo code Joe Rogan. We are also brought to you by Juve. Juve is a new cutting-edge technology that I've been using for the last few months. It's it blasts healing light all over my body. I use it nearly every day after workouts and I love it. I like it because first of all it makes me feel good and there's a lot of science behind it initially. I was kind of skeptical about red light therapy, but once I dug into the research,
2:29
research and saw there are hundreds of peer-reviewed research studies reporting all kinds of benefits, like improving your performance enhancing recovery better skin Health sleeped optimization inflammation, improving your blood circulation and improved mental acuity all from red light therapy from Juventus pretty amazing how red light stimulates your cells in your body to produce more energy, which gives your body the fuel that it needs to function properly the
3:00
Rock Solid and the reason I chose Juve as they make the best devices on the market all their devices are third-party tested for safety quality and performance. They also make several different sizes from full-body to handheld and I personally use the full body Juve Elite because I want to blast as much healthy light at my body in a single session as possible. It's just it's easy to use. I just step in front of it and I put on my ear pods and I listen to music.
3:29
And it feels great. The new Juve go is also a great option because it's super affordable and it's battery powered and you can take it anywhere you go it fits in the palm of your hand and it's just as powerful as the larger devices making spot treatments very easy and effective the Juve products are so badass that they've been used by the San Francisco 49ers. They just announced that Juve is their official light therapy provider and they use the device in their Clubhouse. So if you're interested for a limited time, Jew
4:00
Wants to hook you up with an exclusive discount on your first order just go to Juve.com / Jo and apply the code Joe to your qualifying order. That's j o VV Juve.com. /jo Jo e exclusions apply limited time only. We're also brought to you by viory. The Ori is a new perspective on performance apparel. Perfect. If you're sick and tired of traditional old workout gear.
4:29
Here everything is designed to work out in but it doesn't look or feel like it. It's very comfortable. So comfortable you want to wear it all the time. The products are incredibly versatile. They can be used for just about any activity like running training yoga, but also great for just lounging around or weekend errands Yuri means Mountain but to us, it represents the view from the summit the expansive Clarity, it can provide an awe-inspiring experience that it brings viewer. He's thriving active community and Encinitas, California.
4:59
When you served as the inspiration for the brand and the idea is to build activewear that doesn't look like activewear. It's very comfortable stuff. They sent me a bunch of it and I love it. It's designed to look great in everyday life outside the gym and it's perfect for any work out very easy to put on very easy to where they have a bunch of different products. They have men's core shorts, which are the most comfortable lined athletic short. You're ever going to wear one short every sport. Men's Ponto short the perfect Lounge.
5:29
JH or work from home short, they have women's performance. Joggers the softest. Jogger you're ever going to own women's daily legands no mistake in the name. You will wear them daily featuring a high waist drawstring tie and an upgraded no slip fit Yuri is an investment in your happiness and for our listeners, they're offering 20% off your first purchase get yourself some of the most comfortable and versatile clothing on the planet at viewer e.com Rogan that
5:59
Vuo are i.com slash Rogan? Not only will you receive 20% off your first purchase, but you'll also enjoy free shipping on any us orders over 75 bucks and free returns go to view or e.com Rogan and discover the versatility of the Ori clothing. We're also brought to you by my all-time favorite toothbrush quip. You probably heard me talk about quip a million times before it's a
6:29
Acree design beautiful Sleek toothbrush, but this is something brand-new that rewards you and your mouth. When was the last time you got rewarded for brushing your teeth? Well with quips new smart electric toothbrush good habits can actually earn you great perks, like free products give cards and more the quip smart brush for adults and kids connect to the quip app with Bluetooth and you can track when and how well you brush you can get tips and
6:59
Ching to improve your habits and you can earn points for daily brushing and bonus points for completing challenges like streaks redeem for rewards, like free products gift cards and discounts from quip and partners. If you already have a quip upgrade it with a smart motor and keep the features that you know, and love sensitive Sonic vibrations with a two-minute timer and 30 second pulses for a guided clean that makes sure that you had an even clean on all
7:29
all sides your mouth slim Sleek lightweight with no wires and no bulky Chargers to weigh you down and you have a multi-use travel cover that doubles as a mirror mount for Less clutter beyond the brush quip has everything you need to build a complete routine Mentor watermelon toothpaste with anti cavity ingredients for strong healthy teeth floss that expands to clean and comes in a refillable dispenser to reduce waste and eco-friendly, so
7:59
Her battery charger to power your quip with Sunshine plus you can get brush head toothpaste and floss refills delivered from five dollars and shipping is free. How smart is that? Join over 5 million mouths who use quip and save hundreds compared to other blue tooth brushes when you get a quip smart brush for just $45 and right now you can start getting rewards for brushing your teeth today go to get quip.com / Ro.
8:29
Egan that's go there right now to get your first refill for free at get quip.com Rogan spelled g ET Q UI p.com / Rogan quip better oral health made simple and rewarding. All right. My guest today is an American Computer scientist and business person and he's the president of and co-founder of the center for Humane technology may have seen
8:59
In the recent Netflix documentary the social dilemma, which is I cannot recommend enough, and I really enjoyed talking to him, please welcome, Tristan Harris.
9:12
The Joe Rogan
9:13
Experience Train by day Joe Rogan podcast by now
9:20
How are you? Good? Good to be here good to have you here, man. You were just telling me before we went on air the numbers of the social dilemma and they're Bonkers. So what do you say that?
9:31
Yep, the social dilemma was seen by 38 million households in the first 28 days on Netflix, which I think is broken records. And if you assume, you know, a lot of people are seeing it with their family because parents seeing with their kids the issues that are on team mental health. So if you assume,
9:49
One out of ten families salt with a few family members were in the 40 to 50 million people range, which is just broken records. I think for Netflix. I think it was the second most popular documentary throughout the month of September. I think tour film throughout the month of September
10:01
is really well done documentary, but I think it's one of those documentaries that affirmed a lot of people's worst suspicions about the dangers of social media and then on top of that it's sort of alerted them to what they were already experiencing.
10:19
In their own personal life and like highlighted it.
10:21
Yeah, I think that's right. I mean most people were aware because I think everyone's been feeling that the feeling you have when you use social media isn't that this thing is just a tool or it's on my side. It is an environment based on manipulation as we say in the film and that's really what's changed that you know, I remember, you know, I was been working on these issues for something like 8 or 8 years or something now
10:45
you please tell people who didn't see the documentary. What is your background isn't what?
10:49
How you got into it? Yes, so I you know, the the film goes back as a set of Technology insiders my background was as a design ethicist at Google. So I first had a startup company that we sold to Google and I landed there through a talent acquisition and then started work about a year into being at Google made a presentation that was about how essentially technology was holding the human Collective.
11:19
Ikea in its hands that we were really controlling the world psychology because every single time people look at their phone they are basically experiencing thoughts and scrolling through fees and believing things about the world this is becoming the primary meaning making machine for the world and that we as Google had a moral responsibility to you know, hold the collective psyche in a thoughtful ethical way and not create this sort of race to the bottom of the brainstem attention economy that we now have so my background was as a
11:49
A kid I was a magician that I studied the lab at Stanford called or studied in a class called the Stanford persuasive technology class that taught a lot of the engineers at in Silicon Valley kind of how the mind works and the cofounders of Instagram were there and then later studied behavioral economics and how the mind is sort of influence that went into Cults and started studying how Cults work and then ride it Google through this lens of you know, technology isn't really just this thing that's in our hands. It's more
12:19
or like this manipulative environment that is tapping into our weaknesses everything from the slot machine rewards to the way you get tagged in a photo and it sort of manipulates your social validation and approval these kinds of things
12:31
when you are a Google did they still have the don't be evil sign
12:35
up? I don't know. There's actually a physical sign was
12:38
there was never a physical sign. I thought there was something that they actually had I think
12:42
it was there's this guy was it palt not pull it was his last name. He was then better when the Avengers of Gmail and they had a meeting and they came.
12:49
But this Mantra because they realize the power that they had and they realized that there is going to be a conflict of interest between advertising on the search results and regular search results. And so we know that they know that they could have used that power and they came up with this Mantra. I think in that meeting in the early days to don't be don't be
13:05
evil. There was a time where they took that Mantra down and I remember reading about it online and
13:11
and they took it off their page. I think that's what was yeah,
13:14
and when I read that I was like that should be big news.
13:19
Like there's no reason to take that down. Why would you take that down? Yeah, why would you why would you say well maybe give you a little evil looks like a crazy.
13:28
It's a good question. I mean, I wonder what logic would have you remove a statement like
13:32
that? That seems like a standard State like it's a great statement. Okay, here it is. Google removes. Don't be evil Clause from its code of conduct
13:40
in 2018. Yeah.
13:42
I wonder why did they have an explanation that it's a anything underneath her?
13:49
Don't be evil has been a part of the company's corporate code of conduct since 2000 when Google was reorganized under a new patent parent company alphabet and 2015 alphabet assumed a slightly adjusted version of the model by thing, you know do the right thing. Oh, that's a Spike Lee movie bitch. However, look retained its original don't be evil language until the past several weeks. The phrase has been deeply incorporated into Google's company culture so much so that a version of the phrase has served as the why
14:19
Five password on the shuttles that Google uses to Ferry its employees to its Mountain View headquarters think I
14:25
remember that in while on the bus and you type in don't be evil. I
14:29
wonder why they decided
14:32
Well, I mean they did change it to do the right thing. I mean, we always used to say that just two friends not within Google but just you know instead of saying don't be evil to say let's let's do some good here,
14:41
right? That's nice. Let's do some good. Here's yeah think positive think doing good instead of don't do bad.
14:49
Yeah, but the
14:50
problem is when you say do good the question is who's good? Because you live in a morally plural society. And there's this question of who are you to say what's good for people and it's much easier to say let's reduce harms than it is to say let's actually do good like
15:01
this. It says the updated version of Google's code of conduct still retains one reference to the company's unofficial motto. The final line of the document is still and remember dot dot dot don't be evil. And if you see something that you think isn't right speak up,
15:18
Mmm, okay. Well, they still have don't be evil. So maybe it's Much Ado About Nothing but having that kind of power we were just before the podcast. We're watching Jack Dorsey speak to members of the senate in regards to Twitter censoring the hundred Biden story and censorship of conservatives, but allowing dictators to spread propaganda dictators from other countries, and why and what this is all about one of the things that
15:48
Jack Dorsey has been pretty adamant about is that they really never saw this coming when they started Twitter. Yeah, and they didn't think that they were ever going to be in this position where there were going to be really the Arbiters of free speech for the world. Right which is essentially in some ways what they
16:05
are. I think it's important to to roll back the clock for people because it's easy to think you know that we just sort of landed here and that they would know that they're going to be influencing the global psychology, but I think we should really
16:18
It's engineer for the audience. How did these products work the way that they did? So let's go back to the beginning to the Twitter. I think his first tweet was something like checking out the buffalos and Golden Gate Park in San Francisco. You Know Jack was fascinated by the taxicab dispatch system that you could send a message and then all the taxis get it and the idea is could we create a dispatch system so that I post a tweet and then suddenly all these other people can see it and the the Real Genius of these things.
16:48
Was that they weren't just offering this thing you could do they found ways of keeping people engaged. I think this is important for people to get that they're not competing for your data or for money. They're competing to keep people using the product. And so when Twitter for example invented this persuasive feature of the number of followers that you have remember like that was a new thing at that time, right you log in and you see your profile. Here's the people who you can follow and then here's the number of followers you have that
17:17
created a reason for you to come back every day to see how many followers do I have so that was part of this race to keep people engaged as we talked about in the film like these things are competing for your attention that if you're not paying for the product you are the product but the thing that is the product is your predictable Behavior you're using the product in predictable ways, and I remember a conversation I had with someone at Facebook who's a friend of mine who said in a coffee shop one day people think that
17:48
We Facebook are competing with something like Twitter that one social network is competing with another social network. But really he said our biggest competitor is YouTube because they're not competing for social networks. They're competing for attention and YouTube is the biggest competitor in the digital space for attention. And that was a real light bulb moment for me because you realize that as they're designing these products. They're finding new clever ways to get your attention. That's the real thing that I think is different in the film The Social dilemma rather than talking about
18:17
About you know censorship and data and privacy in these themes. It's really what is the core influence or impact that the shape of these products have on how we're making meaning of the world when they're cheering our psychology.
18:30
Do you think the it was inevitable that some one manipulates the way people use these things to gather more attention. And do you think that any of this could have been avoided if there was laws against that if instead of having these algorithms that specifically
18:48
Target things that you're interested in or things that you click on or things that are going to make you engage more if they just allow these things to if someone said listen you can have these things you can allow people to communicate with each other be can't manipulate their attention
19:04
span. Yeah, I think the so we've always had an attention economy, right and you're competing for right now and politicians compete for it. Can you vote for someone you've never paid attention to never heard about never heard them say something, you know outrageous know.
19:18
So there's always been an attention economy. And so it's hard to say we should regulate who gets attention or how
19:24
but it's it's organic in some ways. Right like this podcast is an organic. I mean if we're in competition, it's organic. I just put it out there and if you watch it, you don't or you don't I don't you know, I don't have any say over it and I'm not manipulating it in any
19:39
way sort of so, I mean, let's imagine that the podcast apps were different and they actually while you're watching they had like the heart
19:48
In the stars and the kind of voting up and numbers and you could like send messages back and forth and apple podcasts worked in a way that didn't just reward, you know, the things that you clicked follow on it actually sort of promoted the stuff that someone said the most outrageous thing. Then you as a podcast Creator have an incentive to say the most outrageous thing and then you arrive at the top of the Apple podcast or spot or Spotify and and that's the thing is that we actually are competing for attention. It felt like it was neutral and it was relatively neutral and to
20:17
progress that story back in time with you know, Twitter competing for attention. Let's look at some other things that they did so they also added this retweet these instant re sharing feature, right and that made it more addictive because suddenly we're all playing the fame Lottery right? Like I could retweet your stuff and then you get a bunch of hits and then you could go viral and you could get a lot of attention. So then instead of the companies competing for attention now each of us suddenly when the fame Lottery over and over and over again, and we're getting attention and then another example is going to think about I forgot it.
20:48
What was it?
20:50
You can have it if you want some apple has an interesting way of handling sort of the way. They have their algorithm for their podcast app. Is it secret? It's kind of it's weird. But one of the things that it favors is it favors new shows and it favors engagement and new subscribers. So comments engagement and new shows. There you go
21:16
like you that's the same as
21:18
Competing for attention because engagement must mean people like it and that's yeah and there's going to be a fallacy as we go down that road, but go on.
21:24
Well, it's interesting because you could say if you have a podcast and your podcast gets like it's a $100,000 loads a new podcast can come along and it can get 10,000 downloads and it will be ahead of you in the rankings. And so you could be number 3 and it could be number two and you're like well, how was that number two, and it's got 10 times less, but they don't do it that way and they're
21:48
Logic is they don't want the podcast world to be dominated by you know, New York Times the big ones. Yeah, and whatever whatever is number one and number two and number three
21:56
forever. We actually just experienced this. We have a podcast called your undivided attention and since the film came out in that first month we went from being, you know in the lower 100 or something like that to we shot to the top five. I think we're the number one Tech podcast for a while. And so we just experienced this through the fact not that we had the most listeners, but because the trend was to wrap it that we sort of jumped.
22:18
To the top. I
22:18
think it's wise that they do that because eventually it evens out over time. You know, you see some people rocket to the top like, oh my God, we're number three and you like hang on there fella just give it a couple of weeks and then three weeks later four weeks later. Now their number 48, right? They get depressed. Right? But that was really where you should have been but the thing that Apple does that I really liked in that is it gives an opportunity for these new shows to be seen it with the where they might have.
22:48
Gotten just stuck because these these rankings in the ratings for a lot of these shows. The shows are so consistent and they have such a following already. Yeah. It's very difficult for these new shows to gather attention. Right and the problem was that there were some people that game the system and there was companies that could literally move like Earl skakel. Remember Earl became the number one podcast and like no one was listening to it. Earl has money and he hired
23:18
Some people to game the system and he was kind of like open about it and and laughing about now isn't he banned from iTunes now or something? I think he got banned because of that because it was so obvious he game the system and like a thousand downloads news. Number one. I mean the thing is that were
23:36
Apple podcast you can think of is like the Federal Reserve or the government of the attention economy because they're setting the rules by which you win right? They could have set the rules as you said to be you know, who has the
23:47
Most listeners and then you just keep rewarding the kings that already exists versus who is the most trending. There's actually a story friend of mine told me. I don't know if it's true. Although it's a fairly credible Source Who said he was a meeting with Steve Jobs when they were making the first podcast app and that they had made a demo of something where you could see all the things your friends were listening to so just like making a Newsfeed like we do with Facebook and Twitter, right? And then he said was well, why would we do that if something is important?
24:18
Orton enough your friend will actually just send you a link and say you should listen to this. Hmm. Like why would we automatically just promote random things that your friends are listening to and again, this is kind of how you get back to social media. How is social media so successful because it's so it's much more addictive to see what your friends are doing in a feed but it doesn't reward what's true or what's meaningful and this and this is the thing that people need to get about social media is it's it's really just rewarding the things that tend to keep people back addictively the business model is addiction in
24:47
A race to the bottom of the brain stem for
24:49
attention. Well, it seems like if we in hindsight of hindsight is 20/20. What what should have been done or what could have been done had we known where this would pile out is that they could have said you can't do that. You can't manipulate these algorithms to make sure that people pay more attention and manipulate them to ensure that people become deeply addicted to these platforms. What you can do is just let them openly
25:18
Right, but it has to be organic.
25:20
And then the problem is so here. This is the thing I was going to say about Twitter is when one company does the call it the engagement feed meaning showing you the things that the most people are clicking on and retweeting trending things like that. Let's imagine there's two feets. So there's the feed that's called the reverse chronological feed meaning showing in order in time, you know, Joe Rogan posted this two hours ago, but that's you know, after that you have the thing that people posted an hour and a half ago all the way up to
25:47
to 10 seconds ago. That's the reverse chronological. They have a mode like that on Twitter if you click the sparkle icon and if you know this it'll show you just in time. Here's what people said, you know sorted by recency, but then they have this other feed called what people click on retweet etcetera the most people you follow and it sorts it by what it thinks you'll click on and want the most which one of those is more successful at getting your attention the sort of recency what they posted recently versus what they know people are clicking on retreating on the most
26:17
certainly
26:18
What they know people are clicking on retweeting the most
26:20
correct. And so once Twitter does that let's say Facebook was sitting there with the recency feed like just showing you here is the people who posted in this time order sequence. They have to also switch to who is like the most relevant stuff, right? The most clicked retweeted the most. So this is part of this race for attention that once one actor does something like that and they algorithmically, you know, figure out what people its most popular the other companies have to follow because otherwise they won't get
26:47
The attention so it's the same thing if you know Netflix adds the autoplay 5 4 3 2 1 countdown to get people to watch the next episode that if that works at say increasing Netflix's watch Time by 5% YouTube sits there says we just shrunk how much time people were watching YouTube because now they're watching on Netflix. So we're going to add five four, three two, one autoplay countdown and it becomes again this game theoretic race of who's going to do more. Now you feel open up tick tock tick tock doesn't even wait and if you know if your kids use Tick Tock, but when you open up the
27:18
It doesn't even wait for you to click on something. It just actually plays the first video the second you open it which none of the other apps do right. And the point of that is that causes you to enter into this engagement stream even faster. So this is this again this race for attention produces things that are not good for society. And even if you took the whack-a-mole sticker, you took the antitrust case and you whack Facebook and you got rid of Facebook or you whack Google are you a cue tube? You're just going to have more actors flooding in doing the same thing and one other example of this is
27:49
the time it takes to reach. Let's say 10 million followers. So if you remember back in the ashtray was an Ashton Kutcher who raised for the first million followers of CNN base with CNN right now. So now if you think of it the companies are competing for our attention if they find out that each of us becoming a celebrity and having a million people we get to reach if that's the currency of the thing that gets us to come back to get more attention. Then they're competing at who can give us that bigger Fame Lottery hit faster.
28:17
Let's say 2009 or 2010 when Ashton Kutcher did that it took him. I don't know how long it took months
28:23
to for him to get a million. I don't
28:25
remember it was a it was a little bit though, right? And then Tick Tock comes along and says, hey we want to give kids the ability to hit the fame Lottery and make it big hit the jackpot even faster. We want you to be able to go from zero to a million followers in 10 days. Right? And so they're competing to make that shorter and shorter and shorter and I know about this because you know speaking from a Silicon Valley perspective venture.
28:47
Catalyst fund these new social platforms based on how fast they can get to like a hundred million users. There is this famous line that like, I forgot what it was but I think Facebook took like 10 years to get to a hundred million users Instagram took, you know, I don't know four years three years or something like that Tick Tock can get there even faster. And so it's shortening shortening shortening and that's what people are that's what we're competing for. It's like who can win the fame Lottery faster, but is a world where everyone broadcast to millions of people without the responsibilities of publishers.
29:17
Etc does that produce an information environment that's helped that that's that's healthy. And obviously the film The Social dilemma is really about how it makes the worst of us rise to the top. Right? So our hate are outraged our polarisation what we disagree about black and white thinking more conspiracy oriented views of the world q and on you know, Facebook groups things like that and I can we can definitely go into there's a lot of legitimate conspiracy theories. I don't wanna make sure I'm not categorically dismissing stuff, but that's really the point is that
29:47
that we have landed in a world where the things that we are paying attention to are not necessarily the agenda of topics that we would say in a reflective world. What we would say is the most most important.
29:59
So there's a lot of there's a lot of conversation about free will in about letting people choose whatever they choose whatever they enjoy viewing and watching and paying attention to but when you're talking about
30:17
These incredibly potent algorithms and the incredibly potent addictions that people the people develop to these these things and we're pretending the people should have the ability to just ignore it and put it away right and use your willpower. Yeah, that seems I have our kids and I have a folder on my phone called addict and it's all caps and it's at the end of my all you have to scroll through all my other apps to get to it. And so if I want to get to Twitter or
30:47
Instagram the problem
30:48
is the collapse which your will put it in the most recent. So once you switch apps and you have Twitter in a recent, it'll be right there
30:54
so that if I want to go if you left and yeah, I want to see that you can do that. Yeah, it's um, it's insanely addictive and if you can control yourself, it's not that big a deal, but how many people can control themselves?
31:11
Well, I think the thing we have to hone in on is the asymmetry of power.
31:17
As I say in the film it's like we're bringing this ancient brain Hardware the prefrontal cortex, which is like what you use to do goal-directed action self-control willpower holding back, you know marshmallow test. Don't do the get don't get the marshmallow now wait later for the two marshmallows later. All of that is through our prefrontal cortex and when you're sitting there and you think okay, I'm going to go watch. I mean look at this one and thing on Facebook because my friend invited me to this event or it's this one post. I have to look at and the next thing you know, you find yourself.
31:47
Rolling through the thing for like an hour, right? And you say men that was on me. I should have had more self-control but they're behind the screen behind that glass slab is like a supercomputer pointed at your brain that is predicting the perfect thing to show you next and you can feel it like it's this is really important. So like if I'm Facebook and when you flick your finger, you think when you're using Facebook, it's just going to show me the next thing that my friend said, but it's not doing that it when you flick your finger. It actually literally wakes up this
32:17
Sort of supercomputer Abbott are voodoo doll live version of Joe and the voodoo doll of Joe is you know, the more clicks you ever made on Facebook is like adding a little hair to the voodoo doll. And the more likes you've ever met dad's little clothing to the voodoo doll. And the more, you know, watch time on videos you've ever a dad's little, you know shoes to the food at all. So the boot it is getting more and more accurate. The more things you click on. This is in the film The Social dilemma, like if you notice like the character, you know as he's using this thing.
32:47
It builds a more and more accurate model that the AI is the 3A eyes behind the screen or kind of manipulating and the idea is it can actually predict and prick the voodoo doll with this video or that post from your friends or this other thing and it'll figure out the right thing to show you that it knows will keep you there because it's already seen how that same video that same post has kept 200 million other voodoo dolls there because you've just look like another voodoo doll. So here's an example.
33:12
In this works the same on all the platforms. If you are were a teen girl and you opened a dieting video on YouTube 70% of YouTube's watch time comes from the recommendations on the right hand side, right? So the things that are showing recommended videos next and it will show you it'll show what did it show that the girls who watch the team dieting video. It showed anorexia videos because those were better at keeping the teen girls attention not because it said these are good for them. These are
33:42
Helpful for them. It just says these tend to work at keeping their attention. So again these
33:47
tend to work if you are already watching diet videos.
33:50
Yeah. So if your 13 year old girl and you watch the diet video YouTube wakes up, its voodoo doll version of that girl and says, hey, I've got like a hundred million other voodoo dolls of 13 year old girls, right and they all tend to watch these these other videos. I don't know what I just know that they have this word thinspo. The inspiration is the name for it and be inspired for anorexia. Yeah. It's real thing YouTube address this problem a couple of years ago, but when you let
34:12
the machine run blind all it's doing is picking stuff. That's engaging.
34:16
Why did they choose to not let the machine run blind with one thing like
34:21
anorexia? Well, so now we're getting into the Twitter censorship conversation the moderation conversation. So the real this is why I don't focus on censorship in moderation because the real issue is if you blur your eyes and zoom way out and say how does the whole machine tend to operate like no matter what I start with what is it going to recommend next? So, you know if you start
34:42
With you know, a World War II video YouTube would recommend a bunch of Holocaust denial videos, right? If you started teen girls with a dieting video, it would recommend these anorexia videos in Facebook's case if you joined those there's so many different examples here because Facebook recommends groups to People based on what it thinks is most engaging for you. So if you were a new mom, you had renamed I rest on my friend on this podcast. We've done a bunch of work together and she has this great example of as a new mom. She joined one.
35:12
Group for mothers who do do it yourself baby food, like organic baby food. And then Facebook has this sidebar. It says here's some other groups. You might recommend. You might want to join and what do you think was the most engaging of those because Facebook again is picking on which group if I got you to join it would cause you to spend the most time here, right? So for some do it yourself baby food groups. Which group do you think it's
35:36
selected? Probably something about vaccines.
35:38
Exactly. So Haley vaccines for moms. Yeah. Okay. So then if you join,
35:42
That group now it does the same run the process again. So then so now look at Facebook. So it says hey I've got these voodoo dolls. I got like a hundred million voodoo dolls and they're all they just joined this anti-vaccine mom's group. And then what do they tend to engage with for very long time if I get them to join these other groups, which are those other groups would show up?
36:01
Chemtrails, oh, look, it's a gate Flat Earth flat Earth. Absolutely. Yep and YouTube recommended. So I'm interchangeably going from YouTube to Facebook because it's the same Dynamic they're competing for attention and YouTube recommended Flat Earth conspiracy theories hundreds of millions of times. And so when you when you're a parent during covid and you sit your kids in front of YouTube's you're like, I'm I've got a this is the digital pacifier it got to let them do their thing. I got to do work, right and then you come back to the dinner table.
36:30
Will and your kid says, you know, the Holocaust didn't happen in the Earth is flat and people are wondering why it's because of this and now to your point about this sort of moderation thing we can take the whack-a-mole stick after the public yells and Ranae and I make a bunch of noise or something and large community by the way of people making noise about this and they'll say okay shoot. You're right Flat Earth. We got to deal with that. And so they'll tweak the algorithm and then people make a much a noise about the thinspiration videos for anorexia for kids and they'll deal with that problem.
37:00
But then they start doing it based reactively. But again, if you zoom out it's just still recommending stuff. That's kind of from the Crazy Town section of is the problem the recommendation.
37:10
That's why I don't mind that people have ridiculous ideas about Hollow Earth because I think it's humorous, but I'm also a 53 year old man. Right right. I'm not I'm not a 12 year old boy with a limited education. That is like, oh my God, the government is lying to us. There's lizard people that live under the Earth, right?
37:30
But if that's the real argument about these conspiracy theories is that they can influence young people or the easily impressionable or people that maybe don't have a sophisticated sense of vetting out
37:41
bullshit. Right? Well and the algorithms aren't making a distinction between who is just laughing at it, right and who's deeply vulnerable to it? It's a generally it's just it's it just says who's vulnerable to because another example, either way I think about this is if you're driving down the highway and and you know, there's Facebook and Google trying to figure out like, what should I give you based on?
38:00
What tends to keep your attention if you look at a car crash and everybody driving the highway they look at the car crash according to Facebook and Google is like the whole world wants car crashes. We just feed them car crashes after car after car crashes and what the algorithms do as Guillaume Chaz low in the film says who's the YouTube whistleblower for the YouTube recommendation system is they find the perfect little rabbit hole for you that it knows will keep you there for five hours and the conspiracy theory like dark corners of YouTube were the dark Corners that tends to keep people there for five hours.
38:30
Hours and so you have to realize that we're now something like 10 years in to this vast psychology experiment where it's been in every language and hundreds of countries, right and ever in hundreds of languages. It's been steering people towards the Crazy Town another when I say crazy town, I think of you know, imagine there's a spectrum on YouTube and there's on one side you have like the calm Walter Cronkite Carl Sagan, you know slow, you know kind of boring but like educational material or something and the other side of the spectrum you have
39:00
Of you know the craziest stuff you can find Crazy Town no matter where you start you could start in Walter Cronkite or you could start in crazy town. But if I'm YouTube and I want you to watch more am I going to steer you towards the column stuff or - are you more towards crazytown president always more towards crazy town. So then you imagine just tilting the floor of humanity just by like three degrees, right? And then you just step back and you let Society run its course as Jaron Lanier says in the film if you just tilt Society by one degree
39:30
Two degrees. That's the
39:31
whole world. That's that's what everyone is thinking and
39:33
believing. And so if you look at the degree to which people are deep into Rabbit Hole conspiracy thinking right now. And again, I want to acknowledge cointelpro operation Mockingbird, like there's a lot of real stuff, right? So I'm not categorically dismissing it but we're asking what is the basis upon which were believing the things we are about the world and increasingly that's that's based on technology and we can get into you know, what's going on in Portland. Well the only
40:00
I know that as I'm looking at my social media feed and according to that. It looks like the entire City's on fire and it's a war zone. But if you I called a friend there the other day and he said it's a beautiful day. There's there's actually no violence anywhere near where I am. It's just like these two blocks or something like that. And this is the thing as warping our view of reality and I think that's what really for me. The social dilemmas was really trying to accomplish as a film and I know the director Jeff will ask you was trying to accomplish is is how did this Society Get Go Crazy everywhere all at
40:30
Once you know seemingly another this didn't happen by accident happened by design of this business model.
40:35
When did the business model get implemented? Like when did they start using these algorithms to recommend things because initially YouTube was just a series of videos and it didn't have that recommended correct section. When was that?
40:47
You know, it's a good question. I mean, I you know, they originally YouTube was just post a video and you can get people to you know, right go to that URL and send it around they needed to figure out
41:00
Out once the competition for attention got more intense. They needed to figure out how am I going to keep you there? And so recommending those videos on the right hand side. I think I was there pretty early. I remember actually because that's that was some of the Innovation is like keeping people within this YouTube Wormhole. And once people were in the you to Wormhole constantly seeing videos that was what the they could they could offer the promise to a new video uploader. Hey, if you post it here, you're going to get way more views than if you post it on Vimeo, right? And that's that's the thing if I open up,
41:30
Tick Tock right now on my
41:31
phone. You have to talk on your phone.
41:33
Well, I'm not supposed to obviously but it more for research purposes
41:37
research. Do you know how to take talk at all? No, my 12 year olds obsessed. Oh, really? Oh, yeah. She can't even sit around if she standing still for five minutes. She just
41:47
starts like
41:49
she starts tick-tocking and that's the thing I need is in 12 2012. So the Mayans were right, right 2012 to platform announced an update to discover system.
42:00
In to identify the videos people actually want to watch by prioritizing videos that hold attention throughout as well as increasing the amount of time a user spends on the platform overall yuto YouTube could assure advertisers that it was providing a valuable high quality experience for people. Yep. So that that's Beginning of the End.
42:18
Yep. So 2012 and YouTube's timeline. I mean, you know the Twitter and Facebook world, I think introduces the retweet Andre share buttons and the 2009 to 2010.
42:30
Of time period so you end up with this world where the things that were most paying attention to are based on algorithms choosing for us and so sort of deeper argument that's in the film that I'm not sure everyone picks up on is these technology systems have taken control of human choice, they taking control of humanity because they're controlling the information that all of us are getting right think about every election. Like I think Facebook is kind of a voting machine, but it's a sort of indirect.
43:00
Machine because it controls the information for four years at your entire societies getting and then everyone votes based on that information. Now you could say well hold on radio and television were there and we're partisan before that but actually TV radio and TV are often getting their news stories from Twitter and Twitter is wreck is recommending Things based on these algorithms. So when you control the information that an entire population is getting your controlling their choices, I mean literally in military theory if I want to screw up your military, I want to control the information.
43:30
A nation that it's getting I want to confuse the enemy and that information funnel is the very thing that's been corrupted and it's like the Flint water supply for our
43:38
minds. I was talking to a friend yesterday and she was saying that there were articles that she was laughing that there's articles that are written about negative tweets that random people make about a celebrity doing this or that and she was like and she was quoting this article. She's like look how crazy this is. This is a whole article that's written about
44:00
Out someone who decided to say something negative about some something some celebrity it done and then it becomes this huge art and then the tweets are prominently featured right and then the response to those immediately really like arbitrary look weird
44:14
because it's a values blind system that just cares about what will get
44:17
attention. Exactly. And that's what the article was. It was just an attention
44:20
grab. It's interesting because Prince Harry and Megan have become very interested in these issues and are actively working on these issues and getting to know them just a little bit
44:29
are they really?
44:30
Lee yeah, well there because it affects them personally
44:32
well, it's actually interesting. I don't want to speak for them. But I think Megan has been the target of the most vitriol hate oriented stuff on the planet right from just the amount of sort of criticism that they that they get really and scrutiny. Yeah. She's I mean, she's just like newsfeeds filled with hate about just what she looks like what she says is constantly
44:51
I I'm out of the loop. I've never seen anything. She's pretty what do they think she looks
44:55
like? I honestly I don't follow it myself because I don't fall into these attention traps. I try not to but
45:00
People should just face the worst betrayal. I mean, this is the thing with teen bullying right? So I think they work on these issues because teenagers are now getting a micro version of this thing where each of us are scrutinized, you know, and I think that's what's not I mean think about what celebrity status does and how it screws up humans in general right like take an average celebrity like it warps your mind at warps your psychology and you get scrutiny right when you suddenly are followed each person gets thousands or predict forward in the future a few years. Each of us have you know tens of thousands
45:30
To a hundreds of thousands of people that are following what we say. That's a lot of feedback and you know, it's Jonathan haidt says in the film. I know you've had him here. Yeah, you know, it's made kids much more cautious and and less risk taking and and more bullied overall and there's this huge problems in mental health around
45:47
this. Yeah. It's really bad for young girls. Right? Especially Turtle internals and I've had quite a few celebrities in here and we've discussed it. I just tell them that you can't read that stuff. Just don't read it. Yeah like
46:00
There's no good in it. Like I had a friend she did a show. She's a comedian to show and she was talking about this one negative comment. That was inaccurate. Let said she only did a half an hour and our show socks. She's like Fuck her and it's not like I go why are you reading that? She's like because it's mostly positive I go but how can we not talking about most of it that talking about this one person? Yes one negative person. We're both laughing about it. Like she's she's healthy, you know, she's not she's not completely fucked up by it. But this one person got into her head.
46:30
Like I'm telling you it's not the juice is not worth the squeeze, but don't read those
46:34
things. But this is this is exactly right and this is based on how our minds work mean. I mines literally have something called negativity bias. So if you have a hundred comments and 99 are positive and one is negative just at where does the average humans mind go? Right they go to the negative
46:48
and it also goes to the negative. Even when you shut down the
46:51
screen your mind is sitting there looping on that negative comment and why because evolutionarily it's really important that we look at Social approval negative social approval because our reputation
47:00
Is at stake in the tribe. Yeah, so it matters. Yes, but it's never been easier now for not just that that one comment to sort of gain more air time, but then for that to build a hate mob and then to see the interconnected clicks and I can go in and see 10 other people that responded to that that aren't ya yes. And so especially when you have teenagers that are exposed to this and you can keep going down the tree and see all of the hate Fest on you. This is the psychological environment. That is the default way that kids are growing up now.
47:27
Yeah, I actually
47:28
faced this recently with the film.
47:30
Self because actually the film has gotten just crazy positive Acclaim for the most part and there's just a few, you know, negative comments and for myself even right like I'm Zack injunction, but I was
47:42
glued to a few negative
47:43
comments and I and then you could click and you would see other people that you know who positively like or respond to those comments. Like why did that person who say that negative thing? I thought we were friends that whole kind of psychology and it were all vulnerable to it. Yes, you learn as as you said to tell your celebrity friends, just don't pay attention.
48:00
Even mild stuff. I've see people fixate on even a mild disagreement or mild criticism people fixate on and it's it's also a problem because you realize it's someone saying this and you're not there and you can't defend yourself. So you have this exactly feeling of helplessness. Like hey, that's not true. I didn't and then you you don't get it out of your system. You never you never get to express
48:23
it and people can share that false negative stuff. I mean, I didn't not all negative stuff is false, but you can assert things.
48:30
And build on the hate Fest right and start going crazy and saying this person's a white supremacist or this person's even worse and that'll spread to thousands and thousands of people and next thing, you know, you check into your feet again at you know, 8:00 p.m. That night and you your whole reputation has been destroyed. Yes, and you'd even know what happened to you well, and it's happened to teenagers to me. They're anxious. Like I'll post, you know teenager post a photo their High School to make a dumb comment without thinking about it and then next thing they know, you know at the end of the day the parents are all calling because like 300 parents saw it.
49:00
And are calling up the parent of that kit and it's we talked to teachers a lot in our work at the center for Humane technology and they will say that on Monday morning. This is before covid but on Monday morning, they spend the first like hour of class having to clear all the drama that happened on social media from the weekend for the kids sure is and again like
49:21
this and these kids are in what age group
49:24
this is like 8th 9th 10th grade that kind of thing
49:26
and the other
49:30
Wrong with these kids is there's not like a long history of people growing up through this kind of influence and successfully navigating it. These are the these are the Pioneers.
49:41
Yeah, and they won't know anything different which is why you know, we talked about in the film like this. They're growing up in this environment and you know, one of the simplest principles of Ethics is the ethics of symmetry doing unto others as you would do to yourself and as we say at the end of the film like one of the easiest ways, you know that there's a
50:00
Here is that many of the executives at the social media tech companies. Don't let their own kids you social media, right? They literally say the end of the film like it's a we have a rule about it where religious about it. We don't do it. The CEO of Lunchables Foods didn't let his own kids eat Lunchables.
50:17
That's when you know if you talk to a doctor or a lawyer a doctor and you say, you know, would you get this surgery for your own Kitty? So no, I would never do that. Like would you trust that doctor? Right? And it's the same thing with for a lawyer. So this is the relationship. We have a relationship of asymmetry and technology is influencing all of us and we need a system by which you know, when I was growing up, you know, I grew up on the Macintosh and technology and I was creatively doing programming projects and whatever else the people who built the technology I was using would have their own kits you
50:47
Things that I was using because they were creative and they were about tools and empowerment and that's what's changed. We don't have that anymore because the business model took over and so instead of having just tools sitting there like hammers waiting to be used to build creative projects or programming to invent things or paint brushes or whatever. We now have a manipulation based technology environment where everything you use has this incentive to not only a dick to you but to have you play the fame Lottery get social feedback because those are all the things that keep people's attention.
51:16
Isn't this also a problem with
51:17
With these information Technologies being attached to corporations that have this philosophy of unlimited growth. Yes. So there are no matter how much they make I'm I applaud Apple because I think they are the only company that takes steps to protect privacy to block advertisements to make sure that at least like when you when you use their Maps application, they're not saving your data and sending it to everybody and it's one of the reasons why Apple
51:47
This is really not as good as Google Maps, right but I use it and that's one of the reasons why I use it and when Apple came out recently and there was they were doing something to block your information being sent to other places and they forget what was the exact thing that it was in the new iOS. They released a thing
52:15
that blocks the tracking identifiers. That's
52:17
Right and it's not actually out yet. It's going to be out in January or February. I think someone told me and what that's doing that's a good example of they're putting a tax on the advertising industry because just by saying you can't track people individually that you know takes down the value of an advertisement by like 30 percent or something. Here it is and you I do Safari I get this whole privacy report thing, right? It says it's like in the last seven days. It's prevented a hundred twenty-five trackers from profiling me.
52:43
Yeah, and you can opt out of that if you'd like if you like no, fuck that track.
52:47
Yep. Yeah, you can do that. If you can let them send your data, but that that seems to me a much more ethical approach to be able to decide whether or not these companies get your information.
52:58
I mean those things are great the challenges imagine you get the Privacy equation perfectly, right?
53:04
Because this apple working on its own search engine as Google ties. I actually cut soon I started using DuckDuckGo.
53:11
Yep for that very reason just because it's it. They don't do anything with it. You don't they give you the information but they don't they don't take your data and do anything with it
53:21
the challenges. Let's say we get all the Privacy stuff perfectly perfectly right and data production and data controls and all that stuff in a system that still based on attention and grabbing attention and harvesting and strip-mining our brains. You still get maximum polarization addiction mental health problems.
53:41
And teen depression and suicide polarization breakdown of truth. Right? Right. So that's we really focus in our work on those topics because that's the direct influence of the business model on warping Society. Like we need to name this mind when we think of it like the climate change of culture that these thing like these seem like different disconnected topics much like with climate change you'd say like, okay, we've got species loss in the Amazon. We've got we're losing insects. We've got melting glaciers. We've got ocean acidification. We've got the
54:11
The coral reefs, you know getting dying. These can feel like disconnected things until you have a unified model of how emissions change all of those different phenomena right in the social fabric. We have shortening of attention spans. We have more outraged driven news media. We have more polarization. We have more breakdown of Truth. We have more conspiracy minded thinking these seem like separate events and separate phenomena, but they're actually all part of this attention extraction Paradigm that
54:41
The company's growth is you said depends on extracting more of our attention which means more polarization more extreme material more conspiracy thinking and shortening attention spans is we also say like, you know, if we want to double the size of the attention economy, I want your attention Joe to be split into two separate streams. Like I want you watching the TV the tablet and the phone at the same time because now I've tripled the size of the amount of extractable attention that I can get for advertisers, which means that by fracking for
55:11
Attention and splitting you into more junk, you know attention that's like thinner we can sell that is if it's real time just like the financial crisis where you're selling thinner and thinner Financial assets as if it's real but it's really just a junk asset. Oh wow, and that's kind of where we are. Now where it's sort of the junk attention economy because we're we can shorten attention spans and we're debasing the substrate of that makes up our society because everything in a democracy depends on individual sense-making and meaningful Choice meaningful Freewill meaningful.
55:41
Use but if that's all basically sold to the highest bidder that debases the soil from which independent views grow because all of us are jacked into this sort of Matrix of social media manipulation, that's that's ruining and degrading our democracy and that's really added many of the things that are ruining to creating our democracy, but that's that's a sort of invisible force its Upstream that affects every other thing Downstream because if we can't agree on what's true, for example, you can't solve any problem. I think that's what you talked about in your 10-minute thing on the social dilemma. I think I saw on
56:11
YouTube.
56:11
Tube. Yeah, your organization highlights all these issues and you know in an amazing way and it's very important, but do you have any
56:21
solutions?
56:24
It's hard, right? So I just want to say that this is a complex of problem as climate change in the sense that you need to change the business model. I think of it like we're on the fossil fuel economy and we have to switch to some kind of be on that thing. Right because so long as the business models of these companies depend on extracting attention. Can you expect them to do something different
56:48
like you can't but how could you is it? I mean, there's so much money involved.
56:53
And now they've accumulated so much wealth that they have an amazing amount of influence. Yeah, you know
57:00
and and the asymmetric influence can by lobbyists conditions Congress and prevent things from happening. So this is why it's kind of the last moves. That's right. But you know, I think we're seeing signs of real change. We have the antitrust case that was just filed against Google in Congress. We're seeing more hearings.
57:16
That was the basis of that case.
57:18
You have to be honest. I was actually a middle of the social dilemma launch when I think that
57:22
and in our my home burned down in the recent fires in Santa Rosa, so I actually missed that happening. Sorry to hear that. Yeah. Sorry. That was a big thing to drop. But yeah, I know it's awful. There's so much that's been
57:33
happening. The last six. I've been I was evacuated three times where I lived in California. Oh, really? Yeah. So we're real close to our house Justice Department's who's monopolist Google for violating antitrust laws Department files complaint against Google to restore competition and search and search advertising markets.
57:51
Okay, so it's all about
57:51
search it is right. This was a case. That's about Google using its dominant position to privilege its own search engine in its own products and Beyond which is similar to sort of Microsoft bundling in the Internet Explorer browser, but you know, this is all good progress, but really it misses the kind of fundamental harm of like these things are warping our society. They're working the how our minds are working and there's no you know congressional action against that because it's a really hard problem to solve.
58:20
I think that the the reason the film for me is so important is that if I look at the growth rate of how fast Facebook has been recommending people into conspiracy groups and kind of polarizing us into separate Echo Chambers, which we should really break down I think as well for people like exactly the mechanics of how that happens. But if you look at the growth rate of all those harms compared to you know, how fast has Congress passed anything to do with it, like basically not at
58:46
all. They seem a little bit unsophisticated in that regard.
58:50
I think they understand me. Yeah, they that my trying to be what
58:54
heritable I want to be charitable to and I want to make sure I call out another Senator Mark Warner Blumenthal several other Senators. We've talked to have been really on top of these issues and led I think Senator Warner's white paper on how to regulate the tech platforms is one of the best it's from two years ago in 2018 and Rafi Martina his staffers an amazing human being is works very hard on these issues. So there are some good folks, but when you look at the broad like the hearing yesterday,
59:20
It's mostly grandstanding to politicize the issue right? Because you turn it into on the right. Hey your censoring conservatives and on the left. It's hey, you're not taking down enough misinformation and dealing with the hate speech and all these kinds of things. Right and they're not actually dealing with how would we solve this problem? They're just trying to make a political point to win over their
59:40
base. Now the Facebook recently banned the queue and on pages, which I thought was kind of fascinating because I'm like, well, this is a weird sort of slippery slope, isn't it?
59:50
Like if you decide that you it see it almost seemed to me like well throw them a bone will get rid of Q and on because it's so Preposterous. Let's just get rid of that. But what else like if you keep going down that rabbit hole, where do you draw the line? Like where are you allowed to have JFK conspiracy theories. Are you allowed to have flat Earth? Are you allowed? I mean, I guess Flat Earth is not dangerous. Is that where they make the
1:00:16
distinction? So I think their policy is evolving in the direction of when
1:00:20
Things are causing offline harm when online content is known to proceed offline harm. That's when the platform that's the standard by which platforms are
1:00:30
acting what what offline harm has been caused by the Cuban on stuff. Do you know
1:00:35
there's several incidents. We interviewed a guy on our podcast about it. There's some armed at gunpoint type thing. I can't remember and there's there's things that are priming people to be violent. You know with these are as want to say these are
1:00:50
tricky topics, right? I think what I want to make sure we get to though is that there are many people manipulating the group think that can happen in these Echo Chambers because once you're in one of these things like I studied Cults earlier in my career and the power of Cults is like they're a vertically integrated persuasion stack because they control your social relationships. They control who you're hearing from who you're not hearing from the give you meaning purpose and belonging. They they've have custom language. They have an internal way of referring to things and social
1:01:20
Allows you to create the sort of decentralized Colts Factory where it's easier to grab people into an echo chamber where they only hear from other people's views and Facebook. You think even just recently announced that they're going to be promoting more of the Facebook group content into feeds which means that they're actually going to make it easier for that kind of manipulation to happen,
1:01:40
but they makes a distinction between group content and conspiracy groups. Like, how do you how do you when when does group content when does it
1:01:50
Cross a line.
1:01:52
I don't know. I mean the policy teams that work on this or coming up with their own standards. I'm not familiar with it. If you think about you know, think about how hard it is to come up with a law at the federal level that all states will agree to then you imagine Facebook trying to come up with a policy that will be Universal to all the countries that are running Facebook. Right? Well,
1:02:12
then you imagine how you to company that never thought they were going to be in the position to do that, correct? And then within a decade they become the most prominent source of news.
1:02:20
And information on the planet Earth, correct, and now they have to regulate
1:02:24
it. And you know, I actually believe Zuckerberg when he says I don't want to make these decisions I shouldn't be in this role where my beliefs decide the whole world views, right? He genuinely believes that yeah, um and to be sure to let that but the problem is he created the situation where he is now in that position and he got there very quickly and they did it aggressively when they went into countries like Myanmar Ethiopia all throughout the African continent where they gave do, you know about free Basics? No,
1:02:50
So this is the program that I think has gotten something like 700 million accounts on to Facebook where they do a deal with like a telecommunications provider like at their version of AT&T and Myanmar something. So when you get your smartphone, it could switch built Facebook's built-in and there's a symmetry of access where it's free to access Facebook, but it costs money to do the other things. So for the data plan, so you get a free Facebook account Facebook is the internet basically because it's the free thing you can do when you're filmed.
1:03:21
And then there's we know that there's fake information that's being spread
1:03:25
their the data doesn't apply to Facebook
1:03:27
use. Yeah, I think like the cost like, you know, we pay for data. He was like that. Yeah, I think you don't pay for Facebook. But you do pay for all the other things which creates an asymmetry where your course you're going to use Facebook for most
1:03:37
things right? So you Facebook Messenger.
1:03:40
Yeah. And what's that? Yeah. I don't know exactly what video because different
1:03:45
levels of video calls as well
1:03:47
General they do. Yeah. I just don't know how that works in the developing world, but there's
1:03:51
Joke within Facebook what I mean. This is cause genocides right? So in Myanmar, which is in the film the rohingya Muslim minority group many wrinkle were persecuted and murdered because of fake information spread by the government on Facebook using their asymmetric knowledge with fake accounts. I mean even just a couple weeks ago Facebook took down a network of I think several hundred thousand fake accounts in Myanmar and they didn't even have at the time more than something like four or five people in their extended Facebook Network. Who even
1:04:20
Spoke the language of that
1:04:22
country. Oh God. So when you
1:04:24
realize that this is like the I think it's like the Iraq War Colin Powell Pottery Barn rule where like, you know, if you go in and you break it, then you are responsible for fixing it.
1:04:34
This is Facebook
1:04:36
actively doing deals to go into Ethiopia to go into Myanmar to go into the Philippines or whatever and providing these Solutions and then it breaks the society and they're now in a position where they have to fix those actually a joke within Facebook that if you want to know
1:04:50
Which countries will be quote unquote at risk in 2 years from now look at which ones have Facebook free basics.
1:04:58
Jesus
1:05:00
and that's terrifying that they do that and they don't have very many people that even speak the language. So there's no way they're gonna be able to filter
1:05:06
it. That's right. And so now if you take it back and we were talking outside about the Congressional Hearing in Jack Dorsey and the questions from the senator about are you taking down the content from the Ayatollah has or from the Chinese xinjiang Province about the uyghurs, you know, when there's sort of speech that leads to offline violence and these other countries the issue is that these platforms are managing the information.
1:05:28
And Commons and for countries, they don't even speak the language of right and if you think the conspiracy theory sort of dark Corners Crazy Town of the English internet are bad and wietfeldt. We've already taken out like hundreds of whack-a-mole sticks and they've hired hundreds of policy people and hundreds of Engineers to deal with that problem. You go to a country like Ethiopia where there's something like 90 major does 90-something dialects, I think in the country and six major languages where one of them is the dominant Facebook sort of language.
1:05:58
Image and then the others get persecuted because they actually don't have they don't have a voice on the platform. This is really important that the people in Myanmar who got persecuted and murdered didn't have to be on Facebook for the fake information spread about them to impact them for people to go after them, right? So this is the whole I can assert something about this minority group that minority group isn't on Facebook. But if it manipulates the the dominant culture to go we have to go
1:06:28
Kill them then they can go do it and the same thing has happened, you know in India where there's videos uploaded about. Hey those Muslims, I think they're called flush killings where they'll say that this these Muslims killed this cow. And in Hinduism Hinduism, the cows are sacred the ticket that right. Anyway, I believe you. Yeah, the they will post those the go viral on WhatsApp and say we have to go Lynch those
1:06:58
Muslims because they killed our sacred the sacred cows and they went from something like five of those / happening per year to now hundreds of those happening per year because of fake news being spread again on Facebook page book about them on WhatsApp about them. And again, they don't have to be on the platform for this to happen to them. Right? So this is critical that you know, imagine you and I are in all let's imagine all of your listeners, you know, I don't even know how many you have like tens of millions, right? And we all listened to this conversation. We say we don't want to even use Facebook and Twitter or YouTube.
1:07:28
We all still if you live in the u.s. Still live in a country that everyone else will vote based on everything that they are seeing on these platforms. If you zoom out to the global context all of us don't we don't use Facebook in Brazil, but if Brazil which was heavily the last election was skewed by Facebook and WhatsApp or something like 87% of people saw at least one of the major fake news stories about Bowl scenario, and he got elected and you have people in Brazil chanting Facebook face a bouquet when he wins he wins.
1:07:58
And then he sets a new policy to wipe out the Amazon all of us don't have to be on Facebook to be affected by a leader that wipes out the Amazon and accelerates climate change timelines because of those interconnected effects. So, you know, we at the center for me and Technology are looking at this from a Global Perspective where it's not just the u.s. Election Facebook manages something like 80 elections per year. And if you think that they're doing all the monitoring that they are for, you know, English speaking American election most privileged Society now look at the hundreds of other countries that they're operating in.
1:08:28
You think that they're devoting the same resources to to the other
1:08:31
countries?
1:08:34
This is so crazy. It's like is that you Jimmy, so weird noise.
1:08:40
You're like a squeaky. Yeah, maybe it's me. I don't think this just might be feedback. There it is. You might be breathing. I don't know you have a you have
1:08:50
asthma I think I had an allergy
1:08:52
come. Oh, it's like making sorry mrs. What's terrifying is that we're talking about from 2012 to 2020 YouTube implementing this program. And then what is the even the birth of Facebook? What is that like two thousand two or three like present for?
1:09:11
This is such a short timeline and having these massive worldwide implications from the use of these things. When you look at the future. Do you look at this like a runaway train? That's headed towards a
1:09:22
cliff. Yeah, and I think right now this thing is a Frankenstein that it's not like even if Facebook is aware of all these problems. They don't have the staff unless they hired like hundreds of you know, tens hundreds of thousands of people definitely minimum to try to address all
1:09:40
Problems, but the Paradox were in is that the very premise of these Services is to rely on automation like it used to be we had editors and journalists or at least editors are you know people edited even what went on television saying what is credible what is true? Like, you know you sat here with you know, Alex Jones even yesterday and you're trying to check him on everything. He's saying right you're researching and trying to look that stuff up. You're trying to be doing some more responsible communication the premise of these systems is that
1:10:10
That you don't do that. Like the reason venture capitalist find social media. So profitable and such a good investment is because we generate the content for free. We are the useful idiots, right instead of paying a journalist $70,000 a year to write something credible. We can each be convinced to share our political views and we'll do it knowingly for free. Actually. We don't really know where the useful idiots that's the kind of the point and then instead of paying an editor $100,000 a year to figure out which of those things is true that we want to promote and give
1:10:40
Will reach to you have an algorithm says Hey what if people click on the most what people like the most and then you realize the quality of the signals that are going into the information environment that we're all sharing is a totally different process. We went from a high-quality gated process that cost a lot of money to this really crappy process that cost no money, which makes the companies so profitable and then we fight back for territory for values when we raise our hands and say hey
1:11:10
there's a thinspiration video problem for teenagers and anorexia. Hey, there's a mass conspiracy sort of echo chamber problem over here. Hey, there's you know Flat Earth sort of issues and again these get into tricky topics because we want to you know, I know we both believe in free speech and we have this feeling that the solution to bad speech is better, you know more speech that counters the things that are said, but in a finite attention economy, we don't have the capacity for everyone who gets bad speech
1:11:40
Just have a counter response. In fact, what happens right now? Is that that bad speech rabbit holes into not only call worse and worse speech but more extreme versions of that view that confirms it because once Facebook knows that that Flat Earth rabbit hole is good for you it getting your attention back. It wants to give you just more and more of that. It doesn't want to say here's 20 people who disagree with that thing. Right? Right. So I think if you were to imagine a different system, we would ask who are the thinker's that are most open-minded and synthesis oriented where they can actually steal.
1:12:10
On the other side. Actually they can do you know for this speech here is the opposite counter-argument. They can show that they understand that and imagine those people get lifted up but notice that none of those people that you and I know I mean, we're both friends with Eric Weinstein and you know, I think he's one of these guys who's really good at sort of offering the steel Manning. Here's the other side of this. Here's the other side of that but the people who generally do that aren't the ones who get the tens of millions of followers on these surfaces. It's the black and white extreme outrage oriented thinkers and speakers.
1:12:40
Hours, they get rewarded in these attention economy. And so if you look at how if I zoom way out and say how is the entire system behaving just like if I zoom out and say climate, you know, the climate system like how is the entire overall system behaving it's not producing the kind of information environment on Which democracy can survive Jesus.
1:13:01
The thing that troubles me the most that I clearly see your thinking and I agree with you. Like I don't see any holes in what you're saying. Like, I don't know how this plays out, but it doesn't look good and I don't see a solution. It's like if there are a thousand bison running full steam towards a cliff and they don't realize that Cliff is there I don't see how you pull them back. So I
1:13:24
think of it like we're trapped in a body and that's eating itself. So like it's kind of a cannibalism.
1:13:30
Maybe because our economic growth right now with these tech companies is based on eating our own organs. So we're eating our own mental health organs were eating the health of our children were eating the sorry for being so gnarly about it. But it's it's a cannibalistic system in a system that's hurting itself or eating itself or punching itself. If one of the neurons wakes up in the body, it's not enough to change that it's going to keep punching itself. But if enough of the neurons wake up and say this is stupid. Why would we build our system this way? And the reason I'm so excited about the film is that if you have 40 to
1:14:00
50 million people who now recognize that we're living in this sort of cannibal as system in which the in economic incentive is to debase the life support systems of your democracy. We can all wake up and say that's stupid. Let's do something differently. Let's actually change the system. Let's use different platforms. That's fun different platforms. Let's regulate and tame the existing Frankenstein's and I don't mean regulating speech. I mean really thoughtfully, how do we change the incentives? So it doesn't go to the same race to the bottom and we have to all recognize that we're now 10 years.
1:14:30
Into
1:14:30
this hypnosis experiment of warping of the mind and like, you know his friends and his is like, how do we snap our fingers and get people to say that that artifact there's an inflated level of polarization and hatred right now that especially going into this election. I think we all need to be much more cautious about what's running in our brains right
1:14:47
now. Yeah, I don't think most people are generally aware of what's causing this polarization. I think they think it's the climate of society because the president and because of black lives matter and the joy
1:15:00
George Floyd protest and all this Jazz, but I don't think they understand that that's exacerbated in a fantastic Way by social media and the last 10 years of our addictions to social media. And these Echo chambers that we all exist in. Yes. So
1:15:17
I want to make sure that we're both clear and I know you agree with this that these things were already in society to some degree, right? So we want to make sure we're not saying social media is blamed for all of it. Absolutely not no no.
1:15:30
In fact gasoline is gasoline. Right? Exactly. It's
1:15:33
lighter fluid for Sparks of polarization. It's lighter fluid for Sparks of you know more paranoid, which is ironically what everybody
1:15:42
was the opposite of everybody with everybody. Hope the internet was going to be right everybody. Hope the internet was going to be this bottomless resource of information where everyone was going to be educated in a way. They had never experienced before in the history of the human race where you have access to all the answers to all your questions, you know, Eric Weinstein describes as
1:16:00
the Library of Alexandria in your pocket. Yeah, but no
1:16:04
well and I want to be clear so that I'm not against technology or giving people access. In fact, I think a world where everyone had a smartphone and a Google search box and Wikipedia and like a search oriented of YouTube so you can look up health issues and how to do it yourself fix anything sure would be awesome. That would be great. I would love that just want to be really clear because this is not an anti-technology conversation. It's about again this business model that depends on recommending stuff to people which just to be clear on the polarization front.
1:16:30
Um it so social media is more profitable when it gives you your own Truman Show that affirms your view of reality every time you flick your finger right like it that's going to be more profitable than every time you flick your finger. I actually show you here's a more complex nuanced picture that disagrees with that. Here's a different way to see it. That won't be nearly as successful in the best way for people to test this. We actually recommend even after seeing the film to do this is open up Facebook on two phones, especially like a, you know, two partners or people who have the same friends. So you have the same.
1:17:00
I'm friends on Facebook. You would think if you scroll your feeds you see the same thing the same people you're following. So why wouldn't you do the same thing? But if you swap phones and you actually scroll through their feet for 10 minutes and you scroll through my mind for 10 minutes, you'll find that you'll see completely different information and it won't you'll also notice that it won't feel very compelling. Like if you ask yourself my friend Emily just did this with with her husband after seeing the film and she literally has the same friends as her husband and she strolled through the field. She's like, this is an interesting I wouldn't come back.
1:17:30
This
1:17:32
right and so we have to again realize how subtle and gadgets how subtle this has been
1:17:37
our wonder what would happen if I scroll through my feet because I literally don't use Facebook what I don't use it at all. I only use Instagram use Instagram. I I'd stopped using Twitter because it's like a bunch of mental patients throwing shit at each other and I very rarely use it I should say occasionally. I'll check some things to see like what the climate is but cultural climate, but
1:18:00
I use Instagram and I Facebook I used to use Instagram to post to Facebook but I kind of stopped even doing that because it just it just seems gross. Yeah, it's just and it's these people in these verbose arguments about politics the economy and world events
1:18:17
and just we have to ask ourselves is is that Medium constructive to solving these problems just not at all and it's an attention casino right The House Always Wins, and we're
1:18:30
You know are quite he might see Eric once in a thread, you know, battling it out or sort of duking it out with someone and maybe even reaching some convergence on something but it just whizzes by your feet and then it's gone. Yeah and all the effort that we're putting in to make these systems work, but then it's just all gone when it's you do. I mean I try to very minimally use social media overall. Luckily the work is so busy that that's easier. I want to say first that you know on the addiction front of these things.
1:19:00
I you know myself and very sensitive and you know easily Addicted by these things myself and that's why I think I
1:19:06
notice the you are saying the social dilemma. It's email for you. Huh?
1:19:10
Yeah, I I you know for me I refresh my email and pull to refresh like a slot machine. Sometimes I'll get invited to meet the president of such-and-such to advise on regulation and sometimes I get a stupid newsletter from the politician. I don't care about her something right. So I email is very addictive. It's funny. I talked to Daniel Kahneman who wrote the he's like the
1:19:30
Under of Behavioral economics. He wrote the book Thinking Fast and Slow, you know that one and he said as well that email was the most addictive for him and he you know, the one thing you'll find is that the people who know most about the sort of persuasive manipulative tricks, they'll say we're not immune to them just because we know about them Dan ariely is another famous persuasion behavioral economics guy talks about flattery and how flattery still feels good. Even if I tell you, I don't mean it like I love that that sweatshirt. That's an awesome sweatshirt. Where'd you get it?
1:19:59
You're just going to bullshit me. But that's that's the it feels good to get flattery. Even if you know that it's not real right and that the point being that like can we have so much evolutionary wiring to care about what other people think of us that just because you know that they're manipulating you in the likes or whatever it still feels good to get those hundred extra likes on that thing that you posted.
1:20:20
Yeah when the lights come about.
1:20:24
Well, let's see. Well actually, you know in the film, you know, Justin Rosenstein.
1:20:29
Inventor of the like button talks about I think the first version was something called Beacon and it arrived in 2006. I think Hmm, but then the simple like one click like button was like a little bit later like 2008-2009.
1:20:41
Are you worried that it's going to be more and more invasive. I mean you think about the problems we're dealing with now with Facebook and Twitter and Instagram all these within the last decade or so. What what do we have to look forward to I mean, is there something on the horizon it's going to be even more invasive. Well, we have to change this.
1:20:59
System because as you said technology only to get it is only going to get more immersed into our lives and Infused into our lives not less the technology to get more persuasive or less persuasive more more sure is a I going to get better at predicting our next move or less good at predicting our next
1:21:15
move. It's almost like we have to eliminate that and I mean it would be really hard to tell them you can't use algorithms anymore that depend on people's attention spans, right? It would be really hard but it seems like
1:21:29
The only way for the internet to be pure correct. I
1:21:32
think of this like the environmental movement. I mean some people have compared the film The Social Dilemma to Rachel Carson's Silent Spring right where that was the birth. That was the book that birth the environmental movement and that was in a republican Administration the Nixon Administration. We actually passed we created the EPA the Environmental Protection Agency. We went from a world where we said the environment something we don't pay attention to two we passed a bunch of forgot the laws. We pass between 1963 and 1972 over a decade.
1:21:59
We started caring about the environment we created things that protected the national parks we and I think that's kind of what's going on here that you know, imagine for example, it is illegal to show advertising on youth oriented social media apps between 12:00 a.m. And 6 a.m. Because you're basically monetizing loneliness and lack of sleep, right? Like imagine that you cannot advertise during those hours because we say that like a national park our children's attention between this is a very minimal example by Lisa be like
1:22:29
Taking the most obvious piece of low-hanging fruit and land in sight. Let's quarantine this off and say this is
1:22:34
sacred but isn't the problem like the Environmental Protection Agency it resonates with most people the idea. Oh, it's protect the world for our children, right? There's not a lot of people profiting off of polluting the rivers right? But when you
1:22:48
lose this, I mean over overhunting, you know, certain lands or overfishing certain fisheries and collapsing them. I mean, there are if you have big enough corporations that are based on an infinite growth profit model.
1:22:59
You know operating with less and less, you know resources to get this is a problem we faced
1:23:03
before for sure for sure, but it's not the same sort of scale as 300 and x amount of millions of people and a vast majority of them are using some form of social media and also this is not something that really resonates in a very clear like one plus one equals two way like the Environmental Protection Agency. It's makes sense. Like if you ask people right should
1:23:29
Be able to throw garbage into the ocean. Everyone's going to say no that's terrible idea. Should you be able to make an algorithm that shows people what they're interested in on YouTube like yeah, what's wrong with that? What's more like sugar? Right? Because it sugar is
1:23:44
always going to taste way better than something else because our evolutionary Heritage says like that's rare and so we should pay more attention to it. This is like sugar for the fame lottery for attention for social approval. And so it's always going to feel good and we need to have Consciousness about it, and we haven't banned.
1:23:59
Sugar but we have created a new conversation about what healthy you know eating is right. I mean, there's a whole new fitness movement in sort of yoga and all these other things that people care more about their bodies and health than they probably ever have I think many of us wouldn't have thought we'd ever reach it through get through the period of soda being at the sort of Pinnacle popularity. That is I think in Two thousand thirteen or fourteen was the year that water crossed over as being more of a successful drinking product than soda. I think really I think that's true. You might want to look that up but
1:24:29
So I think we could have something like that here. We have to I think of it this way if you want to even get kind of weirdly, I don't know spiritual or something about it, which is we are the only species that could even know that we were doing this to ourselves. Right? Like we're the only species with a capacity for self-awareness to know that we have actually like roped ourselves into this Matrix of like literally The Matrix of of sort of undermining our own psychological weaknesses.
1:24:59
A lion that somehow manipulated its environment so that there's gazelles everywhere and is like overeating on gazelles doesn't have the self-awareness to know. Wait a second. If we keep doing this this is going to cause all these other problems. It can't do that because it's brain doesn't have that capacity. Our brain. We do have the capacity for self awareness. We can name negativity bias, which is that if I have a hundred comments and 99 or positive. My brain goes to the negative. We can name that and once we're aware of it we get some agency back we can name that we have a draw towards social approval.
1:25:29
Evil, so when I see I've been tagged in a photo. I know that they're just manipulating my social approval we can name social reciprocity which is when I get all those text messages and I feel I have to get back to all these people. Well, that's just an inbuilt bias that we have to get back reciprocity. We have to get back to people who do give it give stuff to us. The more we name our own biases like confirmation bias. We can name that my brain is more likely to feel good getting information that I already agree with that information that disagrees with me once I know that about myself
1:26:00
I can get more agency back and we're the only like species that we know of that has the capacity to realize that we're in a self terminating sort of system and we have to change that by understanding our own weaknesses and that we've created this system that is undermining ourselves. And I think the film is doing that for a lot of people
1:26:17
it certainly is but I think it needs more. It's like inspiration needs a refresher on a regular basis, right? Do you feel this massive obligation to be that guy that is out there sort of as
1:26:29
The Paul Revere of the the technology influence Invasion.
1:26:36
I just see these problems and I want them to go away. Yeah, you know, I didn't, you know didn't desire and wake up to run a social movement, but honestly right now that's what we're trying to do with the center for human technology. We realized that before the success of the film. We were actually more focused on working with technologists inside the industry and I come from Silicon Valley many of my friends are
1:26:59
negatives at the companies and we have these inside relationship. So we focused at that level. We also work with policymakers. We were trying to speak to policymakers. We weren't trying to mobilize the whole world against this problem, but with the film suddenly, we as an organization have had to do that and we're frankly I wish we had I'm just digging really honestly. We I really wish we'd had those funnels so that people who saw the film could have landed into you know, a carefully designed funnel where we actually started mobilizing people to deal with this issue because there are ways we can do it we can pass certain laws.
1:27:30
We have to have a new cultural sort of set of norms about how do we want to show up and use this system, you know families and schools can have whole new Protocols of how do we want to do group migrations? Because one of the problems is that if a teenager says by themselves, well, I saw the film I'm going to delete my Instagram account by myself or Tick-Tock account by myself. That's not enough because all their friends are still using Instagram and Tick-Tock and they're still going to talk about who's dating who or gossip about this some homework or whatever on those services.
1:28:00
And so the services Instagram and Tick-Tock pray on social exclusion that you will feel excluded if you don't participate and the way to solve that is to get whole schools or families together, like pick different parent groups, whatever together and do a group migration from Instagram to Signal or iMessage or some kind of group thread that way because notice that when you as you said apples a pretty good actor in the space if I make a FaceTime call to you FaceTime isn't trying to monetize my attention right? It's just
1:28:29
Sitting there being like yet. What how can I help you? Have a good face. It's close to face-to-face, you know conversation these
1:28:34
possible Jamie pulled up an article earlier. There was saying that Apple was creating its own search engine. Yeah. I hope that is the case and I hope that if it is the case, they apply the same sort of Ethics that they have towards sharing your information that they do with other things to their search engine, but wonder if there would be some sort of value and them creating a social media platform.
1:29:00
Form that doesn't rely on that sort of
1:29:03
algorithm. Yeah. Well, I think in general one of the exciting trends that has happened since the film is there's actually many more people trying to build Alternatives social media products that are not based on these business models. Yeah. I can name a few but I don't want to be endorsing and I mean, there's people believe Marco Polo Club House Wikipedia is trying to build a sort of for nonprofit version. I always forget the names of these things, but but the
1:29:29
interesting thing is that for the first time people are trying to build something else because now there's enough people who feel disgusted by the present State of Affairs and that wouldn't be possible unless we created a kind of a cultural movement based on something like the film that reaches a lot of people
1:29:44
it's interesting that you made this comparison to the Environmental Protection Agency because there's kind of a parallel in the way other countries handle the environment versus the way we do and how it makes them competitive and that's always been the Republican argument for not getting rid of
1:29:59
Certain fossil fuels and coal and all sorts of things that have a negative consequence that we need to be competitive with China. We need to be competitive with these other countries that don't have these regulations in effect. The concern would be well first of all, the problem is these companies are Global right? Like Facebook is global if they put these regulations on America, but didn't put these regulations worldwide then wouldn't they use the the income and the algorithm?
1:30:29
Mmm, in other countries unchecked right and have this which is my last negative consequence in gather up all this money,
1:30:37
which is why I just like sugar. It's like everyone around the world has to understand and be more antagonistic. Yeah and not like sugars evil, but just you have to have a common awareness about the problem and how
1:30:46
could you educate people the like if you're talking about some a country Mike Myanmar or these other countries that have had these like serious consequences because of Facebook. How could you possibly get our ideas?
1:31:00
Across to them if we don't even know their language and it's just this system that's already set up in this very advantageous way for them where Facebook comes on their phone. Like, how could you hit the brakes on that? Well,
1:31:13
I mean first was want to say this is an incredibly hard and depressing problem. Yeah. I just the scale of it. Right? Right. Um, you need something like a global. I mean language independent Global self awareness about this problem now again, I don't wanna be tooting the horn about the film. But the thing I'm excited about is
1:31:30
It launched on Netflix in a hundred and ninety countries and in 30 languages, so you should take the horn. Oh,
1:31:35
yeah. Yeah toot it. Yeah. Well, I think you know, the
1:31:39
film was seen in 30 languages. So I you know the cool thing is I wish I could show the world my inbox I think people see the film and they feel like oh my God, this is huge and I'm a huge problem and I'm all alone. How are we ever going to fix this but I get emails every day from Indonesia, Chile Argentina Brazil people saying, oh my God this
1:31:59
Exactly what's going on in my country? I mean, I've never felt more optimistic and I felt really pessimistic for the last eight years working on this because there really hasn't been enough movement. But I think for the first time there's a global awareness now that we could then start to mobilize. I know the eu's mobilizing Canada's mobilizing Australia's mobilizing California state is mobilizing with prop 24. There's a whole bunch of movement now in the space and they have a new rhetorical arsenal of why we have to make this bigger transition now, you know,
1:32:29
No, are we going to get all the the countries that were the six different major dialects in Ethiopia where they're going to know about this? I don't think the film was translated into all those dialects. I think we need to do more. It's a really really hard messy problem. But on the topic of if we don't do it someone else will you know one interesting thing in the environmental movement was there's a great WNYC radio piece about the history of
1:32:59
And when we regulated lead I don't you know anything about this.
1:33:02
Yeah, I do. Yeah. Yeah, the
1:33:04
occurs is matches up with with your experience. The my understanding is that obviously LED was this sort of Miracle thing. We put it in paint. We put it in gasps. It was great. And then the way we figured out that we should regulate lead out of our sort of infused product Supply is by proving there is this guy who proved that it dropped kids IQ.
1:33:29
By four points for every I think Mike 10 microgram per deciliter. I think it's in other words for the amount of if you had a microgram of lead per deciliter of either I'm guessing are it would drop like the IQ of kids by four points and they measured this by actually doing a sample on their teeth or something because LED shows up in your bones, I think and they proved that if the IQ points dropped by Four Points, it would lower future age warning.
1:33:59
Each earning see me wage earning potential of those kids, which would then lower the GDP of the country because it would be Shifting the the IQ of the entire country down by 4 points. If not more based on how much lettuce in the environment if you zoom out and say is social media. Now, let's replace the word IQ, which is also a rod term because there's like a whole bunch of views about how that's designed in certain ways and not others and measuring intelligence. Let's replace IQ with problem-solving capacity.
1:34:30
What is your problem-solving capacity, which is actually how they talk about it in this radio episode and imagine that we have a societal IQ or a societal problem solving capacity. The US has a societal like you Russia has a societal like you Germany has a societal IQ. How good is a country at solving its problems. Now imagine that what is social media due to our societal IQ
1:34:56
will distorts our ideas gives us a bunch of false narrative.
1:35:00
It fills us with misinformation and makes it impossible to
1:35:03
agree with each other and in a democracy if you don't agree with each other and you can't even do compromise you have to recognize that politics is invented to avoid Warfare, right? So we have compromise and understanding so that we don't like physically are violently each other. We have compromised and conversation if social media makes compromise conversation and under shared understanding and shared truth impossible. It doesn't drop our societal IQ by 4.2
1:35:28
drops it to zero.
1:35:30
Because you can't solve any problem whether it's
1:35:32
human trafficking or poverty or climate issues or you know, racial Injustice, whatever it is that you care about. It depends on us having some shared view about what we agree on by the way and on the optimistic side, there are countries like Taiwan that have actually built a digital Democratic sort of social media to type thing. Audrey Tang. You should have Audrey Tang on your show. She's amazing. She's the digital minister of Taiwan and they've actually built a system that Rewards
1:36:00
Likely consensus so when two people who would traditionally disagree post something online and when when they act when two people traditionally disagree actually agree on something that's what gets boosted to the top of the way that we look at our information feeds room. Yeah, so it's about finding consensus where the unlikely and saying hey
1:36:21
actually, you know you Joe interest on you.
1:36:22
Typically you agree to disagree in these six things you agree on these three things and of things that we're going to encourage you to talk about on a menu. We hand you a menu of the things you
1:36:29
agree.
1:36:30
And how did they manipulate that
1:36:33
honestly we did a great interview with her on our podcast that people can listen to I think you should have her on she's honest. I would love to
1:36:40
me. What is your podcast again? Tell people
1:36:41
it's called your undivided attention and with interview is with Audrey tanks her name and I think that's this is one model of how do you have you know, sort of digital media bolted onto the top of a democracy and have it work better as opposed to how do you did just degrades into kind of nonsense and polarization?
1:37:00
Inability to agree honestly such a unique
1:37:02
situation to write because China doesn't recognize them and there's a real threat that they're going to be invaded by China correct it so what's interesting
1:37:09
about Taiwan is there's we didn't we haven't talked about the disinformation issues, but it's under like you said not just physical threat from China but massive propaganda disinformation campaigns are trying to run their right sure. And so it's amazing is that their digital media system is good at dealing with these disinformation campaigns and conspiracy theories and other things even in the face of it.
1:37:30
A huge threat like China but there's more binding energy in the country because they all know that there's it's a tiny Island and there's a looming threat of this big country. Whereas the United States. We're not this tiny island with a looming threat elsewhere. In fact many people don't know or don't think that there's actually information Warfare going on. I actually think it's really important to point out to people that this social media is one of our biggest National Security risks because while we're obsessed with protecting our physical borders and building walls, and you know, spending a trillion dollars we doing the nuclear Fleet.
1:38:01
We left the digital border wide open like if Russia or China try to fly a plane into the United States our Pentagon and billions of dollars of Defense infrastructure from Raytheon and Boeing whatever will shoot that thing down and it doesn't get in if they try to come into the country. They'll get stopped by the passport control system. Ideally
1:38:18
if they try to fly if Russia or China try to
1:38:20
fly an information bomb into the country instead of being met by the Department of Defense. They're met by a Facebook algorithm with a white glove that says exactly which zip code you want to Target.
1:38:30
Like it's the opposite of protection. So social media makes us more vulnerable. I think of it like if you imagine like a bank that spent billions of dollars, you know, surrounding the bank with physical body guards, right? Let's just the buff is guys and every single quarter you just totally secured the bank but then you installed on the bank a computer system that everyone interacts with and no one changes the default password from like lowercase password. Anyone can hack in that's what we do when we install Facebook in our
1:39:00
You install Facebook in Ethiopia? Because if you think Russia or China, you know or Iran or South Korea is getting North Korea influencing. Our election is bad. Just keep in mind the like dozens of countries throughout Africa where we actually know recently there is a huge campaign that the Stanford cyber policy Center did a report on of Russia targeting think something like seven or eight major countries and disinformation campaigns running in those countries or the Facebook whistleblower who came out about a month ago. Sophie Zhang I think is her name.
1:39:30
Mmmmmm saying that she personally had to step in to deal with disinformation campaigns in Honduras Azerbaijan. I think Greece or some other countries like that. So the scale of what these technology companies are managing their managing the information environments for all these countries, but they don't have the resources to do it. So they
1:39:49
not only that they're not trained to do it. They're not qualified to correct their making up as I go along 2232, and they're way behind the curve when I had renamed to rest on and she detailed all
1:40:00
If the issues with the internet research agency in Russia and what they did during the 2016 campaign for both sides, I mean the idea is they just promoted Trump but they were basically sowing the seeds of just the decline of the Democracy. They were trying to figure out how to create turmoil and they were doing it in this slide very bizarre calculated way that it didn't see it was hard to see like what's the endgame here? Well the end.
1:40:30
Game is to have everybody fight. Yeah, I mean, it's really what the endgame was. And
1:40:34
if I'm you know, one of our major adversaries in after World War Two there was no ability to use kinetic like nukes or something on the bigger countries, right? Like that's all done. So the what's the best way to take down the biggest, you know country on the planet on the Block you use its own internal tensions against itself. This is what Sun Tzu would tell you to do. Yeah.
1:40:58
And that's never been easier because of Facebook and because of these platforms being open to do this manipulation. And if I'm looking now, we're four days away from the u.s. Elections or something like that when this goes out Jesus Christ, there is never we have never been more destabilize the country until now, I mean, this is the most destabilize you probably have ever been I would say and polarized maybe people would argue the Civil War was whether worse but in recent history,
1:41:29
There is maximum incentive for foreign actors to drive up again, not one side or the other but to drive us into conflict. So I would really you know, I think what we all need to do is recognize how much incentive there is to plant stories to actually have so physical violence on the streets. I think there was just a story when we talking about this morning that there's some kind of truck I think in Philadelphia or DC loaded with explosives or something like this. There's there's such an incentive to try.
1:41:58
You know throw the agent provocateur like throw the first stone throw the first Molotov cocktail throw the first, you know, make the first shot fired to drive up that conflict and I think we have to realize how much that may be artificially
1:42:12
motivated.
1:42:14
Very much. So and the Renee direst of podcasts that I did where she went into depth about all the different ways that they did it and the most curious one being funny memes. Yeah, there's so many of the memes that you read that you laugh at ya. There's it was just so weird that's their humorous and she said she looked at probably a hundred thousand
1:42:35
means and the funny thing is you actually can agree with them, right? I appreciate you can't even
1:42:39
laugh at them. Yeah, like oh, you know and they're being constructed by foreign.
1:42:43
Once that are doing this to try to mock certain aspects of our society and pit people against each other and create a mockery
1:42:53
and you know back in 2016. There is no very very little collaboration between our defense industry and CIA and DOD and people like that and the tech platforms and the tech platforms said it's government's job to deal with the foreign actors are doing these
1:43:07
things. How do you stop something like the IRA like say if they're creating memes in particular and they're funny me.
1:43:13
Memes was
1:43:14
one of the issues that Renee brings up and I'm just a huge fan of her and her work is as my yeah, is that if I'm you know, China, I don't need to invent some fake news story. I just find someone in your Society who's already saying what I want you to be talking about and I just like amplify them up. I take that dial and I just turn it up to ten. Right? So I find your Texas secessionists and like oh, Texas said, that would be a good thing if I'm trying to rip the country apart, so I'm going to take those tests secessionists and the, California.
1:43:43
- - and I'm just going to dial them up to 10. So those are the ones we hear from now. If you're trying to stop me in your Facebook and your the Integrity team or something on what grounds are you trying to stop me? Because it's your own people your own free speech. I'm just the one amplifying the one I want to be out there. Right? And so that's what gets tricky about. This is I think our moral Concepts that we hold so dear of free speech are inadequate in an attention economy that is hackable and it's really more about what's getting the attention rather than what our
1:44:13
Individuals saying or can't say and you know, then again they've created this Frankenstein where they're making mostly automated decisions about who's looking like what pattern behavior or coordinated inauthentic Behavior here that and they're shutting down. I don't know if people know this people Facebook shut down two billion fake accounts. I think this is a stat from a year ago. They shut down two billion fake accounts. They have 3 billion active real users. Do you think that those two billion were the perfect like real, you know real fake accounts and
1:44:43
They didn't miss any or they didn't overwhelm and took some real accounts down with it. You know our friend Brett Weinstein. He just got taken down by Facebook and he saw that
1:44:52
that seemed calculated though. Facebook is shut down 5.4 billion fake accounts this
1:44:57
year. And that was in November 29th. Oh my
1:45:00
God. Oh my God. That is insane. That's so many.
1:45:05
And so again, it's the scale at these things are operating at and that's why you know when Brett got his thing taken down. I didn't like that but I it's not like there's this
1:45:13
Vendetta against Brett,
1:45:14
right? No, I don't know about that that seemed to me to be a calculated thing because you know, Eric actually tweeted about it saying that you know, you could probably find the two weeks. I retweeted it. Like basically it was reviewed by a person. So you're lying. He's like this is not something that was taken down by an algorithm. He believes it was because it was Unity 2020 platform, whether trying to bring together conservatives and liberals and try to find some common ground and create like a third party.
1:45:43
Candidate that combines The Best of Both Worlds. I don't
1:45:47
understand what policy is you need to Unity 2020 thing was going up against like I have no idea going into
1:45:52
two party system. The idea is that it's taking away votes from Biden and then it may help Trump win right band him off Twitter as well. You know that
1:45:59
too. They blocked the account or something
1:46:01
for they banned the entire Tab and the 20 Unity 2020 account. Yeah Unity. Yeah, I mean literally Unity the like, nope. No Unity. Fuck you. We want Biden. Yeah the political bias on.
1:46:13
And social media is undeniable and that's maybe the least of our concerns in the long run but it's a tremendous issue and it also it for sure sews the seeds of discontent and it creates more animosity and it creates more
1:46:27
conflict. The interesting thing is that if I'm one of our adversaries I see that there is this view that people don't like the social media platforms that I want them to be more like let's say I'm Russia and China, right and I'm currently using Facebook and Twitter successfully to run information campaigns and then I
1:46:43
Them I can actually plant a story so that they end up shutting it down and shutting down conservatives are shutting down one side which then forces the platforms to open up more so that I then Russia or China can keep manipulating even more righto sir. Yeah. So so right now they want it to be a free for all where there's no moderation at all because that allows them to get in and they can weaponize the conversation against itself. Right?
1:47:10
We have to all be aware that I mean, I'd hate it if we are all aware of
1:47:13
It it seems so pervasive.
1:47:16
Yeah. Well, it's not just pervasive. It's like I said, it's weird 10 years into this hypnosis experience. This is the largest psychological experiment we've ever run on
1:47:26
Humanity. It's insane. It is insane and it and it's also with tools that never existed before evolutionarily. So like we we really are not designed just the way these brightly lit metal devices and glass devices interact with your brain. There's
1:47:44
Enthralling, right we've never had to resist anything like this before with the things we've had to resist is don't go to the bar. You know, you have an alcohol problem. Stop smoking cigarettes. It'll give you cancer. Right? We've never had a thing that does so much you can call your mom. You can text a good friend. You can you can receive your news you can get amazing email about this project you working at and it could suck up your time staring at
1:48:10
butts and the and the infusion of
1:48:13
of the things that you that are necessary for life like text messaging or like looking something up are infused and right next to write all of the sort of right corrupt stuff.
1:48:23
Right? And if you're using it to order food and if using it to get an Uber and right, but
1:48:28
imagine if we all wiped our phones of all the extractive business model stuff and we only had the
1:48:33
tools like if you thought about using a light phone
1:48:36
yes, honey. I those guys just to be brought up in my awareness more often for those who don't know. It's like what it's like a
1:48:43
One of the guys on the documentaries one of the creators of it right
1:48:47
now. The I think you're thinking of Tim Kendall who started he's the guy who invented who brought in Facebook's business model of advertising and he runs a company now called moment that shows you the number of hours you spent on different apps and helps you
1:48:59
use it less someone involved in the documentary was also a part of the Light phone
1:49:05
team. No, no, not not officially. No, I don't think so, but the light phone is like a basically a thin white black and white phone
1:49:12
thing.
1:49:13
Text and I think it does it plays music now, which I was like, oh, that's a mistake. Right? Like that's a slippery slope.
1:49:20
That's a thing and we have to all be comfortable with losing access to things that we're I love I like. Oh, maybe you do want to take notes this time, but you don't have your full keyboard to do that. And are you willing to that? I think the thing is one thing people can do is to take like a digital Sabbath one day a week off completely because the very imagine if you got several hundred million people to do that that drops the revenue of these companies by like 15% because that's one out of seven days that you're not on the system.
1:49:43
So long as you don't rebalance and use it more on the other
1:49:46
days, I'm inclined to think that Apple's their solution is really the way out of this that the to opt out of all sharing of your information and if they could come up with some sort of a social media platform that kept that as an ethic. Yeah, I mean might allow us to communicate with each other but stop all this algorithm nonsense. It's look if anybody has the power to do it. They have so much Goddamn money
1:50:12
totally well and also,
1:50:13
They're like the you know people talk about you know, the government regulating these platforms, but apple is kind of the government that can regulate the attention economy because when they do this thing we talked about earlier of saying do you want to be tracked right? And they give you this option when like 99% of people are gonna say no, I don't want to be tracked right when they do that. They just put a 30% tax on all the Advertising based businesses because now you don't get as personalized an ad, right which means they make less money which means that business models less attractive to venture capitalists to fund the next thing which mean so they're actually
1:50:43
Reenacting write a kind of a carbon tax but it's like a you know on the polluting stuff right there enacting a kind of social media polluting stuff. They're taxing by 30% but they could do more than that. Like imagine. You know, they have this 30/70 split on app developers get 70% of the revenue when you buy stuff and apple keeps 30% they could modify that percentage based on how much sort of social value that those things are delivering to society. Hmm. So this gets a little bit weird people may not
1:51:13
Not like this, but if you think about who's the real customer that we want to be like, how do we want things oriented? How should we find an app developer? I want to make money the more I'm helping society and helping individuals not how much I'm extracting and stealing their time and attention and imagine that governments in the future actually paid like some kind of budget into let's say the App Store there's antitrust issues with this but you pay money into the app store and then as apps started helping people with more social outcomes, like let's say learning programs or schools or things like Khan Academy things like this.
1:51:43
That more money flows in the direction of where people got that value and it was that that Revenue split between Apple and the app developers ends up going more to things that end up helping people as opposed to things that were just good at capturing attention and monetizing zombie Behavior. Mmm. One of my favorite lines in the film is Justin Rosenstein from the like button saying that you know, so long as a whale is worth more dead than alive and a tree is worth more as Lumber and two by fours then
1:52:13
A living tree now. We are the whale we're the tree. We are worth more when we have predictable zombie-like behaviors when we're more addicted distracted outraged polarized and dissing formed then if we're a living thriving citizen or a growing child that's like playing with their friends and I think that that kind of Distinction that just like we protect national parks or we protect, you know, certain fisheries and we don't kill the whales in those areas or something. We need to really protect the we have to call out what sake
1:52:43
To us now.
1:52:44
Yeah, it's it's an excellent message. My problem that I see is that I just don't know how well that message is going to be absorbed on the people that are already in the trance. I think it's so difficult for people to put things down on how like I was telling you how difficult it is to for me to tell my friends. Don't read the comments, right? You know, right? It's hard to have that kind of discipline. It's hard to have that kind of because people do get bored and
1:53:13
They get bored. Like if you're waiting in line for some where you pull out your phone you're at the doctor's office. You pull out your phone. Like
1:53:21
totally I mean and that's why you know, and I do that right? I mean, this is incredibly right. This is incredibly hard back in the day. When I was at Google trying to change. I tried to change Google from the inside for two years before leaving. What was it like there,
1:53:35
please share your experiences because when you said you try to change it from the inside what kind of resistance were you met with?
1:53:43
What was their reaction to these thoughts that you had about the unbelievable negative consequences
1:53:48
of well, this is in 2013. So we didn't know about all the negative consequences
1:53:53
but you saw the writing on the wall at least some of it
1:53:55
some of it. Yeah, I mean the notion that things were competing for attention which would mean that they would need to compete to get more and more persuasive and hack more and more of our vulnerabilities and that that would grow that was the core Insight. I didn't know that it would lead to polarization or conspiracy theory like recommendations, but I would I did know, you know more addiction.
1:54:13
Ian kids having less weaker
1:54:16
relationships. When did it occur to you? Like what were your initial feelings? I was
1:54:22
on a hiking trip in the Santa Cruz mountains with our co-founder now Aza Raskin, it's funny enough our co-founder ASA. He is Dad was jef Raskin who invented the Macintosh project at Apple. I don't know if you know the history there, but he started the Macintosh project and actually came up with the word Humane to describe the human interface and that's where are our name and our work comes from his
1:54:43
His father's work. He and I were in the mountains in Santa Cruz and just experiencing nature and just came back and realized like this all of this stuff that we've built is just distracting us from the stuff. That's really important. And that's one coming back from that trip. I made the first Google deck that then spread virally throughout the company saying never before in history have 50 designers, you know, white 20 to 35 year old Engineers who look like me to hold the collective.
1:55:13
G of humanity and then that presentation was released and about you know, 10,000 people at Google saw it was actually the number one meme within the company to have this internal thing inside of Google called Moma that has like people can post like gifs and memes about various topics and it was the number one meme that hey we need to talk about this at this week's TGIF which is the like weekly thank God its Friday type company meeting it didn't get talked about but I got emails from across the company saying we definitely need to do something about this.
1:55:44
It was just very hard to get momentum on it. And really the key interface is to change within Google are Chrome and Android because those are the neutral portals into which your then using apps and notifications and websites and all of that. Like those are the kind of governments of the attention economy that Google runs
1:56:02
and when you work there they did you have to use Android or is it part of the the requirement to work there
1:56:11
now? I met a lot of people had Android phones I still used in.
1:56:13
Phone was it an issue?
1:56:16
No. No, I mean people because they realized that they needed products to work on all the phones. I mean if you work directly on Android, then you would have to use an Android phone but we tried to get you know, some of those things like the screen time features that are now launched, you know, so everyone now has on their phone like it shows you the number of hours or whatever you set on Android as well it is yeah and actually that came I think as a result of this advocacy and that's shipping on a billion tons, which shows you you can you can change this stuff right like that goes against their
1:56:43
social interest people spending less time on their phones. He's getting blessed
1:56:47
vacation does but it doesn't
1:56:49
work. We'll correct. It doesn't actually work as the thing. Yeah, and let's separate the intention in the fact that they did it for like labels
1:56:55
or cigarettes at tell you it's going to give you cancer like by the time you're buying them. You already hooked correct? I mean
1:57:01
it's even worse than I imagined like every cigarette cigarette box had like a little pencil inside so you can Mark there's like little streaks that said the number of days in a row you haven't smoked and you could like Mark each day. It's like it's too late right like
1:57:14
Yeah, it's just the wrong Paradigm. The fundamental thing we have to change is the incentives and how money flows because we want money flowing in the direction of the more these things help us like letting you can create example, like let's say you want to learn a musical instrument and you go to YouTube to pick up ukulele or whatever and you're seeing how to play the ukulele like from that point in a system that was designed in a Humane and sort of time well spent kind of way. It would really ask you instead of saying here's
1:57:43
Any more videos that are going to like suck you down a rabbit hole. It would sort of be more oriented towards what do you really need help with like do you need to buy ukulele here's a link to Amazon to get the ukulele do you looking for a ukulele teacher? Let me do a quick scan on your Facebook or Twitter search to find out which of those people are ukulele teachers. Do you need instant like tutoring because there's actually the service you never heard of called skillshare or something like that where you can get instant ukulele tutoring and if we're really designing these things to be about what would most help you next? You know, we're only as good as the menu.
1:58:13
Choices on life's menu. And right now the menu is here's something else to a dick to you and keep you hooked instead of here's a Next Step that would actually be on the trajectory of helping people live their lives
1:58:23
better and you'd have to incentivize the companies because like there's so much incentive on getting you addicted because there's so much Financial reward. What would be the financial reward that they could have to get you something that would be helpful for you like lessons or this.
1:58:38
I mean, so one way that I could work is like let's say people pay a monthly subscription of like
1:58:43
No, 20 bucks a month or something. So it's never going to wear I get you but like let's say you pay some you put money into a
1:58:49
pot where the potty but then we have the problem. The problem is like they cost money versus free. Like there was a there's a company that still exists for now that was trying to do the Netflix of podcasting. Oh and they approached us and the like we're just going to get all these people together and they're going to make them people going to pay to use your podcast and like why would they do that? When podcasts are free? Yeah, like that's one of the reasons why podcasts work is because they're afraid.
1:59:13
Right when things are free. They're attractive. It's easy when things cost money. You have to have something that's extraordinary like Netflix. Yeah, like when you say the Netflix of podcasting, we will Netflix makes their own shows, right they set up and see millions of dollars on special effects and all these different things and they're really like the enormous projects. Like you're just talking about people talking shit and you want
1:59:38
money, right? What's the things we have to actually deliver something that is totally qualitatively better.
1:59:43
So I also have to be like someone like you or someone who's really aware of the issues that we're dealing with with addictions to social media should have to say this is this is the best possible alternative like in this environment you are you yes, you are paying a certain amount of money per month, but maybe they could get factored into your cell phone bill and maybe with this sort of an ecosystem, right? You're no longer being drawn in by your
2:00:13
Ends and you know, it's not playing for your attention span. It's rewarding you in a very
2:00:18
productive way and imagine Joseph like 15% More of your time was just way better spent like he was actually spent on you were actually doing the things you cared about like it actually helped improve your life. Yeah, like imagine when you use email if it was truly designed. I mean forget email me those people don't really do that because email isn't that popular but whatever it is, that's a huge time sink for you for me emails a huge one for me, you know by browsing or whatever is a big one imagine that those
2:00:43
Things were so much better design that I actually wrote back to the write emails. And I mostly didn't think about the rest that when I was spending time on you know, whatever. I was spending time on that. It was really my I might more and more of my life was a life. Well lived and time well spent that's like the retrospective view I keep
2:00:59
going to Apple but because I think that the only social media come over the excuse me, the only technology company that does have these ethics to sort of protect privacy. Have you thought about coming to them? Yeah, have you well, I
2:01:10
mean, I think that they've
2:01:13
Made great first steps and they were the first along with Google to do those screen time management
2:01:20
stuff. But that was just this barely scratching the surface
2:01:23
like baby baby baby steps, but we really need them to do is radically reimagine how those incentives and how the phone fundamentally works. So it's not just all these colorful icons and one of the problems they do have a disincentive which is a lot of the revenue comes from gaming and as they move more into Apple TV competing with HBO and Hulu and Netflix in that whole thing where they
2:01:43
need subscriptions. So the apples revenue on devices and Hardware is sort of maxing out and where they're going to get their next bout of Revenue to keep their stock price Up is on these
2:01:52
subscriptions. I am less concerned with those addictions. I'm less concerned with gaming addictions that I am information addictions because at least it's not fundamentally altering your view of the world,
2:02:02
right? It's growing up democracy. Yeah it impossible to
2:02:04
agree with in this is coming from a person that's had like legitimate video game addictions in the past. Yeah, but like my wife is addicted to Subway Surfer look,
2:02:13
The better notice that it's crazy game. It's like you're riding the top of subways you jumping around. It's like it's really ridiculous. But it's fun to watch Like Whoa, but I don't fuck with video games, but I watch it and it's those games at least are enjoyable. There's something silly about it. Like fucking you start doing it again when I see people getting angry about things on social media. I don't see the upside right? I don't mind them making
2:02:43
A profit off games there is an issue though with games that a dick children and then these children. There's like you could spend money on like Roblox and you can you know have all these different things you spend money on you wind up, you know having these enormous building those. Yeah, you leave your kid with an iPad you come back. You have a 500 dollar bill. Like what did you do? This is this is an issue for sure. But at least it's not an issue in that it's changing their view of the world, right?
2:03:13
And I feel like there's a way for keep going back to Apple but a company like apple to rethink the way you know, they already have a Walled Garden right with imessage and FaceTime and all these different
2:03:27
come. I can totally build those things out. I mean I message an iCloud could be the basis for some new neutral social media based on instant social approval. And
2:03:36
yeah, that's right.
2:03:36
Yeah, they can make it easier to share information with small groups of friends and have that all synced and even you know in the pre kova days.
2:03:43
I was thinking about Apple a lot. I think you're right. By the way to really poke on them. I think they're the one company that's in a position to lead on this
2:03:49
and they also have a history of you thinking along those
2:03:52
lines. They had this feature that's kind of hidden now, but the find my friends right thing called find My Notes all buried together so you can find your devices and find your friends, but in a pre covid World Imagine, they really built out the you know, where are my friends right now and making it easier to know when you're near by someone so you can easily more easily get together in person the right now all the like to the extent that
2:04:13
Spoke wants to bring people closer together. They don't want to ins and again, this is pre covid but they don't want to incentivize lots and lots of Facebook events. They really care about groups that keep people posting it online and looking at ads because of the category of bringing people closer together. They want to do the online screen time based version of that right as opposed to the offline Apple by contrast. If you had little iMessage groups of friends, you could say, hey does everyone in this little group want to opt into being able to see where each other are where we all are on say weekdays.
2:04:43
Between 5:00 and 8:00 p.m. Or something like that. So you could like time-bound it and make it easier for serendipitous connection and availability to happen that's hard to do. It's hard to design that but there's things like that that apples in a position to do if it really took on that mantle and I think as people get more and more skeptical of these other products, they're in a better and better position to do that. One of the antitrust issues is do we want a world where our entire well-being as a society depends on what one massive Corporation worth over a trillion dollars does or doesn't do right like we
2:05:13
More openness to try different things and we're really at the behest of whether one or two companies Apple or Google does something more radical
2:05:22
and there has to be some massive incentive for them to do something. That's really going to change. Yeah, the the way we interface with these devices and the way we interface with social media and I don't know what incentive exists it's more potent than financial
2:05:34
incentives. Well, and this is where the you know, if the government and the same way that we want to transition long-term from a fossil fuels oriented economy to something that doesn't
2:05:43
aunt that. She that changes the kind of pollution levels, you know, we have a hugely emitting, you know Society ruining kind of business model of this attention extractive Paradigm, and we could long-term sort of just like a progressive tax on that transition to some other thing the government could do that, right? That's not like who do we censor? It's how do we disincentivize these businesses to pay for the sort of life support systems of society that they ruin a good example of this I think in Australia is there.
2:06:13
I think it's just really that's regulated that Google and Facebook have to pay the Publishers who they're basically hollowing out because one of the effects we've not talked about is the way that Google and Facebook have hollowed out for the fourth estate in journalism. I mean because journalism has turned into and local what news websites can't make any money except by basically producing clickbait. So even to the extent that local newspapers exist, they only exist by basically, click beta fication of even lower and lower paid, you know workers who are just generating content Farms, right? So,
2:06:43
Maybe so that's an example of if you force those companies to pay to revitalize the fourth estate and to make sure we have a very sustainably funded for the state that doesn't have to produce this clickbait stuff. That's that's you know, another Direction.
2:06:57
Yeah, that that's interesting that they have to
2:07:00
pay.
2:07:02
I mean, these are the wealthiest companies in like the history of humanity. All right, so that's the thing. So we shouldn't be cautious about how much they should have to pay because we also don't want it to happen on the other end, right? You don't want to have a world where you know, we have round up making a crazy amount of money from giving everybody cancer and lymphoma from you know, the chemicals right life Estates and then they pay everybody on the other end after a lawsuit of a billion dollars, but now everyone's got cancer. Let's actually do it in a way so we don't want a world where Facebook and
2:07:32
Will profit off of the erosion of our social Fabric and then they pay us
2:07:36
back for how do you quantify how much money they have to pay to journalism? Yeah, it seems like it's almost a form of Socialism. Yeah. I mean this is where like
2:07:47
that the IQ Led Led example is interesting because they were able to disincentivize and tax the lead producers because they were able to produce some results on how much this lowered the wage earning potentials of the entire population. I mean like
2:08:02
how much does this cost our society we used to say free is the most expensive business model we've ever created because we get the free downgrading of our attention spans are mental health our kids like our ability to agree with each other our capacity to do anything as a democracy like yeah, we got all that for free wonderful. Obviously, we get lots of benefits and I want to acknowledge that but that's just not sustainable the real question. I mean right now we're
2:08:26
We have huge existential probably have a global competition power competition going on. I think China just passed the GDP of the US. I believe there is, you know, if we care about the u.s. Having a future in which it can lead the world in some meaningful and enlightened way we have to deal with this problem and we have to have a world where digital democracy out competes digital authoritarianism, which is the China model and right now that builds more coherence and is more efficient and doesn't of
2:08:56
Of the way that our current system does I think Taiwan Estonia and countries like that where they are doing digital democracies are good examples that we can learn from but we are behind right now.
2:09:07
Will China also has a really fascinating situation with Huawei where Google is banned Huawei so you can't have Google applications on Huawei. So now Huawei is creating their own operating system and they have their own ecosystem now that they're building up.
2:09:26
And that's you know, it's weird that there's only a few different operating systems now, I mean, there's a very small amount of people that using Linux phones. Yeah, then you have a large amount of people using Android and iPhones and if China becomes the first to adopt our own operating system, and then they have even more unchecked rules and regulations in regards to like the influence that they have over there people with an operating system that they've
2:09:56
Developed and they control and who knows what kind of backdoors and spying tons? Yeah. It's
2:10:05
It's weird where and when you see this do you like it feels so futile for me on the outside looking in looking at but you you're working on this. How long do you anticipate this going to be a part of your life? What does it feel like to you?
2:10:30
I mean, it's not easy right in the film ends with his question. Do you think we're going to
2:10:37
get there? Yeah,
2:10:39
I just say we have to like I mean if you care about this going well, I wake up every day and I ask what will it take for this whole thing to go? Well like and how do we just Orient each of our choices as much as possible towards this going well, and we have a whole bunch of problems. I do look a lot at the environmental issues the permafrost methane bombs like the timelines that we
2:11:00
Deal with certain problems are crunching and we also have a certain dangerous exponential technologies that are emerging decentralization of crisper and like there's a lot of existential threats I hang out with a lot with the sort of existential threats Community. It's going to take must be a lot of fun. It's there's a lot of psychological problems in that Community actually
2:11:18
a lot of depression. There's accept only imagine the suicide as
2:11:21
well. It's you know, it's it's hard, but I think we each have a responsibility when you see this.
2:11:30
It's tough to say what will it take for this to go? Well, and I will say that really seeing the film impact people the way that it has I used to feel like oh my God, how are we ever going to do this? No one cares, like none of people know right at the very least. We now have about 50 40 to 50 million people who are at least introduced to the problem. The question is how do we harness them into a collective movement and that's what we're trying to do next. I mean, I I'll say also these issues get
2:12:00
More and more weird over time my co-founder. Asia Raskin will say that it's making reality more and more virtual over time because we haven't talked about how as technology advances at hacking our weaknesses. We start to prefer it over the real thing. We start for example, there's a recent company VC funded raised like it was worth over a hundred twenty five million dollars and what they make our virtual influencers. So these are like virtual people virtual video that
2:12:30
is more entertaining more interesting and Pete in that fans like more than real
2:12:35
people. Oh boy,
2:12:37
and it's kind of a related to the end of deep fake world, right? We're like people prefer this the real thing and Sherry turkle, you know has been working MIT wrote the book reclaiming conversation and alone together. She's been talking about this forever that over time humans will prefer connection to robots and boss and the computer-generated thing more than the real thing. Think about a i generated music being more it'll start to
2:13:00
Our taste buds and give us exactly that thing. We're looking for better than we will know ourselves. Just like YouTube can give us the perfect next video that actually every bone in our body will say actually I kind of do want to watch that even though it's a machine pointed at my brain calculating the next thing there's an example from Microsoft writing this chat bot called gerais. I cannot pronounce it that after nine weeks people preferred that chatbot to their real friends and twenty
2:13:22
five or 10 to
2:13:24
25% of their users actually said, I love you to the chat bot. Oh boy and that many other several who actually said it.
2:13:30
It convinced them not to commit suicide to have this relationship with this chat bot.
2:13:34
So it's her it's her the movie exactly which is what so all these things are the same
2:13:38
right? We're veering into a direction where technology if it's so good at meeting these underlying Paleolithic emotions that we have the way out of it is we have to see that this is what's going on. We have to see and reckon with ourselves and this is how I work. I have this negativity bias if I get those 99 comments in one spot ones positive comments and ones negative.
2:14:00
Mine is going to go to the negative. I
2:14:01
don't see that I've site you in the future wearing an Overcoat. Your you are literally Laurence Fishburne in The Matrix trying to tell people to wake up.
2:14:11
Well, that's there's a line in the social dilemma where I say, how do you wake up from the Matrix? If you don't know you're on the Matrix
2:14:18
that is the issue, right? And I
2:14:20
even in the Matrix. We at least had a shared Matrix. The problem now is that in the Matrix? Each of us have our own Matrix. That's the real kicker. I
2:14:27
struggle with the idea that this is all inevitable because
2:14:30
does this is a natural course of progression with technology and that it's sort of figuring out the best way to to have us with as Little Resistance embed ourselves into its system and that our ideas are what we are with emotions and with our biological issues that this is just how life is and this is how life always should be but this is just all we've ever known.
2:15:00
On this all we've ever known I'm signed in right into the laws of physics that social media has to exist for Humanity. Right? We've gotten rid again. The environmental movement is a really interesting example because we passed all sorts of laws. We got rid of lead. We've changed from D to know some of our pesticides, you know, we're slow on some of these things and corporate interest in asymmetric power of large corporations, you know, which I want to say markets and couples more great shows that when you have a symmetric power for predatory systems that cause harm they're not going to terminate themselves. They
2:15:30
Have to be bound in by the public by Culture by the by the state and we just have to point to the examples where we've done that and in this case. I think the problem the problem is that how much of our stock market is built on the back of like five companies generating a huge amount of wealth. So this is similar, I don't mean to make this example, but there's a great book by Adam. Hope Shield called Brewery the chains which is about the British abolition of slavery in which he
2:16:00
About how for the British Empire like if you think about it, when when we collectively wake up and say this is an abhorrent practice that has to end but then at that time in the 17 1800 s in Britain slavery was what powered the entire economy. It was free labor for you know, huge percentage of the economy. So if you say we can't do this anymore, we have to stop this. How do you decouple when your entire economies based on slavery? Right and the book is actually inspiring because it tracks
2:16:30
A collective movement that was through networked. All these different groups the Quakers in the US the people testifying before Parliament the former slaves who did first hand accounts the graphics and art of all the people who never seen what it looked like on a slave ship and so by making the invisible visceral and showing just how important this stuff was through a period of about 60 to 70 years the British Empire had to drop their GDP by two percent every year for 60 years and willing to do that to get off of slavery.
2:17:00
Now I'm not making a moral equivalence what he really clear for every few taking sure things out of context, but just that it's possible for us to do something that isn't just in the interest of economic growth and I think that's the real challenge. It's actually something that should be on the agenda which is how do we one of the major tensions is economic growth, you know being in conflict with dealing with some with many of our problems, whether it's some of the environmental issues or you know with some of the technology issues we're talking about right
2:17:27
now.
2:17:27
Artificial intelligence is something that people are terrified of as an existential threat. They think of it as one day you're going to turn something on and it's going to be sentient. It's going to be able to create other forms of artificial intelligence that are exponentially more powerful than the one that we created and that will have Unleashed this Beast that we cannot control what my concern is with already has. Yeah. That's my concern. My concern is that this this is a slow acceptance of
2:17:57
Drowning it's like a slower. Okay, I'm only up to my knees. It's fine. It's just my waist high.
2:18:05
It's like a rocket boiling water. Right?
2:18:07
Exactly. Exactly. It seems like this is like
2:18:11
humans have to fight back to reclaim our autonomy and Free Will from the machines. I mean one one clear. Okay, Neo
2:18:20
is what it's very much the matrix. It's neither.
2:18:22
One of my favorite lines is actually when the Oracle says to me. Oh and don't worry about the vase and he says what face
2:18:27
And he knocks it over it says that face and so it's like she's the AI who sees so many moves ahead in the chess board. She can say something which will cause him to do the thing that verifies the thing that she predicted what happened. Yeah. That's what a high is doing now except it's pointed at our nervous system and figuring out the perfect thing to dangle in front of our dopamine system and get the thing to happen which instead of knocking off the vases to be outraged at the other political side and be fully certain that you're right even though it's just a machine that's calculating shit that's going to make you, you know, do the thing
2:18:57
when you're
2:18:57
About this how much time do you spend thinking about simulation Theory the simulation? Yeah the idea that it if not currently one day. There will be a simulation that's indiscernible from regular reality and it seems we're on that path. I don't know if you mess around with VR at all.
2:19:13
But well, this is the point about you know, the virtual chat Bots out competing ideals for exactly the technology. You know, I mean, that's what's happening is that reality is getting more and more virtual right? Because we interact with a virtual news system. That's all this.
2:19:27
of clickbait economy outrage machine that's already a virtual pilot political environment that then translates into real world action then becomes real and that's the weird feedback
2:19:35
loop go back to 1990 whatever it was when the internet became mainstream or at least started becoming more mainstream and the small amount of time that it took the 20-plus years to get to where we are now and then think what what about the virtual world and once this becomes something that's has the same sort of
2:19:57
Rate of growth that the internet has experienced or that we have experienced through the internet. I mean, we're looking at like 20 years from now being unrecognizable. Yeah, we're looking. I mean, it's it almost seems like that is what life does the same way bees create beehives, you know caterpillar doesn't know what the fuck's going on when it gets into that cocoon, but it's becoming a butterfly. Yeah, we seem to be a thing that creates newer and better objects.
2:20:28
Correct more effective but
2:20:29
we have to realize AI is not conscious and won't be conscious the way we are and so many people think that it
2:20:36
is consciousness essential I think so to us.
2:20:40
I don't know essential in the sense of where the only ones who have it. No, I don't
2:20:43
know that we know what the rap might be more. Yeah things that have Consciousness but they is it
2:20:48
essential. I mean, it's the to the extent that choice exists. It would exist through some kind of Consciousness as a choice is
2:20:55
Choice essential.
2:20:57
It's essential to us as we know it like his life as we know it. But My worry is that we're in essential that we like we're thinking now, like single-celled organisms being like hey, I don't want to hang up with a bunch of other people and become an object that can walk. I like being a single cell organism. This is a lot of fun.
2:21:16
I mean I hear you saying, you know, are we a boot loader for the AI that then runs the ones that see lines? Yeah perspective. I mean, I think this is a really dangerous way to think. I mean we have to yes.
2:21:27
Are we then the teachers for us? Yeah, I mean our
2:21:30
but what if the next version of but the knife is but the next version
2:21:33
being run by machines that have no values that don't care that don't have choice in our just maximizing for things that were programmed in by our little miniature brains anyway, but they
2:21:41
don't cry they don't commit suicide but
2:21:44
then Consciousness in life dies, that could be the future. I think this is the last chance to try to snap out of that and
2:21:51
is it important in the eyes of the universe that we do that? I don't know. It feels important. How does it all of you feels important?
2:21:57
It but I'm not Mama monkey, you know the monkeys like God I'm staying in this tree man. You guys are out of your fucking mind. I mean, this is the weird Paradox of being
2:22:05
human is that again? We have these lower level emotions. We care about social approval. We can't not care at the same time. Like I said, there's this weird proposition here were the only species that if this were to happen to us, we would have the self-awareness to even know that was happening. Right? Like we can can set like this to our interview. We can conceptualize that this thing has happened to us right that we have.
2:22:27
Built this Matrix this external object which has like Ai and supercomputers and voodoo doll versions of each of us and it has perfectly figured out how to predictably move. Each of us in this Matrix.
2:22:37
Let me propose this you we are what we are now human beings Homo sapiens in 2020. We are this thing that if you believe in evolution, I'm pretty sure you do we've evolved over the course of millions of years to become who we are right now. Should we stop right here? Are we done know right? We should keep it.
2:22:56
Evolving. What does that look
2:22:59
like?
2:23:03
What does it look like if we go ahead just forget about social media? What would you like us to be in a thousand years or a hundred thousand years or 500 thousand years. You certainly wouldn't want us to be what we are right now, right? No one would
2:23:18
know. I mean, I think this is what visions of Star Trek and things like that. We're driving down ask right like hey, let's imagine humans do make it and we become the most enlightened we can be and we actually somehow make peace with these other, you know, alien tribes.
2:23:31
Figure out, you know, space travel and all that. I mean actually a good heuristic that I think people can ask is on an enlightened planet or we did figure this out. What would that have looked
2:23:41
like isn't it? Always weird that those movies it's people are just people but they're in some weird future, but they have really changed that much.
2:23:51
Right, I mean which is to say that the fundamental way that we work is just unchanging but we are are such things as more why societies more sustainable societies more peaceful more harmonious societies, but Jane's, you know
2:24:04
intimately biologically we have to evolve as well. But our version of the like the best version that's probably the grey aliens. All right, maybe so I mean ultimate
2:24:14
future. I mean we're going to get into Gene editing and becoming more perfect perfect on the sense of you know that but
2:24:21
We are going to start operating Rising seas. And for what are the outcomes that we value? I think the question is, how do we actually come up with brand new values that are wiser than we've ever thought of before that actually are able to transcend the win-lose games that lead to Omni lose-lose that everyone loses if we keep playing the win-lose game at greater and greater
2:24:40
scales. I like you have a vested interest in the biological existence of human beings. I think people are pretty cool. I love being around them. I enjoyed talking to you today to my fear is that we
2:24:51
We are we're we're a Model T. Right, you know, and there's no sense in making those fucking things anymore. The brakes are terrible. They smell like shit. When you drive them. They don't go very fast. We need a better version.
2:25:07
You know, the funny thing is there's some quote by someone I think like I wish I could remember it. It's something about how much would be solved if we were at peace with ourselves like if we were able to just be okay with nothing.
2:25:21
Like just being okay with living and breathing. Hmm. I don't mean to be you know, playing the Woo new age card. I just genuinely mean how much of our Lives is just running away from you know, anxiety and discomfort and aversion
2:25:34
it is but you know in that sense some of the most satisfied and happy people are people that live a subsistence living have these subsistence existences in the middle of nowhere just chopping trees and catching
2:25:46
fish right and more connection. Probably. Yeah antic than something else and I think
2:25:51
Probably resonates
2:25:52
biologically to because of the history of human beings living like that is just so much longer and greater
2:25:58
totally and I think that those are more sustainable
2:26:00
societies. We can never obtain peace in the outer World until we make peace with ourselves Dalai Lama. Yeah, but I don't buy that guy, you know that guy. He's a he's an interesting case.
2:26:12
I was thinking it was a different slightly different quote, but actually there's one quote that I would love to if it's
2:26:17
one of the reasons why I don't buy them. He's just chosen they just chose that guy.
2:26:22
Yeah, also he doesn't have sex how yeah, how much can you be enjoying life? If that's not a part of the mom bro. You
2:26:30
wear the same outfit every day the fuck out of here and your orange robes.
2:26:34
Can I there's a there's a really important quote that I think would really be good to share. It's from the book If you read amusing ourselves to death by Neil Postman know from 1980 to know so especially when we get into big Tech when we talk about censorship a lot and we talked about
2:26:51
out or well, he's this really wonderful opening to this book was written 1982 it literally predicts everything that's going on. Now. I frankly think that I'm adding nothing and it's really just Neil Postman called at all in 1982. He had this great opening it says
2:27:10
Let's see. We're all looking out for you know 1984 when the year came and the prophecy didn't thoughtful Americans saying softly and praising themselves the roots of liberal democracy had held this is like we made it through the 1984 Gap wherever else the terror had happened. We at least had not been visited by are welling in nightmares, but we had forgotten that alongside Orwell's dark vision. There was another slightly older slightly less well-known equally chilling vision of Aldous huxley's Brave, New World
2:27:40
Contrary to Common belief even among the educated Huxley and or well did not prophesy the same thing or well warns that will become overwhelmed overcome by an externally imposed depression. But in huxley's Vision, no big brother is required to deprive people of their autonomy maturity or history as he saw it people will come to love their oppression to adore the technologies that undo their capacities to think what or well feared would those who would ban books would Huxley feared was that there?
2:28:10
Be no reason to ban a book for there would be no one who wanted to read one.
2:28:14
Orwell feared those who would deprive us of information Huxley feared those who would give us so much that we would be reduced to passivity and egoism or will fear the truth would be concealed from us Huxley feared. The truth would be drowned in a sea of irrelevance or will feared we would become a captive culture. But Huxley feared we would become a trivial culture preoccupied with some equivalent of the Phillies and the Georgie Porgie and the centrifugal bumble-puppy. Don't know what that means.
2:28:44
Huxley remarked in Brave New World Revisited the Civil Libertarians and rationalist who are ever on the alert to oppose. Tyranny failed to take into account man's almost infinite appetite for distractions lastly in 1984 or will added people are all people are controlled by inflicting pain in Brave New World. They are controlled by inflicting pleasure in short or well feared that what we fear will ruin us Huxley fear that what we desire will ruin us. Oh shit.
2:29:14
Isn't that good?
2:29:15
That's that's the best way to end this goddamn. But again
2:29:21
if we can become aware that this is what's happened. We're the only species with the capacity to see that our own psychology our own emotions our own Paleolithic evolutionary system has been hijacked. I like that you're
2:29:34
optimistic. We have it to be if we if we want to remain people we
2:29:39
have to optimism is probably the only way to live in a meat suit body and keep going otherwise.
2:29:44
Is certainly helps. Yeah, it certainly helps.
2:29:47
Thank you very much for being here man. I really enjoy this even though I'm really depressed now. I really don't want you to be depressed. I
2:29:53
really hope people you know, I'm kidding we got we really want to build a movement and you know, we're just I wish I could to give people more resources. We do have a podcast called your undivided attention and we're trying to build a movement at Humane Tech.com, but we'll listen
2:30:06
any new Revelations or new developments that you have. I'd be more than happy to have you on again. We'll talk about them and send them to me and I'll put them on social media and whatever you need. Awesome. I'm here to help awesome
2:30:16
map great great to be
2:30:17
Resist. Yeah, risen us together. He managed to resist Humanity were in this together. Thank you, Tristana. Really? Really? Appreciate it. Goodbye everybody.
2:30:26
Thank you my friends for tuning into the show and thank you to quip start getting rewards for brushing your teeth today and go to get quip.com / Rogan right now to get your first refill for free. That's get your first refill for free at get quip.com. Rogan spelled g ET Q UI p.com Rogan quip better oral health made simple and rewarding. We're also brought to you by the Ori.
2:30:56
Yuri is an investment in your happiness and for listeners this podcast they are offering 20% off your first purchase get yourself some of the most comfortable and versatile clothing on the planet at the or e.com Rogen that's vuo are i.com slash Rogan not only will you receive 20% off your first purchase but you will enjoy free shipping on any us orders and over $75 and free returns so go to
2:31:26
To the your e.com Rogan and discovered the versatility of the Ori clothing. We're also brought to you by Juve and they're fantastic red light therapy. If you are interested for a limited time. Juve wants to hook you up with an exclusive discount on your first order. Just go to Juve.com Joe and apply my code Joe to your qualifying order. That's j o OV v.com. /jo exclusions apply.
2:31:56
Limited time only we're also brought to you by liquid IV and liquid IVs new hydration multiplier plus immune support is available at Walmart or you can order online and get 25% off when you go to liquid. I beat.com and use the code Joe Rogan all one word at checkout. That's 25% off anything you order when you use the promo code Joe Rogan at liquid IV.com get better hydration today at liquid IV.com and use the promo code Joe.
2:32:26
Open all right, my friends. Thank you very much for tuning in much. Love to you all. Bye. Bye.
ms