The Ezra Klein Show: Best of: Jia Tolentino on what happens when life is an endless performance https://www.vox.com/ezra-klein-show-podcast
Category: attention
-
So Far
I’ve deleted all my infinite scrolling apps (FB, Instagram, LinkedIn, Twitter). I’ve stopped reading online news and only read a printed Sunday paper. I’ve paid for products to remove ads (Cruncy Roll, YouTube, Pandora).
Call To Action To Take Back Attention & Time
Today, I listened to the founder of Center For Humane Technology, Tristan Harris, talk being intentional about how we should guard our attention and time from social media apps. He has went to school with the founders of Instagram and he talked about how Instagram’s founder were masters of stealing attention by designing the user interface in ways to hack our brains to come back over and over again to the Instagram app.
After thinking about it a little bit, I thought about buying a flip phone to be used on weekends in addition to my Android Nexus 6P. However, instead of buying another gadget, which will break my vow to not buy anything for one year unless it’s perishable or a book, I’ve decided to do the following life hack on my Android phone.
Creating Clean Workplaces For Every Context
On Android, you can create multiple users who get a completely clean Android home screen with the clean version Android. From that clean slate, I can install apps that allow me to single task. This similar to how you can create new users on your laptop.
Podcast & Deep Thought Context
In this Context I have only
- Pandora – Tuned to Piano Dreams radio for me to concentrate
- BeyondPod – Listen to podcasts, with zero interruptions and incentive to browser or read email
- Google Maps – I could be driving and listening to the Podcast
- Chariot – I may need to look up my ride to work
- Brave Browser – To replace Chrome, by removing ads from Chrom
Nothing from
Gmail, work email, Slack or SMSis interrupting my flow.My activities
- Reading
- Listening to podcasts
- Driving
- Reading a book and do not want interruptions
Physically In The Office Working Context
In this Context I have only
- Microsoft Outlook – work email
- Slack – work chat
- SMS – for my boss to text me for urgent issues
- Okta verify – to get into the work VPN
My activities
- During work hours on weekdays 8 to 5pm
Nothing from
Gmail, YouTube, any personal appsinterrupting my flow.Learning Context
In this Context I have only
- Coursera
- EdX
- TED
- Udemy
- Kindle
My activities
- Learning from video training
All notifications turned off. No email, nothing.
Personal Context
This is the messiest one with everything else. Personal Gmail, Wechat, banking apps, commuting apps and personal Slack channels. This is only the first week that I’m trying this so I could possibly create more contexts as they make themselves known to me.
Why?
These contexts make my intentions known and clear to my brain. Each context is meant to signal that what my primary purpose is. The completely controlled context is forcing me to single task, finish that portion and then move to the next context intentionally.
The contexts are discrete from each other. An example is that there is no multi-tasking when I’m listening to a podcast, I cannot email at the same time. As I physically switch from one context to the others, my brain also switches contexts and get ready for the next focus area. Making the context switching harder and longer will hopefully signal and train my brain to focus on a single task at hand.
What is your hack to take back attention and time?
-
I have recently started a media detox program of not reading online news, online blogs, social networks (Twitter, LinkedIn, FB, Instagram). Instead I am focusing on listening to interview based podcasts and reading physical books.
During this journey of reassessing how I spend my attention and time, I also started powering off my cell phone after I get home until I wake up the next morning.
This is the beginning of a long journey towards changing my life habits. During this discovery phase, I have stumbled upon (virtually) several people that have influenced me deeply.
One such person is Tim Ferris. Through his books (Tribe of Mentors, Tools of Titan) and podcasts (Tim Ferris Show, Tribe of Mentors), he has exposed me to life stories of people (authors, athletes, thought leaders). Most important of all, Tim Ferris has exposed me to books that have influenced the people he interviews in his book and podcasts. For example, Maria Popova is a writer who reads, thinks and writes on brainpicker.org. She reads a lot of books and spend hundreds of hours a month researching and writing. That is her full time ‘job’ if you can call it that.
Another interview based podcast is Design Matters with Debbie Millman. She started out many years ago interviewing designers & artists. But has since expanded her interviews to cover bloggers, authors, and other passionate ‘humans’ (I like the word). Today I just discovered books from Anand Giridharadas, author of ‘True American’, ‘India Calling’. The author claims that he accidentally come across Trumpism in 2011 and that our nation is now torned between those who used to rule America for the last 400 years vs the new minority majority. The next 40 years will be full of tension as the nation readjust to the power shift. For people who want change to happen sooner, be patient. In the arc of history, we are already changing very rapidly.
Reading: I read books. I get more recommendations for more books. I read more books.
Thinking: I listen to Piano Dreamer radio on Pandora, and I think.
Writing: I write blog posts inspired directly or indirectly by the books I read.To those who have questioned my habit. Yes I do watch videos. I watch BWF badminton videos. I watch the shows that my family want to watch with me. That’s it.
Will you join this detox program and get back your attention span?
-
-
Taking Back Control Of My Privacy
- Goodbye Infinite Scroll Media : LinkedIn App
- Goodbye Twitter @ tonytam
- Friends Don’t Let Friends’ Data Be Stolen #deleteAllFacebook (Instagram,WhatsApp)
- Life Strategy #1: Why I Am Only Reading Long Articles and Books
Google Activities
This is the hardest one yet. I love the convenience of Google maps, suggestion browsing history, Google Now voice control, Gmail. I have 7 gmail accounts for different purposes.
Today I went to Google Activity at https://myactivity.google.com/myactivity and turned off all activity tracking and also deleted all activity already saved on Google for All Time.
- Also remember to delete your Google Maps timeline https://www.google.com/maps/timeline
Plan Going Forward
I plan to keep all my activity private until I make a decision to trade off privacy for convenience. Fingers crossed that this will work!
-
Recently I have made choices on how I spend my time and this has helped me decide what I say YES to and more importantly what I say NO to
Optimize for time over money
If I had to choose between taking muni or Lyft to work, I choose Lyft. Muni takes 45 minutes for a 4 mile trip to downtown San Francisco. Lyft costs more, but gets me there in 20 minutes. I pull up my laptop and write a blog post while I’m in Lyft.
Optimize for attention
I install a free AdBlocker, but I donate $15 for it in order to get rid of ads on webpages. I pay for ad free Hulu, ad free YouTube, ad free Crunchyroll.
I stop reading news and just read books and the Sunday newspaper delivered to my home.
I listen to Piano Dreamers radio when I write, read and not music with lyrics because I can only single task, no one can multitask!
Optimize for family
My family comes before almost everything except my own health. Spending time with my daughter, my wife & my parents always bubbles to the top.
Optimize for deeper meaningful relationship
I only have 2 really close friendships, that is all I have time for. And even that I don’t spend enough time nurturing.
I spend time mentoring several women at work and I try to spending meaningful time with them to help them with advice they are seeking.
Optimize for long term
My time and interaction has a long term focus. Long term is 10 years, 15 years. So I optimize for long term vs short wins.
Optimize for scope of impact
I spend time to have deeper impact on a single person, but I optimize for larger impact. For example, I may spend hours mentoring a person but I might distill that work into a blog like impactfulengineer.org to help hundreds later.
Optimize for no regret
I say YES to anything that I will regret when I am dying on my deathbed.
-
Taking Back Attention
I’m going on a detox program with media
- Goodbye Infinite Scroll Media : LinkedIn App
- Goodbye Twitter @ tonytam
- Friends Don’t Let Friends’ Data Be Stolen #deleteAllFacebook (Instagram,WhatsApp)
Thank you brain trust member : Seth Godin
After I have done my detox, I find this gem in my email
Your laptop and your phone work the same way. The reviews and the comments and the breaking news and the texts that you read are all coming directly into the place you live. If they’re not making things better, why let them in?
- Seth Godin
-
I have removed Facebook
I have removed TwitterFeb 8th, 2018 – Today I have removed the LinkedIn app from my phone. It was a difficult decision because I have been using the LinkedIn app to catch up posting from my professional connections as well as messaging people on LinkedIn.
Why?
- I’m trying to read more long form content, LinkedIn postings have trained me to do the thumb swipe and scroll continuously, bookmark articles that I want to read later. I don’t want to be trained anymore by infinitely scrolling apps.
- The 2nd posting on LinkedIn is *always* an advertisement which I cannot block on my mobile phone, but I do block using Chrome extension AdBlock. I already pay to get rid of ads everywhere I can, LinkedIn does not offer that option
What’s Next?
- I still extract more value from LinkedIn by connecting with my professional network for interviews for ImpactfulEngineer.org. I will only use LinkedIn.com on my Chrome browser and observe how I use the site keeping in mind that I optimize for time, attention and deep connections. See How I Optimize
-
On March 27th, 2018, I’ve decided to remove all my accounts on the following Facebook-owned apps: Facebook, Instagram and WhatsApp.
- Facebook.com – This is easy because I only have 1 connection.
- Instagram – This is a little harder because I use the message feature to talk to 2 people on iOS devices. So now we have to figure out how to chat from iOS devices to Android while not wanting to install yet another messaging app.
- WhatsApp – This is the most difficult but I decided to bite the bullet and move all 95 members of my badminton social group to Slack.
The people in my social group asked why I am making this move given that we really don’t have privacy anymore
Ad Driven Business Models Have The Wrong Incentives
-
Their incentives are to mine our data, the people we are connected to in order to grow their reach and depth of understanding of our behaviors. I’m moving to Slack because their business model is to build a delightful product so that we will pay for it, not to sell ads. These 2 business models drive very different incentives for the employees who work there.
It’s Time, We Can Do It, #deleteFacebook
- My friends are saying they have been in Facebook for a long time now, their privacy is already lost. My argument is that this is like quiting cigarettes, just because you have been smoking for 10 years, it doesn’t mean you should continue.
Friends Don’t Let Friends’ Data Be Stolen
- Like the campaign “Friends Don’t Let Friends Drive Drunk”. Being on Facebook doesn’t just expose your data to advertisers. The ‘friends’ that are connected to you also are at risk of having their data scrapped without them knowing. Even if you protect your data, if you allow any app to connect to your Facebook account, your friends’ data will be exposed as well. See how it was done to 50 million users
I Know It Is Hard
- I know it’s hard to change your habits. I know it’s hard to be FOMO. I know people will ask you to come back to Facebook, Instagram, WhatsApp
- I am on the other side now, I know it’s a better place.
- I also know that there are naysayers
- Tony, you have Google, isn’t that the same thing?
- Yes Google knows a lot about me, I use their phone, but I’m not exposing my friends’ privacy. Google doesn’t have the same type of social network.
- Just because I’m using Google, doesn’t mean I shouldn’t delete Facebook
- Tony, you have no friends on Facebook to connect to, that’s why it’s so easy.
- The friends I want to talk with, I see them face to face or I will call them on the phone. Or I will create a private chat room focused on talking to people I have common interests with.
- 500+ friends on Facebook do not need to see me spam them.
- If they cared, they can always follow me on
- sfbadminton.org – I play badminton weekly with 95+ people
- impactfulengineer.org – I interview people and help engineers
- tonytam.org – I blog for my daughter and my future self
- Oh Tony, you are are just weird!
- You can be weird, just like me! Why do you want to be like the other 2 billion Facebook users?
- Tony, you have Google, isn’t that the same thing?
If you need support, ping me.
-
Recently there has been news reporting about #deleteFacebook, #metoo, #marchforourlives and #nevergain. You can track the interest over time at Google Trends
For me, I wanted to know the relative traction each of these movements are actually making. Looks like #deleteFacebook is gaining traction, check back in 7 days to see if the internet meme continues.
While I’ve done #deleteFacebook a few years ago, I got back on recently just to connect to one person but now, I’m deleting it yet again.
#deleteInstagram https://help.instagram.com/370452623149242
This page is http://bit.ly/delAllFacebook
Facebook.com – DONE March 27thInstagram – DONE March 27th- WhatsApp – I am moving my 90+ badminton social group to Slack! Done by April 30th.
-
Life Strategy #1
I optimize for time, attention and deep thought (see How I Optimize). This is part of what I will say YES to and will guide me in what I will say NO to on requests for my time.
Starting in March 2018, I have disconnected from reading news as a daily habbit. I will read news (nytimes, Twitter) only once a week. If it is important someone will be talking about it, it will be in my weekly newspaper.
I am substituting that hour a day of reading news for
- Listening to podcasts
- Reading books like ‘Tell Me More’
- Planning my long term passion projects like Impactful Engineer
- Learning about how to teach my teenage daughter on how to think through decisions instead of what to do or what to do.
Maxwell Anderson writes about this on Medium:
There can be no real thinking in News reports because explaining takes time (i.e., space). So News is made up of statements rather than arguments, which has a serious effect on our minds. When News constitutes almost all of our reading, we fall into the habit of thinking that opinions are the same as thoughts. The News alludes to a debate but only shows us a clash of opinions. As a result, we forget how to carry on a debate, and fall back on polls.
-
http://calnewport.com/blog/2018/02/09/facebooks-desperate-smoke-screen/
The Smoke Screen
In my opinion, the first problem — the engineered addiction — is the more pressing issue surrounding social media. These services relentlessly sap time and attention from peoples’ personal and professional lives that could be directed toward more meaningful and productive pursuits, and instead package it for resale to advertisers so the value can be crystalized for a small number of major investors.
-
Facebook Doesn’t Like What It Sees When It Looks in the Mirror https://nyti.ms/2FLVBei
Turns out, an enlightened, socially engaged Facebook has a similar outlook as the amoral, audience-seeking Facebook. Each sees connecting online as key to the good life.
-
Platforms such as Facebook, Google and YouTube are persuasion architectures that even the engineers who created them do not know how the machine algorithms ultimately target audiences with content. These platforms show us content in our private screens targeted specially for us as individuals. They know what our weaknesses are and move us in subtle but powerful ways. Our attention and time are their products!
As a technologist, I personally will not
- Work for a company whose primary business model is selling ads because the incentive is in the wrong place. People working in these company may have the best of intentions, but the business incentives will ultimately drive the wrong behaviors internally.
- Will not use a product where the content is sorted by machine learning and does not allow me to sort in other ways. I do not want an endless scroll of news feed.
- Will not use a product where I cannot pay it to turn off advertisements. I am not your product, my time and attention is not your product.
Watch this TED talk:
Zeynep Tufekci: We’re building a dystopia just to make people click on ads
https://go.ted.com/CX4kHere is the entire transcript
So when people voice fears of artificial intelligence, very often, they invoke images of humanoid robots run amok. You know? Terminator? You know, that might be something to consider, but that’s a distant threat. Or, we fret about digital surveillance with metaphors from the past. “1984,” George Orwell’s “1984,” it’s hitting the bestseller lists again. It’s a great book, but it’s not the correct dystopia for the 21st century. What we need to fear most is not what artificial intelligence will do to us on its own, but how the people in power will use artificial intelligence to control us and to manipulate us in novel, sometimes hidden,subtle and unexpected ways. Much of the technology that threatens our freedom and our dignity in the near-term future is being developed by companies in the business of capturing and selling our data and our attention to advertisers and others: Facebook, Google, Amazon, Alibaba, Tencent.
01:25Now, artificial intelligence has started bolstering their business as well. And it may seem like artificial intelligence is just the next thing after online ads. It’s not. It’s a jump in category. It’s a whole different world, and it has great potential. It could accelerate our understanding of many areas of study and research. But to paraphrase a famous Hollywood philosopher, “With prodigious potential comes prodigious risk.”
02:00Now let’s look at a basic fact of our digital lives, online ads. Right? We kind of dismiss them. They seem crude, ineffective. We’ve all had the experience of being followed on the web by an ad based on something we searched or read. You know, you look up a pair of boots and for a week, those boots are following you around everywhere you go. Even after you succumb and buy them, they’re still following you around. We’re kind of inured to that kind of basic, cheap manipulation. We roll our eyes and we think, “You know what? These things don’t work.” Except, online, the digital technologies are not just ads. Now, to understand that, let’s think of a physical world example. You know how, at the checkout counters at supermarkets, near the cashier, there’s candy and gum at the eye level of kids?That’s designed to make them whine at their parents just as the parents are about to sort of check out. Now, that’s a persuasion architecture. It’s not nice, but it kind of works. That’s why you see it in every supermarket. Now, in the physical world, such persuasion architectures are kind of limited, because you can only put so many things by the cashier. Right? And the candy and gum, it’s the same for everyone, even though it mostly worksonly for people who have whiny little humans beside them. In the physical world, we live with those limitations.
03:34In the digital world, though, persuasion architectures can be built at the scale of billionsand they can target, infer, understand and be deployed at individuals one by one by figuring out your weaknesses, and they can be sent to everyone’s phone private screen, so it’s not visible to us. And that’s different. And that’s just one of the basic things that artificial intelligence can do.
04:04Now, let’s take an example. Let’s say you want to sell plane tickets to Vegas. Right? So in the old world, you could think of some demographics to target based on experience and what you can guess. You might try to advertise to, oh, men between the ages of 25 and 35,or people who have a high limit on their credit card, or retired couples. Right? That’s what you would do in the past.
04:27With big data and machine learning, that’s not how it works anymore. So to imagine that,think of all the data that Facebook has on you: every status update you ever typed, every Messenger conversation, every place you logged in from, all your photographs that you uploaded there. If you start typing something and change your mind and delete it,Facebook keeps those and analyzes them, too. Increasingly, it tries to match you with your offline data. It also purchases a lot of data from data brokers. It could be everything from your financial records to a good chunk of your browsing history. Right? In the US, such data is routinely collected, collated and sold. In Europe, they have tougher rules.
05:23So what happens then is, by churning through all that data, these machine-learning algorithms — that’s why they’re called learning algorithms — they learn to understand the characteristics of people who purchased tickets to Vegas before. When they learn this from existing data, they also learn how to apply this to new people. So if they’re presented with a new person, they can classify whether that person is likely to buy a ticket to Vegas or not.Fine. You’re thinking, an offer to buy tickets to Vegas. I can ignore that. But the problem isn’t that. The problem is, we no longer really understand how these complex algorithms work. We don’t understand how they’re doing this categorization. It’s giant matrices, thousands of rows and columns, maybe millions of rows and columns, and not the programmers and not anybody who looks at it, even if you have all the data, understands anymore how exactly it’s operating any more than you’d know what I was thinking right now if you were shown a cross section of my brain. It’s like we’re not programming anymore, we’re growing intelligence that we don’t truly understand.
06:5207:08So let’s push that Vegas example a bit. What if the system that we do not understand was picking up that it’s easier to sell Vegas tickets to people who are bipolar and about to enter the manic phase. Such people tend to become overspenders, compulsive gamblers. They could do this, and you’d have no clue that’s what they were picking up on. I gave this example to a bunch of computer scientists once and afterwards, one of them came up to me. He was troubled and he said, “That’s why I couldn’t publish it.” I was like, “Couldn’t publish what?” He had tried to see whether you can indeed figure out the onset of maniafrom social media posts before clinical symptoms, and it had worked, and it had worked very well, and he had no idea how it worked or what it was picking up on.
08:0608:21Do you ever go on YouTube meaning to watch one video and an hour later you’ve watched 27? You know how YouTube has this column on the right that says, “Up next” and it autoplays something? It’s an algorithm picking what it thinks that you might be interested in and maybe not find on your own. It’s not a human editor. It’s what algorithms do. It picks up on what you have watched and what people like you have watched, and infers that that must be what you’re interested in, what you want more of, and just shows you more. It sounds like a benign and useful feature, except when it isn’t.
09:01So in 2016, I attended rallies of then-candidate Donald Trump to study as a scholar the movement supporting him. I study social movements, so I was studying it, too. And then I wanted to write something about one of his rallies, so I watched it a few times on YouTube.YouTube started recommending to me and autoplaying to me white supremacist videos in increasing order of extremism. If I watched one, it served up one even more extreme and autoplayed that one, too. If you watch Hillary Clinton or Bernie Sanders content, YouTube recommends and autoplays conspiracy left, and it goes downhill from there.
09:52Well, you might be thinking, this is politics, but it’s not. This isn’t about politics. This is just the algorithm figuring out human behavior. I once watched a video about vegetarianism on YouTube and YouTube recommended and autoplayed a video about being vegan. It’s like you’re never hardcore enough for YouTube.
10:1210:14So what’s going on? Now, YouTube’s algorithm is proprietary, but here’s what I think is going on. The algorithm has figured out that if you can entice people into thinking that you can show them something more hardcore, they’re more likely to stay on the site watching video after video going down that rabbit hole while Google serves them ads. Now, with nobody minding the ethics of the store, these sites can profile people who are Jew haters,who think that Jews are parasites and who have such explicit anti-Semitic content, and let you target them with ads. They can also mobilize algorithms to find for you look-alike audiences, people who do not have such explicit anti-Semitic content on their profile but who the algorithm detects may be susceptible to such messages, and lets you target them with ads, too. Now, this may sound like an implausible example, but this is real. ProPublica investigated this and found that you can indeed do this on Facebook, and Facebook helpfully offered up suggestions on how to broaden that audience. BuzzFeed tried it for Google, and very quickly they found, yep, you can do it on Google, too. And it wasn’t even expensive. The ProPublica reporter spent about 30 dollars to target this category.
12:02So last year, Donald Trump’s social media manager disclosed that they were using Facebook dark posts to demobilize people, not to persuade them, but to convince them not to vote at all. And to do that, they targeted specifically, for example, African-American men in key cities like Philadelphia, and I’m going to read exactly what he said. I’m quoting.
12:2912:5213:1013:29Experiments show that what the algorithm picks to show you can affect your emotions. But that’s not all. It also affects political behavior. So in 2010, in the midterm elections,Facebook did an experiment on 61 million people in the US that was disclosed after the fact. So some people were shown, “Today is election day,” the simpler one, and some people were shown the one with that tiny tweak with those little thumbnails of your friends who clicked on “I voted.” This simple tweak. OK? So the pictures were the only change,and that post shown just once turned out an additional 340,000 voters in that election,according to this research as confirmed by the voter rolls. A fluke? No. Because in 2012, they repeated the same experiment. And that time, that civic message shown just onceturned out an additional 270,000 voters. For reference, the 2016 US presidential electionwas decided by about 100,000 votes. Now, Facebook can also very easily infer what your politics are, even if you’ve never disclosed them on the site. Right? These algorithms can do that quite easily. What if a platform with that kind of power decides to turn out supporters of one candidate over the other? How would we even know about it?
15:25Now, we started from someplace seemingly innocuous — online adds following us around — and we’ve landed someplace else. As a public and as citizens, we no longer know if we’re seeing the same information or what anybody else is seeing, and without a common basis of information, little by little, public debate is becoming impossible, and we’re just at the beginning stages of this. These algorithms can quite easily infer things like your people’s ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age and genders, just from Facebook likes.These algorithms can identify protesters even if their faces are partially concealed. These algorithms may be able to detect people’s sexual orientation just from their dating profile pictures.
16:33Now, these are probabilistic guesses, so they’re not going to be 100 percent right, but I don’t see the powerful resisting the temptation to use these technologies just because there are some false positives, which will of course create a whole other layer of problems.Imagine what a state can do with the immense amount of data it has on its citizens. China is already using face detection technology to identify and arrest people. And here’s the tragedy: we’re building this infrastructure of surveillance authoritarianism merely to get people to click on ads. And this won’t be Orwell’s authoritarianism. This isn’t “1984.” Now, if authoritarianism is using overt fear to terrorize us, we’ll all be scared, but we’ll know it,we’ll hate it and we’ll resist it. But if the people in power are using these algorithms to quietly watch us, to judge us and to nudge us, to predict and identify the troublemakers and the rebels, to deploy persuasion architectures at scale and to manipulate individuals one by one using their personal, individual weaknesses and vulnerabilities, and if they’re doing it at scale through our private screens so that we don’t even know what our fellow citizens and neighbors are seeing, that authoritarianism will envelop us like a spider’s weband we may not even know we’re in it.
18:22So Facebook’s market capitalization is approaching half a trillion dollars. It’s because it works great as a persuasion architecture. But the structure of that architecture is the same whether you’re selling shoes or whether you’re selling politics. The algorithms do not know the difference. The same algorithms set loose upon us to make us more pliable for ads are also organizing our political, personal and social information flows, and that’s what’s got to change.
19:02Now, don’t get me wrong, we use digital platforms because they provide us with great value. I use Facebook to keep in touch with friends and family around the world. I’ve written about how crucial social media is for social movements. I have studied how these technologies can be used to circumvent censorship around the world. But it’s not that the people who run, you know, Facebook or Google are maliciously and deliberately trying to make the country or the world more polarized and encourage extremism. I read the many well-intentioned statements that these people put out. But it’s not the intent or the statements people in technology make that matter, it’s the structures and business models they’re building. And that’s the core of the problem. Either Facebook is a giant con of half a trillion dollars and ads don’t work on the site, it doesn’t work as a persuasion architecture,or its power of influence is of great concern. It’s either one or the other. It’s similar for Google, too.
20:24So what can we do? This needs to change. Now, I can’t offer a simple recipe, because we need to restructure the whole way our digital technology operates. Everything from the way technology is developed to the way the incentives, economic and otherwise, are built into the system. We have to face and try to deal with the lack of transparency created by the proprietary algorithms, the structural challenge of machine learning’s opacity, all this indiscriminate data that’s being collected about us. We have a big task in front of us. We have to mobilize our technology, our creativity and yes, our politics so that we can build artificial intelligence that supports us in our human goals but that is also constrained by our human values. And I understand this won’t be easy. We might not even easily agree on what those terms mean. But if we take seriously how these systems that we depend on for so much operate, I don’t see how we can postpone this conversation anymore. These structures are organizing how we function and they’re controlling what we can and we cannot do. And many of these ad-financed platforms, they boast that they’re free. In this context, it means that we are the product that’s being sold. We need a digital economywhere our data and our attention is not for sale to the highest-bidding authoritarian or demagogue.
22:2222:3022:4822:49 -
A contrast to what people ( myself included) Post publicly vs privately searching on Google. Just a great reminder to look at social media posts with the awareness that it is partial truth
On social media, the top descriptors to complete the phrase “My husband is …” are “the best,” “my best friend,” “amazing,” “the greatest” and “so cute.” On Google, one of the top five ways to complete that phrase is also “amazing.” So that checks out. The other four: “a jerk,” “annoying,” “gay” and “mean.”