Jan. 24, 2022

Let’s settle it: Is your phone actually listening to you?

Let’s settle it: Is your phone actually listening to you?
The player is loading ...
Thinking Is Cool

Have you checked in on the FBI agent in your phone lately? Winter can be tough, you know. They could probably use a muffin basket or something.

 

I’m only kind of kidding because there isn’t actually an FBI agent in your phone, but there might as well be given the incredible amount of data that you offer up every time you paw through Instagram to find your high school nemesis or type in “perciatelli vs bucatini” on Google.

 

This week on Thinking Is Cool, I explore the depths of surveillance capitalism we’ve willingly and unwillingly submitted ourselves to. It’s pretty astounding how much we don’t know about the information we hand over in the form of anonymized data. 

 

It might not seem like a big deal when you think about it in the context of just yourself—I mean, realistically I’m 1 of about a million carbon copies of girlies in their late 20s who like to read and also like to dress like Claudia Schiffer in the ’90s and also have a fondness for matching sets and also think maybe snail mucus will cure their skin problems. But when you think about the ways the data we may or may not care about can be weaponized en masse to literally shape society…things get kind of scary.

 

This episode explores those fears. But it also offers solutions and considerations that were entirely new to me, a power internet user, when I heard them first. Listen and send to the friend in your life who’s holding out on Instagram Reels because TikTok is a Chinese-owned entity.

 

Important deets:

  • My guests are Nik Sharma, head of Sharma Brands and Twitter’s go-to DTC Guy, and Robert Reeve, a privacy tech expert whose thread on data-driven ads and surveillance went viral last year. They’re both incredibly intelligent.
  • Get your Massican here. My favorites are the Annia and the Sauvignon Blanc.
  • It would be great if you shared this episode! And even better if you shared this episode and then rated and reviewed the show. And even better than that? Sharing, rating, reviewing, and subscribing to my newsletter. Do that here.
Transcript

Sultry voice: Hi everyone…are you ready to talk about something just a little unsettling that we’ve all either willingly or unwillingly been complicit in? I thought so.

 

Hello hello hello and welcome to Thinking Is Cool, the show designed to make your next conversation better than your last. I’m your host Kinsey Grant, and I’ve got a story for you. You ready?

 

I got an ad on Instagram just before the holidays for a teapot. A millennial pink direct to consumer teapot. But that’s not the whole story. The whole story is something more sinister, something more appalling, something more “how the fuck are we not talking about this all the time.”

 

That ad, innocuous as it might have seemed at the time, sent me down the rabbithole. That teapot represents so much more than just a would-be Christmas gift. It represents, as I’ll show you over the next bit of time, the depths of surveillance capitalism here in the US of A.

 

Time to take you down that very same rabbit hole. Nothing is off limits. Everything is on the table. Take it anywhere. And remember, thinking is cool and so are you.

 

*Fade out intro music*

 

In December, I was participating in a ritual as old as time: the pre-holiday panic about what to get your boyfriend as a gift for your first Christmas together without breaking the $100 budget you had previously agreed to. I was a lost soul, wandering the desert of “10 Things Your Bae Needs This Holiday Season” listicles when it appeared like a mirage…an ad for The Qi’s Bloom glass teapot.

 

It was a perfect gift idea served to me on a silver platter, and by silver platter I mean a targeted ad on Instagram’s stories feature. See, my boyfriend, Coleman, is a bit of a tea snob. He’s always sniffing bags of leaves I’ve never heard of before and weighing tea leaves and refusing me when I offer English breakfast, the only tea I’ve ever bought myself. He’s deeply into tea.

 

But…I’m not. I Google a lot of weird shit for this job, but I’ve never to my knowledge Googled anything about tea, teapots, or tea paraphernalia. Coffee? Sure. But never tea. 

 

So I saw the ad, clicked through like a dufus, realized it was outside our predetermined gift budget, and went about my search for a gift that said “I love you so much and I definitely know you to your core but also I don’t want to rush you in our relationship.”

 

I had my ideas, but clearly fate had other plans. Because not a few days later, I saw a viral Twitter thread from May reposted on an Instagram account I follow called Shit You Should Care About. The thread explained why the author was getting ads for his mom’s toothpaste of choice after spending some time at his mom’s house. 

 

That’s when it hit me. The ad for the teapot I didn’t buy was more than just a convenient coincidence. It was planted by data-driven design in my feed…for many reasons, one of them being because I spent a perhaps unhealthy amount of time with a voracious tea lover.

 

How did The Qi, the brand behind this teapot I didn’t buy, know? Had they heard Coleman beg me to go to a tea shop in Williamsburg with him? Did the FBI agent in my phone watch as I pretended to like the tea concoction Coleman gave me that looked and tasted like dung water? Was I now labeled by Big Tech as a tea lover by association?

 

Today…I’m taking you on the journey to find out. Because what started with a curious tea ad ended as a much bigger conversation. And I’m sure that’s a conversation you’ve thought about yourself—we’ve all considered the FBI agent in our phone, but today we’re taking it a step further—we’re gonna figure out what they know. 

 

*Transition music*

 

I think it’s important, now that you know about The Ad, as it shall henceforth be referred to as, that we next run through The Thread. Spoiler alert, I interviewed Robert Reeve, who wrote this viral thread, for today’s episode. You’ll hear from him and his incredible mic that is much better than mine in just a few moments.

 

But for right now…I’m going to read you the thread. Because it’s important. And it’s not that long. Here we go.

 

Read: https://twitter.com/RobertGReeve/status/1397032784703655938

GPS one → Editor’s note: Robert clarified to me when I interviewed him for this episode that the GPS location sharing is actually less important than who you share a Wi-Fi network with. That’s primarily how you end up being served ads for things your roommate/boyfriend might want.

Last Tweet: “So. They know my mom's toothpaste. They know I was at my mom's. They know my Twitter. Now I get Twitter ads for mom's toothpaste. Your data isn't just about you. It's about how it can be used against every person you know, and people you don't. To shape behavior unconsciously.

 

That last part is what got me so profoundly. My data isn’t just about me, and yours isn’t just about you. Our data is part of a much larger social tapestry. I’ve always said, perhaps as an act of self-soothing, that I don’t care if international entities access my data…they can get it if they want it already.

 

But the more I think about it, the more I recognize that my attitude is a small-minded one. It fails to account for the fact that, while my data as a 27-year-old material girl is pretty uninteresting, the fact that it can be accessed in ways I’ve consented to but don’t fully understand should give us pause.

 

We’ll get to why that is and what we can do about it in a few minutes. But before we do, I wanted to dig a little deeper into why we’re having this conversation in the first place. Why do we see so many targeted ads built on this rich network of aggregated data we’ve handed over to tech and media companies? How did we get here?

 

The short answer? Because it works so damn well.

 

KINSEY: You  have. A lot of institutional understanding and knowledge about the DTC space. And that is in my personal experience where I'm seeing so many on these targeted ads. More recently, a lot of them are coming from these cool, sexy, like San Saraf upstarts that are trying to get me to buy their product. So with that, what does targeted advertising mean today?

 

NIK: So, you know, decades ago there was this acronym of AIDA, which stands for awareness, interest, desire, and action.  In theory, you're pushing somebody down this conversion funnel from awareness to. uh, and, and through these ads, they are getting their own inclination to go and want to buy the product or try it.

 

KINSEY: A lot of the more modern technology, it feels like it's compressing that process of going from awareness to action. There have been many times literally within the last 24 hours, I did it where I see something for the first time on like an Instagram story I clicked through and I purchased. Without ever leaving Instagram. 

 

That was part of my conversation with Nik Sharma, the artist known on Twitter as "The DTC Guy" and genius behind Sharma Brands, which works with all of your favorite DTC brands like Haus, Carraway, Judy, and more. Nik is one of the smartest people I know, a fantastic networker, and often one of the first to say “would anyone do a shot?” in the best way.

 

What Nik explained in our conversation is that, with traditional ads like magazine spots for a Coca-Cola brand, there was incredibly limited or frankly non-existent reporting on how well the ads were or were not performing. There was no real way of knowing what worked, and it made the job of advertisers difficult.

 

But then, Mark Zuckerberg got dumped and the world was never the same again. With the advent of social media platforms like Facebook and Instagram, plus search engines like Google and whatever the fuck Amazon is, we as then-early internet users became all too comfortable handing over our very personal details in the form of mass amounts of data. Anonymized data, but data nonetheless.

 

Armed with that data, advertisers were suddenly cooking with gas. An entire industry sprung out of the internet with the sole purpose of aggregating that anonymized data and selling it as a tool to make the jobs of advertisers—targeting us as consumers with a compressed awareness to action pipeline—easier. Because no matter what their mission statements say about connecting people or building community or organizing the world’s information…it’s always been ads that make money.

 

NIK: So, uh, Kinsey, if you an idea, I'm on Instagram and you go look at a stiff. And Instagram knows that we're, we're close friends on Instagram. I might actually start to see ads for sniff because there's an assumption that because you're looking into it and we communicate, I might be the one who's also interested in that product considering we might have similar interests. Um, and, and similarly to, if I go to sniff and I click on an ad and get to sniff, you know, Facebook now knows, okay.

 

You know, on a scale of one to 10, let's say Nick has maybe now a 7.8 in interest because he spent 37 seconds on the site. He went two pages in, he looked at different sense that were available for the candle. And now we know that we have a pretty high probability of if we show him this specific ad with this specific message and caption, we have a much better chance of converting him to actually being a customer.

 

That’s one example of many. We’re also talking about things like…Facebook knowing how long you stop scrolling when you see a video to measure how much the targeted content holds your attention…or self-driving car companies training their machine learning algorithms by buying the datasets that you perhaps unwittingly provide every time you prove to a Captcha system that you’re not a robot because you can identify what pictures have a bridge or a sidewalk or a pedestrian in them…or email companies inserting invisible pixels into your morning newsletter to gain all kinds of information once you click open.

 

Nik gave another example that brings the point to bear—this isn’t just Facebook and Google and the like participating in the targeted ads economy. I’ll let him explain.

 

NIK: I can give you an example. You know, when I worked at a beverage company, we were trying to find, uh, cohorts of users who were buying competitive products. And so we went to an advertising company called record. And they have this product called slice intelligence and slice intelligence owns another product called unroll.me, which is an inbox management tool.

 

And it helps you kind of clear your newsletters and basically clean your inbox up. And, uh, what, what they do is they take, okay. If Kinsey bought a harmless harvest, coconut water or LaCroix water from Amazon, they'll actually put you in, in a, in a bucket. And then they'll, you know, then slice and rackets and I've access to that. And then they'll go sell that. And so it's, it's so hard to just stay mad at it because it's always going to keep happening in the background. 

 

The thing is? It’s easy to hear those examples and get riled up—how could they! Why don’t I get a dime of the money made from my data! What gives! But…we consented to a lot of this anonymized data sharing when we said “yeah okay I agree” without reading the terms and conditions of use for all manner of tech platforms. And what’s more, we might not really like the internet if the data-sharing went away.

 

NIK: The trade-off has always been convenience, right? If you want to go find something quickly, uh, the platform that will most conveniently show it to you is also going to be in exchange, collecting your data.

 

And I mean, this has been going on for, for like a couple of decades now, you know, if you think back to even like Quancast, which is, uh, one of the large advertising. Um, like they know they know what car you drive and they know what you watch on Hulu. And then they use all that data collectively and they know that you might open the skin and they use all that data collectively to then show you an app.

 

And those ads they show you? They’re usually not that bad. I got the ad for a teapot that was an almost perfect gift for Coleman. I didn’t get an ad for, like, adult diapers or tractor parts or baby toys. I got an ad that made my life a little more convenient.

 

But at what cost? There’s a price on that convenience…and we’re going to talk about it after this quick definitely not programmed or targeted ad.

 

*Insert ad about how this is a good kind of ad here*

 

Now, before that break I said this: There’s a price on that convenience. And that price, it might seem, is humanity’s own collective self-determination. Allow me to explain by using my own ideas and those I came across in a truly enjoyable and thought-provoking conversation I had with Robert Reeve, the author of that viral Twitter thread I read before and someone who’s worked in privacy tech for a while.

 

Robert and I went through his Twitter thread together and spoke about it length, and it was invigorating. But once we realized that we were veering on an entirely different conversation, things got really interesting.

 

What Robert and I ended up talking about most was the concept of responsibility in a world defined by the tension between liberty and freedom. Like it or not, we live in a world that could easily be described as a system of surveillance capitalism—an economic system often defined as rooted in the commodification of personal data with the singular purpose of profit-seeking and making.

 

We provide data; tech overlords make money selling it. We get convenience; they get control. That’s a severely black and white means of communicating a tremendously grey idea…but it stoked in me and in Robert these questions…

 

Where do we draw the line for surveillance capitalism? How do we determine the difference between our data being used to make the world more convenient…and our data being used to Cambridge Analytica all of us again? To achieve specific and unilateral ends by any means possible? Can we determine that difference at all?

 

At the core of those questions is this one: Who’s the bad guy? If we want to best understand how we determine good and bad uses of technology…we need to know where to point fingers. We need to know what’s convenience and what’s social grooming. We need to know who’s the villain.

 

ROBERT: I guess if I'm going to be really provocative in some ways I feel like the villain is ourselves. You know, like the villain is humanity and like the way that tech companies have gamed our behavior and how we treat each other, um, they figured out like advertently or inadvertently what makes people tick and, and they're cranking all the dials on those things up to 11. Generally in pursuit of profit. So maybe, maybe my, my second controversial thing is capitalism is the bad guy, but everybody says that these days, I think.

 

KINSEY: And I don't disagree. And next time on thinking is cool? Um, no, but I think that. In a lot of ways, you know, this conversation, we can say it's very specific. We're talking about the ways that targeted ads work on the internet, but I think it's also so much bigger than that when we consider that tech in a lot of ways, you know, there's, there's always a conversation about technology isn't good or bad.

 

It just is. I think a lot of people have suggested that the real villain or superhero is the ways that we use it. The ways that we utilize technology, in so many ways, Our biggest gripe is that these tech companies are holding a mirror up to humanity in a lot of ways.

 

Um, and that, that can be really unsettling to recognize that like, well, shit, they're just, they're giving us what we want. We are, we are going out every day and giving them data numbers and like ones and zeros to suggest that this is actually what we want. This is what we're interested in, but there is also the counterargument to that, that just as often as tech is a mirror for society and for humans, Humanity is shaped by a lot of these technological influences, the more and more that we move online and we consider the metaverse as the future, or even just the present.

 

The last two years in the pandemic so much has happened online that we, we have to question how much society as a large is being shaped by these decisions of data aggregators and the tech companies that collect data. And even, I want to be clear, it's not just the Facebooks Twitters Googles of the world.

 

I worked for an email company and they knew a lot about you too. Like even the small companies have ways of collecting this data. Um, so I'm curious to hear your perspective on, on that, on the ways, um, or perhaps the dangers in the, the nature of society being something that is malleable, that can be impacted by outside elements, that outside element, of course, being the internet.

 

ROBERT: Yeah, I think that's why big tech is the target. It is. They built things that are so vast and touch so many aspects of our lives that I think they're no longer under even their own creator's control. And it is impossible for them to predict every possible bad actor and how the systems they built can be exploited. By those who have the resources and incentive to do so.

 

Data collection and aggregation is a fantastically impressive technology. To think it’s evolved so rapidly is truly astounding. But that technology can be used by bad actors that, under the cover of the internet’s vastness, are pulling the strings of society like we’re nothing but a marionette doll.

 

Not to get, as Robert put it in our conversation, all Charlie Day It’s All Connected on you, but…it starts with a teapot. What comes next? That ad was intelligent and targeted enough to get me to click. To get me to take an action. What other action can data aggregation beget?

 

We know that the answer to that question is incredibly difficult to articulate. Technology and the people who create it are imperfect. They’re also not fortune tellers. We can’t expect them to predict every possible utilization of their tech before it comes to fruition. But what we can do is expect them to take responsibility when tech goes badly. 

 

ROBERT: ​​I have a thought about that. Facebook and Twitter doing more with the ad data they have and like a way to use it ethically that they don't do right now. Um, the same way that we have ad profiles that tell us, like my partner likes tea, like there can be stuff attached to your profile. That are like this person engages with a lot of content around white supremacy and literally like Facebook generates these ad categories, algorithmically. 

 

Rob explained that, to an algorithm, we’re a collection of our interests. Those interests can be, as he stated alluding to his own, Dungeons and Dragons. Those interests could be tea. But those interests could also be…white supremacism and anti-semitism and racism.

 

ROBERT: Facebook had some algorithmically generated ad tags around like anti-Semitism and companies went in and bought ads on those tags. And, and this was like a new story a couple of years.

 

Like it turns out and Facebook was like, we didn't build that on purpose. Our machine learning algorithms just happened to make tags for selling ads for anti-Semitism. Uh, and they're like, of course we never meant for this to happen and it's time to shut these down and we're, we're shutting them down.

 

We're we're, we're, we're pulling all of this data out and we're making sure no one can ever do this again. But like the thing that I would rather see is like, if you have this date, Why aren't you sharing it with like your moderation teams? Uh, if you can buy and sell ads on anti-vax propaganda, you know, like that means that, you know, all the people who are engaging with that, or the people who are really distributing a lot of that, like, I think that they've, uh, some, um, watchdogs for, uh, like, like anti-vaccine propaganda. it down to like 20 people on Facebook are sharing like the vast majority of the content. And they've reached out to Facebook to be like, Hey, like you have an ethical responsibility here to, I shut these people down and shape this public health debate for the good of society. And then they're touching on like free speech issues too.

 

They're like, should Facebook do that? My opinion is yes, they should. But like, but that really. The free speech debate is also that's about government intervention, not about like your right to have a profile on like a private company's social media network. So I think that there's a lot of ways that this data they collect nominally for ads can be used for authentic good in shaping society, but it just doesn't get used to that. And, and that's really frustrating to me to, to know that somewhere Facebook and Twitter know who the white supremacists are, but they're like, but our growth metrics, everyone should be here because we're still selling a lake like the worst version of even conservatives by Nike's. 

 

My natural next question went something like this: Isn’t that opening Pandora’s Box? If we allow these tech platforms run by technocrats to reach in this way…what’s to suggest they won’t overreach in other ways? To potentially bend the will of society in ways far more nefarious than rooting out the propagandists…

 

The answer is somewhat unsatisfying but…they already do. Content moderation is obviously imperfect but it does exist. And sometimes it even works.

 

I guess that’s at the root of a lot of this conversation that we’re having today. Sometimes tech does work. And when it does, there are so many immense benefits to be reaped for all of us. Stumbling upon the perfect Christmas gift in between your friends’ Instagram stories. Learning how to perfectly fry an egg on TikTok. Connecting with your high school English teacher on Facebook. This is tech done right. 

 

Inevitably, there will be times when tech is done wrong, that much we know. For those most concerned with that angle of this conversation, I present this from Robert.

 

ROBERT: If you're like, uh, a prepper, I want to live in a data bunker. I never want the government to know anything I've ever searched for. And I think that Google is selling my data to the government.

 

Like if you're one of those people, there are ways to go crazier than that. But for like, for someone who's just like unconcerned, but I want life to still be livable on the internet. I don't want to like, have to like reroute my traffic through the onion network. Use Firefox, um, use Firefox and switch your default search engine to duck, duck go the privacy minded search engine. Um, I was worried about it at first because like I was like, there's no way this is going to be as good as Google. It's pretty dang good. I was surprised. Um, in the rare instance where duck, duck go, doesn't really get me there. I just switched to Google and I'm like, all right, Google. you can have this data point.

 

At the end of the day, this conversation, whether it’s had on Firefox or Chrome or face to face, is far from over. Technology is never static; it’s always evolving in different ways and directions. And I certainly don’t have all the answers, but I hope listening to this got you thinking about what the teapot in your life is. What do the little things you encounter online mean about you? About your community? About your world? And how are you going to hold those responsible for said meaning accountable?

 

That’s what I’ll be thinking about for a while. And I hope you will be, too. Because it’s not about the teapot. It was never about the teapot. It’s about so much more. 

 

Thank you for listening, everyone. Have a blast today, go find some weird as fuck Instagram ads and send them to me, and remember—thinking is cool and so are you. I’m Kinsey Grant, and I’ll see you next week.