• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Coaching for Leaders

Leaders Aren't Born, They're Made

Login
  • Plus Membership
  • Academy
  • About
  • Contact
  • Dashboard
  • Login
Episode

718: How Leaders Can Use the Algorithms for Good, with Sandra Matz

Psychological targeting isn’t going to dis­appear.
https://media.blubrry.com/coaching_for_leaders/content.blubrry.com/coaching_for_leaders/CFL718.mp3

Podcast: Download

Follow:
Apple PodcastsYouTube PodcastsSpotifyOvercastPocketcasts

Sandra Matz: Mindmasters

Sandra Matz is a Columbia Business School professor, computational social scientist, and pioneering expert in psychological targeting. Her research uncovers the hidden relationships between our digital lives and our psychology with the goal of helping businesses and individuals make better decisions. She is the author of Mindmasters: The Data-Driven Science of Predicting and Changing Human Behavior*.

Algorithms are becoming more influential with each passing day. That’s why leaders must understand their power and then decide how their organizations engage. In this conversation, Sandra and I discuss where psychological targeting is at, where it’s going, and the opportunity you have to make the world a bit better.

Key Points

  • Everyone knows everything in a small town (for better or worse). In the same way, psychological targeting can be used for both evil and good.
  • Psychological targeting already is successful at identifying wealth, personality, income level, and sexual orientation – and keeps improving.
  • None of this is going away. Understanding how the game of targeting is played can help you make it work to your advantage.
  • Leaders and organizations who use targeting responsibly can do tremendous good, including helping people save money and flag early interventions for health crises.
  • Be transparent with what data you’re collecting and how you’re using it. Consider newer practices like federated learning that protect privacy and provide permission-based access.
  • Design systems and practices that anticipate the reality of future leaders with different values.

Resources Mentioned

  • Mindmasters: The Data-Driven Science of Predicting and Changing Human Behavior by Sandra Matz

Interview Notes

Download my interview notes in PDF format (free membership required).

Related Episodes

  • Serve Others Through Marketing, with Seth Godin (episode 381)
  • The Way to Earn Attention, with Raja Rajamannar (episode 521)
  • The Reason People Make Buying Decisions, with Marcus Collins (episode 664)

Discover More

Activate your free membership for full access to the entire library of interviews since 2011, searchable by topic. To accelerate your learning, uncover more inside Coaching for Leaders Plus.

How Leaders Can Use the Algorithms for Good, with Sandra Matz

Download

Dave Stachowiak [00:00:00]:
Algorithms are becoming more influential with every passing day. That’s why leaders must understand their power and then decide how their organizations engage. In this episode, where psychological targeting is at, where it’s going, and the opportunity you have to make the world a bit better. This is Coaching for Leaders episode 718. Production Credit: Produced by Innovate Learning, maximizing human potential.

Dave Stachowiak [00:00:33]:
Greetings to you from Orange County, California. This is Coaching for Leaders, and I’m your host, Dave Stachowiak. Leaders aren’t born, they’re made. And this weekly show helps you discover leadership wisdom through insightful conversations. So much is changing about how we work and how we utilize technology. We’ve seen so much in the news in recent years on how organizations are using data and targeting in order to influence behavior. Today, a conversation that helps open the door for all of us to get better about what’s happening and more importantly, how we as leaders can do a better job at making choices in our organizations that really create the world that we want.

Dave Stachowiak [00:01:20]:
And I am so pleased to welcome Sandra Matz to the show. She is a Columbia Business School professor, computational social scientist, and pioneering expert in psychological targeting. Her research uncovers the hidden relationships between our digital lives and our psychology with the goal of helping businesses and individuals make better decisions. She is the author of Mind Masters: The Data-Driven Science of Predicting and Changing Human Behavior. Sandra, what a pleasure to have you on.

Sandra Matz [00:01:50]:
Thank you so much for having me, Dave.

Dave Stachowiak [00:01:52]:
You write in the book about your experience of growing up in a small town in Germany. And as anyone who’s lived in a small town knows, everyone knows your stuff. Right? And there are some really great benefits that come from that, and there’s also some real downsides too. Aren’t there?

Sandra Matz [00:02:15]:
Very much so. So yeah. You’re absolutely right. I grew up in this tiny village, 500 people in the southwest corner of Germany, or actually as my as my parents keep reminding me, it’s grown to a 1,000 since I left, which I can assure you is is not, making that much of a difference. But you’re absolutely right. I think the feeling that I got growing up in this village, contrast to me living in New York today, is that there’s there’s people who truly know you. Right? So it’s not just people who live next door. You see them once in a while.

Sandra Matz [00:02:45]:
You maybe say hi once or twice a day. Those are people who observe everything that you do. They know exactly what you do on the weekend. They know who you’re dating. They know which music preferences you have. And for me, actually what it felt like was that they truly knew me. So they, in a way, put together these these puzzle pieces of my existence to understand my motivations, my preferences, my fears, hopes, dreams, you name it. And then they did what village neighbors in a way do best.

Sandra Matz [00:03:14]:
Right? So sometimes they used it to offer me the most amazing advice because they knew exactly what I wanted. Right? They kind of help me figure out what I wanted to do after school, to take a gap year, what to study, connect me with opportunities. But on the on the other hand, it it also felt like there was someone pulling the strings behind my back in a ways that I didn’t always appreciate. So you’re absolutely right in that. This idea that someone could really understand what I wanted and who I was had both, like, these bright and dark sides to it.

Dave Stachowiak [00:03:45]:
It’s such a interesting analogy for psychological targeting because psychological targeting can do some amazing incredible things. As I think we’re gonna talk about in this conversation, it can also manipulate and exploit. Right? And a line I highlighted from you in the book was this one. And I think you were talking about in in the context of growing up in a small town- “Once I understood the game that was played and had a clear sense of what I wanted out of it, I learned to play it to my advantage. Suddenly, I was winning more than I was losing.” And there’s a message there, I think, for how much we learn about psychological targeting there too, isn’t it?

Sandra Matz [00:04:25]:
Yeah. So absolutely right. It’s funny because I give this analogy of the village, and then I kind of very pretty rapidly shift to technology. Right? So it almost feels like a paradoxical comparison because the village was all about individual relationships. And now that we live in this, what I think of as a digital village, it’s not necessarily neighbors understanding exactly who we are and what we want. But it’s all of these digital neighbors, like algorithms who observe everything that we do from what you post on social media, credit card spending, the sensor data that gets captured by your smartphone. And you’re absolutely right. It also has these two sides.

Sandra Matz [00:05:04]:
To some extent, learning something about individuals offers amazing opportunities, and we probably are gonna get to talk about this in a little bit. But it also has, like, this this exploitative, manipulative angle to it. And for me, the same way that in the village, I was trying to figure out how do I amplify the positives and how do I mitigate some of these challenges and sides that I didn’t appreciate about the village. The same way I’m thinking about this in the context of technology. So how do we make technology work for people rather than against them? And I think leadership really plays a critical role

Dave Stachowiak [00:05:36]:
here. Yeah. And I think it’s such a great opportunity for leaders to be knowledgeable about this. Like, this is not going anywhere. Right? It’s not disappearing. In fact, if anything, it’s gonna become bigger and more prominent in all of our work, how organizations message us, how governments and individuals reach out to us. And so, like, understanding how this works, I think, is key. And before we even get into some of the, like, okay, what would we do as leaders? I’m just curious about some of the examples in the book because I think they’re so powerful in just illustrating how this works in the identities that we craft online, whether we’re conscious of it or not.

Dave Stachowiak [00:06:16]:
And there’s a really interesting example in the book of just oh, there’s so many examples, but one of them is how the differences between what high and low income people talk about online. Could you illustrate that a bit? Because I think it’s just fascinating how that shows up.

Sandra Matz [00:06:34]:
Yeah. So and for me, the interesting part is that data actually gives us an insight not into, like, just an individual psychology, but it also teaches us something about human behavior. So the example of high and low income, individuals was actually relating that to what they talk about on social media and the things that they like on social media. And you can imagine some of these connections are actually obvious. So high income people, they talk about the fun vacations that they go on. They talk about the luxury brands that they buy where that’s not not necessarily true for low income individuals. But there’s also these more subtle nuances and these more subtle relationships that are not just interesting from a psychological point of view, but also show that there’s, like, all of these like, what I think of as, like, hidden relationships. It’s not something that you intentionally put out there to signal your identity.

Sandra Matz [00:07:23]:
It just kind of creeps into the language that you use. So this is, for example, low income individuals being a lot more focused on the on the present than the future, for example. And again, in a way that makes sense because if you’re worried about, running out of money, trying to figure out how to make ends meet, you’re probably not gonna think about the next summer, next year, 10 years from now. What you’re trying to figure out is, like, how am I gonna get through this week? How am I gonna get through this month? The same is true for references to the self. So low income individuals have a lot more references to the self. And again, it’s much harder to worry about the problems of the world if you’re struggling financially. So for me, it’s kind of both this interesting lens into human behavior, and it also teaches us something about, like, just how hard life is when you have very little money available.

Dave Stachowiak [00:08:13]:
Yeah. Indeed. And it’s the book is boy, it’s so worth picking up the book just to look at the word clouds that you’ve surfaced in the research on, like, what are the kinds of words people use when they’re in a low income situation? What are the kinds of words they use in a high income situation? What are the kinds of words extroverts and introverts use? It’s really fascinating. And then, of course, the algorithms can be tuned to that to be able to target what kind of ad you see, what kind of material shows up, what, shows up in the feed. It it is really, really fascinating.

Sandra Matz [00:08:50]:
Yeah. And for me, the interesting part in a way is that some of it is obvious. Right? So if you look at the word clouds and you look at the word clouds for extroverts and introverts, like extroverts are out there talking about weekends and dating and so on, and that introverts talk about computers and anime and manga, all of the stuff that you do by yourself. And for me, the the obvious parts are actually nice because they show that it’s not rocket science. Right? The way that, computers or algorithms translate your online behavior into these more holistic psychological profiles isn’t really like this this massive black box that AI is oftentimes made out to be. It’s in a way counting words and looking at these pretty obvious relationships, but then you also get the ones that are not as obvious and that we, as humans, might not have even been able to pick up on. So, again, like, the references to the self for low income, for example, is also true for, for emotional distress. So if you’re emotionally distressed and having a hard time, you just kind of think about yourself a lot more and talk about yourself a lot more because, again, you’re worried about, like, why am I feeling so bad? Am I ever gonna get better? And those are these relationships that show up in the word clouds that are not obvious, but that computers can actually detect just because they have access to so much data and of so many people at the same time.

Dave Stachowiak [00:10:03]:
And speaking of the data and how much access we have to it now, like so many organizations, one of the really fascinating and also scary things that you talk about is just the prediction that some of the algorithms are doing through photography and the photos of people online and how it’s possible, not only possible, but really accurate in some cases to predict personality and even sexual orientation through photographs. Could you share a bit about just, like, what the research is showing on that?

Sandra Matz [00:10:36]:
Yeah. And I would say that this is some of the the most controversial research in that space because it also has, like, pretty important implications. Right? You could always argue, well, I don’t have to post on social media if I don’t wanna be be tracked, or maybe I can leave my phone at home if I don’t want the GPS records to to figure out exactly where I go and and who I meet. But the moment that we’re talking about your physical appearance, that could be face, that could be anything related to grooming, there’s always no way to to escape tracking. We have cameras on pretty much every corner in New York combined with facial recognition. It’s very easy to pick up these signals. So what this research shows, is essentially that depending on how you look at it. So some of it is grooming, but you can imagine one of my favorite examples in the book is extroverts and introverts.

Sandra Matz [00:11:21]:
Like extrovert introverted women in this case. And just kind of seeing the differences between how they present themselves on social media. And what you see is, like, extroverts, they probably dye their hair a lot more often because their hair looks a lot blonder. They probably also wear contact lenses more often because their eyes look blue, and there’s no genetic reason for why that that should be the case. They also seem to be much better at taking pictures because their pictures are you don’t see the nostrils, so which suggests that they take pictures from above to make their faces look slimmer. So there’s all of these traces that are somewhat curated and groomed. Now what the research suggests, and I think this is why it gets really creepy also potentially interesting, is that it also suggests that there might be some features of your face that actually predict personality. So independent of grooming, just the features of your face that are related to some of these psychological traits.

Sandra Matz [00:12:11]:
And I think what creeps people out is that there was the signs for pseudo signs of physiognomy, for a long time that suggested, well, maybe if your nose looks a certain way, maybe you have a certain character trait, and it was certainly abused in in many different contexts. Now what we can do with computer science and AI is really look at the actual differences. And there’s many ways in which that could be true. I remember when I first heard about this research, I was like, this is absolutely insane. I don’t think that there’s anything to this. But then there’s many ways in which actually our physical appearance might be related to psychology. If you think about hormones, for example, we know that testosterone makes you more aggressive, makes you more assertive, but it’s also related to your facial features. Or, like, one of my favorite examples, or an argument that that is oftentimes made is we just respond differently to our social environments.

Sandra Matz [00:13:02]:
Right? Imagine you’re like this beautiful baby, perfect symmetry of your face. Everybody loves you. You’re constantly getting positive feedback from your environment. It’s not so surprising to imagine that maybe you’re also gonna turn out a little bit more extroverted than other people. Right? You’re constantly getting this positive social feedback. So this, I think, is part of the science that is the most most controversial. So, like, your facial features really opening or, like, offering a window into your psychology, but also one that is the most creepy in a way because there’s no way that you can leave your face at home.

Dave Stachowiak [00:13:35]:
Yeah. Indeed. And like you said, even if you personally decide you’re gonna opt out, there’s so many cameras, so many places. And, I mean, it’s really fascinating even some of the studies where they control for the grooming, like, have everyone do the hair the same way, position the camera the same way. Even then, like, the algorithms are pretty good. I mean, it’s and and you think about this, and it’s, like, really easy to go down a very dark tunnel very quickly of thinking. Okay. How can organizations and governments manipulate this data? And we’ve seen examples of that already happening in in media.

Dave Stachowiak [00:14:07]:
Right? And there’s a both end here, which is like, what are also the great things? Which is one of the reasons I wanted to talk to you. Like, what are the great things that can possibly come out of this? And I love some of the examples you talk about in the book and some of them you’ve been involved with. I’m wondering if you could share the story of Safer Life and the work you did with them to to highlight how this could actually be used for good.

Sandra Matz [00:14:31]:
Yeah. It’s it’s one of my favorite examples. And in a way, it comes back to the analogy of the village. Right? So in the village, the fact that someone truly understood me was a blessing, in the way that they could really help me. They knew exactly what I wanted, and they could offer the best advice. So save a life, was one of the the collaborations that I did with a Fintech company that is actually trying to help low income individuals save more. So if you look at the state of affairs in the US, the picture looks pretty grim. I think about 50% of people live paycheck to paycheck.

Sandra Matz [00:15:02]:
10% of people couldn’t even go a week without being paid. And that’s a it’s a terrible situation to be in. Because, like, the the only thing that has to happen to you is your car breaks down. You can’t put it to the into the shop. You you can’t get it fixed, you can’t get to work, you lose the job, and so on. So it’s a very kind of easy slippery slope to to losing everything. So what what Save A Life was trying to do is to say, well, we know what these sneaky marketers do. Right? So we know that psychological targeting, allows marketers to sell you more stuff.

Sandra Matz [00:15:30]:
If I know that you extroverted, I can advertise certain products, I could talk to you in a certain way. And we know from my own research that that allows marketers to essentially increase the likelihood that someone is going to buy a product. And the question that we had is could we just flip this concept on its head? Could we instead of using this understanding of your psychology to get you to spend more, could we also use it to get you to save more? So with users consent, so save a life users, in the app, we surveyed their personality. So in this case, we actually did it using questionnaires with their consent. So it’s very clear and transparent what we were trying to do. And we just said, okay. Now that we understand that you might be more extroverted or more agreeable, which is like one of these personality traits that looks at how much people care about their social relationships, for example. And can we use these insights to help you save more with messages that we send you? So we kind of had this, we tapped into save a life.

Sandra Matz [00:16:24]:
I think it’s called race to 100. So this is a challenge that they set for their users where they encourage them to spend to save at least $100 over the course of a month. And it might not sound too much to some of the the listeners, but those were people who had less than a $100 in savings. So this is, like, a massive undertaking, which is really kind of trying to double your savings over the course of 4 weeks. And what we tried to do is we took Save A Lives the message that they had been trying to optimize for for years. So this I think of it as, like, the gold standard that Save A Life was using at the time, and then we compared it to psychologically customized and psychologically tailored messages. So, again, for agreeable people, for example, they’re not necessarily gonna be convinced by just having extra money in the bank account because what they care about is other people and their loved ones. So for them, messages would say something like, well, if you put some money to decide right now, this is an opportunity for you to make sure that your loved ones are safe now and in the future.

Sandra Matz [00:17:22]:
And what we saw is that essentially in in the number of people who managed to hit their saving goals of $100, we saw, like, a 60% increase in in in the group that got these psychologically customized messages compared to the gold standard. So this is like a pretty high bar to clear. Right? This is save a life trying to figure out how do we best communicate with our users for for quite some time, and still by just tapping into people’s psychology and their motivation, we were able to increase that further. So for me, this is just a very nice what if scenario is like we can use the same technology to either get people to spend more reach deeper into their pockets, but we can also use it to help them save, which is typically something that a lot of us aspire to do but have a hard time with.

Dave Stachowiak [00:18:07]:
It’s an amazing example of how there’s so much good that can come from this as well. And I think about some of the examples you cite in the book about just the chatbots and helping to support mental health for folks. And now there are some examples of how they fail spectacularly. Of course. Right? But there’s also some really amazing examples of, like, how by doing targeting can provide interventions and opportunities to connect with people before even their spouse or a partner or a friend or someone else would notice what’s happening. Right?

Sandra Matz [00:18:48]:
Yeah. So for me, like, the ability- and those systems are getting better and better. Right? If you look at the last 5 years, the development that we’ve seen with generative AI with these chatbots is just mind blowing. And what these chatbots can do is essentially 2 things in the context of mental health. 1 is tracking, and I think that’s an important part because we know that still so many people go undiagnosed. Right? There’s still, like, the the number of suicides every year that could be prevented if someone was only diagnosed with something like depression are sky high. It’s just like terrible numbers. And what happens typically is once you enter a full depressive episode, for example, it’s really hard for you to reach out to people because one of the signs of depression is you’re essentially very much inward focused.

Sandra Matz [00:19:32]:
You are having a hard time reaching out to the people who could support you. So what technology can do is essentially passively say, well, we kind of see that there’s some deviation from your typical baseline. That could be anything from social media. Right? Again, like, how do you talk about your in emotional life? 2, something like a GPS records embedded in your in your smartphone. And that could be, well, we see that you’re not leaving your house as much anymore. There’s much less physical activity. Maybe you’re not making, taking as many calls. And it might be nothing, but maybe you’re just on vacation and that’s why there’s much less physical activity.

Sandra Matz [00:20:08]:
But why don’t you look into this? So instead of saying we have to wait until someone is, like, deep in the valley, deep into intra depression, we could intervene much early and say, okay. Again, maybe it’s nothing. It’s not a diagnostic tool, but there seems to be some deviations. And maybe I’m gonna try to point you towards the right resources. But maybe there’s also a way in which, especially if you know that you have a history of of mental health problems, why don’t you nominate someone that you trust and love that could be could be siblings, could be spouses who get alerted once we see that there’s this deviation. And again, it’s just like flagging it. It’s like an early warning system, but they could then reach out and say, hey. Everything okay? Is there anything that I could do to support? So the the tracking part, I think, is is already critical.

Sandra Matz [00:20:52]:
And then as you mentioned, I think the treatment part with just having this chatbot and and the alternative for people who otherwise can’t access, mental health care, which is most people. Right? According to the World Health Organization, I think for every 100,000 people looking for for treatment is 13 professional therapists. So there’s this huge gap in supply and demand. And even though those chat boards are maybe not perfect, they’re certainly better than not having any care at all.

Dave Stachowiak [00:21:19]:
And available at 3 in the morning when someone’s in crisis too that, like, a traditional therapist wouldn’t be in. Like, you you go to great lengths in the book to say, like, this is not a replacement for a traditional therapist necessarily. But, boy, what the potential to be able to bring alongside the both and tier of, like, to have a resource that’s available and to flag and to alert others. I mean, it’s it’s really powerful. And I and this just brings me to maybe the obvious big question, which is, wow. What do we do with this? Right? As leaders. Because there is, of course, knowing that and many people listening to have organizations that have done some version of this in recent years of doing psychological targeting and collecting data. And there’s a lot of incentive for organizations to collect as much data about people as possible, right, to be able to get them to do whatever they want, either good or evil.

Dave Stachowiak [00:22:13]:
Right? And for someone who’s listening, who’s thinking, okay. Maybe my organization is thinking about doing this. Maybe we’ve done a lot of this collection of data before, and we’re thinking about how we do this in the future. Where do we start? Like, how do we think about this differently in a way that really supports the good we wanna create in the world with this?

Sandra Matz [00:22:30]:
Yeah. I think there’s 2 main avenues that I see that the first one is that and this is like a recommendation that I make whenever I work with companies is to the extent that you can, I would always involve customers and end users? Because this so first of all, like, when you use these predictive models, right, translating data into psychological insights and then using them to try and shift behavior in a certain way, you’re gonna make mistakes. So those models are pretty damn accurate on on average. But at the individual level, you’re gonna make mistakes. And that’s not only annoying for your customer and user, but it’s also like very costly to you. Because if I think that you’re someone who you’re not and I’m now optimizing my entire product offering servicing the way that I communicate with you through that false profile, that’s just a waste of money and it’s annoying for you, and I’m most likely gonna lose you. And and it’s also a question of trust. Right? I think we’ve seen enough examples now where consumers figure out that companies collect all of this data in a way that they don’t appreciate.

Sandra Matz [00:23:29]:
So whenever you can, make it part of the communication conversation that we you have with users. My favorite example is actually a a like a project that we did over 10 years ago now with Hilton. They were trying to figure out how do you integrate some of these insights into the recommendations for vacations. Right? How do you create this the best personalized experience for users? And what data is instead of saying, I’m just gonna passively grab some data, predict your psychology, and then try to get you to to spend more. What they they created an experience for their users and their customers to say, hey, why don’t you try and help us understand who you are? We’re gonna create a traveler profile by you logging into your social media. And so it’s all based on consent and it’s part of the value proposition. Right? It’s saying, well, by giving us access to your data, we’re not only gonna show you the predictions that we make, but we’re also then directly creating value for you by tapping into these into these psychological profiles. So this is number 1.

Sandra Matz [00:24:30]:
I think it’s just to the extent that you can, I would always involve your customers and users? And number 2 is that this argument, and I I really like the way that you described it because I think it’s a philosophy that a lot of companies have and that was pushed for a long time. And that the more that you collect, the better. Like, the the because first of all, even if you don’t need it right now, it’s always good to have it because it’s very easy and it’s cheap to collect. So why wouldn’t you?

Dave Stachowiak [00:24:56]:
Right.

Sandra Matz [00:24:57]:
And and I think that’s no longer true. It’s no longer true that you can only offer personalization. You can only offer the best service if you collect all of this data. And that’s because we have new technologies that is allowing you to essentially extract intelligence from data without collecting it. So the technology that I that I’m referring to is it’s often called federated learning. And it’s the idea that traditionally take take Netflix, for example. So traditionally, what happened is Netflix is trying to create these recommendation algorithms that figure out which movies you might be interested in based on your past viewing history and what everybody else is into. And it used to be the case that you just have to send all of your data to Netflix.

Sandra Matz [00:25:38]:
They process it on a central server. They create these models, and that’s how you benefit. Now the fact, is that we do have these insane supercomputers in our pockets. Right? Our phone is so much more powerful than the computers that were in the challenger, for example, that we used to kind of launch into launch into space. So we have these extremely powerful machines in our pockets. And what Netflix can do is they can instead of grabbing our data, this data can stay on our phone and Netflix can just send the intelligence. So Netflix sends the model to my phone locally. It updates.

Sandra Matz [00:26:13]:
It sees which movies I like. So it it improves my recommendations locally on the phone. And then it sends back instead of sending back the data, it just sends back the intelligence to Netflix and says, okay. Now I’ve learned something about which movies go together, but I’ve never seen the actual data. And so this is an incredible technology where you can keep the data safe, but you can still generate all of these insights. And the question that I oftentimes get is like, why would companies do that? But why would companies kind of not collect the data themselves, but instead go to technologies like federated learning? Right. And I think there is a very good argument to be made. Like, unless you’re in the in the business of selling customer data, then you probably you’re gonna collect as much data as you want no matter what.

Sandra Matz [00:26:57]:
But if that’s not the case, you’re much better off saying, well, I can provide the best service with the best personalization, but I’m not responsible I’m not responsible for safeguarding all of the user data. Right? If you collect all of this user data and Netflix is a simple example. Think about this in the medical space in the medical space, medical histories, genetic data. The moment that you collect this centrally on your server, that’s a huge responsibility to protect it. And we’ve seen data breaches kind of on the rise across the board. It’s extremely expensive for companies to deal with these security risks and both financially and reputationally. So if you are, again, not in the business of selling data, you’re much better off saying I can provide the same service, maybe even make it part of my value proposition to say, well, we are a company that gives you the same product without the risk of your data being out there instead of just hoarding it all in one place. And now you’re responsible, and now you have this huge risk of of data breaches.

Sandra Matz [00:27:52]:
So I think this is a trend that we already see playing out, and there’s some pretty powerful kind of players behind it. Apple and Google, they both developed some of these federated, ecosystem open source, by the way. And I think this is gonna be the future for many companies that have the best interest of consumers at heart, and they also wanna reduce the risk of some of these data breaches.

Dave Stachowiak [00:28:15]:
It’s such a critical way to think about this. And I just in our own organization, we’re a tiny, tiny organization, Sandra, and Yeah. We have the principle of let’s keep as little data as we possibly can about people just for that reason. And that’s easy for us because it’s my wife and I, and we have a few contractors. Like, it’s easy for us to have that policy and to implement it. One of the things you pointed out to me though is that a larger organization that maybe a CEO or an executive director or someone’s really aligned on good practices with data is also thinking about the future in systems because sometimes someone else steps in, the next CEO, the next executive director, the next whoever, and they have a very different opinion about how data should be used. Right? And so, like, think part of this is thinking through for the future. How do you design systems knowing that people are gonna change those seats sometimes?

Sandra Matz [00:29:11]:
Yeah. Exactly. It’s one of my, favorite examples from industry, and it’s coming from Apple. So, like, I have this friend of Apple at at some point told me about, the evil Steve test that apparently make their teams go through. And the idea of the evil Steve test is that they know when they have teams working on products that everybody’s excited. Right? So you’re putting out a new product, and most of the people there are really kind of they see this is the data that we collect, but we’re collecting it to make the product better, and here’s how we’re using it. Now that means that oftentimes, there’s, like, devil’s advocate that says, well, but wait a second. What would happen if this data was actually being abused now that we’ve collected it? It really hard to to play the devil’s advocate.

Sandra Matz [00:29:54]:
So what they do with the evil Steve test, they essentially make all of the teams go through this thought experiment of saying, well, maybe you have the the user’s best interest at heart right now with the way that you’re designing the product and the data that you’re collecting. But what would happen if tomorrow we have a different CEO? So that’s where the evil Steve comes from. We have a different CEO with completely different values. They’re not trying to help our users. They’re trying to exploit them to the best they can. Would you still feel comfortable collecting the data that you’re collecting today and setting up the system in the way that you’re setting it up right now? And if the answer is no, if the answer is well, and there is a lot of potential for abuse, and I wouldn’t want anyone who has, like, intentions that are opposed to what we are trying to do here. Get hold of that data, and then you go back to the drawing board and say, okay. Maybe there’s a way in which we can do this that doesn’t put our users at risk of the data being exploited in the future.

Sandra Matz [00:30:51]:
And I think it’s just such a nice thought experiment where you don’t necessarily have to have just a single team member that says, but wait. What about, all of the risks? Because usually that team member isn’t hugely popular. So it’s like a systemic systemic, intervention that you can do, that you can implement as a leader. And it says, okay. We’re collectively thinking through the risk, and we’re kind of bringing the future into the now, to see if we can do better.

Dave Stachowiak [00:31:17]:
Yeah. It’s really it’s so critical and just, like, so much of leadership is, like, answering the question of change. Right? And we are all called to have the responsibility to think of, okay, not just today, but, like, 5 years from now, 10 years from now, how do we put it in systems with the best possible practices that will do this so well? And fascinating. Just fascinating. I hope folks will get the book, mind masters. And to just get into the details of thinking about this, understanding how organizations are making decisions. And just on a personal level, like, I consider myself a pretty savvy person on, like, privacy and data, you know, and targeting and all that. And I still went in after reading the book and, like, went into my smartphone and started changing a bunch of settings just like on a personal level because how you think about this, like, really does show up differently.

Dave Stachowiak [00:32:03]:
And and speaking of showing up differently, I’m curious in putting this book together and doing all the research you’ve done over the last couple of years. I mean, so much is changing on this, of course. What if anything, as you’ve been doing this, have you changed your mind on in the last year or 2?

Sandra Matz [00:32:18]:
Yeah. It’s a great question. And it like, one of the questions that I think is related is and you mentioned it. Right? You going on changing the permissions on maybe some of the applications. I think I’ve become a lot more pessimistic just observing my own behavior in terms of how much we can do as consumers. So if you look to some of the data protection regulations in Europe and California, They all kind of did try to empower consumers by saying, well, we kind of advocate for transparency and control. So we explain to users and consumers what’s happening with their data, and then we give them the control to to manage it. And it’s a really, really nice idea in in principle, and I do think that we need control.

Sandra Matz [00:32:57]:
But just looking at my own behavior, it’s an impossible burden. But it’s not really a right to say, oh, now I get to read all of the terms and conditions of all of the products and services that I’m using, and I have to keep up with all the latest technology to see how I might be exploited by companies. And, by the way, it’s also a full time job because if you really wanna do it for everything that you use, you’re no longer gonna share a meal with your family. All you do is read through the terms and conditions and and manage permissions, and I don’t do it. Right? So I think about this topic almost every day for the last 10, 15 years. And I constantly catch myself saying yes to stuff where I was like, I just don’t have the time. I don’t have the energy to now manage all the cookies. I’m just gonna mindlessly say yes.

Sandra Matz [00:33:40]:
And I think it’s to me, it suggests that we just need different solutions. Right? That could be anything from privacy by design on a, regulation level that makes it a lot easier for people to to protect their data or something like federated learning where you don’t have a trade off between I want the service and convenience, but I also want to maintain a certain level of privacy and self determination. So I think that’s really something that’s changed in my thinking of I don’t think putting the responsibility on on users is gonna is gonna do the trick. I really do think that we need regulators to step up and business leaders.

Dave Stachowiak [00:34:16]:
Sandra Matz is the author of Mind Masters: The Data-Driven Science of Predicting and Changing Human Behavior. Sandra, thank you so much for your work.

Sandra Matz [00:34:26]:
Thank you so much, Dave, for having me.

Dave Stachowiak [00:34:32]:
If this conversation was helpful to you, 3 related episodes I’d recommend. One of them is episode 381, serve others through marketing. Seth Godin was my guest on that episode. He’s been on the show a few times over the years. I’ve been following Seth’s work for more than 20 years, and it’s where I learned first about the principle of permission marketing. Getting people’s permission in order to engage with them, have your organization engage with their messaging. You heard echoes of that in this conversation with Sandra and Seth’s message.

Dave Stachowiak [00:35:04]:
So critical on that as well. Episode 381, a good complement to the conversation today. Also recommended episode 521, the way to earn attention. Raja Rajamannar was my guest on that episode, chief marketing officer at Mastercard. He talked about many principles of data and marketing in that conversation. And one of the points that he made is him and his team at Mastercard think about relationships with consumers as affinity, not loyalty, at least not anymore. He talked about that distinction in that conversation, plus a lot more in the importance of engaging people in the story. Episode 521 for that.

Dave Stachowiak [00:35:42]:
And then finally, I’d recommend the episode with Marcus Collins, episode 664, the reason people make buying decisions. People, when they’re making a buying decision or a decision to engage or donate or go affiliate with an organization, there’s always a decision to connect with the brand, whether that brand’s a school, a nonprofit, or a for profit, or even a government agency. There’s a brand behind it. One of the points that Marcus makes in his work and in that conversation is it’s not really about the brand. It’s really not about you. It’s about how people view themselves. Such an important principle to consider when thinking about how to engage and to earn people’s attention. Episode 664 for that.

Dave Stachowiak [00:36:27]:
All of those episodes, you can find on the coachingforleaders.com website. And if you haven’t already, I’m inviting you today to set up your free membership at coachingforleaders.com because that’s gonna give you access to the entire library of episodes that I’ve aired since 2011. And there are several areas that you can dive in and be able to find exactly what you’re looking for. We’re filing this episode under marketing, also under social media, also under data and analytics. I mentioned that because there are a ton of other conversations in all of those categories inside of the episode library. It’s hard to find that on the apps, but that’s why we’ve built the website to really be able to support you to help you find exactly what you’re looking for right now. Go over to coachingforleaders.com, set up your free membership. When you do, you’ll have full access to that.

Dave Stachowiak [00:37:16]:
Plus all of the other benefits of free membership that you’ll see right there on the homepage. And if you’re looking for a bit more, you may want to find out about coaching for leaders. Plus, One thing I’m doing every single week is I am writing a journal entry. Short form takes just about 2 minutes to read, but it is my thoughts specifically on a reflection from one of the guest experts who’s been on the show or one of the situations one of our members has found themselves in recently. That’s what happened this past week. One of our members said, I’m about to move into a new role. I’m the heir apparent for this new leadership position, but because of the bureaucracy, it’s gonna take a couple of months for it all to play out. What do I do in the meantime? Do I start doing things? Do I start putting in initiatives? Or do I wait? Well, as we all know, there’s a lot of things you you can’t do when you don’t officially have the role even if you’re the person who’s going to have it eventually, but there’s also a whole bunch you can do in the meantime.

Dave Stachowiak [00:38:13]:
I talked about that in one of my recent journal entries. To find out more about that, also, all of the past entries that are database on the website, also searchable by topic, go over to coachingforleaders.plus. You’ll find out more about the entries and all of the benefits inside of Coaching for Leaders Plus. Coaching for Leaders is edited by Andrew Kroeger. Production support is provided by Sierra Priest. Thank you as always for the privilege to support you, and I’ll be back next Monday for our next conversation. Have a great week.

Topic Areas:Data and AnalyticsMarketingSocial Media
cover-art

Coaching for Leaders Podcast

This Monday show helps you discover leadership wisdom through insightful conversations. Independently produced weekly since 2011, Dave Stachowiak brings perspective from a thriving, global leadership academy of managers, executives, and business owners, plus more than 15 years of leadership at Dale Carnegie.

Listen Now OnApple Podcasts
  • More Options
    • YouTube Podcasts
    • Spotify
    • Overcast

Activate Your Free Membership Today

Access our entire library of Coaching for Leaders episodes from 2011, searchable by topic.
Listen to the exclusive Coaching for Leaders MemberCast with bonus content available only to members.
Start Dave’s free audio course, 10 Ways to Empower the People You Lead.
Download our weekly leadership guide, including podcast notes and advice from our expert guests.

... and much more inside the membership!

Activate Your Free Membership
IMAGE
Copyright © 2025 · Innovate Learning, LLC
  • Plus Membership
  • Academy
  • About
  • Contact
  • Dashboard
×

Log in

 
 
Forgot Password

Not yet a member?

Activate your free membership today.

Register For Free
×

Register for Free Membership

Access our entire library of Coaching for Leaders episodes from 2011, searchable by topic.
Listen to the exclusive Coaching for Leaders MemberCast with bonus content available only to members.
Start Dave’s free audio course, 10 Ways to Empower the People You Lead.
Download our weekly leadership guide, including podcast notes and advice from our expert guests.

... and much more inside the membership!

Price:
Free
First Name Required
Last Name Required
Invalid Username
Invalid Email
Invalid Password
Password Confirmation Doesn't Match
Password Strength  Password must be "Medium" or stronger
 
Loading... Please fix the errors above