ITIF - The Information Technology and Innovation Foundation

06/28/2024 | News release | Archived content

Information Technology Is Increasingly Critical and Increasingly Demonized, With Daniel Castro

Over the last several years, public opinion on technology and the use of data has shifted from excitement to skepticism to fear. Rob and Jackie sat down with Daniel Castro, Vice President of ITIF and Director of the Center for Data Innovation, to discuss the negative effect of techlash on human outcomes.

Related

Auto-Transcript

Rob Atkinson: Welcome to Innovation Files. I'm Rob Atkinson, founder and president of the Information Technology and Innovation Foundation.

Jackie Whisman: And I'm Jackie Whisman. I head development at ITIF, which I'm proud to say is the world's top ranked think tank for science and technology policy.

Rob Atkinson: And this podcast is about the kinds of issues we cover at ITIF, from the broad economics of innovation to specific policy and regulatory questions about new technologies. And today we're going to talk a little bit differently. Today we're going to talk about how information technology and data is increasingly critical to the world we live in, but at the same time increasingly demonized by a whole wide array of folks. And which side's going to win, the good side, the angels or the demons. That's what we're going to talk about.

Jackie Whisman: We have our first repeat guest. We think he's an angel, but I guess it's up to the audience to decide. Daniel Castro is Vice President at the Information Technology and Innovation Foundation, ITIF, and director of our Center for Data Innovation. He writes and speaks on a variety of issues related to information technology and internet policy, including privacy, security, intellectual property, internet governance, e-government, and accessibility for people with disabilities. That doesn't seem like all you do, but it's a long list, but we're happy you're back, Daniel.

Daniel Castro: Glad to be back.

Jackie Whisman: My first question is very important. How do you pronounce the word data?

Daniel Castro: Well, not surprisingly, I've actually looked into this quite a bit and the answer is, it kind of depends. So the word is plural, so you have words like medium and then media or forum and fora. So if you have one datum and then you have another datum, then you have data. But you can also have one datum and then another datum, and then you have data. So the debate continues.

Jackie Whisman: No.

Daniel Castro: On Star Trek they did say data, so that's the most popular pronunciation. That's the one we use. It's used pretty much everywhere except New Zealand. So I think we can go with data.

Jackie Whisman: Now that we have that very important thing out of the way.

Rob Atkinson: Now that we have data on this.

Jackie Whisman: No.

Daniel Castro: Data. In Texas, it's data.

Rob Atkinson: Data.

Jackie Whisman: Why don't we talk about the work of ITIF's Center for Data Innovation? It seems like over the last decade or so, public opinion on tech and the use of data in particular has shifted from excitement to skepticism and oftentimes downright fear. We've talked about this a lot on the podcast, and I imagine the shift has driven much of your work in recent years. Is that right?

Daniel Castro: Yeah, absolutely. I think if you look 20 or 25 years ago, tech was geeky but cool. So tech had a lot of promise. Nobody thought it was perfect. You had things like the infamous blue screen of death. There was the Y2K bug that people were all concerned about, and sometimes industry obviously over-promised and under-delivered. But by and large tech was seen as this force for good. So if you think about back in the Clinton years, the Bush years and even the early Obama years, tech was something that these administrations wanted to be associated with. Tech was seen as the best of America. It was something to celebrate and emulate around the world. But I think since then there's been this relentless assault on tech. And it's more than just, I think the pendulum swinging back a little bit. It's this open hostility. And some of this is driven by the media.

In fact, we had done this study a few years back that looked at how press coverage on tech has gotten more negative over time, and public opinion shapes public policy. So instead of focusing on common sense regulations to address genuine consumer concerns, like, how do we have better cybersecurity? How can we create a baseline set of privacy laws? We instead have proposals that are really focused on punishing tech companies for being large and successful. We're seeing policy makers really trying to get their pound of flesh, embarrass a tech CEO at a congressional hearing. And that's not, I think where we should be in this conversation around tech.

Rob Atkinson: Couldn't agree more. I think there's sort of two big issues here. One of the issues is that, and Daniel, you and I covered this for many years. We had a summer reading list for tech books for like, I don't know, six or seven years. And we'd have really good books on there, good meaning accurate and interesting. And then we'd have a list of bad books. These are the books you shouldn't read. And every year there were more and more of these, and they were basically just people writing these books because if they'd written a book that said, "Hey, tech is pretty cool, it's pretty good. It's going to make the world a little better," who cares?

But if you write a book that tech is going to make us stupid or tech is evil, then you're going to get your podcasts, your book is going to be covered in the New York Review of Books. So there was a lot of that going on, just people sort of cashing in on getting their cred. And then the other thing is as the tech companies became big, they became targets for shakedowns essentially. "Well, these companies are so big, we better go after them." But with all of that, we seem to have lost some semblance of reality. How do you see that?

Daniel Castro: Yeah, I think what you say is exactly right. There's been this vilification of the tech companies over time. I mean, you see it play out every time there's a discussion of tech. Mark Zuckerberg, because it's the guy that everyone likes to hate on and it's easy to go after him. But it misses the point that these are companies that have created enormous value for consumers leading to significant benefits across the economy and healthcare and education and government down the line. And it's just this easy target that we see. Whenever there's something wrong, well, why don't we blame it on tech? And we've seen this just again and again.

If anything's going wrong in society, then you can guarantee there will be an article asking, "Well, how did social media contribute to this? How did tech contribute to this?" And it's kind of the low-hanging fruit, but it misses the fact that there's so much more going on here. Discourages I think really talented people from going into the industry. I mean, if you think about it, someone who's just thinking about going to college this year, why would they want, if they're the best and the brightest, why would they want to go be part of a sector that is just vilified across America? It's like going to, they make it seem like you're working for a tobacco company or something similar.

Rob Atkinson: If you want to go to a vilified industry, at least you should make a boatload of money. So they're going to go to hedge funds and be Quonset hedge funds. You make like 10 times more than even working in Silicon Valleys. I actually had a little cartoon in my office and it had a picture of a bus and then it said, "Luddites unite. Go to luddites.com." But literally I saw somebody a while back had a Facebook page that was opposing Facebook. Okay, that makes a lot of sense.

Daniel Castro: Yeah, we've seen these campaigns against targeted ads, for example, where you go through the list of members of parliament who are supporting of these campaigns to ban targeted ads, and they're using targeted ads for their fundraising.

Jackie Whisman: Well, I'm always frustrated about when we're having conversations, both work conversations and socially, that these fear-based conversations don't ever really acknowledge the clear social benefits that data does provide. You alluded to it, but I'd like to go back there and talk more about it mostly because I'm having dinner tonight and I could use some of these data points for my arguments, if you don't mind some examples.

Daniel Castro: Sure. I mean, I think the most important thing is better data leads to better decisions, right? So, and this is both in your personal life at the business level and at the society level. So you can make smarter decisions, faster decisions, you can automate more decisions. And we see this, as I mentioned, in virtually every sector. So you think about healthcare or health and prescription data. So using this data, you can identify problems like the opioid epidemic sooner. You can figure out where there are problems, where there's over-prescription. You can look at problems like emerging diseases and figure out where is it going, how is it being transmitted? You can figure out how to get the right resources to the right places. You look at something like college, so college data, there's always these questions students have about where should they go to school, what makes sense? There's questions right now around student loans and what students and parents should be putting their money towards.

Right now, you don't have very good data to make a decision about where you should go to school, what kind of financial decision that will have on your future, what degree should you get at that school. More data on over time, basically where people go to school and what degrees they get and then what they earn, how effectively they pay off student loans. That type of data could be really empowering for individuals to make better decisions. More recently, one of the things we saw during the last couple of years is around mobility data. So there's always this question about, "Oh, our cell phone's tracking everywhere you go."

And there is a lot of data from different apps that collect in aggregate where different populations of people are moving. That data has been incredibly valuable in understanding the pandemic, its impact on businesses. It's been very useful in understanding, for example, when refugees are coming in from different countries, you look at understanding where they go, where resources need to go. That data has been instrumental in understanding that. These are all uses of data that have enormous societal value, but instead it's usually just portrayed as, "Oh, big government or big brother, these big tech companies are just stealing your data because they want to sell you more ads." That's really not what all is going on here.

Rob Atkinson: Well, the other thing is I think we should have a rule that says if you are against technology, big tech companies and big tech tech, you should be required to give up five applications that you never use again. And one of them would be any kind of mapping system. So anytime I go anywhere in my car, which isn't all that often because I ride my bike. Even riding my bike, I use an app. But anyway, I would say, okay, no more Google Maps. Or you could have the Google Maps, but it won't have traffic on it. Good luck with that. So we just went on vacation and we use this pretty cool app that we use all the time called Vrbo, V-R-B-O, Vacation Rental by Owner, and it's fantastic. We found this great place, we saved money, but how would you possibly have done that in the past?

You'd maybe look for some catalog and you'd call people. So I agree, Daniel, there's all these really fundamental things at that kind of societal level, but just the simple things that we do in our lives. One thing I would like you to just talk to for a moment, because another thing that really drives me crazy about all this, you see it almost all the time in the paper, is that these companies are selling your data. And what they're selling almost all the time is they're not selling your data. They're selling somebody's ad to put in place in front of your eyeball. So when I get an ad for some kind of, I don't know, from Proctor and Gamble or Ford Motor Company or whatever, they don't know that it's Rob Atkinson looking at the ad. What they know is that somebody's looking at the ad who for some reason the algorithm thought would be a good fit for that ad. That's all they know.

Daniel Castro: Exactly. And that's where I think there's so much confusion, right? Because if all you hear is that companies are stealing your data, they're selling your data, you never get any of that nuance in these articles that make these claims. And that's where it's almost insane that you have newspapers that are doing fact checking, and yet they repeatedly put out these claims that are just completely and factually wrong. And it gets to the fact that, of course, these are complicated ecosystems in the end, or the ad-supported internet involves a lot of different players. Someone will create a service that will work with a mobile app that allows a mobile app developer to easily monetize their app. And so there's a lot of data exchange going on in the back, but you're right, at the end of the day, they're not selling consumer data. They're only selling information that says, "This user is interested in buying shoes, so show them a shoe ad." They're not really selling anything else.

Rob Atkinson: One of the things, I've been in a lot of these soirees, meetings where this comes up and I point out, I ask people, by the way, how much money do you pay monthly for your Gmail account or your Facebook account or Google or whatever it is, and obviously the answer is zero. And then they always say, "Oh, well, I'm paying with my data." Well, no, you're not. But then most of the people then say, "Well, I don't mind paying. I would pay it." I'm like, "Well, that's because you're an upper middle class, highly compensated professional who if you wanted to pay 50 bucks a month for each of these apps, it would be no skin off your nose." But what about the tens of millions of people who are making median income and below? Asking people to pay 250, 400, 500 dollars a year for these things? That's a huge, huge imposition. And so that really gets into this whole question of digital opportunity, and then also what I know you and your colleagues are working on this whole question of the data divide.

Daniel Castro: Absolutely. And also, there's been some studies that when you ask people, not only do they want to use these things, but how much would I have to pay you not to use it? Those numbers get quite high. This cost of, "Well, how much would I have to pay you not to use Google Maps for a year? How much do I have to pay you not to have email for a year?" It goes up quite high. And so the value to consumers is quite high. The cost to them is quite low, and that's why so many make this trade-off. And yeah, you mentioned the data divide, something we've been doing a lot of work on, and that's around this question of for so many people, the biggest issue for them is not privacy. It's not that too much data is being collected about them, it's that too little data is being collected about them.

And because of that, because either they or their community or someone they're part of is not represented in data sets, that either they're underrepresented or invisible in these data sets, they're missing out on opportunities. So maybe they don't have good information about the sidewalks in their neighborhood so they don't get very good walking directions if they want to try and navigate somewhere. Maybe they don't have very good environmental data collection in their neighborhood. So when decisions are made about what kind of intervention should be made, they don't do anything there. Maybe there's not good health data collected about their community, so health research isn't going where it should. And so these are really important issues. And some of these privacy laws that are maybe well-intentioned end up having this hugely negative effect because it's limiting data collection, it's discouraging data collection because of the narrative around here.

And the end result is that you're keeping very useful tools and processes out of the hands of people who can use it. A very concrete example of this came out a few years ago with a nonprofit that was created called inBloom, where the Gates Foundation had funded this to basically help schools do backend data processing of their students so they could connect new various types of apps to their student data systems. Provide different types of analytics to the teachers, help parents understand if their students were doing well, where they might need additional tutoring. All these opportunities, basically no cost to these schools, but because there was concern about data being stored in the cloud and what that might mean for privacy, they shut the entire thing down. And that's that kind of tech-lash that we talk about that has these real world implications.

Rob Atkinson: Well, also to me, it stems from this view where each individual citizen sees the world through, "Okay, it's my data and I don't want anybody to see it," but they're much less likely to see the world through the lens of, "Oh, there's this data system out there that would help us all." So inBloom is a great example of that. "Well, maybe I don't want my kids' data to be there, but boy, I'm not thinking at all about how much better the education system could be if it was being run on data analytics." I'm reminded of that event we did years ago with Catherine Tucker from MIT and her colleague from University of Virginia, where they looked at data privacy limitations on health data by state.

And what they showed using a regression model was that states that restricted data sharing had higher rates of Black, low-income infant mortality because you just couldn't share data when you needed to between clinicians and hospitals and the like. That's a real world impact. So we hear all this stuff about data is almost seen as your fundamental core right. We should never, ever share it. And yet if we do share it in ways that are respectful of privacy, including using de-identification, there's enormous collective societal benefits that we will all benefit from, especially as you point out in terms of the data divide, making sure that underrepresented or lower income populations can also benefit from that.

Daniel Castro: And that's why we're seeing even in Europe where they had the General Data Protection Regulation, which they've kind of held up as this gold standard for data. They've recognized it didn't work so great, and now they're trying to pass all these new laws, the Data Governance Act, the Data Act, and a few others where they're basically saying, "Wait a second, we need to figure out how to make data sharing work because it's not working right now." They saw it during the pandemic. They've seen it in many different applications. And as technologies like artificial intelligence, machine learning become more important, they realize they don't have data, they'll never succeed in the algorithmic economy. And so they need to fix these things. And that's a lesson for a lot of countries, including the United States as just thinking about doing its own privacy law.

Rob Atkinson: Well, this is great, Daniel. What's your prognosis, I guess? It seems to be we're really on this cusp between moving to a world that is vastly more intelligent. I mean, I think of healthcare, healthcare is essentially not a very intelligent system. If we knew so much more about the drugs people took, all the interactions, the health outcomes, all that, we could really, really drive significant improvements in healthcare. Same thing in education, same thing in logistics. Are we going to eventually get there? These are just bumps in the road, or something more serious?

Daniel Castro: I think we eventually get there, but I do think these bumps have real world consequences. When we're delaying the development of new drugs because we have laws that are not allowing people to share their health data, when we're having delays in bringing better education to schools because we're not able to deploy analytics in the schools, I mean, these are real world consequences. So I think the delays matter, whether it's a five-year delay or a 10-year delay, I don't think we'll know yet, but it's significant. I think we do eventually get to the point though where so much of this technology is pervasive enough that it's hard to have opposition to it because it's there.

Kind of like we don't see as much opposition anymore to computer chips. They're just out there. We don't see the opposition to RFID. It's just out there. Eventually, the algorithms and AI are out there enough that they've moved on to the next thing. But I do think it's a problem if we have continued technological resistance. And so hopefully the work that we're doing around these issues will help educate the public and policy makers how to embrace innovation in a realistic way in the future.

Rob Atkinson: So my second favorite podcast besides Innovation Files is the Pessimist Archive. It's a great podcast because it just goes through all this, these similar kinds of things. There was panics over the Walkman, there were panics over teenagers reading novels. There were panics over females riding bicycles. I mean, you name it. Almost every technology has been demonized. And then people get used to it, like, "Yeah, let's move on to the next one." So let's keep our fingers crossed that that's what's going to happen here. Daniel, thank you. That was great. Really interesting discussion.

Daniel Castro: Thanks for having me.

Jackie Whisman: And that's it for this week. If you liked it, please be sure to rate us and subscribe. Feel free to email show ideas or questions to [email protected]. You can find the show notes and sign up for our weekly email newsletter on our website itif.org. And follow us on Twitter, Facebook, and LinkedIn @ITIFdc.

Rob Atkinson: Yeah, don't follow us though if you care about your privacy because clearly Facebook, Twitter, LinkedIn, and all the other ones just sell your data, and you'll never have any privacy again. So we have more episodes lined up. Great episodes, great guests. Continue to tune in.

Jackie Whisman: Talk to you soon.