Surveillance Won’t Protect Students

Chris Gilliard

Notes

Paris Marx is joined by Chris Gilliard to discuss the push to expand surveillance technologies in schools during the pandemic and in response to school shootings, and why they’re making life worse for students without addressing the problems they claim to solve.

Guest

Chris Gilliard is Just Tech Fellow at the Social Science Research Council at a recurring columnist at Wired. Follow Chris on Twitter at @hypervisible.

Support the show

Venture capitalists aren’t funding critical analysis of the tech industry — that’s why the show relies on listener support.

Become a supporter on Patreon to ensure the show can keep promoting critical tech perspectives. That will also get you access to the Discord chat, a shoutout on the show, some stickers, and more!

Links

Transcript

Paris Marx: Chris, welcome back to Tech Won’t Save Us.

Chris Gilliard: Oh, thank you very much for having me. It’s always an absolute pleasure.

PM: I always love having you on the show, of course. And I feel like the topic that we’re discussing today is in line with some of the things that we’ve talked about in the past. Our first conversation was on your work on digital redlining to a large degree. And part of that had to deal with the filtering that was happening at a college that you were teaching at when students were trying to access things that were online and do their research. And then more recently, at the end of last year, we talked about all of these surveillance devices that are being pushed on us increasingly, in particular in our homes, and really what that means for how we understand the home and what that actually means for our communities more broadly and things like that. And recently you wrote a column for Wired, looking at the push to increase the surveillance in schools, and what is driving that. And I think that this is a really important topic, especially in this moment where there is such a big discussion about schools when schools are so impacted by the aftermath of the pandemic, we’re still talking about the impact of school shootings. And really, this is ongoing for several decades now that conversation; it’s certainly not a new thing. And so I guess I wanted to start with a broader question. And then we can dig into many of the specifics that you’ve talked about, both in your piece and more broadly on Twitter. Why is this topic of school surveillance — if it’s not an obvious question or an obvious answer — why are you so concerned about this topic? Why is this something that we should be paying more attention to, and really wanting to push back on?

CG: Well, I think it’s not obvious. It’s a pretty layered question. And hopefully, I could provide a layered answer. I think one of the things I try to get people to think about when we think about this is the degree to just how important it is. It’s a pillar of a functioning society, whether or not you can protect the most vulnerable, the youngest children, we’re in a lot of ways failing at that. I think there are a lot of ways that people are hoping to address it. And one of the things that happens, I think, is that barring an answer, or barring the political will to do things that we know might work that other societies, other countries have done, people are grasping at anything they think will result in safer schools and safer children. And so what we see is a lot of kind of tech solutionism applied to these issues. I’ll be fair to companies in a way I’m usually not, when I say that there are a lot of people looking for solutions. There are also some people who are just in it for the cash grab. And they often promise things about their technology and its ability to ferret out danger or possible school shooters or to predict certain kinds of behavior and things like that. They make all kinds of wild promises about what their technology can and can’t do. And institutions, schools are desperate for solutions. Because again, it applies to an essential element of society, which is keeping children safe. People will do almost anything to do that. And they’ll glom on to anything that seems like a solution in order to move forward.

PM: It makes perfect sense. Obviously, people are seeing, in particular, when we’re talking about the mass shootings at schools, people are seeing these children die. And at least for some people, there’s a really kind of inherent reaction that we should want to keep these children safe that we should not want to have this happen in schools. But I guess part of what I wonder is how has this conversation evolved over the course of a couple of decades now, because obviously, the most recent shooting in Texas was not something that was entirely new. There’s a long history of this going back to Columbine, and I’m sure even before in the United States. So how has that conversation about dealing with this in schools actually evolved over that time period? Has it always been this kind of response?

CG: Well, that’s an interesting question. And I’m not sure I know fully the answer to this. I do know that in many ways, as someone who does a lot of research, looking at surveillance and the history of surveillance, that the way this has evolved is the way that a lot of technologies in the last 20, 30, 40 years or “solutions” in quotation marks have evolved, which is the notion that more surveillance is somehow going to increase the level of safety. As we’ve had more technologies available at somewhat cheaper prices and things like that, more cameras, more things that people are calling artificial intelligence, more machine learning, more systems that can be installed on young people’s computers. As these became things that have become more available, they’ve been pitched more and more as some kind of solution to this problem. But again, to go back to that history of surveillance, there’s a very long standing assertion by advocates of surveillance, that more surveillance equals safety. That has not borne out. There’s a lot of examples we can use whether we want to look at the prevalence of CCTV in the UK, or body cams, police body cams in the US, or the level of surveillance in the United States. After all, it doesn’t equal more safety, but people often feel like it does. And I think that gets to part of your question, which is why these systems have become so prevalent in schools in the United States, and not only in the United States, but that’s where most of my focus typically is.

PM: Absolutely, I think it’s an important point. And I think you’re completely spot on. When I was reading your piece, I just kind of thought about some of the things that have been added to schools or that I’ve read about being added to schools in the past decade or two. In response to this desire to want to try to stop school shootings or find a way to stop them. Have they really been successful? I don’t think so. Things like ID cards or adding onsite security people, whether that’s police or someone who’s just hired to be security at the school, probably someone who has a gun as well, because of course, one of the slogans of the NRA is that you need a good guy with a gun. The metal detectors in schools in order to try to find someone bringing a gun into the school, clear plastic backpacks that some of the students have to wear. So you can see the contents of the backpack security cameras, of course, as you say, with CCTV being rolled out. And I’m sure there are many other examples. And I don’t know if you have any thoughts on that, or anything that you that you’d want to add. But it does seem to be there’s this long history of trying to add these different layers of security or policing or surveillance to the school. But really, that doesn’t address the inherent the fundamental problem.

CG: Absolutely. We saw and we continue to see that after some of the more recent shootings, there was an increase in talk about hardening schools using military language, in terms of adding. The way to think about it or the way I think about it — I don’t want to use this term, I need to talk about it in the ways that people are talking about in order to kind of tease it out — making it a more difficult target, when they talk about hardening. And the ways that allow these people mean, essentially means turning schools into prisons or military installations. And again, that gets to the thing that I mentioned early on, which is — how shall I say this? — it’s a very sad indicator for where we are, as a society that the proposed solution to making schools safer is to essentially turn them into prisons, or into military installations.

PM: It also makes you incredibly sad, just to think about the school is supposed to be, at least if we imagine it to be, this place that is welcoming of students where they can learn, to explore different ideas, learn to be creative, explore their different passions, and figure out what they’re interested in in the world and learn about the world. And then to think about that kind of a space that should be playful and creative and fun and engaging and enriching. And then to think it increasingly being turned into, as you say, a kind of prison atmosphere, where students are constantly being surveilled, where they need to feel worried about what they share, even on their devices, or whatnot, and someone at the school might see that and see it as an example of them being a potential threat to the school. It just seems completely counterintuitive to what we want a school to be. And then I also wonder, when we think about how this affects different people, is this something that is just happening to people in public schools, but then wealthy people who go to private schools and things wouldn’t have to deal with something like this?

CG: Yes, so Center for Democracy and Technology just did a really interesting study about school surveillance systems, and especially some of the software that’s put on kids’ computers, and things like that. And one of the things they noted is that I think something like 50% of the students who they talked to said that they did not feel free to express themselves freely and openly when they knew that they were being watched. This number increased when we’re talking about queer and trans students or people who are in some ways exploring their gender or their sexual identity. And so the bedrock of how we think about schools, and what the most kind of fertile ground is, in order for people to learn, one of the first things that you have to take care of is people feeling safe. And by hardening these schools, and by increasing the level of surveillance, I think it doesn’t do one thing, and it does the other it does another thing that people are trying to not do, I think, in that there’s no evidence that these things actually make school safer. The independent studies that I’ve seen suggest that they do not, in fact. It makes certain segments of the population of schools less safe empirically. But in addition to not doing that, not doing the thing it’s supposed to do. By adding this constant surveillance, you’re making people feel less safe, not more safe. And so it not only doesn’t do the thing, the most important thing, and the thing that boosters of this technologies in these practices and behaviors, propose that it does and not only does that but in some ways it makes the situation worse because people do not feel young people do not feel free to express themselves and do not feel safe at a place where that’s one of the most important things that you can have if you want to have learning and growth.

PM: It’s so concerning, necause it’s the complete opposite of what you actually want, how you want the students to feel and what you want the school to be. Obviously, there are renewed proposals for the expansion of this kind of surveillance in schools in particularly after the recent shooting in Texas. Can you walk us through what the vision for this kind of hyper-surveilled school that is supposed to be super safe as a result looks like? What other measures do these people want to see added to schools and that they claim are going to make the school safe?

CG: Well, if I could paint a picture, it’s going to be very extreme, or it’s going to sound very extreme to someone who doesn’t follow along with this tack and these narratives. But essentially, it would look like a prison or a military installation. There might be guards out front; there would be metal detectors; there’d be cameras everywhere, but not only to get in, but there’d be cameras inside everywhere; there might be microphones that detect sounds; these cameras would maybe feed into some fusion center; they’d be equipped with facial recognition; there would be some kind of what people call artificial intelligence, studying people’s movements, their biometrics, their gait, in order to predict some kind of threat. There might be some additional layer of some kind of drone, or it could be a robot dog that would seek to intervene in the case of a shooting. But also, there’d be software, there would be, as I mentioned, these systems would probably be equipped with some kind of threat detection, but it’s also software and systems that go on students computers, that look at their emails, and their messages, and their social media, and little notes, they type in Google Docs, and…and…and their browsing habits. And so when I paint this picture, it might seem as if I’m being hyperbolic, but many of these things already exist, and are widely deployed at schools.

And many of these things have been proposed — and credibly proposed — by companies who sell this stuff. And what I mean by credibly is that these companies often have the pull, the influence to make these claims into reality. And so an example I would give is that the CEO of Axon — briefly, I might add; this was pulled off the table after why backlash for now — but the CEO of axon proposed the idea of equipping schools with drones that were armed with tasers. Extra backup, if people are not familiar with Axon, they’re most known for being the company that makes a lot of police body cams, and produces the taser. And so he proposed that we start installing in schools drones equipped with tasers that would be released in the occasion of a potential school shooter, a school shooter that could be released from their housing, fly to the site of the threat, and distract and disable the potential shooter. He even composed graphic novel with this future that he envisioned. There was wide backlash, including much of the civilian board of Axon resigning, because he had proposed it to them. And many of them told them was a bad idea, and he moved forward with his plans anyway. But when this became public, there was widespread backlash, and he eventually kind of pulled that off the table for now. I don’t know that that is permanent.

But one of the things the CEO of Axon said, and I’m paraphrasing, and I think it’s really important — so this is not a direct quote — but he said something along the lines of: I felt like we needed to do something. And I think this is really important, because I’m going to take him at his word for a moment, in that he’s looking at the situation, which we all agree: each time it happens is an atrocity. So he’s taking a look at the situation and feeling like he needs to do something. And part of the other thing that he said is that the gun issue was kind of a non-starter. And so bluntly, part of the reason it happens here in the US, in ways that it doesn’t happen as often or as in the most gruesome ways elsewhere, is the ready availability of guns. And so even the CEO of Axon sees the gun issue as something kind of intractable in ways that he’ll reach to what many people see as a dystopian fantasy, in order to try to alleviate a thing, which everyone agrees is on atrocity and needs to be stopped. It’s a very gruesome picture, overall, to think about sending our children to these places in order to learn to send them to these hardened institutions, where they’re constantly surveilled. But then they’re also constantly Seville one thing I left out, they’ll also be surveilled, depending on who they are, and whether or not they can afford their own device. They’ll also be constantly surveilled at home, and this might even extend to other family members who may use that that device because they don’t have their own. I’ll leave that there for now; we could get into some more stuff. But I’ll leave it at that for now.

PM: There’s plenty in there that I want to pick up on. I’ll start with this, though, because I think it’s really important, and I think it gets to something that both you and I talk about a lot in our work. And that’s when we’re talking about the issue of gun violence in schools, it’s inherently a political issue. It’s an issue that requires a political solution. Because as you say, it’s happening, as much as the NRA and people like that would like to say differently, because there are so many guns, and because guns are so easy to access in the United States. That’s really the driving force behind all of this gun violence. And so you need a political solution in order to deal with that problem. But because there’s not the political willingness to do that, these other solutions, techno-solutions, in particular, especially more recently, step into that void, and say: The politics isn’t going to work, or you don’t need the political solution, maybe if you’re someone on the right wing, and instead, we’re going to solve this by saying: All we need to do is put these new technologies into the school, and that will make everything safe, and that will solve the problem.

CG: Absolutely. I mean, it’s right there in the name of the podcast — the tech will not save us in this instance, in most others, for that matter. And again, I think there’s wide agreement on this. If there’s a thing that would make children safer, most people agree across a very wide spectrum that it’s a thing that we should do. But there is no evidence that these things actually do make children safer. And that is the sort of dichotomy about the situation is that people are clamoring for solutions. Tech companies are offering solutions; there’s no proof that they actually work as solutions. And again, they make people less safe, particularly some of the most marginalized and vulnerable populations within a set of students.

PM: It seems like in that case, those solutions then, whether it’s adding an armed guard, or whether it’s putting in metal detectors, or whether it’s adopting these security cameras, or these AI systems, or what have you, not only offer a tech solution, or some other policing solution, in place of a political solution, but also give politicians and other groups kind of the cover to say: Look, we don’t need to change the gun laws or the Second Amendment or what have you. Because we have these other ways to solve the problem. And every time one of these supposed solutions is shown to not work, there’s just a new one waiting in the wings to be implemented instead to further this kind of dystopian trajectory of schools so that you never have to actually look to the political problem and the root of the problem.

CG: Absolutely. I’ve made this observation many times in that a thing that you can consistently notice about surveillance is that each time it fails, what’s offered is that more deep and detailed surveillance is needed. So it’s never that the tech can’t do a particular thing, that it is inherently not going to do that thing, it’s that we need more of it. The failures of surveillance are always met with calls for more and deeper surveillance, more data, no more cameras, improved AI. In an environment where people say we have to do something, it seems as if it’s a solution. What I run into, what I even ran into when I was writing that Wired piece, is the excuse or the refrain of: It’s better than nothing. And I actually don’t know that that’s true. Actually, I don’t think that it is true. It’s not better than nothing, because it appears or it offers the illusion of a solution. But again, it does a thing that makes certain students less safe. One of the things I referenced, that Center for Democracy and Technology study, one of the things that they found in their research is that, I think, 70% of the time that these systems were used for discipline, not for safety. And so this is a thing, again, that we see when we think about surveillance creep in hese systems. The claim is that they’re there to make students and the institution more safe, but one of the things they’re actually used for is to perform their more carceral function, which is cracking down on students.

PM: I think there’s two really important points there. First of all, that, especially in a moment after a mass shooting, I think it’s difficult to be the person who says: No, we shouldn’t have more surveillance in the schools, because the accepted wisdom is that this is going to make things safer, even though the evidence often shows otherwise. And then these systems are presented in a way such that they are supposed to be targeted at the shooters, the people who are making the school unsafe or making the school a threat. And so we need to deploy these technologies so that we can identify these and keep the schools safe. But then when we actually look at the implementation, those technologies are not pointed at the shooter, because the shooters are rare and it’s hard to predict them, but are actually turned against the students themselves. And so they are subject to this more carceral, more punitive system, and the school becomes a place that’s not about learning and being yourself and discovering things, but where you feel less safe and surveilled and that you can’t be yourself.

CG: Absolutely. And it’s only ever going to be this way. And I know that there’s probably an air of tech determinism, or it may sound that way when I say this. But I frame it a different way. I’ve said in the past that surveillance always finds its level. And what I mean by that is that the nature of surveillance, not the technology, but the nature of the practice of surveillance is that it is an attempt to exert control. And so we get controlled through discipline, or the claim is that we get control through discipline. And so that when these systems are put into place, they are instances of a particular set of ideologies about control; they are going to be used in ways that attempt to control students. And what that looks like often is discipline. And, again, when I say discipline, people might think that what I’m talking about are things that lead to violence in schools. But it might be something like dress code; it might be something like eating in the hallway; it might be talking loud; all these things that are an attempt to control students that often have very little to do with their actual safety when they’re at school.

PM: It’s such an important point. I want to pivot a little bit to something else that you mentioned, because we’ve been talking about the physical infrastructure of the school to a large degree, and maybe software that’s implemented within that physical infrastructure. But you also talked about the real surveillance of the students themselves in the devices that they receive. And I think that this became something that was particularly noted by people or really came to people’s attention in a significant way, especially in the first stage of the pandemic, when there were locked downs, to some degree, at least, and many students were not in school. And so they were given devices to be able to continue their schoolwork and their learning at home. But those devices had software installed on them that allowed the school or the teacher to be able to see everything that the student was doing on the computer. So can you talk to us a little bit about that and what it actually looks like and what it means for the students to have these devices that can see everything they’re doing?

CG: There’s a wide range of technologies that fall into that category. And I should note, too, that we’ve also seen this, that the last couple years, has also shown us an uptick in the use of that for people in the workplace as well. And I don’t think this is an accident. They’re parallel for a lot of reasons. But these systems, they possess a wide range of capabilities: seeing everything that a student is looking at if they’re browsing, the ability to turn on a student’s camera on or off, the ability to shut down a browser if a student is looking at things that the instructor doesn’t think they should be looking at, the ability to monitor the student’s social media and messages and who they’re communicating with. And along with that, often these these systems are either tied to either third parties, who are poorly trained individuals who are monitoring children — and they’re not in any way trained or experts at this — but also machine learning systems that attempt to look at the ways that teens talk, and communicate and figure out whether or not those patterns or those communications pose particular danger. And I think audiences who have listened to your podcast will be well aware of some of the difficulties in machine learning trying to predict things.

I think there’s an added layer of difficulty — I would say a degree of impossibility — when we’re talking about young people who are constantly shifting the ways that they speak and talk and communicate, partially because they want to do those things outside of the watchful eye of adults. But also, frankly, outside of the watchful eye of machine learning systems and platforms. And so there poses a great potential for false positives. But also, we got to think a little bit about what it means to be young, what kind of experiences you’re going through, what kind of growth you’re going through, what kind of experimenting you might be doing. And think about how that doesn’t match up with how certain parties do or do not think about safety. And so I’ll be very explicit when I say this: young people are in the process of figuring out their sexuality, their gender; they are often looking for information on sexual health; they often might confide in people about those things; they might confine their peers about those things. There are large pockets of populations who don’t want students to have access to this information. And think it is a danger to them to even express those ideas. I am not part of that segment of the population.

I think that it’s a necessary and important, and I would argue, an essential element of being young is being able to get information about those things, being able to experiment, to wonder, to confide. And the systems potentially rob students of all of those things. Todd Feathers did an interesting piece in the Markup about a month ago. And he reached out to some of these companies and asked them if they in their heat list or list of blacklisted terms and things like that, if they included things about, say, sexual health. Most of the companies denied it. I don’t mention the companies’ names, because I’m not absolutely convinced that they’re telling the truth. And as we’ve seen with the overturn of Roe, that what a particular company is doing at this moment, doesn’t necessarily speak to what they will do when the laws change. And so I think it’s very likely that one of these systems will do is start looking at those terms to and notifying parents, notifying teachers, notifying police when students exchange certain terms. I think in a lot of cases that can be very dangerous.

PM: I think it’s a really important point that you’ve outlined. And just to note for international listeners who might not be more familiar, when you’re referring to Roe, that’s the overturning of abortion rights at the federal level in the United States, which likely means and has already started to mean, that access to abortion, that the right to abortion has been rolled back, has been criminalized in some US states. And so that could very clearly impact some of these students who might be looking for abortion information on these devices that would be surveilled in this way with the softwares. You also mentioned, I believe, was in your piece maybe was a different piece that I read on Wired, that there was a student who was also outed to their parents as a result of the software’s and, you know, it’s in some cases that might be perfectly fine for the parents might be cool with it. But in other cases, that could be a real problem.

CG: It’s not an uncommon occurrence. This is another thing that came up repeatedly in that study I referenced from Center for Democracy and Technology that there is often, I don’t have the stat in front of me, but these systems, these surveillance systems that are placed on student devices have a very high potential to out students. There’s a very high potential for involuntary disclosure of people’s gender and sexuality in ways is that students did not consent to.

PM: What are the other things that stood out was that in many cases, these notifications are not even sent to the parents, but are actually sent directly to the police and reported to the police. And that can obviously have some very concerning, very harmful interactions once the police get involved.

CG: Absolutely. And we know, there have been multiple studies that talk about the ways that there’s a higher danger to certain populations. When police are involved or school resource officers by a large margin, Black girls are over policed in schools. But that also applies in Black girls most severely and most often, this also applies to black boys to gay and queer and trans teens, and young people. And so, again, it’s really important to note that not only do these systems not do the things that they claim to do, but they also pose a significant risk and create significantly more risk for students who are already marginalized in some sense. The other thing I would add, and this is getting away from students for a second, but I don’t want to understate the connection between this and employee monitoring. I bring up this example quite often. But one of the biggest companies that sells this kind of technology as part of their promotional materials, what they bragged is that they could anticipate, predict and prevent organizing within the population of teachers. Now, they later deleted this, so there’s a reason I don’t say which company it is. But put it in the show notes; it’s on record.

PM: I can certainly add that link.

CG: They openly articulated that one of the things they could be used for is to prevent union activity amongst teachers. And so when I talk about surveillance systems, I try to always point out that while it’s true, these systems are often going to be used to control and surveil particular populations, and that the harms are going to fall the earliest and the most often on the most marginalized; this is true. But the other thing that’s true that I always encourage people to think about is the ways in which these systems will be eventually turned on everyone. And so if we think about the ways that it’s going to harm students, and they are currently harming students. So we can think about that. But I also encourage instructors and professors and administrators to think about the ways that the systems will also be turned on on them as well.

PM: It goes back to your writing with David Golumbia on luxury surveillance, and who is actually subjected to the surveillance and who thinks they’re not until it actually turns on them, as well, which it very often does. And I’m happy you brought up the point about the teachers as well. It’s shocking to hear that a company would — maybe it’s actually not so shocking — that they will come out and actively say they want to stop you in organizing and will offer you a tool to do that. But especially when you think about the context of teachers in the United States, where I would say in many cases, they’re already treated quite poorly, especially in a lot of public school systems, very underpaid, in many cases, not unionized, have a lot of increasingly kind of draconian restrictions placed on what they can say to students or talk to students about without possibly seeking or being subjected to some kind of punishment or even prosecution. I’ve been seeing some reporting lately that, in some states, they’re having trouble even finding teachers now. And it just seems not surprising seeing how things are advancing.

CG: And I think it’s always relevant and important to think about surveillance systems as an attempt to exert control. As a parent, but just in general, I think it’s wrongheaded to look at young people and think what your goal is, is that you should control them. Not even speaking as a parent, but just having been a young person at one point myself, I think that the idea that you’re going to control them or that you should control them as a means of keeping them safe, or making sure they develop welll and properly, I think is the wrong way to go about it. But there’s all sorts of things that come with that, too. So that because there are systems of control, that it’s not going to just stop with students. These mechanisms are also going to be an attempt to crack down on the workforce and control them in ways, even outside of things like curriculum, but control their ability to have an influence on their workplace and things like that. So, it’s pretty dismal.

PM: Absolutely. It also brings to mind, I remember reading what David Noble would write about technology and the development of technology and how it was so focused on the development of technology was kind of shaped by both the need to make profits for companies, but also the need to increase control in particular of the workforce. And so when you can see the ways that these technologies are moving, it’s almost natural that that is ultimately how these things progress. Because that is kind of built into the incentives behind the development of this tech.

CG: Absolutely. I think this extends to a thing we haven’t talked about, which is remote proctoring systems — that these are also, whenever we talk about school surveillance, methods and technologies that need to be included in that discussion. Because they do that thing, too. They’re also an attempt to exert control. And as I try to note, whenever I talk about them, that they’re not, the end goal is not just that they’ll only be trained on students, and by trained on what I mean is focused on. The end goal is not just that they’ll only focus on students, but that they will result in being focused on instructors, teachers, professors, as well, in an attempt to control them, again, whether that’s. People may be familiar with a viral example from sometime in the pandemic, where a school put out a contract that said: You had to prove you were not taking care of your child, while you’re also teaching. Now, I’m forgetting some of the specifics. But essentially, the school was saying that you couldn’t both be teaching online and have your child in the room. And so we’ve seen all kinds of instances of this. And I think people tend to think of them as isolated incidences rather than the canary in the coal mine, so to speak, that I think when we see these things, we need to recognize that, again, they may start on a particular population, who’s more vulnerable, who has less ability to object, but they’re eventually coming for almost everyone. And I so wish that more people would recognize that.

PM: Hearing you describe that also kind of forces me to think back to the promises that were made about these technologies back in the day. They were going to be personal liberation, and we’re going to be about freeing the individual and all this kind of stuff. And it was all going to be fantastic; we were going to have this great Utopia offered by these digital technologies, and the internet. And then just to increasingly actually see how these things are implemented in our lives and the great divide between that future that was sold to us in order to kind of get us to buy into this privatized vision of of the Internet and digital technology, and then to see how our society is increasingly shaped by this surveillance by these monopolistic forces that have taken advantage of it, it leaves me quite angry and frustrated.

CG: One of the CEOs of one of the proctoring companies said, when he was quoted in a piece: We’re the cops; we’re the police. And so this a very perfect distillation of how they think about their product, and the carceral nature of it. And so it’s disappointing to me as well. But as someone, I know that you are you’ve studied the history of this stuff very well. And so right, we saw this coming. It’s just been very hard to sort of be the person or the group part of the group of people who say to people: Hey, this is what’s going to happen! Hey, this is what’s going to happen! Hey, this is gonna happen! And then it happens. And I don’t want to be reduced to an ‘I told you so.’ But much of this is exactly playing out in the ways that people had predicted and have been predicting for 50 years.

PM: Absolutely. And it brings to mind like Dan Greene’s work as well. And he was on the show last year talking about his book, and just how at the moment that the Internet is rolling out in the United States, there’s also this gutting of the welfare programs and this expansion of the carceral state alongside of it. And I guess you can see that built into what we’re talking about now. And these kinds of trajectories and things just continuing, and we still see it today. I think there’s a final point on what we’re talking about with these technologies that I want to mention before we kind of start to close off our conversation. And once again, it’s really the classes and that is built into this right, as we’ve already kind of talked about. But just to really cement the point for the listeners, there are certain students who are very reliant on these devices, and even their families are reliant on these devices that the schools are giving them, and everything they do is surveilled. And even if they plug their phones into the computers, what happens on their phones can also be picked up by these systems and seen by the schools. And then on the other hand, you have wealthier students, we’ve already said how private schools might not be subject to this to the same degree, but then if you’re more well-off, you’re likely not using the school-provided devices once you leave the school. And so you’re also not being subjected to this surveillance in the same way. And so it very much has a differential impact. Some people are being impacted much more than others because of their class position.

CG: Absolutely. There was a story that ran in The Guardian about three years ago. And it was about a set of a group of parents in I think, the Virginia DC area, who decided that they were going to petition the school, and the US tech companies to, at the end of the year, delete all the data they had on children. And it was DC, Virginia area. So you can imagine lots of federal employees, probably people from federal law enforcement, judges, politicians, things like that. And they were able to go to the school and say: Do this, and go to the tech companies and say: Do this. And the school and the tech companies did that. They deleted the information that they had on these children. Now, think about having the juice to do that. But also, think about the difference between that and what you’re just referring to — that the students are having massive amounts of their data extracted, used for who knows what, maintained in some cloud in perpetuity. And so there is such a chasm between those two kind of groups of people and the ways that these products and processes are used to deploy against them.

PM: I appreciate you outlining so many different aspects of this for the listeners to really show how harmful it is to have these systems continue to roll out. As you said, these are not just things that are ideas, but are being installed in many schools in the United States, and I’m sure other parts of the world are starting to grab onto them as well. The United States will be the testing place, and then others will follow suit. What do you think might be the next stage of this? Do you see this further entrenching? And what tech might they try to put into schools next, do you think?

CG: So unfortunately, I see in some ways it is going to get worse before it gets better. And I would love to be wrong about this; I hope I’m wrong about this. One of the things we’ll see is that as more and more people return to school in person, or as we move to our next stage of dealing with the pandemic, and unfortunately, often that means ignoring it. But our next stage of dealing with it is that a lot of the tech that was ramped up and purchased during the pandemic is now going to be used for its more surveillance purposes. And when I say that, I don’t mean disease surveillance. And so an example is that for a time when people thought that it was essential to, say, to take people’s temperature before they entered a building, and there are lots of companies that are selling these kiosks and things like that to schools. So they consistently try to upsell with schools with not only temperature detection, but facial recognition. And so now that these devices are in schools, they can say: Oh, well, you don’t have to put this thing in a closet; you spent all this money on it, and so now you can continue to use it, because it has facial recognition. So we’ll see a lot of that.

We’ll see that any pandemic technology or technology that was particularly deployed in an effort to stem the pandemic, we’ll see that used for kind of its other surveillance purposes. I think, as long as there is this persistent myth about the abilities of artificial intelligence and machine learning to predict things, as long as people are able to still perpetuate that myth, we’ll see the systems move further and further down the line of prediction technologies that claim they can identify potential threat. Before that threat manifests, I think we’ll see a lot more of that. The other thing is, I think we’ll see a lot more of petitioning and approaching social media companies for the messages and the communications of students, whether that be geofence warrants, and things like that, or as we’ve seen really recently, the requesting or warrants to Facebook about messages that people have exchanged. I’m unfortunately, often, in the position of telling people why something’s bad, or how it’s going to get worse. And again, I’d love to be wrong. But I think that that’s that we’re going to see more of that in the near future.

PM: Sadly, I think you’re right. And I remember talk of how these technologies that were adopted to deal with COVID could then be later turned in a way that was not their original purpose, but allows them to be entrenched and continue to be used because this investment was already made. And unfortunately, this is one example of that. And to close our conversation, maybe on a more hopeful note, if there is a more hopeful note to be found, do you see any kind of positive steps toward opposing these technologies? Or do you see good ways to try to stop this development? Or is it really, until there’s the willingness to have a political solution, we’re constantly going to be served with these techno fantasies of surveillance that are just going to keep making things worse and worse and worse?

CG: Well, I don’t think any of this is inevitable. And one of the things I think is really important. So I was approached by a group of students who are working on this stuff. And I think young folks — in direct opposition to prevailing myths about whether or not they care about privacy and in how they use technology — young folks are keenly invested and aware of how these systems work, how they don’t do the things that they’re supposed to do, how they make them less safe, and many of them are advocating and organizing against them. I think that’s really important because it speaks to the ways in which they’re resisting and speaking out against things that are supposedly done in their name. And so that is a place where I have tremendous hope. There’s the ways in which young people are saying: No, no, this isn’t for me, and it doesn’t work for me, and it doesn’t work anyway. It’s unfortunate too. I hate to pin hopes on young folks, because that sort of inverts some of the — I mean, they have agency, which is a thing I’m glad they have — but it’s unfortunate to pin hopes on them to create a better and more workable society. That should not be on them. But unfortunately, in some ways it currently is. And that is a place where I have seen a lot of action.

One other thing I want to go back on is a thing we didn’t mention, I’m sure you’ve seen, and I wanted to make sure we got to is the ways in which these systems like the Pasco County example, for instance, where they did their own sort of version of predictive policing, but they were targeting children, and looking at things like attendance, and grades, and whether or not there had been violence in the homes and using that to predict, in their words, which young folks were likely to become criminals. This is their language, not mine. And what they did with that information was to then harass those families in the hopes that they would move out of that area. The Tampa Times did a Pulitzer winning series on this; they got this the Pasco County Sheriff’s Department got federal money to develop their own predictive policing algorithm, aimed it at students and used it to harass students and their young folks and their families in hopes of driving them out of the area. And I think it’s important again, when I mentioned these things, they’re not one-off’s. So often we see the worst episodes of the worst instances. And people are able to dismiss these things as one-off, whether it’s a remote proctoring system that can’t see students and so therefore, it shines bright lights in their faces. Students who fail the Bar because of the wonky technology; students who are or have some kind of forced disclosure about disability, or gender. These are not one-off’s These are the ones that we hear about, but the nature of these systems is that there are many, many, many more that we’re not hearing about. And so, when we hear about these, it’s important not to think about them as isolated incidents. But as part of a pattern, that as we have these systems, more and more deployed against students will happen, increasingly.

PM: I think it’s a really important point for you to make. And even though there’s that hopefulness to see the students pushing back against these things, it really shouldn’t be on them to have to lead this charge and lead this fight in order to push back against technologies and systems that are just inherently oppressive and harmful, and that shouldn’t exist in our society anyway. And so Chris, I always love to have you on the show. I’m so happy that you came back on to discuss this with us. Thanks so much.

CG: Oh, it is again absolutely my pleasure. Thank you.

Similar