Lately

Yes, your boss is tracking you

Episode Summary

Lately, our bosses are going further than reading our emails. New technologies that can track our motions and our moods are ushering in a new age of workplace surveillance. Is this productivity hacking, or counterproductive micromanagement?

Episode Notes

Lately, our bosses are going further than reading our emails. New technologies that can track our motions and our moods are ushering in a new age of workplace surveillance. Is this productivity hacking, or counterproductive micromanagement?

Our guest, David Murakami Wood, is the Canada Research Chair in Critical Surveillance and Security Studies and a professor at the University of Ottawa. He joins the show to walk us through recent mind-blowing advances in employee tracking technology and whether all this surveillance actually makes workplaces more efficient. He also explains why he didn’t get a cell phone until two years ago.

Also, Vass and Katrina undergo theoretical brain surgery.

Subscribe to the Lately newsletter, where the Globe’s online culture reporter Samantha Edwards unpacks more of the latest in business and technology.

Find the transcript of today’s episode here.

We’d love to hear from you. Send your comments, questions or ideas to lately@globeandmail.com.

Episode Transcription

Vass Bednar [00:00:00] I'm Vass Bednar and I host this Globe and Mail podcast, Lately.

Katrina Onstad [00:00:03] And I'm Katrina Onstad I'm the show's executive producer.

Vass Bednar [00:00:07] Katrina, have you been watching season two of Severance?

Katrina Onstad [00:00:11] No, I saw season one. But I don't have my apps in order, and I've not  been able to access a second season, so I want to.

Vass Bednar [00:00:17] Okay. I'm glad you're generally familiar with the show because our introduction is predicated on it.

Vass Bednar [00:00:22] It's so great. I think even if people haven't seen the show, they sort of have caught on to the gist by now. Right. So the concept: Lumon Industries, an employer where employees have gone through a somewhat mysterious procedure that separates their brains -- their selves right, into to their innies, who are their more formal work selves and then their outies, their home selves. And since you are familiar with the show, I was wondering if if H.R. offered this service, would you go for it?

Katrina Onstad [00:00:54] Yeah, I think maybe. I mean, I definitely had jobs where that sounds very appealing. I like the idea of hitting a switch and never thinking about work after 6 p.m. I realize this is not the message of that show, by the way, but it's the end of the struggle for work life balance. And so would you. You wouldn't do it. You never did. I know you wouldn't.

Vass Bednar [00:01:13] No, I wouldn --

Katrina Onstad [00:01:15] You love work!

Vass Bednar [00:01:15] That's a separate episode on my relationship with typing, but I could maybe activate like my weekend self or something like that, but it doesn't really appeal to me. I find it hard to separate my work self from my my other self, whoever she is. But that's between me and my A.I. Chat bot. Handing over your brain to your workplace is actually not just science fiction for streamers. Right. Brain monitoring on the job. Hate to break it to you is here. We're not quite in the space of implants and surgeries, but wearable neuro technologies like there's this smart cap that monitors employee fatigue. I worry that I would really light that thing up.

Katrina Onstad [00:02:00] Yeah.

Vass Bednar [00:02:02] Another example Apple is working on AirPods that can monitor your brainwave activity.

Katrina Onstad [00:02:08] My gosh. Wow. Yeah.

Vass Bednar [00:02:10] So maybe eventually your boss will know how fast you're picking up a new skill or how excited or not excited you are about that new project. And I don't want to seem like a conspiracy theorist, but I'm telling you, all of this leads straight to Elon Musk's neuralink.

Katrina Onstad [00:02:27] Yeah. Yeah. Okay, so explain this connection, because I keep hearing this too.

Vass Bednar [00:02:30] So, okay. Neuralink is that BCI brain computer interface implant that he's been developing, right? It's like a coin sized brain chip that's surgically implanted under your skull, and it can monitor and stimulate brain activity using electrical current. So the promise and kind of the best possible application is that it can read signals from a paralyzed patient's brain and then transmit that data to, say, a phone or a computer. That means the patient can control devices with just their thoughts. Right? No swiping, no typing. And they can, you know, do things they weren't able to do before.

Katrina Onstad [00:03:09] There's obviously extraordinary medical possibilities here. And it was recently greenlit by Health Canada for a clinical trial here, focusing on helping people with quadriplegia. Great. But there's also some speculation about whether or not this could end up in the workplace. Right. Imagine this. Elon Musk in our brains. And maybe that's just the inevitable leveling up, because less dramatically, we are already being monitored. Write our emails, websites, chats, phone calls. You can pretty much assume that everything you do or write or say on company time and across their devices, especially every slack you compose. No. Yes. Cars are often about your little obsession with cleantok videos.

Vass Bednar [00:03:48] That's very healthy.

Katrina Onstad [00:03:50] Every one of those is delightfully being reviewed and probably, you know, they can and they might read it. And I is just going to allow employers to get through a greater volume of employee output. And we know this, we don't like it, but for the most part, we seem to kind of passively accept it.

Vass Bednar [00:04:05] I think sometimes people don't fully appreciate kind of what's going on or what's possible on that back end. But their neurotechnology market is here and it's growing. We are using the definition that says neuro tech is any technology that interfaces with the nervous system and it's estimated to grow to $21 billion by 2026. And these technologies are expected to completely change workplaces right now. Management has always argued that surveillance tech holds this promise of improving productivity. But what about the dystopian invasion of privacy part?

Katrina Onstad [00:04:42] Yeah. Yeah, exactly. Where is the Privacy Commissioner on all of this?

Vass Bednar [00:04:46] You know, the bar is pretty low. They don't really have our backs quite yet. It turns out that Ontario is the only province that mandates if a company of more than 25 employees. So already pretty decent size. Is using a surveillance tool. All it needs to do is disclose that to workers.

Katrina Onstad [00:05:04] Right. So we might not know or might have been in small print in our contracts. Exactly what kind of surveillance is at work, right? Yeah. So it seems like there isn't much to protect employees from this, quote unquote boss. Where.

Vass Bednar [00:05:16] You know, boss, where makes it sound so much sassier than it is to make sense of the promise and peril of this increasingly sophisticated surveillance. We reached out to David Murakami Wood. He's the Canada research chair in critical surveillance and security studies and a professor at the University of Ottawa. I was pretty productive during our interview.

Katrina Onstad [00:05:37] I know you are. Because I had the implant. Tell me.

Vass Bednar [00:05:42] Well, Big Brother called and he wants his spyware back. This is lately.

Vass Bednar [00:06:03] David, why did you wait until just two years ago to get a mobile phone?

David Murakami Wood [00:06:08] How did you know that?

Vass Bednar [00:06:10] I've been surveilling you. 

David Murakami Wood [00:06:11] Yeah. I didn't have a mobile phone for most of my life, and it was partly because of surveillance and partly because I was a surveillance researcher. And I was well aware of the, you know, of the of the potential hazards involved in this. I mean, for most of the 90s before I was a researcher, I was an activist for environment, peace and social justice. So I was well aware that not only could I be watched and could I be monitored, but I had been in the past. I was part of a group that won damages against the British government for illegal surveillance of their activities. So, you know, I was well aware of those things and I've had to give in in the last couple of years for personal reasons. And I now have a mobile phone that I use it basically as a phone. I mean, I don't use any of the kind of things on my phone that many other people would use. And I have the most basic model of my phone that you can still get and still link up to any networks. But yeah, I mean, this shows the problem that if you want to reject and if you want to stay outside, it's now almost impossible to do so with the realities of life. Across the whole range of our living spectrum, whether it's personal to workplace, you know, it is now almost impossible to stay outside of surveillance and say, I reject surveillance and I'm going to go somewhere else. So I had to give in.

Vass Bednar [00:07:32] How common is surveillance at work?

David Murakami Wood [00:07:36] Well, it's probably used by almost all workplaces in some way. One of the things is that we don't often recognize the existing forms of workplace surveillance that are already there, predating the digital. So we have a massive surge of digital techniques, new techniques of surveillance, and people are very worried about them. And they should be. But let's not forget the you know, almost every aspect of the workplace as an institution is set up to be a surveillance entity. Right? This is exactly what capital has always wanted is and managers have always wanted is to have complete control over people to ensure maximum productivity. Right. So the first thing is we shouldn't forget that all workplaces are fundamentally controlled by surveillance, and that's the way it's always been.

Vass Bednar [00:08:23] You mentioned some of the newer technologies in this space as well. I noticed that CNBC reported some huge employers like Walmart, T-Mobile, Starbucks are actually monitoring their staff's Slack messages. Right? So not necessarily reading them in real time, but using computer software, using algorithms to monitor them. And they're monitoring them. And this is a quote to track employee sentiment and toxicity. That's at least what the creator of the program said about it. What are corporations trying to do with this information?

David Murakami Wood [00:08:57] Well, I think there's two fundamental things they're doing. One is that many of these corporations, especially in the US context, but also in Canada, are fundamentally interested in either preventing or limiting the power of unions and unionization. So they want to stop workers organizing together, but also have any kind of criticism of the company or workplace conditions. And this, of course, has been only accelerated by the kind of the platform tech boom, because many of these companies, especially the CEOs, have shown themselves to be utterly opposed to unions as an idea. Elon Musk is one example of this. He's, you know, completely and utterly opposed to the idea of unions in any shape or form. And many other tech CEOs have similar views to that. So that's the first thing is it's a union busting move. The other one is that many of these companies are, you know, interested in trying to predict essentially how employees will react to certain things. This includes workplace transformations, changes to working practices. And they can see they argue in advance how people are going to react to these things, whether there's going to be discontent, where there's going to be trouble. And even on a personal level, how individual workers, you know, are seen to be overly negative, for example, or, you know, making statements that might be seen as even if not directly critical, just damaging to some kind of idea of workplace morale or productivity. And of course, this is incredibly subject of an incredibly skewed towards the side of what management thinks is an appropriate or otherwise attitude in the workplace.

Vass Bednar [00:10:32] The other thing that this kind of technology can do is attempt to make inferences about your mood based on the tone of your written messages, right? So now we know even if we don't really like that, you know, our bosses could maybe read an email or a slack message that we send through through a company system. But now it seems like it's not just what you say, but how you say it, both in written form and allowed right through voice. Why are we treating this layer like it matters?

David Murakami Wood [00:11:07] I don't think we should and I don't think we are when we talk about ourselves. I think I think this is another way in which bosses think they can access some kind of hidden self that we can we can keep away from their conventional forms of surveillance. So they think if we can elude all these more superficial forms of surveillance, somehow with these newer forms, they can get below the surface, they can start to dig in and find out what the real truth is. I mean, we saw this in policing decades ago in the 90s and 2000s, we started to see the rise of new forms of biometric monitoring connected to CCTV. So facial recognition has been the next big thing for the last 25 years, right? This is not new. I wrote an article about facial recognition in 2003. I think about the dangers of facial recognition. It wasn't new then. I wasn't the first person to be writing about this. And yet we're still getting the same things, the same thing, saying that facial recognition will solve these problems. All these new systems attached to it will solve all our problems. At that time, the FBI in the USA were pioneering what they called micro expression recognition. What they were saying, saying is that the other tiny sort of muscular tells that would be available by close analysis of the face that again could tell whether somebody was about to do something. Somebody was you know, was was basically doing something against what they were saying or what they were expressing in other ways. And they argued this was like something you couldn't spoof. These were like subdermal cues that were moving tiny. Groups of muscles around the eyes and things. Most of this has been almost completely discredited. You know, and in fact, microexpressions recognition and micro expression analysis is a very is regarded with the same kind of, you know, skepticism as lie detection in many ways now, which is an equally supposedly scientific method that was shown to be completely bogus in the sense that it was just measured stress. Right. And people can be stressed for all kinds of reasons. So I think the key thing with a lot of this stuff is just because it's being automated, just because we're having a layer of A.I. added to it, the magic fairy dust of A.I. is being sprinkled on it. It doesn't mean that suddenly the underlying science is sound. It's not. In fact, it gives hides the fact that the underlying science is bogus to begin with. You know, you don't get to look at any more as the classic black boxing A.I. creates another black box around the existing black box, you know?

Vass Bednar [00:13:29] So why spy on your workers in that way? Like, why try to be listening in to all sorts of things that people are saying at the workplace?

David Murakami Wood [00:13:40] Paranoia, fear. I mean, fundamentally, managers are afraid that they will lose control. You know, this is one of the fundamental dynamics of power relations in any situation. Lewis Mumford, the great scholar of urbanism, famously said that, you know, that cities were an expression of the paranoia of kingship. The idea of the creation of cities was designed to concentrate people in one space. That immediately creates a kind of paranoia that your subjects will rebel, will overthrow you. You know, they can talk, they can discuss, they can organize amongst themselves. And workplaces, you know, are the same kind of entities in that sense. They are spaces in which there is utter control, but there is also incredible paranoia amongst those in control that they will lose control. And, you know, it's not just that these people are like dictators in many cases. We're talking about middle management. We're talking about people whose own jobs depend on them having a sense of control and in fact, demonstrating a sense of control, demonstrating increased productivity, demonstrating these things. The managers themselves are also subject to metrics and data fixation. And so they passed that on down. You know, it goes down the chain.

Vass Bednar [00:14:49] Maybe we could talk about the evolution of surveillance at the workplace a little bit more. I know you were telling me it's kind of always been part of the fabric of work, but it's becoming more more digitized now because employers have sort of always watched workers. Should we treat this as something that's new?

David Murakami Wood [00:15:08] I don't think we should ever treat it as something that's normal. It's not new, but it shouldn't be regarded as normal. You know, pre-industrial conditions meant that workers were often very much more independent. In many cases, especially, you know, people who worked in the weaving industry, for example, back in the 18th century were relatively independent. They could control their own production. And these are the people who became what, you know, popularly known as Luddites once, who resisted being forced to move into factories, resisted the centralization of their work. And this wasn't about them being anti-technology. It was precisely about them losing their independence, losing their ability to control their own work. And this is exactly what surveillance is designed to enforce, that sense that people are not in control of their own work, that they are subject to management dictates, they are subject to orders, and they have to buckle under and obey orders. And, you know, you saw with the pandemic just how much this was true. So when people were forced to move to remote work, then you saw an increase, a massive increase in digital surveillance, the imposition of spyware on workers, computers, you know, active monitoring, all these kinds of different systems. A lot of them came in during the pandemic on the grounds that basically managers were no longer in the same space and therefore were not able to sort of more physically control workers. So you had to turn to these new digital technologies is fundamentally that that really demonstrated that same kind of idea that this was always about the sense of control.

Vass Bednar [00:16:36] You mentioned spyware and that monitoring kind of moving. And it does sort of seem like what we're talking about now is a migration of practices, right? Sort of something that we became used to as consumers being tracked online, being watched kind of passively. And that's moving into the workplace, right? Now, employers are sort of collecting data from us or from workers in the same way that a company like Amazon collects information about shoppers. Does that spill over sort of sound right to you?

David Murakami Wood [00:17:06] Absolutely. And it works both ways. And what I mean by that is that not only are the same kinds of practices that were developed for market research and for surveillance of consumers being deployed on workers, but also the kind of information that bosses are collecting about workers is exactly the kind of information that used to be considered private and in the realm of the domestic or the personal. So, you know, when we were talking earlier about things like sentiment analysis, this is something that, you know, would in the past and probably a. Hope still is what have been regarded as private information, as personal information that's outside the realm of the workplace and of what bosses are legitimately allowed to know. But we've seen an increasing move of companies into the private sphere of workers. And I think this started probably with drug testing in the workplace. So this is something that US companies frequently deploy. And it started obviously in industries where there was, you know, genuine danger of people being impaired. So, for example, if you're operating a forklift truck in a workplace or appraising, you know, anything you could drive or heavy machinery. There's a real concern about impairment, right? Whatever that is, from alcohol or drugs. But US companies have seen an expansion of that into many other industries, and especially here, testing of workers is something that expanded in the 90s and 2000s. And there's been several court cases about the viability of hair testing, not least because hair testing is incredibly, first of all, unreliable. But also in the case of some particular narcotics and substances like, for example, cannabis, it stays for a very long time in hair and actually can also remain in hair from other passive sources. So it's an incredibly unreliable method. And this is the problem with many of the newer methods that get used is that the reliability of many of these methods is not generally assessed in practice. And so after it's been implemented, we have laboratory tests, We have basically unrealistic conditions under which these things have been tested, in many cases, even just pure simulations in the case of digital surveillance, but actual testing the workplace, is it right? The place was introduced is the test.

Vass Bednar [00:19:15] So then as subjects here, why do you think we we tend to sort of accept even begrudgingly that companies are grabbing onto our data when we shop or browse online? But then we tend to recoil at that same practice when it's applied in the workplace.

David Murakami Wood [00:19:31] Because we have no choice. I mean, fundamentally, even with online shopping, we know we have other options and in some ways it's treated as sort of frivolous as a consumer activity, something that's not serious. But, you know, the whole point about the workplace relationship is that that's your life. You know, if you don't have a job, you don't get paid. That's everything.

Vass Bednar [00:19:49] One of the ways I've seen the implementation of this technology dressed up is in the name of efficiency. Do you think that surveillance is actually efficient? Do these technologies make people more productive, keep them more engaged with their tasks at work?

David Murakami Wood [00:20:06] Well, the evidence seems to suggest, and there's been a lot of research about this, that in fact, that is not the case. You know, it would seem in commonsense terms to be obvious that more control equals more productivity. You know, you remember the classic movie Modern Times with Charlie Chaplin working on the production line. And the idea is that, you know, the supervisor stands behind them and make sure they continue to work in a very productive and efficient way. But even, of course, in that early example, way back in the 1920s, 30s, everything starts to go wrong. Things don't end up being more efficient people. Object. People fail, systems fail. And if there's one rule about any of these things is that almost all the expectations about efficiency are not borne out by the research. And in fact, it's actually worse than that. In many cases, these things are actively counterproductive. They make workers feel paranoid. They make workers feel like they're under watch and therefore react against it in various ways. Workers find new and ingenious ways to spoof systems. They find ways to drag their heels, to sabotage, to kick back. And there's been some excellent research on this kind of thing. You know, right. Going back to James Scott's fantastic work on everyday forms of peasant resistance, you know, which he showed how peasants in Malaysia and the colonialism didn't fight back by forming, you know, kind of active resistance. But they did it by passive forms, by just making things slower, by dragging their heels by little acts of petty sabotage. And. And this happens everywhere. And the more surveillance you put on people, the more people react against it and and find ways to counteract it and to resist it.

Vass Bednar [00:21:45] So where do you see that current resistance taking place? Either, you know, place based through algorithms, through computer systems to where is it happening?

David Murakami Wood [00:21:54] Well, funnily enough, I mean, you mentioned Slack earlier on. One of the ways and one of the reasons why the bosses want to monitor the slack and discord and other kinds of, you know, co-working tools is precisely because those were the places or have been the places where the sort of shared stories and resistance and tips have, you know, about how to kind of avoid surveillance, how to do things in ways that seem to be conforming but are not are actually shared. One of the classic ones, of course, is when keystroke monitoring was introduced, especially in back offices. Right. So we're talking about, you know, things like customer support centers and things like this. Keystroke monitoring was used to check that workers were actually using the keyboard a certain amount. The early versions weren't very sophisticated. And basically you could spoof them by just randomly hitting keys while you were doing something else. So you could be chatting with your coworker, drinking a cup of coffee and then just hitting keys with your fingers like the cat in the in the in the video, you know. And that worked quite well until bosses, of course, worked out this was the case. And then you have sort of more active forms. So then you introduce secondary forms of surveillance to check that the primary form of surveillance is being conformed to. So when you have keystroke monitoring plus maybe, you know, eye recognitions types of software that check that your eyes are looking in a certain direction. So then, of course, work is work out that you can also spoof, that you can look at the camera and still do this and still be talking to someone else. So you have to record conversations and so on and so on. So I mean, this is one of the other sort of, if you like, laws of surveillance is that every form of surveillance fails in ways that gives rise to a new form of surveillance to make sure that the first form either works or, you know, is supplemented.

Vass Bednar [00:23:42] We've been talking a little bit more about white collar jobs and the application of surveillance. We do know that surveillance has been more common or maybe most common in blue collar jobs. I'm thinking of Amazon workers and those infamous monitored bathroom breaks. Gig workers on apps like Uber Eats having their delivery speed kind of constantly tracked and their route. Is this stuff true? Is it important? Why is it still happening?

David Murakami Wood [00:24:11] You know, I'm a relatively privileged, older male white researcher. You know, I'm exactly the kind of person who has nothing to hide and nothing to fear. Right. Because most of the surveillance capacities are not targeted at people like me. But I mean, you know, fundamentally, bosses don't trust workers. They don't think workers are going to voluntarily give their all to the company. And they're quite right. They won't. So, you know, they actually have to introduce measures by which they think workers will conform in voluntarily. And Amazon is one of the best examples of this. It's not just the things they have introduced. They have they hold all kinds of patents for different technologies that they could introduce, some of which are far more dystopian than what we've seen them actually using in practice. You know, they also had that famous incident where they tried to ameliorate this by introducing a kind of confessional box where you could actually enter the box and kind of like escape. They had an escape box, essentially. They introduced into some of the facilities where you could actually have temporary respite from surveillance and observation, basically. Some people described it as like a screaming box.

Vass Bednar [00:25:15] Was, my God, go in.

David Murakami Wood [00:25:16] There and and scream and escape for a second. But, you know, that just shows to some extent that even they recognized the extent to which they've gone, how far they've gone in workplace surveillance. They even felt they needed to provide workers with like a tiny space they could escape from this surveillance. It's, you know, it's insane.

Vass Bednar [00:25:34] You've said that we've done pretty well. But I think on the productivity measures we haven't and that, you know, our need for more productivity could be the Trojan horse for for more of these technologies. At the same time, I have to wonder if maybe we're overthinking it, right? Like if you're on company time and if you're doing nothing wrong, you have nothing to hide. Does that sort of implied logic hold for you?

David Murakami Wood [00:26:01] I don't think so. And I think, you know, that doesn't hold in any circumstance. I think that's a recipe for mistrust. And it's the it's the slippery slope. And once you've gone down there, you end up with a surveillance society in which, you know, surveillance is always the answer because you never trust anybody. Right. But I think the fundamental question is a key one. The idea that productivity in that sense is, first of all, the measure, we should judge these things by, but also that productivity isn't necessarily increased by repressive measures. Right. So we know from research that the best thing for increasing productivity is reducing the five day week to a four day week. We know this, right? We know that in Scandinavian countries or in France where they tried these measures, productivity is not decrease. In fact, people are happier. People report higher levels of satisfaction and they actually work harder in the times that they are at work. And this actually more than compensates for the fact that there's a loss of a day. All of those countries also have better welfare systems. They have better minimum wages. They have better support for workers in the workplace. So, you know, I think the answer to most of these things is not to go down the American neoliberal roots of cutting support, cutting state benefits, cutting regulation, and then using, you know, basically force and surveillance to increase productivity. The answer is actually to go back to an idea that workers have a stake in society and in companies and to give workers greater control. Pay workers more. Give workers more time off in general to improve the lives of working people. And I think you'll find that then people actually are more inclined to work harder, you know, in a way that's more voluntary and that's what you want in the end.

Vass Bednar [00:27:42] I don't know if you know this about me, but I am pretty into speculative fiction. And when I'm not reading books about this kind of stuff, I've been reading about facial recognition as part of performance reviews at work. Right. And this technology is sort of here it's often referred to as emotion. I write a new technology. Can you talk a little bit more about this evolution? Because I want to know when my resting bitch face became a career liability.

David Murakami Wood [00:28:13] Well, exactly. And for those of us who don't communicate in a conventional way and have slightly non neurotypical mannerisms, which I don't. Basically, I've always been told that I should look at people when I talk to them. And, you know, there's this whole idea from American policing that if you don't look at the person you're talking to, you're lying or untrustworthy. And it's one of their classic tells of a guilty party. And then I spend all my time looking elsewhere. I find it very difficult to concentrate on looking at somebody directly. But you can probably tell, right, this we're doing that right now. So even when I'm paying close attention to somebody, I'm listening to everything they're saying. I'm looking all over the place and I'm watching squirrels outside as well. Is there as there is right now, It doesn't mean I'm not paying attention. But, you know, if you if you take the kind of classic policing sort of view that makes me guilty, that makes me potentially a criminal. And so a lot of these a lot of these technologies, it's not just about the technology themselves. It's about the things on which they are based, the sort of pseudo psychological research on which the assumptions that would derive from this are based. A lot of this is based on really untested and assumed sort of forms of pseudoscience about how what you can read from human faces. It's all also based on a really normative idea of what human faces are and what they do and the person behind those faces and how it's manifest in their faces. So people who are non neurotypical in all kinds of ways and, you know, we tend to assume that means autistic or on the autistic spectrum, but there's a whole variety of non neurotypical types of person. You know, I'm I live with mental illness. I have you know that in my life and it means I'm not exactly the same. I don't react the same I don't facially react the same as some other people. You know, and these things are all built in these assumptions. And these ideas about normativity are built into facial recognition systems, which then claim that they can judge not only how people think now, but how they might think in the future. It should be treated with the same contempt as we now treat phrenology. And you know the ideas of bumps on your head determining your future. This should be treated with the same complete contempt as that.

Vass Bednar [00:30:21] I was reading about some surveillance tech at the workplace that's really safety oriented, like a high tech helmet worn by people working in a mine or a wearable. For firefighters, that kind of measured smoke exposure and just sort of wondered, is there an upside here? Like is surveillance at a workplace ever appropriate?

David Murakami Wood [00:30:45] Yeah, I think I think these kind of environmental forms of surveillance, they fit into that category of, you know, caring forms of surveillance that actually are about worker safety. In the case of things like, you know, contaminant detection or gas detection in mines, you know, this is you can't say this is a bad thing, right? If people don't die in mines, it's a good thing. It might be a form of surveillance, but not every form of surveillance is necessarily repressive or taking away workers rights. This is fundamentally about the interaction between the environment and the worker. Right. And it's protecting the worker from environmental hazards. That's a good thing. If it's used to reduce other forms of supervision and monitoring that would otherwise have protected the worker, then it begets more problematic. So sometimes some of these technologies are used to actually reduce the numbers of people working in an in a particular environment. So you have very highly high tech equipped workers who can do the job of, say, 2 or 3 other people that used to be in this industry. And so you get like kind of a reduction in worker numbers. So it's usually to be a justification for rationalization. You know, again, productivity. But of course, let's not forget the other side of the productivity debate is always getting rid of workers, right? If you can make people do more for less, you can get rid of the numbers of workers you have. That's also a productivity gain as far as the company is concerned, is certainly a profit gain. So, yes, it's good, but you've got to look at what it's really being used for as part of a whole package.


Vass Bednar [00:32:15] Dr. Wood, thank you for answering our call and for this conversation today.


David Murakami Wood [00:32:19] It's been a real pleasure.


Vass Bednar [00:32:33] You've been listening to Lately, a Globe and Mail podcast. Our executive producer is Katrina Onstad. The show is produced by Jay Cockburn, and our sound designer is Cameron McIvor, and I'm your host, Vass Bednar. In our Shownotes. You can subscribe to the Lately newsletter where the Globe's online culture reporter Samantha Edwards unpacks more of the latest in business and technology. A new episode of Lately comes out every Friday wherever you get your podcasts.