Skip to main content

Engage: a Genetec podcast - Episode 5 - "Eyes Wired Open"

From pervasive video cameras to facial recognition and privacy issues surrounding personal information prompted by the pandemic, government and civil liberty organizations on both sides of the Atlantic are responding with very different strategies to balance the need for security and the privacy of citizens. In this Episode of Engage, “Eyes Wired Open” we speak with former UK Surveillance Camera Commissioner, Tony Porter, and Cindy Cohn, executive director of the Electronic Frontier Foundation to get their thoughts on the UK and the US’s video surveillance strategies.

Subscribe to Engage at

Or listen wherever you get your podcasts:                                                 


Episode 5 - Eyes Wired Open - Transcript

David Chauvin [00:00:05] Welcome to Engage: a Genetec podcast.  

Tony Porter [00:00:28] This technology, I believe, is here and it's here to stay to support communities, not spy on them.  

David Chauvin [00:00:37] From pervasive video surveillance to facial recognition and temperature sensors prompted by the pandemic, government and civil liberty organizations on both sides of the pond are responding with very different strategies to balance the need for security and the privacy of its citizens. Outgoing U.K. video camera surveillance commissioner Tony Porter's take is rooted in a national strategy to set the guardrails for how surveillance should be governed. Meanwhile, in the US, digital rights groups like the Electronic Frontier Foundation look to set the same checks in court.  

Cindy Cohn [00:01:15] So instead of actually trying to pass legislation that would create incentives for people to build secure systems, the legislation that they're passing with the cybersecurity label on, is just all about more surveillance.  

Kelly Lawetz [00:01:27] That's my guest, Cindy Cohn, executive director of the Electronic Frontier Foundation, a leading voice in the battle over digital rights and data protection. Among her many key efforts, Cohn was the lead attorney in the landmark case to protect computer code as free speech. In the second half of the show, I'll talk to Cohn about the challenges of ensuring clear, transparent, and responsible use of potentially invasive technologies.  

David Chauvin [00:01:53] But first, going boldly where no one has gone before. The UK was the world's first nation to establish a permanent video camera surveillance commission, and the first person to take that job is our first guest, Commissioner Tony Porter. I'm David Chauvin.  

Kelly Lawetz [00:02:10] And I'm Kelly Lawetz.  

David Chauvin [00:02:20] While the US has passed some of the world's strictest laws banning government procurement of Chinese state-owned surveillance equipment, the UK has taken a decidedly different route focused on regulating use and establishing strict certification processes in an effort to build public trust as Commissioner Porter now passes the torch of his unprecedented mandate. I asked him what he found to be the biggest challenges in achieving his priorities when he took the reins four years ago.  

Tony Porter [00:02:52] There was no playbook as to how to do it. It taught me a good six months, I think, in speaking to manufacturers, installers, police, local authorities, national crime agencies, all the people that use this surveillance to get an understanding as to the direction of travel. Effectively there was too little communication between the components. Manufacturers weren't talking to the police. They weren't saying, what are the regulations you need to comply with? How do we look towards supporting you? How do you ensure video format is accessible to your officers and then out into the courts? So it became pretty clear after a while I wanted to bring all those components together because it was there was a tachycardic, it was too disjointed. And really, I think I would say to install driving standards up to take them north. That was my first priority, to somehow bring them all together.  

David Chauvin [00:03:49] And in hindsight, what would you say were some of the highlights of your work?  

Tony Porter [00:03:55] Well, I think it's been, it has been tremendously hard work. I'm not just saying that because it's me. It's just the volume has been immense. What I'm most pleased about was, I suppose, targeting the problems identified in the first question. I pulled together senior leaders from 11 areas and developed a national surveillance camera strategy. So that span from police, local authorities, manufacturers, business adopters to the code, the volunteers to adopt training horizon scanning scientists and plonked them in a room. And we've been plonked in the room now for three and a half years and that work is underway. The strategy is on my website. There are strong leads that report into me and I report into parliament. And what is done without being too fancy, it has allowed all those component parts to talk. And we've identified targets and together work through them and I believe we've helped to lead the industry. I introduced a certification scheme that spans the whole industry, so it started off with end-users, with ... But we then brought on installers, were bringing on consultants in January for their own certification scheme. And the idea is that a member of the public who's buying this kit can see the kitemark and go, well, actually, that's the government's gold standard. I feel reassured, but I think I'll buy into that.  

David Chauvin [00:05:30] Yeah, I read a lot about the certification. It was a huge undertaking. Another thing I know that you were part of that was a pretty ambitious mission was the shift from older analog cameras to more sophisticated IP based video. Considering the scale of the system, what were some of the largest challenges in trying to make that shift?  

Tony Porter [00:05:49] Well, it was interesting. We were finding in the public space, certainly, public sponsored to it, people losing faith in the surveillance cameras, the old CCTV, as were, and finance was being removed. And what we did was we introduced engagement with police and crime commissioners. And really I started the conversation about the real value of surveillance. It wasn't now just a camera pointed at people. It was the capability for interconnectivity. It was the speed of transmission, the clarity of transmission, the ability to deploy resources. It was quite significant. And what we saw was whilst there was a lessening of financial contributions seven years ago, we've now got a lot more engagement because of the IP element, the reliability and the capability capacity. We're seeing a lot more engagement now.  

David Chauvin [00:06:46] I know that another big challenge when especially in expanding the surveillance system from the public sector especially in liberal democracies, the PR, the public relations is always a big thing and the perception of the public. What was your approach to communicating both the mission and the vision of the program to the people of England and Wales?  

Tony Porter [00:07:09] OK, another great question. I think what I would say is that from the get-go, I used the line that surveillance should be there to support the community. It is not spying on them, and that's a nice thing to say, but it has to be followed with words and deeds. And I was very hard in certain sectors with the police where I felt that the visibility to the communities wasn't sufficient, where I felt there was a lack of transparency, where I felt that the public had a right to be a little bit suspicious, and I call them that. The response from law enforcement, I have to say, was magnificent. It was tough yards, but it did trigger many in several responses from the police to open up, such as, for example, I ran a national surveillance camera a day where many police forces opened the doors to members of the public to show them what they use for surveillance, how they do it, how it's in their interests. So that was interesting. But another interesting thing was one of our most senior human rights professors is well known in the world of surveillance. He approached me and he actually said somehow, somewhere the camera commissioner's office is conducting voodoo because across the board, the spectrum, from civil rights lawyers to the police in the states, there is general support for the role. People trust and have confidence in the role because it's seen to be fair, impartial, and it actually speak truth to power. And I think that if anything is one lesson for anybody involved in the surveillance business that we're not there is some undercover people going around with dark masks. Unless, of course, you're working for your intelligence agencies or on a police operation. Generally, surveillance is there to support them. Well, let's bring it out into the open. And that has been probably one of the strongest messages I'm most pleased about now.  

David Chauvin [00:09:05] So clearly, transparency for you is one of those core values, right?  

Tony Porter [00:09:09] Well, absolutely it is because I actually believe that whilst generally there is no enabling legislation for this type of technology, in most jurisdictions, it's through common law. It's not enabled by statute specifically. So you do need the public support because otherwise, you lose it. And if there's abuse of power, abuse of surveillance, then I think you'd find pretty quickly those elected officials in the UK, police and crime commissioners across America and Canada, elected sheriffs and mayors, will pretty quickly be bringing down that case if it was abused.  

David Chauvin [00:09:46] When you came into the commissioner's chair, there was a recent document put in place: the camera code of practice in place for just a year. And you brought quite a few recommendations on how it could be improved. What were some of your key recommendations and why was that so important to you?  

Tony Porter [00:10:04] One of the things was that I felt that the code need to be expanded to incorporate the broader range of authorities that were mandated to comply with the code, such as health and education. Now, you will know in your jurisdiction it's very difficult, devilishly difficult to change a statute. What I was able to do was to circumvent the statute and bring in a voluntary scheme whereby we have been successful in bringing education and health, ambulance health services to certify voluntarily to the standards. And that has worked very well. I also have asked the government to expand the code of practice, and I felt that the code of practice, even when I took over, was requiring a bigger reboot. As I say, these things do take time, but we've just had a problem of global first in one of our most senior courts, the Court of Appeal, whether it was a facial recognition trial and the Court of Appeal agreed with me that the surveillance camera code of practice needs to be rebooted. It needs to be reprovisioned. And I'm pleased to say the home secretary has also agreed with that. So I'm looking forward to seeing that in the next six months to a year and choose some kind of reboot.  

David Chauvin [00:11:27] North America, and especially in the US, a lot of cities have outright banned the use of facial recognition, both for forensic purposes and for video. Do you think that by banning the technology, you feel like they're actually legislating the right thing? Or would you say they should find more effective ways to govern around the technology and build policies that actually embrace the technology and share those practices with the public?  

Tony Porter [00:11:52] A friend of mine, Lord David Undercity, is an eminent leader in the field of surveillance. In fact, he drafted a seminal report in 2015 which was titled 'A Question of Trust' that focused on surveillance. And then the question on live facial recognition. He said that I've got confidence in the states, that it has the capability to draft a set of rules. Regulations that can control this technology whilst at the same time delivering the benefits to the citizen, and surely that has to be the right position, because whilst I know that the opponents to the technology are very vocal. There needs to be a counter debate because this technology can save lives. It can identify lost people in the malls. It can pick up people who are vulnerable, that are perhaps subject to threats themselves. And there needs to be a sensible discussion, there needs to be parameters around the police and law enforcement. Well, of course, to do. There needs to be a very close look at the private sector to make sure that they don't run off, creating watch lists that can be disseminated across the country that hold innocent people to account. Of course, that is the case. But this technology, I believe, is here and it's here to stay. And I think that we need to conjoin with the regulators, lawmakers, policymakers to make sure that society benefits from it.  

David Chauvin [00:13:18] Now, moving from software to hardware, another area that we've seen expand in terms of legislation in the US, in this case, was a few years ago the John McCain NDAA, which was an outright procurement ban on specific state-owned technology vendors, mainly cameras, specifically from China, because of concerns of national security. Where do you stand on the subject of banning the procurement of state-owned hardware?  

Tony Porter [00:13:45] This is a matter of some political debate in the UK. It's based on both a geopolitical issue, it's a macroeconomic issue, and it's an ethical issue. So you look at China and some of the contributors, they push millions of pounds worth of equipment into the UK, US, Canada, and it clearly has a trading platform in terms of ethics. We read the terrible plight of the Uighur Muslims in China are being suppressed. That in itself is an outrage! But the more complex and difficult issue is the use of some of the technology of the equipment providers. And I think we have a classic balance here between the two issues that the geopolitical and the macroeconomic and the ethical. So we know that the UK, Canada, America, there's probably upwards of just under, I think, seven hundred and fifty billion pounds worth of trade between the two countries, the three countries, four countries, rather. And that is a factor. But what I find very interesting in the UK is that MPs have started writing to the home secretary, to the prime minister, complaining about the risk to our surveillance industry by the back door. That could be precise by accessing the Chinese state-owned cameras. And I think there are arguments there. And I am also aware that an eminent security threats in the UK, Lord Admiral West raised objections to Chinese technology be used in Portcullis House in Westminster because of those very reasons. So whilst I think the issue stretches beyond my regulatory rebate, my remit goes up to Hajra Walls in Scotland and the Irish Sea and the South Coast. I do think there's an increasing question for people around the ethical purchase of equipment, and I think that's a matter for individuals we're already seeing in certain organizations. And I think really probably as far as I could take that discussion.  

David Chauvin [00:15:44] As you move into the next phase of your career, what are some of the challenges and the priorities that you see for your successor? And if you could, what advice would you give that person?  

Tony Porter [00:15:54] Well, I think the challenge he will have that I didn't have at the get-go is the interconnectivity of this technology, the merger between the law enforcement elements and the private sector, the fact that the private sector in many ways, in many regards, are more powerful than the state sector in terms of surveillance. And these are dynamics that the new commissioner will have to grasp. I'm already drafting policy guidance to the use of facial recognition, which are published next week before the end of my commission. But I think they're going to be enduring challenges for the new individual. I think that the advice I would give the individual would be to remain connected with the industry. The industry is very powerful, very supportive. I have had nothing other than energy warming my rockets, which has enabled me with such a small team to impact on legislation and policy. So to keep engaged with the industry and then keep engaged with the end-users and the different components, maintain the national strategy, maintain a high profile, make a difference, and basically work your socks off. Because as a public commissioner, it's not a free ride on the taxpayer. You should really leave the field with nothing left. And I hope that's what will be said about me at the end of my commission.  

David Chauvin [00:17:19] Well, Commissioner, thank you so much. This was a fantastic interview. I really appreciate your time.  

Tony Porter [00:17:23] Great pleasure. And all the best to you and all your listeners over there.  

[00:17:35] Our main story tonight is government surveillance, and I realize most people would rather have a conversation about literally any other topic.  

Kelly Lawetz [00:17:44] Most people or perhaps many people, Mr. Oliver, but certainly not our next guest, Cindy Cohn, heading up the Electronic Frontier Foundation. Cindy Cohn runs the world's oldest and largest digital rights organization this year, turning 30, making it older than the World Wide Web itself. I jumped right in by asking Cohn if she thinks all video surveillance is bad.  

Cindy Cohn [00:18:08] Well, no. And but I want to qualify that. I mean, I think that if you define surveillance as, you know, anybody seeing something about you, then no, it's not that if you define surveillance as government or corporate spying on you without your permission or knowledge, then, yeah, I would say it's all bad. So certainly if the government goes and gets a warrant and is surveilling someone who is doing illegal activities, there's a space for surveillance that is legal. What we have now, though, are forms of mass surveillance where a lot of us are being spied on, even though we're not all suspects and surveillance kind of everywhere and without a lot of permission and knowledge by both governments and companies. And I think that there's a lot wrong with that.  

Kelly Lawetz [00:19:01] In our first episode, I spoke with Dr. Ann Cavoukian and she was the architect of privacy by design and the former privacy commissioner of Ontario. And she really underscored the false dichotomy, the idea that there is a choice between privacy and security. How do you see that relationship?  

Cindy Cohn [00:19:21] Well, I'm a big fan of Dr. Cavoukian. Yeah, I completely agree with her that people who are interested in making surveillance normalized will tell you that you have to trade off your security for freedom from surveillance or privacy. And it's just not true. And in fact, if you really unpack it, it's kind of ridiculously backwards. Right. Like one of the things that makes you most secure, which is encryption. As I said, I started my career of freeing up encryption from government control. That makes you secure against all sorts of bad guys. It increases your security. It doesn't decrease it for you to be able to lock your own front door and make sure nobody can get in. The idea that the government has to be able to get in so they might be able to watch you in case you're a future criminal is their theory, which I think makes you less secure.  

Kelly Lawetz [00:20:12] So why is that belief still in play?  

Cindy Cohn [00:20:15] Well, very powerful people want you to believe it. So that's the start of it. And they're, you know, they're not wrong that there are some situations in which the government needs to do surveillance. The question is, you know, again, it gets presented as an either-or thing. And I think that that's wrong. The world is much more gray than that. But, you know, either-or messages are very easy to put out in the more nuanced message about due process and control is actually just a harder message to get out to people? But I think most importantly, the people who have the big megaphones are the governments and the big companies. And so they're able to get messages out in ways that, you know, I'm very proud of the FF. I think we're an extremely important player, but we're nowhere near the size of the workforce or the reach of the government that we're trying to hold to account. Less Facebook's and Google's and others who are engaging in surveillance for as part of their business model.  

Kelly Lawetz [00:21:17] So anyone reading your biography would soon learn about the landmark case between Bernstein versus the Justice Department. And you were the lead attorney in that case. Why is that case so important as it relates to tech and its development?  

Cindy Cohn [00:21:32] Well, the case gets a couple of things early on, and I think it set the tone for a lot of things that come afterwards. So the case involved a guy who was a graduate student at the time. His name is Dan Bernstein, who had written a computer program that did encryption. Very simple program. It was actually called Snuffle. And Snuffle, while not particularly groundbreaking in terms of encryption, was groundbreaking because it was the vehicle by which we freed up encryption from government control. So one of the few things that gives you security in a digital network is encryption. And this is just the secret codes, right? Putting something in a code, encoding it in a way that strangers who see it, whether it's been traveling from one place to another or sitting at rest, can't get access to it. So you maintain control over who has access to your messages because you lock them up and not only let certain people have the key to them and see what's going on. So this technology is pretty old. Cesar, Julius Caesar has a famous, like, really, really old, and in the 1990s, it was controlled by the government like ammunition in the US and lots of places around the world. You know, like on the list of things that you couldn't export from the United States, surface-to-air missiles, tanks, software with the capability of maintaining secrecy. So our job in the case was to free up the ability to let you keep a secret in a digital network in order to let the Internet thrive. And, you know, so that you could buy things online, for instance, or do your banking or just have a private conversation with someone you love. And so we set about this case in order to try to free up the publication of this kind of computer program so that you could have the kind of security that that technology brings along the way. So we freed up encryption, which I think is, you know, we were part of a much broader movement. I don't want to say we did it alone. There was movement,  we were in the courts. There were movements in the legislatures. You know, the efforts inside the administration, around the world were pretty strong to try to free up government, to take this out of the realm of ammunition or out of the realm of governmental control and into something that we could develop tools based on freely. So, one, we succeeded in that. But along the way, we had to establish that computer programs were protected speech for freedom of expression, for purposes of the international context. But in the United States, we talk about freedom of speech because that's how our First Amendment is written. And so we had to convince the court that when you're writing code, you're expressing yourself. It's a language to express scientific and mathematical and even beautiful ideas, the same way that someone who's a writer or a poet is expressing ideas in the way that they write that that the computer programing languages are really languages, rarefied ways to communicate ideas both to other people and to machines. And so we were able through that case to establish this baseline idea that what programmers do is protected by the First Amendment or internationally freedom of expression so that if the government wanted to control what coders do, it has to be a very high standard, a much higher standard than if we were going to treat this like it was a weapon or like it wasn't anything expressive, and our case helped establish that. In other cases that we did right, we had some legal technicality. So our cases, the case that cited for that proposition, that's another case called the other, and then a later case called twenty-six are amazing. But the other way that litigation works in the United States. But our case was really the first one where a judge found this. And this, again, has set the tone for, you know, basically what coders do on the side of freedom. Right. Making the barrier, making the efforts that the government has to go to to try to say, no, you can't code that just to be much, much higher than they would otherwise be. And this freed up programmers in the United States, but around the world to not have to go on bended knee for permission from a government before they write a computer program. And I think arguably, again, kind of set the tone for how we think about the Internet now as a place where if you've got a good idea and you can code it up, you can release it to the whole world.  

Kelly Lawetz [00:26:10] You know, when one of the issues we've grappled with as a company is the introduction of state-sponsored devices, how would you advise governments in the U.K. and other countries in Europe about managing this problem?  

Cindy Cohn [00:26:23] Yeah, well, I think it's a piece of a bigger problem where governments are beginning to be pretty sneaky and pretty pervasive and trying to control how the tools you and I rely on work and whether they work just for us or whether they also have a second mission which is working for, in this instance, the Chinese government. But in other instances, again, in the encryption fight in the United States, we have the US government pounding on the door of lots of tech companies trying to get them to do, frankly, something quite similar, which is making sure that there are back doors into their systems so that if the government wanted to come, the government could get it. So I think that it's very serious, but it's important to be honest about this. It's serious no matter what  government's doing it. Because our privacy user center should be the most important thing and our government should stand up, your Canadian government should stand up for user privacy, too, and not be, on the one hand, testing the Chinese for this and on the other hand, asking for essentially the same thing, perhaps with more due process protections, but similarly dumbing down our technology. So, yeah, I think it's really serious and I think it's this is one where it's important, I think governments can do a lot to try to help protect people around the world by basically saying, look, you can't sell, you know, you can't sell into the market unless you make some basic privacy promises that are real promises. And then, you know, that's just part of qualifying for being able to sell into a particular market. And governments have this extra thing that's really important: the governments buy a lot of technology. So, you know, if the government is saying, look, we're not going to buy any of these technologies for our use, that's a good start. But then they may need to pass. You know, this is why we need basic privacy law. We need basic privacy protections for users. And we need to basically take the position that technologies that don't meet the basic privacy standards are not they're not paper. They're not able you know, that I put it the other way, that people can sue and people can protect themselves against any of these technologies.  

Kelly Lawetz [00:28:42] What is the FF working on in 2021? What are the plans going forward?  

Cindy Cohn [00:28:48] Well, we have a few things that we know are coming up. I would say that with FF we usually are up somewhere between 50 to 80 percent on patrol. Our job and what we try to maintain nimbleness to do is to be able to jump in when there's a problem. I think that we are going to see a lot of continued attacks on encryption, which is really important that we protect this technology and we recognize it for the security technology that it is. I think we're going to see attacks on platforms as specifically on a law called Section 230 that makes the platform not responsible for what users do on the platform, or at least partially not responsible. It only applies to certain things, but I think we're going to see more attacks on this because in the United States anyway, both people on the right and people are on the left are upset about this immunity law. The tricky thing is they're upset about it for almost exactly opposite reasons, which is going to be interesting in terms of what they do, because, you know, the Democrats on the left think that the platforms are not censoring enough. And the Republicans on the right thing that the platforms are censoring too much. So this doesn't lead to actually a very easy answer about what might what when they do. But I think it's going to be front and center and then we're going to continue to look at some of the issues around your privacy in COVID times and also accessibility. One of the things that the COVID pandemic has recognized is how important and how reliant we all are on real broadband for our homes to do our work. And we got school kids driving to parking lots to get access to the Internet, and that's just crazy in 2021. So we're going to continue to try to push for more equity in how technology is deployed and used. I also think we're going to continue to be reckoning with racial justice in the United States and around the world. And I suspect there will be lots of things going on around technology and specifically, you know, protecting the technology that lets us see what's going on. You know, there's a lot of effort still to try to make it illegal to film the police, which I think is that France just had an experience with such a law. And there's still litigation in the United States. Sometimes it flies under the radar, but now there's still efforts to try to prevent us from seeing what our police do. And if it's got a lot of work that we do with local groups all across the country and sometimes all around the world to make sure that we maintain transparency in how the police and law enforcement do their job so that, you know, people you know, that democracy starts with the idea that people are in control and people can't be in control of stuff that they don't know about.  

Kelly Lawetz [00:31:33] So a lot to be vigilant for.  

Cindy Cohn [00:31:36] And I to anyone, I didn't even touch on the tech giants. Right. I mean, that's a whole other area where we're going to see we're beginning to see a lot of work to try to promote competition in a lot of tech spaces. You know, I didn't get involved with the Internet, so it would just be five companies at the back end of everything. And so I think that there's been some antitrust actions that have started in the United States that were quite supportive of, there's some in Europe. And I think that that to our interest in this is not just making the companies pay or something. Is to try to figure out how do we get from here to a more distributed, more egalitarian Internet where, you know, back to where we started, where anybody with a good idea can code it up and present it to the world because we've really lost a lot of that in the way that big tech has become so centralized, especially if people want funding. And I don't know if your company has struggled with this, you're in a little different space. But, you know, they have this thing called the kill zone around the big tech companies where nobody who's competing with them can get any funding. So we think there's a lot of work that can be done, both in the context of the antitrust cases and any efforts. Primarily lawyers will be participating in those, but also in thinking about ways that we could really build towards a more distributed infrastructure, making more technologies have to be interoperable, making sure that data is both protected with privacy protected, but that you can leave. Right. You know, Facebook doesn't have customers right now. I'd like to see it have hostages. So there's a lot of technical side work to make the kind of giant silos that are the big tech companies, much more permeable and much more amenable to enter operating tools that I think are an important piece of the story around combating the tech giants that EFF is, you know, particularly positioned to address. So, yeah, it's a big agenda. You know, it was kind of a little ragtag team of activists when we started 30 years ago. But now, we have to be much bigger because the issues facing the Internet and the importance of those issues are much, much bigger.  

Kelly Lawetz [00:33:49] Cindy Cohn, thank you so much for your time today. I really enjoyed this talk and all the best in the New Year.  

Cindy Cohn [00:33:56] Thank you. You too. Thank you for inviting me. It was an honor to be on.  

Kelly Lawetz [00:33:59] And that was Cindy Cohn, executive director of the Electronic Frontier Foundation, based in San Francisco. That's it for this week's show. We hope you enjoy these conversations as much as we have. As always, thanks for listening to Engage. And we hope you can join us again.  

[00:34:16] Engage, a Genetec podcast, is produced by Brendan Tully Walsh, the associate producer is Angele Paquette. Sound design is provided by Vladislav Pronin with production assistance from Caroline Shaughnessy. The show's executive producer is Tracey Ades. Engage a Genetec podcast is a production of Genetec Inc. the views expressed by the guests are not necessarily those of Genetec, its partners or customers. For more episodes, visit our website at on your favorite podcasting app or ask your smart speaker to play engage a Genetec podcast.