Smile, you’re on camera! On the safety and dangers of surveillance
Govanhill has more public CCTV cameras than wards many times its size. But who’s being watched? As faulty systems persist and facial recognition expands, troubling links emerge to Israeli surveillance firms tied to the oppression of Palestinians.
By Dylan Beck | Illustration by Jillian Mendoza | Photo by Rob Reid
…on how many cameras?
With a total of between 434 and 552 cameras (numbers vary in different reports and this number can shift due to updates or changes in the network), Glasgow has the largest public space CCTV network in Scotland. Out of the 441 recorded in an open data map two years ago, 14 are in the Southside Central ward. In fact, a tiny area with Victoria Road to the west, Calder Street to the north, Garturk Street to the east and Dixon Avenue to the south accounts for 1.6 percent of the city’s CCTV network. A total of seven cameras to be precise: one camera more than the whole ward of Maryhill, an area more than 50 times the size. The entirety of Partick East/Kelvindale ward, meanwhile, boasts a total of one camera. These are overall only a drop in the ocean – the vast majority of CCTV cameras in Britain are privately owned and the data is difficult to acquire – the numbers point to a rather disturbing reality: while we all live in the age of surveillance, some of us are being watched more than others.
Some councillors have recently presented arguments as to why, in fact, we are not being watched enough. See, our dear city, while hospitable to greenery, extends much less kindness to technology: due to the all-too-familiar weather conditions, our CCTV cameras are prone to malfunction. In the absence of regular maintenance, anywhere between 8 and 25 percent of the city’s network are now considered faulty. Councillor Laura Doherty, the city convenor for neighbourhood services and assets, explained that camera and network faults are currently addressed as quickly as possible, but significant investment was overdue given the system was last upgraded 12 years ago. Councillor Kevin Lalley, meanwhile, has argued for an increase in the daily hours allocated to monitoring the footage, which, while recorded 24/7, is currently only monitored live for 12 hours a day.
New technologies, old harms
But while the local hardware is facing some challenges, the software behind it has progressed leaps and bounds with the advancements of Artificial Intelligence (AI). Live facial recognition – a technology whereby live video footage is compared to a predetermined database of faces in search for matches – is being considered in Scotland. Examples from South Wales and London, where this is already in place, serve as a cautionary tale of what this would mean. In South Wales, where the system produced over 2,800 false alerts, the Court of Appeal found South Wales Police’s use of facial recognition technology to have been in breach of privacy rights, data protection laws and equality laws. In London, a previous report found the Metropolitan Police’s AI facial-recognition bot to have only been correct on 19 percent of occasions: one of many misidentifications being a Black anti-knife crime community worker, Shaun Thompson, who was then wrongly flagged, detained, and questioned by the Met.
Camera in Glasgow’s Southside
While the use of AI technology has implications for the privacy of everyone, some pay a higher cost, as such errors are not simply random but systemic. Rather than undoing it, technology reproduces existing injustice. Lower income neighbourhoods, and especially those with a higher population of people of colour – like Govanhill – are more likely to be subjected to dense surveillance coverage. At the same time, institutional racism in policing (which the police themselves admit to: following a review findings in 2023, the chief constable of Police Scotland admitted that the force is institutionally racist and discriminatory) means that racialised people are overrepresented in police records to begin with. Facial recognition technology adds a whole another layer to this: a report by the National Physical Laboratory found the police’s live facial recognition software to be less accurate for people of colour, and Black women in particular. This is in line with previous research in the United States, where AI bots were shown to be particularly bad at recognising the faces of anyone who is not white.
It is perhaps then unsurprising that the technology exacerbating such harms frequently also has harmful origins. In 2021, at least half of London’s boroughs were reported to be deploying surveillance systems made by Hikvision or Dahua, companies linked to the genocide of Uyghur people in China. Meanwhile a recent local campaign, ‘Stop Israeli Tech Watching Glasgow’, has uncovered that the surveillance software is supplied to Glasgow City Council, via a subcontractor, by an Israeli company. Their technology was developed by former members of a cyber spy unit tasked with surveilling Palestinian people living under Israeli apartheid. It continues to be used in Palestine now, as Israel conducts its genocide in Gaza. [Ed: An ongoing genocide is being committeed towards the Palestinians by Israel, as evidenced by human rights organisations as well a United Nations special committee and Amnesty International]
Security without surveillance?
An area being surveilled by cameras might provide a sense of safety – but it is evident that the technologies in question actively make many people’s lives more unsafe. The presence of CCTV cameras has not been shown to reduce violence, either: for this, strategies addressing issues such as poverty and marginalisation are infinitely more important. Given the increase in shoplifting due to the cost of living crisis, it would be safe to assume that ensuring people have the essentials they need would help with this, too. There are no quick fixes, but recognising that surveillance is not one either, seems like as good a place to start as any. When it comes to where the public money is spent, health and education might prove a more worthwhile investment than a whole new CCTV system, leave alone funding a genocidal state.
As for the more immediate sense of safety – the ways to build this are endless, and often easily available: from the abstract, like interrogating our preconceived notions of what safety or danger look like, to practical, such as learning about bystander intervention, deescalation and conflict resolution, and self-defence. Community initiatives, such as Govanhill Food Not Bombs or Community Litter Pick, and spaces such as Agnew Lane, are good places to start: ultimately, it is about coming together with our neighbours and building trust and community where we all look out for each other, rather than watching one another through a doorbell camera.