At the Security and Counter Terrorism Expo (SCTX) 2019, Dronestream CEO Harry Howe presented his firm’s collaborative effort with artificial intelligence (AI) firm Skylark Labs, in the form of new drone surveillance capabilities for autonomous crowd scanning and threat detection, as well as mounted CCTV cameras.
Skylark uses AI and machine learning to autonomously scan crowds of people in real-time and identifies a large variety of potential threats. Dronestream specialises in low latency live-streaming video feeds, adding insights such as mapping software and alert functions to help users quickly locate a threat that Skylark’s AI system has detected, contained within one app or website.
The Skylark system, according to its developers, can detect almost anything you feed into the system’s dataset including missing persons and persons of interest, potential weapons, and criminal ‘buzz words’, according to the company. It does so through real-time rapid analyses of faces, gaits, objects, and has a lip reading capability.
The software can even identify a missing person from a photo even if they were a child, and in the absence of a photo can identify a missing person or person of interest by analysing photos of the individual’s mother and father.
On paper, the technology sounds impressive, and Howe gave a variety of visual examples of the working system during his presentation. However, testing so far has been limited and the system is yet to gain accreditation in the UK, or be trialled by the UK police, although there have been partnerships with the Indian Government and the Canadian police.
Drone surveillance: a solution to a UK problem?
Skylark’s drone surveillance technology could be of particular to the UK’s security industry, where the government austerity programme has led to reductions in policing capabilities.
“We are in a position now where there are 20,000 fewer police officers on the street than there was in 2010,” said Howe. “There is nothing yet to plug those gaps other than software.”
Skylark’s software can be adapted for use with drones or applied to one of the four million CCTV cameras across the UK.
“If you look at how many people are in that street, it would be quite difficult for a CCTV operator to really quickly identify if there was a fight breaking out in the crowd,” Howe said, giving a visual example. “The system does it automatically and alerts [users] in real time exactly when that happens. Now even if the drone is at 300ft, it can still operate effectively”.
Around 50% of UK police use drones for activities such as crowd monitoring, and identifying persons of interest or potential threats. An AI co-pilot in the form of Skylark’s technology could reduce the size of the police team to just one operator, according to Howe.
Skyface and weapons detection
Two unique features of the system are the Skyface age-invariant face recognition platform and the weapon detection capability.
“Say you have only got one photo of this person aged 20. That was 20 years ago, he is now 40 years old and he looks completely different,” Howe said. “Well what the age-invariant face recognition can do is it can use the photograph that you’ve given from 20 years ago and still identify him successfully at around a 90% accuracy rate,” Howe said.
Furthermore, if the operator is looking for various kinds of knives and weapons, the system can identify these when they are in shot too, even if the imaging quality is low.
Howe added: “Even if it is quite a grained shot of a knife against a background, it can still identify that it is a weapon. And in some cases tell you exactly what type of weapon it is.”
Flexibility is key
The key to the Skylark system is flexibility, according to the developers. For example, if you have fed in the photograph of a potential suspect, as soon as that suspect comes into shot, a green bounding box will appear onscreen and an alert will be sent to the relevant authorities.
“It is very straightforward,” Howe said. “This is really what Skylark excels in is the fact that the data annotation is for the data sets you give. This works in real time, instantly. When he walks in shot he is identified, and providing that the data has been fed in previously.”
The system can also be modified to scan large areas while still adhering to UK privacy laws, if necessary.
“In the UK, we are very very tight on this. It all depends on what country you using this software and it can be tailored to meet the specifications of a specific country,” Howe responded. “What Skylark do is, say for example you don’t want to see through a set of windows, what the system can do is identify what windows and perhaps blur them out. The system can be as flexible as you need it to be.”
Ironing out potential flaws
At first glance, it seems like a very simple, yet effective system. A threat is detected, a bounding box appears, and an alert is sent to a police officer or security official, giving the human more time to react to a threat or respond to an incident.
But what if a person is incorrectly identified as a suspect? What if a fairly mundane object is detected as something more sinister? These are just some of the issues that Skylark and Dronestream are trying to address.
During the presentation at SCTX 2019, a few potential issues were raised that questioned the effectiveness of the drone surveillance system.
Asked how effective the system would be when monitoring a large crowd, such as 90,000 people leaving a football stadium, for example, Howe conceded that the sheer scale of a crowd will affect the drone camera’s ability to detect faces and weapons, but the system can be set up to analyse a wider area.
“If it’s a drone [camera], if you’re working at 300ft and there’s 1,000 people below you then it’s going to be very difficult to get an accurate representation because as people get bunched in together, they just become heads,” Howe said. “So in that case you would have to rely on fixed cameras and from different angles but it is certainly scalable.”
Another issue concerned the margin of error and the issue of culpability after detecting and arresting an individual based on the AI system when it has wrongly identified a suspect, or a weapon has been incorrectly identified, such as a chef’s work knives or child’s toy gun.
“That’s why people cannot be completely removed from the frame because a person’s intention can only go so far with the machine,” Howe noted. “What you can do is train the machine algorithm to identify the aggressive movements that go along with holding a weapon. So, perhaps if the gun is drawn, the body posture or the way they are moving, or age group, you can narrow it down. That’s something that the machine can’t do.”
The technology has the potential become a real asset to the defence and security sector in terms of crowd and open space surveillance, but Skylark and Dronestream will need to further mature the technology and, most importantly, prove the system works to a level of accuracy that the security industry can rely on.