Cameras and machine learning technologies are helping the Department of Homeland Security (DHS) Science and Technology Directorate (S&T) and Sandia National Laboratories create more precise drone detection capability through visuals alone.
“If you have a video of something, you can kind of identify it based on certain characteristics,” explained Jeff Randorf, an S&T engineering advisor. “You can train neural networks to recognize patterns, and the algorithm can begin to pick up on certain features.”
Until now, videos of drones were limited to raw data analysis, which entailed merely capturing and learning from the video alone. This is unlike the novel temporal frequency analysis (TFA) being tested at Sandia, which dives deeper into the image. Instead of heat signatures, acoustic signatures, or taking a video at face-value, TFA analyzes the frequency of pixel fluctuation in an image over time, eventually obtaining a “temporal frequency signature” for the drones it has observed. Pairing robust imaging systems with machine learning in this way only makes it a matter of time before discrimination is seamless.
“Current systems rely upon exploiting electronic signals emitted from drones,” said Bryana Woo of Sandia National Laboratories, “but they can’t detect drones that do not transmit and receive these signals.”
Previously, drones could be spotted by picking up the radio signal between a remote control and the drone itself, but if drones are soon to be autonomous, that capability may quickly vanish. Alternatively, TFA captures tens of thousands of frames in a video, so a machine can learn about an object and its associations from how it moves through time. If mastered, TFA could be the most precise discrimination method to date.
The Sandia tests consisted of capturing impressions of three different multirotor drones on a streaming video camera. Each drone would travel forward and back, side to side, up and down, and the camera would capture spatial and temporal location. A machine learning algorithm was trained on the frames taken. Ultimately, the analysis renders the full flight path of the target object in all its directions.
In order to challenge the system, testers began with more complex data, providing lots of clutter in the environment—birds, cars and helicopters around the drone. Over time, Sandia noticed considerable difference in the system’s ability to discern whether an object was a drone or a bird.
TFA work with Sandia is part of a larger S&T effort to stay abreast the latest drone technologies. The number of commercial and personal drones in the sky is expected to nearly triple within the current decade, raising concern as to how their traffic will be managed, how nefarious drones can be identified and how to merely tell drones apart from their environment.
There could always be new barriers to detection, which is why S&T has taken on the nefarious drone issue from multiple angles—enabling law enforcement drones for components through demonstrations at Camp Shelby, Mississippi, developing a DHS interface for the future Unarmed Aerial System (UAS) Traffic Management System, and keeping up with state-of-the-art counter-UAS capabilities.
TFA continues to be explored. When a machine can begin to easily identify drones in flight using temporal frequency analysis, it will have been a step forward for S&T and its partners in securing the skies for the safety of hobbyists, commercial industry, and all citizens below. Airspace can be free of danger, open only to the delivery and recreational drones promising to make life more enjoyable for Americans.
The technology developed by Sandia is the subject of U.S. Patent Application Serial No. 16/141,385, entitled “UNMANNED AIRCRAFT SYSTEM (UAS) DETECTION AND ASSESSMENT VIA TEMPORAL INTENSITY ALIASING,” filed September 24, 2018.
For more information
https://www.dhs.gov/science-and-technology/news/2018/11/02/snapshot-detecting-drones-through-machine-learning-cameras