Skip to main content

Rain ruined the NYPD’s first chance to use security drones at a major event

The New York Police Department (NYPD) discovered the limitations of current drone technology on New Year’s Eve when rain forced it to cancel plans to use the machine as part of its security operation during the celebrations in Times Square.

NYPD Chief Terence Monahan confirmed in a tweet that the police department’s quadcopters wouldn’t be taking to the skies because of the lousy weather that had descended on the city.

Recommended Videos

The plan had been to use the camera-equipped drones to monitor the huge crowds celebrating the arrival of the new year in Times Square, but the inclement weather posed a risk to the drones’ stability, and the last thing the NYPD wanted was to have one of its flying machines falling from the sky and possibly onto someone’s head.

It would have been the first time for the NYPD to deploy its drones for security purposes at a large-scale event.

The NYPD’s drone equipment is comprised of 11 DJI Mavic Pro quadcopters, 2 DJI Matrice 210 RTK quadcopters, and 1 DJI Inspire 1 quadcopter, but all of the remotely controlled drones were grounded during Monday night’s celebrations.

When the NYPD first announced its plan to use drones as part of its New Year’s Eve security operation, Deputy Commissioner of Intelligence and Counterterrorism John Miller said the machines would offer “visual aid and flexibility” with their ability to travel at speed over a large crowd to any spot deemed a location of interest, while at the same time providing support for 1,225 portable and stationary cameras.

But now the police department will have to wait another day to deploy its drones at a major event.

The decision to ground the drones for fear of the machines malfunctioning in rain and strong winds reveals the technology’s current limitations when it comes to using them for security at events involving large crowds.

The NYPD announced in December 2018 that it would begin using drones for some of its work, which besides event security could also include search-and-rescue operations, crime scene investigations where the location is hard to access, hostage situations, and incidents where hazardous materials are present.

The equipment is operated by licensed police officers of the Technical Assistance Response Unit (TARU), each of whom has received extensive training, the NYPD said.

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
BYD’s cheap EVs might remain out of Canada too
BYD Han

With Chinese-made electric vehicles facing stiff tariffs in both Europe and America, a stirring question for EV drivers has started to arise: Can the race to make EVs more affordable continue if the world leader is kept out of the race?

China’s BYD, recognized as a global leader in terms of affordability, had to backtrack on plans to reach the U.S. market after the Biden administration in May imposed 100% tariffs on EVs made in China.

Read more
Tesla posts exaggerate self-driving capacity, safety regulators say
Beta of Tesla's FSD in a car.

The National Highway Traffic Safety Administration (NHTSA) is concerned that Tesla’s use of social media and its website makes false promises about the automaker’s full-self driving (FSD) software.
The warning dates back from May, but was made public in an email to Tesla released on November 8.
The NHTSA opened an investigation in October into 2.4 million Tesla vehicles equipped with the FSD software, following three reported collisions and a fatal crash. The investigation centers on FSD’s ability to perform in “relatively common” reduced visibility conditions, such as sun glare, fog, and airborne dust.
In these instances, it appears that “the driver may not be aware that he or she is responsible” to make appropriate operational selections, or “fully understand” the nuances of the system, NHTSA said.
Meanwhile, “Tesla’s X (Twitter) account has reposted or endorsed postings that exhibit disengaged driver behavior,” Gregory Magno, the NHTSA’s vehicle defects chief investigator, wrote to Tesla in an email.
The postings, which included reposted YouTube videos, may encourage viewers to see FSD-supervised as a “Robotaxi” instead of a partially automated, driver-assist system that requires “persistent attention and intermittent intervention by the driver,” Magno said.
In one of a number of Tesla posts on X, the social media platform owned by Tesla CEO Elon Musk, a driver was seen using FSD to reach a hospital while undergoing a heart attack. In another post, a driver said he had used FSD for a 50-minute ride home. Meanwhile, third-party comments on the posts promoted the advantages of using FSD while under the influence of alcohol or when tired, NHTSA said.
Tesla’s official website also promotes conflicting messaging on the capabilities of the FSD software, the regulator said.
NHTSA has requested that Tesla revisit its communications to ensure its messaging remains consistent with FSD’s approved instructions, namely that the software provides only a driver assist/support system requiring drivers to remain vigilant and maintain constant readiness to intervene in driving.
Tesla last month unveiled the Cybercab, an autonomous-driving EV with no steering wheel or pedals. The vehicle has been promoted as a robotaxi, a self-driving vehicle operated as part of a ride-paying service, such as the one already offered by Alphabet-owned Waymo.
But Tesla’s self-driving technology has remained under the scrutiny of regulators. FSD relies on multiple onboard cameras to feed machine-learning models that, in turn, help the car make decisions based on what it sees.
Meanwhile, Waymo’s technology relies on premapped roads, sensors, cameras, radar, and lidar (a laser-light radar), which might be very costly, but has met the approval of safety regulators.

Read more
Waymo, Nexar present AI-based study to protect ‘vulnerable’ road users
waymo data vulnerable road users ml still  1 ea18c3

Robotaxi operator Waymo says its partnership with Nexar, a machine-learning tech firm dedicated to improving road safety, has yielded the largest dataset of its kind in the U.S., which will help inform the driving of its own automated vehicles.

As part of its latest research with Nexar, Waymo has reconstructed hundreds of crashes involving what it calls ‘vulnerable road users’ (VRUs), such as pedestrians walking through crosswalks, biyclists in city streets, or high-speed motorcycle riders on highways.

Read more