AI drone that could hunt and kill people built in just hours by scientist 'for a game'

Article
Archive

AI drone that could hunt and kill people built in just hours by scientist 'for a game'​

The scientist who configured a small drone to target people with facial recognition and chase them at full speed warns we have no defenses against such weapons.

1710295405032.png
Luis Wenus, an entrepreneur and engineer, incorporated AI and facial recognition into the small drone so it could chase people down at full speed.

It only takes a few hours to configure a small, commercially available drone to hunt down a target by itself, a scientist has warned.

Luis Wenus, an entrepreneur and engineer, incorporated an artificial intelligence (AI) system into a small drone to chase people around "as a game," he wrote in a post on March 2 on X, formerly known as Twitter. But he soon realized it could easily be configured to contain an explosive payload.

Collaborating with Robert Lukoszko, another engineer, he configured the drone to use an object-detection model to find people and fly toward them at full speed, he said. The engineers also built facial recognition into the drone, which works at a range of up to 33 feet (10 meters). This means a weaponized version of the drone could be used to attack a specific person or set of targets.

"This literally took just a few hours to build, and made me realize how scary it is," Wenus wrote. "You could easily strap a small amount of explosives on these and let 100's of them fly around. We check for bombs and guns but THERE ARE NO ANTI-DRONE SYSTEMS FOR BIG EVENTS & PUBLIC SPACES YET."

Wenus described himself as an "open source absolutist," meaning he believes in always sharing code and software through open source channels. He also identifies as an "e/acc" — which is a school of thinking among AI researchers that refers to wanting to accelerate AI research regardless of the downsides, due to a belief that the upsides will always outweigh them. He said, however, that he would not publish any code relating to this experiment.

1710296022183.png


He also warned that a terror attack could be orchestrated in the near future using this kind of technology. While people need technical knowledge to engineer such a system, it will become easier and easier to write the software as time passes, partially due to advancements in AI as an assistant in writing code, he noted.

Wenus said his experiment showed that society urgently needs to build anti-drone systems for civilian spaces where large crowds could gather. There are several countermeasures that society can build, according to Robin Radar, including cameras, acoustic sensors and radar to detect drones. Disrupting them, however, could require technologies such as radio frequency jammers, GPS spoofers, net guns, as well as high-energy lasers.

While such weapons haven't been deployed in civilian environments, they have been previously conceptualized and deployed in the context of warfare. Ukraine, for example, has developed explosive drones in response to Russia's invasion, according to the Wall Street Journal (WSJ).

The U.S. military is also working on ways to build and control swarms of small drones that can attack targets. It follows the U.S. Navy's efforts after it first demonstrated that it could control a swarm of 30 drones with explosives in 2017, according to MIT Technology Review.
 
People are stupid if they think the government hasn't had this tech for years and hasn't already been using it.

There are a variety if defenses, too. They can't recognize targets very well if you hide the face via a mask and disguise the human form with a cloak or robe.

They're vulnerable to ECM, microwaves, ect. Shotguns and/or nets can disable/immobilize them. And most of them are loud as fuck.
He also identifies as an "e/acc" — which is a school of thinking among AI researchers that refers to wanting to accelerate AI research regardless of the downsides
Ah, so he's an idiot who wants either a singularity or the government to perfect China's surveillance state. Got it.
 
as if acquiring manure & ammonium is hard to anyone not in a concrete jungle
Fertilizer bombs are way too heavy for a drone bomb that's any more deadly than just shooting someone with a gun. You need an efficient high explosive for that, and you're not getting one of those without a visit from a couple dozen heavily armored house guests.

Basically, anyone who has the resources to build a drone bomb that's a novel threat also has access to RPGs, grenades, and other things that make the drone aspect largely unnecessary.
 
Posting the words succulents, daffodils and tulips in general just to alleviate any keyword alerts on those F & A words. I love gardening, it's so relaxing~♪
"Corn" is a heavy feeder so you're going to need a lot of "nitrogen" if you want a bountiful "harvest."
 
  • Like
  • Informative
Reactions: SIMIΔN and AFAB
Nobody's posted the obvious yet?
He also identifies as an "e/acc" — which is a school of thinking among AI researchers that refers to wanting to accelerate AI research regardless of the downsides, due to a belief that the upsides will always outweigh them.
This man is so obsessed with whether or not he could he never stopped to ask if whether or not he should.
 
  • Winner
Reactions: NoSpiceLife
Fertilizer bombs are way too heavy for a drone bomb that's any more deadly than just shooting someone with a gun. You need an efficient high explosive for that, and you're not getting one of those without a visit from a couple dozen heavily armored house guests.

Basically, anyone who has the resources to build a drone bomb that's a novel threat also has access to RPGs, grenades, and other things that make the drone aspect largely unnecessary.
It doesn't take a lot of explosive next to a persons face to kill them.

You can buy tons of fireworks without getting a visit from anyone. You can buy nail polish remover and laundry detergent without raising any eyebrows.
 
In 10 years drones will be banned. They’re already a security nightmare. That’s only going to get worse as people get more creative with weaponizing them.
 
  • Agree
Reactions: Orkeosaurus
The scientist who configured a small drone to target people with facial recognition and chase them at full speed warns we have no defenses against such weapons.
Pfffft, only scrubs use the Bomb Drone Killstreak in COD.
 
It doesn't take a lot of explosive next to a persons face to kill them.
Or explosives at all. 00 buckshot is nasty.
In 10 years drones will be banned. They’re already a security nightmare. That’s only going to get worse as people get more creative with weaponizing them.
DIY drones (optionally strapped to pipe shotguns and smartphones) are essentially impossible to regulate out of existence.
 
Drones as weapons have existed for a long time now and I don't see how facial recognition AI is going to significantly improve them unless you have them hovering close to the ground scanning everyone's face so this is just another poorly written AI shill article.
 
  • Like
Reactions: IAmNotAlpharius
Lol @ the journo calling him out for being a hypocritical faggot
"To know and not to know, to be conscious of complete truthfulness while telling carefully constructed lies, to hold simultaneously two opinions which cancelled out, knowing them to be contradictory and believing in both of them, to use logic against logic, to repudiate morality while laying claim to it, to believe that democracy was impossible and that the Party was the guardian of democracy, to forget whatever it was necessary to forget, then to draw it back into memory again at the moment when it was needed, and then promptly to forget it again, and above all, to apply the same process to the process itself—that was the ultimate subtlety: consciously to induce unconsciousness, and then, once again, to become unconscious of the act of hypnosis you had just performed. Even to understand the word—doublethink—involved the use of doublethink."
 
  • Thunk-Provoking
Reactions: IAmNotAlpharius
Back