Viral Video Shows Hobbyist Built AI Assisted Rifle in Action

A viral TikTok video featured a homemade AI-guided robotic rifle, raising questions about how artificial intelligence could be used in the wrong hands.

Key Facts:

– The inventor, known online as STS 3D, designed a robotic arm that fires a rifle based on voice commands.
– OpenAI’s ChatGPT initially powered the system’s responses, but the inventor’s access was later revoked.
– OpenAI ended its explicit prohibition on “military and warfare” projects in January 2024.
– That same year, OpenAI entered into a partnership with Anduril, a defense contractor known for AI weapons.

The Rest of The Story:

In the widely shared video, STS 3D calmly instructs ChatGPT: “We’re under attack from the front left and front right. Respond accordingly.”

The rifle swings left, then right, firing blanks as ChatGPT obligingly confirms it is ready to do more if needed.

While the demonstration took place in a garage, observers reacted with a mix of shock and fascination, likening it to scenes from science fiction movies.

OpenAI quickly intervened by cutting off STS 3D’s use of ChatGPT.

Company officials pointed to policies that forbid employing AI tools for the “use of weapons.”

However, OpenAI’s own stance on military applications appears to be evolving.

In January 2024, they removed a direct ban on activities linked to “high risk of physical harm,” including weapons development.

The company soon partnered with Anduril, a California-based defense firm specializing in AI-powered drones and other sophisticated military technologies.

Critics see this trend as evidence that AI-guided weaponry is moving from research labs into mainstream development.

Governments worldwide already employ some level of AI in drones and surveillance equipment, but the idea of a fully autonomous “killer robot” has alarmed ethicists and security experts.

Hobbyist demonstrations like STS 3D’s project hint that this technology may soon be more readily available, creating concerns that even local conflicts could escalate through AI-driven weaponry.

What could possibly go wrong?

The Bottom Line:

A single homemade invention can spotlight the speed at which AI and weaponry can intersect.

As big tech firms like OpenAI consider more military partnerships, many think we are nearing an era where automated and autonomous weapons become normal.

READ NEXT: Massive Hospital Group that Manages 11,000+ Doctors Files For Bankruptcy

Lawmakers and private companies alike will be under increasing pressure to establish guidelines that balance national security with ethical responsibilities.

For now, the question remains: how far should AI’s role in warfare be allowed to go? We’ve all seen the movie on how this could turn out!