Connect with us

Hi, what are you looking for?


OpenAI quietly updated its policies, lifting the ban on military and warfare applications.

OpenAI policies got a quiet update, removing ban on military and warfare applications

OpenAI may be paving the way toward finding out its AI’s military potential. First reported by the Intercept on Jan 12., a new company policy change has completely removed previous language that banned “activity that has high risk of physical harm,” including specific examples of “weapons development” and “military and warfare.”

As of Jan. 10, OpenAI’s usage guidelines no longer included a prohibition on “military and warfare” uses in existing language that obligates users to prevent harm. The policy now only notes a ban on utilizing OpenAI technology, like its Large Language Models (LLMs), to “develop or use weapons.”

Subsequent reporting on the policy edit pointed to the immediate possibility of lucrative partnerships between OpenAI and defense departments seeking to utilize generative AI in administrative or intelligence operations.


OpenAI (Credits: Gossip Herald)

In Nov. 2023, the U.S. Department of Defense issued a statement on its mission to promote “the responsible military use of artificial intelligence and autonomous systems,” citing the country’s endorsement of the international Political Declaration on Responsible Military Use of Artificial Intelligence and Autonomy — an American-led “best practices” announced in Feb. 2023 that was developed to monitor and guide the development of AI military capabilities.

“Military AI capabilities include not only weapons but also decision support systems that help defense leaders at all levels make better and more timely decisions, from the battlefield to the boardroom, and systems relating to everything from finance, payroll, and accounting to the recruiting, retention, and promotion of personnel, collection, and fusion of intelligence, surveillance, and reconnaissance data,” the statement explains.

AI has already been utilized by the American military in the Russian-Ukrainian war and in the development of AI-powered autonomous military vehicles. Elsewhere, AI has been incorporated into military intelligence and targeting systems, including an AI system known as “The Gospel,” being used by Israeli forces to pinpoint targets and reportedly “reduce human casualties” in its attacks on Gaza.

AI watchdogs and activists have consistently expressed concern over the increasing incorporation of AI technologies in both cyber conflict and combat, fearing an escalation of arms conflict in addition to long-noted AI system biases.

In a statement to the Intercept, OpenAI spokesperson Niko Felix explained the change was intended to streamline the company’s guidelines: “We aimed to create a set of universal principles that are both easy to remember and apply, especially as our tools are now globally used by everyday users who can now also build GPTs. A principle like ‘Don’t harm others’ is broad yet easily grasped and relevant in numerous contexts. Additionally, we specifically cited weapons and injury to others as clear examples.”

OpenAI introduces its usage policies in a similarly simplistic refrain: “We aim for our tools to be used safely and responsibly while maximizing your control over how you use them.”

Click to comment
Notify of
Inline Feedbacks
View all comments

We’re dedicated to providing you the most authenticated news. We’re working to turn our passion for the political industry into a booming online news portal.

You May Also Like


Actress Emma D’Arcy is from the British rebellion. She has only appeared in a small number of movies and TV shows. It might be...


Jennifer Coolidge Is Pregnant: Jennifer Coolidge Audrey Coolidge is a comedian and actress from the United States. Many of her followers are wondering if...


Spoilers! The demon Akaza from Kimetsu no Yaiba dies in the eleventh arc of the manga and the one responsible for his death is...


The young YouTube star Emily Canham has recently been seen making headlines for her amazing work and her journey. She started from scratch and...