OpenAI's new AI chatbot is an expansion on its flagship ChatGPT product. The new tool, ChatGPT Gov, is specifically for use by U.S. government agencies.
OpenAI has announced ChatGPT Gov, a new version of their premiere AI models that the company hopes will be used securely by U.S. government agencies.
The product is not approved for government use yet, but OpenAI of course hopes President Trump will speed things up.
The chatbot repeated false claims 30% of the time and gave vague or not useful answers 53% of the time in response to news-related prompts, resulting in an 83% fail rate, according to a report published by trustworthiness rating service NewsGuard on Wednesday.
OpenAI said it was working with Washington to protect "the most capable models from efforts by adversaries and competitors to take US technology."
OpenAI on Wednesday said that Chinese companies are actively attempting to replicate its advanced AI models after startup DeepSeek
As noted by OpenAI, government agencies can deploy ChatGPT Gov within their own Microsoft Azure cloud instance, making it easier to manage security and privacy requirements. OpenAI says the launch could help advance the use of OpenAI’s tools “for the handling of non-public sensitive data.”
Chinese AI chatbot DeepSeek has displaced OpenAI’s ChatGPT as the most downloaded app on the Apple App store and the market is panicking. Stocks for major AI connected companies like NVIDIA fell on Monday morning following the news.
Security experts are urging people to be cautious if considering using emerging AI chatbot DeepSeek because of the app’s links to China and the potential implications for personal data.
However, the consensus is that DeepSeek is superior to ChatGPT for more technical tasks. If you use AI chatbots for logical reasoning, coding, or mathematical equations, you might want to try DeepSeek because you might find its outputs better.
Chinese AI startup DeepSeek's release of new AI models spurred a selloff in U.S. tech stocks, but some investors think the competitive concerns may be overblown.