Insider Brief Researchers at Technische Universität Berlin report finding that prompting large language models to reason more like humans significantly improved their ability to provide medical care-seeking advice, according to a study published in JMIR Biomedical Engineering. The study focused on a growing problem surrounding AI health tools such as ChatGPT: the tendency to recommend […]
If you haven’t heard of Arm, you haven’t been paying attention to how ubiquitous the chipmaker has become. Arm’s processor designs power Macs, iPhones, and every other major smartphone line. Queries made through ChatGPT, Gemini, or Claude pass through an Arm-based chip at some point.
For more than 40 years, Arm’s focus was on chip design. Major device and AI chip makers then licensed those designs and turned them into hardware.
But the company’s focus is changing: Arm is now making hardware using its own AGI CPU, which OpenAI and Meta will use and which will allow the chipmaker itself to compete with the likes of Apple, Intel, Nvidia, Amazon and Google.
Arm’s envisions its new Performix software suite using “recipes” and AI insights to help engineers identify suspect code and CPU hotspots.
Alex Spinelli, who leads Arm’s software initiatives as senior vice president for AI and developer platforms, is as AI-native an engineer as you’ll find; he played a central role in the TensorFlow st
The lawsuit against OpenAI could intensify scrutiny on AI safety protocols, potentially influencing future regulations and legal frameworks.
The post OpenAI faces federal lawsuit over ChatGPT’s role in FSU shooting appeared first on Crypto Briefing.
The post OpenAI Faces Federal Lawsuit Over ChatGPT’s Alleged Role in FSU Mass Shooting appeared on BitcoinEthereumNews.com.
In brief A federal lawsuit alleges that OpenAI’s ChatGPT provided firearms guidance and tactical advice to the gunman in the April 2025 Florida State University mass shooting that killed multiple victims. The family claims ChatGPT failed to detect threat indicators despite extensive conversations about weapons, mass shootings, and attack planning methods. Florida’s Attorney General has launched a criminal investigation into OpenAI’s role in the incident. Vandana Joshi, whose husband was killed in the April 2025 Florida State University mass shooting, filed a federal lawsuit against OpenAI Sunday, alleging that ChatGPT enabled the attack by providing firearms guidance and tactical advice to the shooter. The lawsuit alleges Phoenix Ikner shared images of firearms with ChatGPT and received instructions on how to use them in the weeks before April 17, 2025. According
OpenAI has introduced a Trusted Contact feature for ChatGPT that allows users to designate a friend or family member to receive automated alerts if conversations suggest self-harm risk. The company said human reviewers aim to assess safety notifications within one hour before deciding whether to contact the designated person via email, text, or in-app message. […]
Journalist Jamie Bartlett on the people trying to get AI to say things it shouldn’t … for the safety of us all
All the major AI chatbots – from ChatGPT to Gemini to Grok to Claude – have things they should and shouldn’t say.
Hate speech, criminal material, exploitation of vulnerable users – all of this is content that the most successful large language models in the world shouldn’t produce, that their safety features should guard against.
Continue reading...