Foundation models excel at general tasks, but customizing models with business knowledge is crucial. Amazon introduces reinforcement fine-tuning for Nova models, shifting from imitation to evaluation learning paradigm, offering tailored solutions for code generation and math reasoning.
Modern large language models (LLMs) face rising costs due to token count growth. AWS's new LMCache support offers cost reductions and performance gains for long-context inference workloads, transforming how organizations handle repetitive data "hot spots."
Workers express feeling devalued by AI technology, fearing a decline in work quality. IMF analysis predicts AI impact on 40% of global jobs, likening it to a labor market tsunami.
AI is accelerating COBOL modernization, but success requires complete context and platform-aware input for forward engineering. Mainframe modernization hinges on reverse engineering and a traceable foundation for AI coding assistants.
Efficiently share GPU capacity with Multi-LoRA for MoE models like GPT-OSS. Amazon optimizations improve performance for hosting dense models.
Meta's AI moderation software inundates US ICAC taskforce with low-quality reports, hindering child abuse investigations. New Mexico lawsuit alleges Meta prioritizes profits over child safety, while company defends changes made to platform protections.
MIT researchers developed a method to accelerate training of large language models by using idle processors. By training a smaller model to predict outputs of a larger model, they doubled training speed without sacrificing accuracy.
Researchers at the Broad Institute of MIT and Harvard and ETH Zurich/Paul Scherrer Institute developed an AI framework that analyzes cell data from different measurements to provide a holistic view, aiding in understanding diseases like cancer and Alzheimer's. Lead author Xinyi Zhang emphasizes the importance of combining multiple measurement modalities to gain a fuller picture of a cell's stat...
GenAI models often lack understanding of physics, leading to impractical 3D designs. MIT's PhysiOpt system enhances designs by incorporating physics simulations for structurally sound objects, allowing users to create unique and functional items with ease.
Nvidia continues to exceed Wall Street's expectations with higher than expected revenues from its data center business, driven by AI infrastructure investments. The chipmaker's dominance in the market is highlighted by its 75% year-over-year growth and staggering $120bn total profit for the fiscal year.
AI assistants at events lack personalized guidance. Amazon Bedrock AgentCore enables quick deployment of intelligent event assistants, enhancing attendee experiences.
Tech equity campaigners criticize government for involving private tech companies in AI deployment. Ministers consult Tony Blair's thinktank and companies like IBM, Accenture, and former Google and Facebook executives.
US struggles with delays and cancellations of new datacenters amid AI boom due to supply chain issues, energy shortages, and local opposition. Investors cautious of AI bubble potential impacting infrastructure expansion.
NVIDIA's survey shows healthcare embracing AI for medical imaging, drug discovery, and cost reduction, with open source software and agentic AI gaining traction. AI adoption is on the rise across all healthcare sectors, with digital healthcare leading at 78% and generative AI as the top workload.
Meta's owner buys $60bn AI chips from AMD, part of $660bn US tech AI spending trend, a 'big bet' on artificial intelligence. Analyst suggests it may signal a pivot in Meta's AI strategy.