Mira Murati's Thinking Machines Unveils Real‑Time "Interaction Models" for AI Collaboration

Mira Murati's Thinking Machines Unveils Real‑Time "Interaction Models" for AI Collaboration
The Verge

Key Points

  • Thinking Machines announces development of "interaction models" that process audio, video and text in real time.
  • Current AI models operate in a single‑threaded mode, pausing until users finish input.
  • New models aim to eliminate the perception freeze, enabling continuous, multimodal collaboration.
  • Demo videos show live animal‑mention detection, real‑time speech translation, and posture alerts.
  • Limited research preview planned for the coming months; broader release targeted for later this year.
  • Founder Mira Murati left OpenAI in February 2025; the startup has faced notable staff departures.
  • Potential applications span education, remote work, accessibility and beyond.

Thinking Machines, the artificial‑intelligence startup founded by former OpenAI CTO Mira Murati, announced Monday that it is developing "interaction models"—systems that process audio, video and text simultaneously and respond in real time. The company says current AI models operate in a single‑threaded fashion, creating a bottleneck that limits natural human‑AI collaboration. Murati’s team showcased the new tech with demos ranging from live animal‑mention detection to real‑time speech translation and posture alerts. A limited research preview is slated for the coming months, with a broader release expected later this year.

Thinking Machines, the AI venture launched by ex‑OpenAI chief technology officer Mira Murati, revealed on Monday that it is building what it calls "interaction models." The firm describes these models as capable of ingesting audio, video and text streams at once, then thinking, responding and acting without the pauses that characterize today’s generative systems.

Current models, according to the company, wait for a user to finish speaking or typing before they generate a response. During that pause, the model’s perception freezes, missing any new cues. "That creates a narrow channel for human‑AI collaboration," the company wrote, likening it to trying to resolve a heated debate over email instead of face‑to‑face.

Interaction models aim to eliminate that bottleneck. By staying aware of a conversation in real time, they can adapt to shifts in tone, gesture or context as they happen. The approach, Murati said, lets AI meet people where they are rather than forcing users to contort themselves to fit the AI’s limited interface.

Live demos illustrate the promise

Thinking Machines shared several proof‑of‑concept videos. In one, the model listens to a storyteller and highlights every mention of an animal, demonstrating continuous auditory processing. Another clip shows the system translating spoken language on the fly, while a third alerts a participant when they begin to slouch, using visual cues to provide instant feedback. The demos underscore the company’s claim that multimodal, real‑time interaction can make AI feel more like a collaborative partner.

Murati, who founded Thinking Machines in February 2025 after departing OpenAI, acknowledged that the startup has already weathered significant staff turnover, with some key engineers migrating to Meta and even returning to OpenAI. "We’ve learned a lot about building resilient teams while pushing the frontier of AI," she said.

The company isn’t offering the technology to the public just yet. It plans a "limited research preview" in the coming months, targeting select partners who can help refine the models. A wider release is slated for later this year, though no specific timeline was provided.

Industry observers note that real‑time, multimodal AI could open new applications in education, remote work and accessibility. If successful, interaction models might also pressure larger players to accelerate similar capabilities, potentially reshaping how developers integrate AI into everyday tools.

For now, Thinking Machines invites interested researchers to sign up for updates on its website. The firm promises more detailed technical documentation in the weeks ahead, offering a glimpse into a future where AI responds as fluidly as a human collaborator.

#Artificial Intelligence#Machine Learning#Mira Murati#Thinking Machines#Interaction Models#Real‑Time AI#Multimodal Technology#Tech Startups#Research Preview#AI Collaboration
Generated with  News Factory -  Source: The Verge

Also available in:

Mira Murati's Thinking Machines Unveils Real‑Time "Interaction Models" for AI Collaboration | AI News