Meta develops 'Hatch' AI agent for Instagram shopping, aims to rival TikTok Shop

Key Points
- Meta is developing an AI agent called Hatch, inspired by OpenClaw.
- Hatch will let Instagram users buy items seen in Reels with a single command.
- The prototype has been tested on simulated versions of DoorDash, Reddit and Outlook.
- Meta plans to use its own Muse Spark model for the agent after early Anthropic testing.
- Launch is targeted for the end of the year, aiming to compete with TikTok Shop.
- CFO Susan Li hinted at future integration of agents with Ray‑Ban Meta glasses.
- Hiring attempts include the OpenClaw creator (who joined OpenAI) and Moltbook founders.
Meta is building an AI assistant named Hatch, modeled after the open‑source platform OpenClaw, to let users shop directly from Instagram Reels and interact with third‑party services such as DoorDash and Outlook. The company tested the prototype on simulated versions of external apps and plans to roll it out before the end of the year, positioning it as a counter to TikTok Shop. Meta’s CEO Mark Zuckerberg highlighted the goal of making agents that understand user goals and work continuously on their behalf, while CFO Susan Li hinted at future integration with the firm’s Ray‑Ban Meta glasses.
During the company’s earnings call last week, Meta CEO Mark Zuckerberg announced that the firm is developing new AI agents for both consumers and businesses on its platforms. A follow‑up report from The Information reveals that the project, internally dubbed "Hatch," draws inspiration from the open‑source OpenClaw framework and is being built to operate inside Meta’s own apps, most notably Instagram.
Hatch is designed to make shopping on Instagram more seamless. By tapping the agent, users could purchase items they see in Reels with a single command, a capability Meta hopes will help it close the gap with TikTok Shop. The move follows a recent policy allowing creators to tag up to 30 products in a single video, laying the groundwork for a more robust in‑app commerce experience.
The prototype has already been tested on simulated versions of popular third‑party services, including DoorDash, Reddit and Microsoft Outlook. While the report does not detail how Hatch will integrate with services that Meta does not own, the testing suggests the agent could act as a bridge between the social platform and external tools, handling tasks like food delivery orders or email management on behalf of users.
Zuckerberg emphasized that the goal is to deliver agents that "understand your goals and then work day and night to help you achieve them." He noted that OpenClaw, while "exciting," remains too complex for most people to set up, prompting Meta to pursue a more user‑friendly solution. The company reportedly evaluated Anthropic models for the early stages of Hatch, with plans to eventually run the agents on Meta’s own Muse Spark model.
Meta’s ambitions may extend beyond smartphones. In a recent analyst call, CFO Susan Li described the Ray‑Ban Meta glasses as offering "the best form factor for agentic interactions," though she cautioned that the technology is still in its infancy. If successful, the glasses could provide a hands‑free interface for agents like Hatch, further blurring the line between augmented reality and everyday digital assistants.
The rollout is not expected to happen immediately. Sources say Meta is targeting a launch toward the end of the year, giving the company time to refine the technology and integrate it with its broader ecosystem. Hiring efforts underscore the seriousness of the push: Meta tried to recruit the creator of OpenClaw, who ultimately joined OpenAI, and it did bring on the founders of Moltbook, a short‑lived AI‑agent forum.
While the exact timeline remains uncertain, Hatch represents Meta’s most concrete step yet toward embedding AI agents into its social and commerce products. If the agent can deliver on its promise of effortless, goal‑driven assistance, it could reshape how users interact with both Meta’s own services and the wider internet.