Anthropic Announces Claude’s New Computer-Use Capabilities with Built‑In Safeguards

Key Points
- Anthropic adds a computer‑use feature that lets Claude interact directly with a user's desktop.
- Safeguards are built in to block risky actions such as moving money, modifying files, or handling sensitive data.
- Access to certain high‑risk apps, like investment and cryptocurrency platforms, is restricted by default.
- Anthropic advises users to start with trusted applications and avoid using sensitive information during the preview.
- The launch follows similar desktop‑control tools from Perplexity, Manus, and Nvidia.
- OpenClaw’s viral popularity earlier this year led OpenAI to hire its creator to develop next‑gen personal agents.
Anthropic introduced a computer‑use feature for its Claude AI model, allowing the system to interact directly with a user's desktop. The company emphasized a set of safeguards designed to block risky actions such as moving money, modifying files, or accessing sensitive data, though it warned that these protections are not absolute. Users are advised to start with trusted applications and avoid handling sensitive information during the preview phase. Anthropic’s rollout follows similar moves by Perplexity, Manus, and Nvidia, and comes after the viral spread of OpenClaw, which prompted OpenAI to hire its creator to advance personal agents.
Anthropic’s New Computer‑Use Feature for Claude
Anthropic has added a capability that lets its Claude AI model take control of a computer’s desktop environment. When activated, Claude can see everything displayed on the screen, including personal data, sensitive documents, and private information. This functionality is part of a research preview that aims to explore how AI agents can assist users by performing tasks directly on their machines.
Built‑In Safeguards and Limitations
The company stresses that Claude is equipped with multiple safeguards intended to prevent common risks. These safeguards are designed to block “risky operations” such as moving or investing money, modifying files, scraping facial images, or inputting sensitive data. Anthropic also notes that access to certain “off‑limits” applications—such as investment and trading platforms or cryptocurrency services—will be restricted by default. While these measures are in place, Anthropic acknowledges that they are not perfect and that Claude may occasionally act outside the intended boundaries.
Guidance for Users During the Preview
Anthropic recommends that users begin by granting Claude access only to applications they trust and avoid using the feature with sensitive data. This precaution is intended to mitigate potential exposure of private information while the technology is still being evaluated. The company’s support page reinforces the advice to start with trusted apps and to be cautious about the types of data Claude can see.
Industry Context and Comparable Efforts
The announcement arrives shortly after a wave of similar offerings from other technology firms. Perplexity introduced a Personal Computer feature, Manus launched My Computer, and Nvidia released NemoClaw—each allowing AI agents to control a desktop directly. These developments follow the viral spread of OpenClaw earlier in the year, which demonstrated the appeal of personal AI agents capable of interacting with a user’s computer. In response to that momentum, OpenAI hired the creator of OpenClaw, Peter SteinBerger, to lead the next generation of personal agents.
Implications and Outlook
Anthropic’s rollout reflects a broader industry trend toward integrating AI assistants more tightly with everyday computing tasks. By coupling advanced language models with direct desktop control, companies aim to create more seamless and productive user experiences. At the same time, the emphasis on safeguards highlights ongoing concerns about privacy, security, and the potential for unintended actions by AI systems. As the research preview progresses, Anthropic and other firms will likely continue refining these features to balance utility with safety.