
Today, Anthropic announced Cowork. It’s refreshing to see a major AI company building a general-purpose local agent for a broader set of knowledge workers—whereas Claude Code has largely remained a power-user tool for a relatively small, geeky audience.
I’ve been thinking about what it takes for desktop-first, general local agents to really matter:
- Integrate deep local search. A single query should be able to retrieve not only text files, but also images and even video frames. With that kind of retrieval, a local “knowledge base” can go much deeper than typical cloud-centric setups.
- Hardware upgrades are a recurring validation loop for local agents. As long as users continue to upgrade their computers, local agents will keep proving their value. People usually upgrade because (a) heavy local workloads are maxing out resources, or (b) the browser has become the “OS” for serious work and is itself consuming a lot of compute. Either way, it signals the same thing: a large amount of user context lives on-device. Even browser login sessions are effectively local context, not something the cloud inherently owns.
- Software-embedded AI can’t cross application boundaries. An AI feature inside one app rarely has the authority, context, or integration hooks to orchestrate workflows across other apps—and real work is fragmented across many tools.
- Likewise, OS-level AI can’t cross system boundaries. Even if the operating system gets smarter, it still can’t seamlessly operate across multiple operating systems or environments users rely on (e.g., separate machines, work vs. personal setups, VMs, remote desktops).
- The long-term moat is “data + execution,” and most of it stays opaque. Software vendors won’t expose their core data and execution logic as a white box. That means OS-level AI can’t truly “reach into” applications; it can only call public interfaces and consume outputs. Over time, those interfaces may evolve toward more structured, DSL-like or agent-friendly forms. Local calls can be simpler and more reliable than network calls—and the more workflows depend on those calls, the stronger the application’s moat becomes. That creates room for local agents like Cowork to thrive, especially if vendors increasingly optimize for local, agent-oriented integrations.
- I’m skeptical of the “digital me” narrative. I don’t think AI will primarily become a perfect proxy acting on behalf of me. Instead, I expect AI to operate more like an independent identity. Eventually, systems and platforms will have to treat AI as a first-class citizen. A simple intuition: even with my most trusted subordinate or my closest family member, I still wouldn’t share all my data. Full delegation is rarer than people assume.
- Local agents benefit heavy local work the most. People with substantial on-device workloads gain the most from local agents. In frontier tech companies, cloud-native workflows and online collaboration are more common, which can dilute some of the perceived value of local agents. Still, I think the overall value of local-first agents remains high.