In my opinion sites that want agent access should expose server-side MCP, server owns the tools, no browser middleman. Already works today.
Sites that don’t want it will keep blocking. WebMCP doesn’t change that.
Your point about selenium is absolutely right. WebMCP is an unnecessary standard. Same developer effort as server-side MCP but routed through the browser, creating a copy that drifts from the actual UI. For the long tail that won’t build any agent interface, the browser should just get smarter at reading what’s already there.
Not something new. They were recording audio on Facebook app and messenger for the longest time without a people using the microphone. They were tracking people using network data. The list is pretty long.
Facial recognition would be able to detect all the strangers around you, whereas audio would surely only pick up people nearer the device, and presumably wouldn't be able to tell people apart/identify them. You're right about network data; if they're using Wifi/BT probes then they can already find and identify everyone in the vicinity.
I'm curious why you're using past tense by the way?
Oh yeah I agree it’s bad. I just meant, company has no morals. It is very data hungry and doesn’t care about people’s privacy.
And with respect to past tense, I don’t know if they still do when they were caught red handed about some of these things. Unless there is a court order I am sure they still do, but I have no proof point.
Thats a good question. I would recommend MCP for the bulk of 'chatty' soft data to keep the database clean. However, you should selectively ingest 'high value' data into ClickHouse for vector search.
For e.g. you wouldn't ingest every 'good morning' message. But once an incident is resolved, you could ETL specific threads (filtering out noise) and the resulting RCA into ClickHouse as a vectorized document. That way, the copilot can recall the solution 6 months later without depending on Slack.
The interesting part is that only one of them is software only. I get it is economists but I guess this is also telling that while AI is really talk of town in silicon valley, long term it is one, minor if I may, part of the future.
We are a service to help brands navigate the new world of AI agents. Currently focused on helping them increase visibility in AI search but we plan to go beyond that.
> Unlike most prompt injections, the researchers said Shadow Leak executed on OpenAI’s cloud infrastructure and leaked data directly from there. This makes it invisible to standard cyber defenses, they wrote.
Sites that don’t want it will keep blocking. WebMCP doesn’t change that.
Your point about selenium is absolutely right. WebMCP is an unnecessary standard. Same developer effort as server-side MCP but routed through the browser, creating a copy that drifts from the actual UI. For the long tail that won’t build any agent interface, the browser should just get smarter at reading what’s already there.
Wrote about it here: https://open.substack.com/pub/manveerc/p/webmcp-false-econom...
reply