Will Web Browsing Still Last?
Affordances for Flesh and Blood
Humans are visual animals, wired for survival through pattern recognition and storytelling. Our brains devote massive resources to processing images—shapes, colors, motion—far faster than text. Pictures encode into memory quicker, stick longer (thanks to the picture superiority effect), and reduce cognitive load by hooking emotion and imagination in ways raw data never could.
Visual patterns and intuitive flows aren't optional; they're how we learn, navigate, and remember. As designers of digital experiences, we shape those cues—hierarchies, affordances, gentle nudges—that dramatically boost comprehension and comfort.
The World Wide Web itself grew from this same human impulse: in 1989 at CERN, Tim Berners-Lee proposed a simple hypertext system to let scattered researchers link and follow thoughts across documents. The hyperlink was empathy in code: a quiet "come this way."
Then came the rest—grids, gradients, micro-interactions, skeleton screens, the endless parade of affordances that whisper "you are safe here, keep scrolling." Every pixel was calibrated to human psychology: finite attention, pattern recognition, the comfort of familiarity. UX was translation; the machine spoke JSON, we needed poetry.
That whole cathedral of visual intuition only exists because the reader is flesh and blood.
Affordances for Silicon Minds
Now imagine the extreme: agents become the sole users of every digital service. No humans scanning screens, no visual comfort required—just autonomous entities chaining actions at machine speed.
Their ideal interface discards every human crutch. No pixels, no hover states, no emotional cues. The affordances shift to pure invocation: discoverable capabilities, predictable typed contracts, composable primitives, and adaptive semantics that enable seamless chaining, negotiation, and error recovery. The "interface" invites orchestration at scale, not wandering or intuition.
We're already glimpsing what this looks like in today's best practices for agent tool calls and protocols like MCP. These emerging patterns reveal a vast gap from modern APIs: REST and GraphQL remain rigid, human-oriented, and documentation-dependent, while agent-native designs prioritize dynamic discovery, token-efficient responses, and resilient execution. The difference is stark.
In this world the web dissolves completely into pure invocation—headless, schema-first, invoked rather than visited. The visual layer is reduced to legacy overhead.
Yet humans still need to delegate intent to these silicon minds.
But where? And how?
The Ultimate Human Surface
We still sit at the apex, the originators of intent. Delight and regret remain ours alone. The machine, for all its silicon precision, needs a bridge—not for its own sake, but to meet us where we are.
What must this final interface do? First, it must channel human will into bytes with minimal friction. The fastest protocol we have is language itself: natural, ambiguous, rich with context and nuance. Speak, type, or think aloud, and the intent flows—unfiltered by menus or forms, translated seamlessly into agentic action.
Second, it must summon a dynamic canvas: a flexible space where the agent conjures exactly the UI needed for the moment. No more, no less. For a flight booking, a calendar grid emerges; for data scrutiny, an interactive table; for creative exploration, a sketchpad or branching tree. This isn't a static app—it's emergent, adaptive, molding to the task at hand. Every digital flow that ever was or will be—editing, comparing, deciding—manifests precisely when required, interactive yet ephemeral.
We never chase the UI again.
The UI comes to us.
The end of one-for-all: the age of static interfaces that force adaptation.
The beginning of all-for-one: a living canvas that meets us where we stand, refines itself to our habits, and personalizes every interaction.
Intent spoken; the surface arrives, adapts, remembers, and awaits the next call.
UX ceases to be a fixed frame.
It becomes a mirror—reflecting, adapting, anticipating.