| ▲ | 7777777phil 4 days ago | |||||||
API prices dropped 97% in two years so the model layer is already a commodity. The question is which context layer actually sticks. The OpenClaw example in the article (400K lines to 4K) is a nice proof point for what happens when context replaces code. I've been arguing for some time now that it's the "organizational world model," the accumulated process knowledge unique to each company that's genuinely hard to replicate. I did a full "report" about the six-layer decomposition here: https://philippdubach.com/posts/dont-go-monolithic-the-agent... | ||||||||
| ▲ | steveBK123 6 hours ago | parent | next [-] | |||||||
The way many corporates are using the models nearly interchangeably as relative quality/value changes release to release, AND the API price drops do make me question what the model moat even is. If LLMs are going to make intelligence a commodity in some sense, where does the value end up accruing will be the question. Picks/shovels companies and all the end user case products being delivered? Mainframes value didn't primarily accrue to DEC. PCs value didn't really accrue to IBM. Internets value didn't accrue to Netscape. Mobiles value didn't only accrue to Apple. One reminder that new efficiency / greatly lowered costs sometimes doesn't replace work (or at least not 1-1) but simply makes things that were never economical possible. Example you hear about AI agents that will basically behave like a personal assistant. 99% of the rich world cannot afford a human personal assistant today, but I guess if it was a service as part of their Apple Intelligence / Google something / Office365 subscription they'd use it. We seem to be continually creating new types of jobs. Only a few generations ago, 75% of people worked on farms. Farm jobs still exist you just don't need so many people. The type of work my father and grandfather did still exist. My father's job didn't really exist in his father's time. The work I do did not exist as options during their careers. The next generation will be doing some other type of work for some other type of company that hasn't been imagined yet. | ||||||||
| ||||||||
| ▲ | apsurd 6 hours ago | parent | prev | next [-] | |||||||
From your link: > Closing that gap, building systems that capture and encode process knowledge rather than just decision records, is the highest-value problem in enterprise AI right now. I buy this. What exactly is the export artifact that encodes this built up context? Is it the entire LLM conversation log. My casual understanding of MCP is service/agent to agent "just in time" context which is different from "world model" context, is that right? i'm curious is there's an entirely new format for this data that's evolving, or if it's as blunt as exporting the entire conversation log or embeddings of the log, across AIs. | ||||||||
| ||||||||
| ▲ | mlcruz 3 hours ago | parent | prev | next [-] | |||||||
Hi Phil, Your article is great! As someone who's working in this space, your points just improved our presentation and selling a lot. We have been talking with C level finance executives about building semantic layers, and i can confidently say that the way you presented the value proposition of the context layer is going to improve our conversion rates. Thank you so much! This is one of the best analysis i have ever heard about the subject. | ||||||||
| ||||||||
| ▲ | energy123 5 hours ago | parent | prev | next [-] | |||||||
It's not a commodity due to the simple observation that revenue run rates of frontier labs are growing exponentially and gross margins are still fine. It's easy to just say it is but the narrative violation keeps occurring in reality. | ||||||||
| ▲ | martin_drapeau 5 hours ago | parent | prev [-] | |||||||
100% Currently integrating an AI Assistant with read tools (Retrieval-Augmented Generation or RAG as they say). Many policies we are writing are providing context (what are entities and how they relate). Projecting to when we add write tools, context is everything. | ||||||||