The Cognition Stack for AI-native Companies & Why Sales, Support, & Social Are Converging

  • In contrast, the next-generation company will be run by a combination of inference engines (real-time computational reasoning running every function and driving actions across the business), leveraging a variety of pre-trained AI models and a large amount of deep and proprietary data at the center of the company.

  • Old companies were designed to help people work efficiently together. New companies will be designed for cognition - the way a brain works. ChatGPT defines “cognition” as “the processes and activities involved in acquiring, processing, storing, and using knowledge. It encompasses a wide range of mental functions—such as perception, attention, learning, memory, language comprehension, reasoning, decision-making, and problem-solving…the ensemble of all the mental actions and processes through which information is perceived, integrated, and acted upon.”

    Human cognition latches onto concrete breadcrumbs — tags, labels, shorthand — as a means for bridging created and consumed content. A knowledge tool built around these anchors can surface the connections that raw memory cannot.
  • Rules for obeying laws and regulations across different geographies, rules for adherence to company policies, rules for brand compliance, rules for pricing and business management, and the list goes on. I expect to see “rules engines,” delivered in the form of APIs, emerge and become an indispensable part of the cognition-driven company stack (if you know of a startup doing this, let me know!).

    The breadcrumbs that people set for themselves — tags, folder structures, naming conventions — compose the ergonomics by which organizational rules get built and enforced. Plain-text formats like markdown lower the barrier to experimentation with how knowledge systems can be structured and leveraged.
  • Orchestrators: An “orchestration designer” is the evolution of a product leader and designer for the AI era who works above the nucleus. An “orchestration engineer” works within the nucleus. I call these people “orchestrators” because they primarily, if not exclusively, work with components that represent AI models, APIs, and very complex prompts. They are conductors of logic, leveraging or instrumenting the various components in a cognico’s nucleus. An orchestration designer understands the capabilities of pre-trained AI models for their node in particular, as well as their own company’s APIs and third-party software components that are ultimately stitched together to execute and optimize the functions performed in their node of responsibility.

  • Most importantly, orchestration designers have great taste. They will ctry creative things to unlock new edges in the market or attract attention through meaning and ingenuity.

  • They are wildly imaginative and able to explore all sorts of permutations of workflows, models, and components using orchestration tools. They are the final mile of human selection and decision-making that will ultimately help differentiate the output of every company competing to win in every industr

  • They must set the overall goals, they must include leaders of each function charged with thinking about the future of each function in a creative way that models “trained on the average of what’s been done before” cannot. They must gut-check every decision, they must declare the double-bottom line for doing good in a world that tends to optimize for profit. They must also prioritize the parts of business that aren’t intended to scale - the art of business.

  • Every company will find, engage, and serve its customers via agent-based experiences and contextual UI powered by AI. This “logic layer,” powered by agents and user interfaces tailor made for specific needs, is the ultimate interface layer (an obsession of mine since 2014). I

  • If your customers are employees in companies, your services will likely be delivered as agents within customer workflows, or perhaps new interfaces to conduct work that don’t even exist yet. Future customers, whether they are consumers or employees in a company,

  • While today we see top technical talent wanting to work at the major LLM model companies, I think we’ll see the next wave of top talent flock to each industry to help drive these transformations. No doubt, the major LLM model companies are rapidly improving AND undercutting each other in price, accelerating the path to commoditization — especially given the rise of local and open- source models. What if too much attention (and capital) is being deployed to companies that MAKE the AI models while the majority of value accrues to the first to who use them in each industry?

  • . Instead of paying for seats, we will pay for generative credits (essentially marked-up compute, or shall we call it “cooked compute?”), outcomes, and performanc

  • . As for the companies that choose to stay small and efficient, we will see 10x or 100x more of these companies. There are so many products and services that people would want but are far too niche to be built and deployed in a profitable way. We’re starting to see very small services emerge, from research and consulting to career support and production shops, that are founded and led by just a few people, without the intention of ever scaling. Finally, let’s talk about the return of crafts. In the era ahead, humans will crave more scarce, authentic, and offline experiences than ever before. We will crave small restaurant experiences with proud chefs. We will crave one-of-a-kind art infused with human story. We will crave theater and emotional films with deep meaning. We will crave shared experiences and live music. In the age of AI, there will be rampant demand for stuff that only humans can create.