• . This all made me wonder about whether, as we trust AI more than ourselves, the guidance and predictions from AI will become self-fulfilling prophesies? If AI tells us, based on everything it knows about our finances, family, actions, and values, that we should make a particular purchase decision, career move, or vote for a particular candidate, at what point does our trust become blind faith? Will there still be people who are willing to test the AI’s effectiveness by disregarding its advice?

    The interface matters: AI materializing as a chat product is why people trust it — engagement creates trust. But if AI were used as a form of creative capture rather than conversation, it would function differently: not as an oracle delivering prophecies but as an intelligent weaver of the user’s own ideas. The difference between self-fulfilling prophecy and genuine insight may depend on which interface wins.
  • So much of modern society is designed to help suppress or channel some of our more destructive primal defaults, with mixed levels of success. But as I considered the accelerating pace of AI model innovation and Jack Dorsey’s proposal that we all choose our own algorithms, I wondered when (not if) these AI-powered algorithms become effective enough that they override our factory default settings. When might these algorithms of persuasive content and doses of personalized social proof influence our hunger (say, convincing us to use supplements instead of naturally satiating our appetites)? When might algorithms influence culture and society’s definition of “beauty” faster than biological factors that drive procreation (and evolution) via natural selection? When might other decisions we normall

    The distinction between push and pull in technology design maps onto Dorsey’s proposal: art creation is push-based (the creator initiates), but media consumption is pull-based (the user requests). If consumption were also push-based — shaped by someone’s deliberate curation rather than algorithmic response — what people consume would align more closely with good storytelling. The more the media stream mirrors someone’s creative push, the less consumption feels like a primal reflex and more like the product of editorial intention.
  • But what we gain in eliminating human error, we lose in agency. Sure, we technically have a choice, but going against the guidance of AI will increasingly look reckless if not self-destructive — much like turning off the headlights while driving at night

    Taste is borne out of error — developed through critique and discourse, not through frictionless optimization. If AI eliminates the possibility of going wrong, it also eliminates the conditions under which taste develops. The question for craft: how to preserve the discursive elements — the debate, the misstep, the correction — when the system’s incentive is to route around them.
  • . If you want to disrupt any industry — or prevent your disruption — go up the stack of user experience.

    Going up the UX stack is one strategy; another is building at the interoperability layer — where UX can bridge products within an industry. Protocols like MCP demonstrate this: identifying a small set of capabilities that, when merged, create a step change and allow people to grow curious rather than merely efficient.
  • What started as fun led to the discovery of utility and ultimately Slack became a mission-critical technology for the team. I saw the same phenomenon during the rollout of a virtual conference room technology in 2005 while working at an investment bank. Nobody used it until one member of the team summoned everyone to “jump on audio conf room 1” to secretly make fun of a partner’s tie. When we play with new technology, we become socialized to its use cases. Now, in the age of AI, the same pattern is repeating itself and we must let our teams play to discover the utility.

    The Slack adoption pattern suggests that technology sticks when play comes first and utility is discovered later — not the reverse. Play without deliverables, without a defined start and end, going to the depths of rabbit holes: not to build habits or self-improve, but to capture and see where it leads. The discovery of utility through play is stickier than utility imposed from above.
  • The user experience determines whether a new customer can survive the first mile of the product, whether the product’s functionality is even used, whether the customer is willing to pay, and whether the product grows. The secret of any successful and honest product leader is their design partner. When you empower designers at every part of the process of building products — and companies — you stack the deck in your favor. What you’ll also learn is that design can compensate for technical shortcomin

  • You need to hear about a product many times before you’re willing to try it, and any product with a learning curve, once overcome, is far stickier than the average entrepreneur imagines.

    PKM workflows illustrate this stickiness: users who have learned a tool like Obsidian the hard way are unlikely to switch — the learning curve itself is the moat. But the prerequisite is a resignation against the idea that shiny workflows and smooth corners build lasting habits. The stickiness comes from the struggle, not from the polish.