• • algorithms are designed, it is easier to understand why they reinforce our strongest views and actually hide any nuanced and more reasonable “both sides of the argument” content (agreeable and reasonable content doesn’t perform, and thus never gets displayed). • More recently, I’ve been thinking about the loss of nuance inspired in part by a Palestinian friend and fellow builder. In a recent conversation, it was especially clear how much nuance is lost in the public narrative, thanks to the algorithms. The business models of journalism simply do not support nuanced reporting.

  • f you’re a developer, or paying attention to any of the new AI startup onboarding experiences, you’ll notice that every company is trying to sync everyone’s data. New companies in the enterprise search and agent space often launch with dozens if not hundreds of “connectors” that enable you to instantly query and index real-time data from other apps.

  • As we interact and do more of our everyday thinking alongside LLMs, they will get to know us better than we know ourselves. This “memory” of us becomes an instant and portable force of personalization.

  • these roaming machines that permeate our lives will offer a bounty of data that is unlike any other. My bet is that data captured by robots will greatly outweigh and outperform data captured from the internet for training the next generation of industrial grade models.

  • Long-tail specialized and unspoiled data sets will unlock precision automation. Another type of data you won’t find on the internet comes from very purposeful sources with meticulous collections, li

    Marginalia is content in the long tail