FT: “[Jensen] Huang’s take on AI economics is based around the production, consumption and monetisation of tokens. These are the most basic units of output from large language models: it takes about 1,300 tokens to generate 1,000 words of text. The key metric, he argues, is the cost per token of output. And as the main input into AI-powered services, he adds, tokens translate directly into revenue.”
WSJ: “Claws are autonomous agents and can plan and execute tasks on their own, and, critically, spin up their own subagents to tackle specialized tasks, access files and themselves delegate tasks to other subagents. They represent a big leap beyond question-and-answer-style AI chatbots as well as recent iterations of AI agents, which typically have narrow use cases and run for a set amount of time—although claws also come with a new set of security concerns. For claws to work as a true personal assistant, they need access to all of a user’s data. So what are people using them for today?”
WSJ: “AI tools like Anthropic’s Claude Code, Cursor and OpenAI’s Codex can now write and debug software, unlocking huge new sources of revenue. That success is pushing their makers toward a bigger ambition: automating our entire lives. What began as a way to autocomplete code quickly evolved into semiautonomous AI bots, or “agents,” that can work for hours on end with little human oversight. We can tell a bot to create a presentation for work, coordinate the family’s schedules and pick a March Madness bracket, all while it learns our personal preferences, no coding needed…The shift has permanently changed the lives of coders and sparked a $1 trillion market selloff as investors and executives contemplate the technology’s potential to reshape industries, including finance, legal and healthcare. Tens of thousands of job cuts have already been attributed to AI.”
Pete Boettke: “In the late 19th century, Italian economist Vilfredo Pareto (1848-1923) expanded on this point, observing that co-ordinating even a modest economy and matching resources to uses and preferences would soon cause an explosion in the number of equations to be solved. But today’s computers can handle quintillions of computations per second, more than Pareto could possibly have imagined. Doesn’t that make a difference? This is where Nobel laureate economist Friedrich Hayek (1899-1992) comes in. Hayek explained that the problem is not merely that the relevant knowledge is decentralized — spread out across millions of individuals — but that it is often tacit. Local shopkeepers’ understanding of their customers’ buying habits cannot be translated into one data point to feed into an AI or any other kind of model. Nor can we predict the emergence of an entrepreneur dreaming up a product that did not exist before…Prices are not lying around in the wild, waiting to be harvested and fed into an algorithm. Rather, they are the result of constantly evolving discovery. Without this process of discovery, the knowledge embedded in a price simply doesn’t come into existence…As powerful and helpful a tool as AI can be to improve logistics, better manage inventories and analyze markets, it remains just that, a tool. It can help us gain a better understanding of markets but only markets themselves can predict and co-ordinate the results of the billions and billions of voluntary exchanges that take place every day.”


