VentureBeat: “A new paper by researchers at Google claims to give large language models (LLMs) the ability to work with text of infinite length. The paper introduces Infini-attention, a technique that configures language models in a way that extends their “context window” while keeping memory and compute requirements constant…Infini-attention keeps the classic attention mechanism in the transformer block and adds a “compressive memory” module to address extended inputs. Once the input grows beyond its context length, the model stores the old attention states in the compressive memory component, which maintains a constant number of memory parameters for computational efficiency. To compute the final output, Infini-attention aggregates the compressive memory and the local attention contexts. “Such a subtle but critical modification to the Transformer attention layer enables a natural extension of existing LLMs to infinitely long contexts via continual pre-training and finetuning,” the researchers write.”
WaPo on how AI is transforming baseball: “[Kyle] Boddy has the most practical definition of AI I’ve heard. “It’s the best translator ever,” he says. “In the literal sense, we communicate with our athletes in Japanese and Korean and Spanish with a ChatGPT plug-in that translates baseball slang flawlessly in real time. But from a technology perspective — poring over code bases, switching between PHP or Python, none of that matters anymore. … AI takes totally different code or data or insights and harmonizes it. Numbers become words. Words can become images. Everything can talk to everything.” Boddy and his engineering team now rely on AI to blend dozens of data streams to create customized coaching regimens. I cannot emphasize enough how little this is like your weekly personal training session. Video analysis breaks down individual muscles and movements by the inch. Hardware (bats and balls) is equipped with software (sensors) that tracks every baseball action and renders them into equations that measure force and torque. Like all data-gobbling AI software, the process gets smarter as it goes; Driveline has collected enough historical performance data that it can correlate five non-baseball related physical tests into dead solid predictions of fastball velocity and bat speed.”
Thomas Sowell: “Whenever there is a proposal for a tax cut, media pundits demand to know how you are going to pay for it. But when there are proposals for more spending on social programs, those same pundits are strangely silent.” [via CafeHayek]
WSJ: “In the next five years, significant upgrades to the batteries in electric vehicles should finally hit the market. In the works for decades, these changes are likely to mean that by 2030, gas vehicles will cost more than their electric equivalents; some EVs will charge as quickly as filling up at a gas station; and super long-range EVs will make the phrase “range anxiety” seem quaint. Almost all of these coming developments are upgrades to the same tried-and-true lithium batteries that others have promised to disrupt. This gives them a huge advantage: They can be manufactured in existing facilities, and fit into existing supply chains. This matters because previous investments in battery-manufacturing capacity are so enormous—more than $30 billion in 2023 alone, according to BloombergNEF—that they make it that much harder for any technology that can’t be manufactured in those facilities to be competitive.”
Gordon Ritter: “Founders building in generative AI today can take many paths, but I see two ways forward to build a lasting, defensible business. First understand your and your team’s core strength, then, choose one of these options: (1) Major in generative AI’s emerging technical capabilities and hunt for a function or vertical problem that benefits from your insights. Or (2) Major in deep domain expertise and market relationships, and form a team to add in the technical knowledge…In the age of generative AI, one lesson remains true: the tighter your context and industry, the wiser your model and product.”