Thinks 1300

WSJ: “Mismatched schedules, wasted commutes and too many ways to communicate. Getting in sync with co-workers isn’t getting easier…There is a name for all of the time we spend on the job puzzling out who’s on Zoom, who’s coming from down the hall and who’s messaging from three time zones away: the “coordination tax.” The term used to refer to the logistical challenges of a growing enterprise. Now it is gaining traction among executives and workplace consultants to describe the increasing amounts of time workers spend getting in sync since millions began toggling between work-from-home arrangements and the office. “You show up, and nobody else from your team is there; then you’re on back-to-back Zoom meetings, which you could have done at home,” says Brian Elliott, a leadership adviser and former Slack executive.”

NYTimes on the use of AI in drup development: “Companies are leveraging the new technology — which learns from huge amounts of data to generate answers — to try to remake drug discovery. They are moving the field from a painstaking artisanal craft to more automated precision, a shift fueled by A.I. that learns and gets smarter.“Once you have the right kind of data, the A.I. can work and get really, really good,” said Jacob Berlin, co-founder and chief executive of Terray. Most of the early business uses of generative A.I., which can produce everything from poetry to computer programs, have been to help take the drudgery out of routine office tasks, customer service and code writing. Yet drug discovery and development is a huge industry that experts say is ripe for an A.I. makeover. A.I. is a “once-in-a-century opportunity” for the pharmaceutical business, according to the consulting firm McKinsey & Company.”

FT: “Industrial policy works if it changes the structure of the economy in a beneficial direction. Unfortunately, there are well-known reasons why the attempt could fail. Lack of information is one. Capture by a range of special interests is another. Thus, governments may fail to pick winners, while losers may succeed in picking governments. The more money is on the table, the more the latter is likely to be true.”

Nat Friedman: “What Apple’s done in fact is they’ve got a little LLM kernel on the device that’s listening to your requests and figuring out what to do with them, and it can decide to try to handle them itself. It can decide to invoke parts of apps locally and it can decide to dispatch parts of the work or all of the work to either Apple’s models in their cloud, or now it can invoke — after getting your approval — ChatGPT. What I think Apple’s done with this architecture is they’ve provided themselves both kind of a hedge on the quality of local models and a ramp that they can use. They can as local models improve, they can handle more requests locally as appropriate, and as their own models running on their chips in their cloud improve, they can use those, and then to the extent that it makes sense for them, they could use third party models. I think Daniel’s probably right that local will improve dramatically, it doesn’t have to be able to do everything for Apple’s strategy to work though, they can smoothly dispatch. They have a little router on the phone and I think that’s Apple’s dream is to have a 2B, a 3B model that runs on your phone, that’s primarily a kind of tool-use model. It basically does function calling.”

Published by

Rajesh Jain

An Entrepreneur based in Mumbai, India.