Calm Down, Wordbros. "Compute" Works.
Learning to love a noun for what it tells us about the world today.
Earlier this week I was Dealbook famous, appearing towards the end of The New York Times’ illustrious newsletter about Money (The Making Of). I was asked about the history of “compute” as a noun. Tech titans like Jeff Bezos, Sam Altman, and Sundar Pichai had used it while onstage for the DealBook Summit, the Times’ live event about Money, (TMO).
Being a thoroughly modern guy, I shamelessly posted the clip quoting me to various social media outlets. It created a modest firestorm of bile, largely from writers, at this linguistic abomination. A verb, compute (“I compute, you compute, he/she/it computes”) has been turned into a noun (“we just need more compute to solve the problem.”) Mumbo jumbo! A Marketing-tipped cruise missile aimed at rational thought! Ugly!!!
First the Kindle Fire phone. Now this. Jeff Bezos uses “compute” as a noun at the Dealbook Summit. Closed caption from a video of the interview.
Remain calm, wordbros. Languages change constantly,1 and this change arose not out of some PR scheme, but organically, from changes in the way technology is used. A little background for what happened explains much about not just this word, but our world.
“Compute” refers to computing power, or the combined output of transistors, chips, accelerators, architectural software, pods and other elements of a computing system. For many decades if you wanted computing power you purchased a mainframe, minicomputer, microcomputer, personal computer, or computer server, depending on the configuration that best met your overall need.
In 2007 that changed. Amazon began renting data storage, and soon after computing power, online, so companies no longer had to buy and maintain their own computers. Amazon’s products, called S3 (for Simple Storage Service) and EC2 (Elastic Compute Cloud), were the first retail cloud computing offerings that got really big. Microsoft,2 with Azure, and Google, first with App Engine and then with Google Cloud3 followed not long after.
This was a really big deal, because it meant people could rent data storage and computers (and in time, the many more things that you need to make modern computing work) to test out an idea, say, or build and grow a business, without an enormous outlay in computer servers, data storage systems, a place to put that stuff, and people to install and maintain them.
Consider a business like Uber. It requires computers for mapping (for picking people up and taking them someplace), for payments, for social networking (how you rate drivers and riders), and for data analysis (there was a lot of early Artificial Intelligence at Uber, mostly to figure out efficiencies). If Uber had been forced to buy its own computers for all that, it would have been forced to spend tens of millions of dollars on people and hardware before writing a line of code, let alone picking up a passenger. Airbnb’s story is much the same.4 Without the capability to buy computation power, now called compute, these companies would not exist.
As consumers we all benefit from this innovation. Uber and Airbnb, and all of the other apps on our phones now rely on the cloud. Pretty much anyone anywhere on the planet can, for smartphones costing as little as $30,5 get the use of a supercomputer.
And, owing to the success of cloud computing, the big three cloud companies all run systems that span the planet, and contain tens of millions of individual computer servers. They are engineered to individually run several small jobs or join thousands of other machines for a really big job, depending on the needs of the moment. Computing power dials up and down, detached from what any particular machine does. Sounds a bit like a noun, doesn’t it?
In business terms, computers shifted from being a fixed capital expense (an asset you buy, which depreciates) to being a flexible operational expense (something you spend money on as needed.) With the distinction made between purchasing computing power as opposed to purchasing the computers themselves, the idea of “compute” as a noun becomes more plausible.
For several years now we’ve become transfixed by the latest wave of Artificial Intelligence, generative AI, and here’s where compute, the noun, really comes into its own. Gen AI improves in direct relationship with the amount of data and computing power -compute- it has to work with, wherever it comes from and however it’s packaged. This has implications big enough and pervasive enough that, at minimum, technology leaders are thinking about the meaning of computing power in new ways.
Bezos talked about it like electricity at the Dealbook Summit. Pichai treated like the means to a new end. Altman has talked about it like a potential currency that will be traded.
The changing ways to think about compute won’t end there. Last month Nvidia was briefly the most valuable company in the world,6 not for the function Nvidia chips would have in any particular computer, but for the larger economic implications of their function as a means to AI everywhere.
Perhaps we could scold the industry to change its ways, and say “computing power” instead of “compute.” Perhaps we could also start saying “the power generation has gone off,” or “the process of sitting inside an airplane in flight is such a pain during the holidays.”
That goes against what people want, though, which is to talk about the world dynamically, the way it is now.
For example, “Computers” used to refer to people doing computations with the assistance of adding machines. The term was later applied to machines that did all kinds of calculations. Somehow we all survived.
A moment of appreciation here for Ray Ozzie, who succeeded Bill Gates as Microsoft’s chief software architect. Ozzie had a deep background in networking computer systems, and got CEO Steve Ballmer to sign the checks for what became Azure data centers. Without him, Satya Nadella would have had much less to work with when he succeeded Ballmer as Microsoft’s CEO.
It’s odd that Google, which pioneered so much of the technology behind cloud computing, was so late to the retailing business. For years it operated the world’s largest cloud for its own use, and could have killed AWS early on. Maybe they thought it was better to keep the technology proprietary. Maybe it was easier for Amazon to move from a 5% margin retail business to a 30% margin cloud business than it was for Google to move from an 80% margin search business to a 30% margin cloud business. Maybe it was simply too much caution, similar to the way Google invented most of Gen AI, but missed its commercialization. It is still the smartest company I’ve worked for, though.
It’s a different topic, but Uber took cars and temporarily turned them into taxis, while Airbnb took homes and temporarily transformed them into hotels. An important technology in cloud computing is called virtualization, which means making it possible for one computer server to perform many different tasks at the same time. In a sense, Uber was virtualizing cars, and Airbnb was virtualizing apartments. If I may use “virtual” in that way.
That’s about the cost of the Shivansh LYF C459, sold in India. In the U.S., the cheapest smartphone is about $180.
The 10% of market capitalization Nvidia has surrendered since then is as great as the full market capitalization of Bank of America.
A delightful article that nonetheless left me queasy! 10/10!
This was really wonderful. Indeed the cloud was a sea change. I watched the entire Jensen Huang video. It was really a treat.