Computing for AI has three main pieces. One is data pre-processing, which means organizing a large dataset before you can do anything with it. This may involve labeling the data or cleaning it up, but basically you’re just trying to create some structure in it. Once preprocessed, you can start to ‘train’ the AI; this is like teaching it how to interpret the data. Next, we can do what we call AI inference, which is running the model in response to user queries.
This ability to gather data from different places, use probabilistic models to weigh relevance to the task at hand, integrate that information, and then provide an output that uncannily resembles that of a human in many instances is what sets it apart from traditional computing
As AI products like ChatGPT and Bing become more popular, the nature of computing is becoming more inference based. This is a slight departure from the machine-learning models
the need for more memory storage and the need for more energy. Regarding memory, an estimate from the Semiconductor Research Corporation, a consortium of all the major semiconductor companies, posits that if we continue to scale data at this rate, which is stored on memory made from silicon, we will outpace the global amount of silicon produced every year. So, pretty soon we will hit a wall where our silicon supply chains won’t be able to keep up with the amount of data being generated.
Couple this with the fact that in 2018 our computers consumed roughly 1-2% of the global electricity supply, and in 2020, this figure was estimated to be around 4–6%. If we continue at this rate, by 2030, it's projected to rise between 8-21%, further exacerbating the current energy crisis.
the rise of AI led to significant growth in data centers, facilities dedicated to housing IT infrastructure for data processing, management, and storage.
data center power and carbon emissions associated with data centers doubled between 2017 and 2020.
Each facility consumes in the order of 20 megawatts up to 40 megawatts of power, and most of the time data centers are running at 100% utilization, meaning all the processors are being kept busy with some work. So, a 20-megawatt facility probably draws 20 megawatts fairly consistently—enough to power roughly 16,000 households—computing as much as it can to amortize the costs of the data center, its servers, and power delivery systems.
Data centers offer economies of scale. In the past, a lot of businesses would build their own facilities, which meant they’d have to pay for construction, IT equipment, server room management, etc. So nowadays, it’s much easier to just ‘rent’ space from Amazon Web Services. It’s why cloud computing has taken off in the last decade.
we have an 800-pound gorilla in the room; our computers and other devices are becoming insatiable energy beasts that we continue to feed.
We just need to remain cognizant of the effects and keep pushing for more sustainable approaches to design, manufacturing, and consumption.
Glasp is a social web highlighter that people can highlight and organize quotes and thoughts from the web, and access other like-minded people’s learning.