Conventional AI/Machine Learning systems consume staggering amounts of power owing to their inefficient digital architectures. Sustainability of AI development and a sovereign presence on the AI engagement field will require a shift to intelligent analog solutions.


The AI enterprise of automating intelligent activity appears to be thriving with an ever expanding buffet of assistive and augmentative systems such as industrial robots, edge-computing systems, self-driving cars, language processors, IoT devices, and a variety of services pertaining to social, commercial and military contexts.  

However, digitization which in the popular awareness has come to be nearly synonymous with the ICT revolution, may be the very undoing of the AI dream owing to unsustainable power demands of conventional digital architectures.  

Researchers find that the energy consumed by ML language models has been increasing exponentially since 2017 [1]. The cost of a single training run of the Transformer language model, around 27 kWh in June, 2017, rose to 656,347 kWh by January, 2019, corresponding to about 6,26,155 pounds of CO2. This is roughly equal to the total lifetime carbon footprint of five cars, about 60 times the CO2 produced by a human over a full lifetime and about 300 times the emission associated with a NY-SF flight. ML models are put through hundreds of such runs and finally employed, in say, a self-driving vehicle, which corresponds to the remaining 80% of the lifetime cost of the model [2]. 

Cambridge University’s energy-tracker estimates that Blockchain applications, involving mining of cryptocurrencies such as Bitcoin, consume around 121.36 terawatt-hours (1Terra = 10^12) of energy per year, which exceeds the energy consumption of the entire country of Argentina, a figure that continues to rise [3]. 

Always-ON IoT systems such as home assistants and burglar alarms use 70-90% of their batteries processing irrelevant data owing to wasteful digital architectures [4].

There are a handful of corporations and countries like the US and China who can afford to support power guzzling Red AI [5] systems. If India is to contribute to the global AI effort, then more efficient and sustainable training methods and systems will have to be put into place. 

The solution to these unsustainably high power demands lies in the use of intelligent Analog technology. For instance, Aspinity’s new analog RAMP chip [5] drastically reduces the volume of irrelevant data generated by up to a 100 times and the corresponding power required to process data for Always-ON systems. Whereas, conventional ML-ANN systems employ hundreds of thousands of transistors to carry out large numbers of simple digital multiply/accumulate operations, RAMP makes do with just a handful of transistors enabling the digital component to exist in a very low power state about 80% of the time [5].

Aspinity’s Analog ML core (Fig 1) provides very low resource consuming alternatives to digital home, IoT, consumer, industrial, and biomedical applications, drastically reducing the carbon footprint of ML models and enabling the extension of battery-life by 10 times or more [6].  

Another promising development is the application of analog Phase Change Memory (PCM) technology for Deep Learning, the former using about 10,000-2,000,000 times less power than its digital counterpart. PCM eliminates costly memory-access operations (consuming 1000 times more power than 32-bit-floating-point-multiplications [7]) of traditional von Neumann computers by storing data as the physical state of the semiconductor (amorphous/crystalline) [8]. Intel and IBM have begun developing PCM-based ANNs [9]. 

Analog image sensors for face-detection, predictive-maintenance, behavior-monitoring, barcode-recognition, leak-detection, cell phone imaging, visual health-monitoring, user-occupancy-based services and object-tracking have a lower latency, energy consumption and cost supporting longer lifetimes for battery-powered products [10].

At a time when the world, having woken up rather late to the looming threat of catastrophic global warming, is scrambling to address the damage done so far, the AI carbon footprint continues to expand at an alarming rate. And this is while AI development is still in its relative infancy.  Not only is the AI dream at risk of collapsing in on itself but the collateral devastation to the environment would be enormous.

Fig1: Aspinity’s multipurpose AnalogML core [6]  (top) and the analog computing machine at the Lewis Flight Propulsion Laboratory circa 1949 (Image source: NASA) [7] (bottom)

Strubell et al, estimate that, “… we must cut carbon emissions by half over the next decade to deter escalating rates of natural disaster. Model training and development likely make up a substantial portion of the greenhouse gas emissions…” [1]

Analog technology for AI and communication systems is an area with great promise but is yet to go mainstream. Indigenous research into these systems, at the earliest, will provide India the technological clout to offer the world a more mature alternative to delinquent state and self-serving corporate actors, unmindful of the tremendous and urgent risk that conventional Red AI systems pose to the survival of the planet and to the AI enterprise itself. 

Open Questions:

  1. Might citizen self-regulation be encouraged to help maximize the availability of power for groundbreaking AI research activities?  
  2. Might research into power-light or power-free technology help offset the environmental cost of AI systems, such as self-cooling native architectures and rock-based purification systems?

References 

  1. https://arxiv.org/pdf/1906.02243.pdf
  2. https://www.forbes.com/sites/robtoews/2020/06/17/deep-learnings-climate-change-problem/?sh=6046829d6b43
  3. https://www.bbc.com/news/technology-56012952
  4. https://www.eejournal.com/article/aspinitys-awesome-analog-artificial-neural-networks-aanns/
  5. https://www.eejournal.com/industry_news/aspinity-enables-10x-less-power-for-always-on-sensing/
  6. https://www.aspinity.com/analogml_core
  7. https://semiengineering.com/an-increasingly-complicated-relationship-with-memory/
  8. https://www.electronics-notes.com/articles/electronic_components/semiconductor-ic-memory/pram-phase-change-memory-storage.phphttps://iopscience.iop.org/article/10.1088/1361-6463/ab7794
  9. https://analog-ai-demo.mybluemix.net/hardware
  10. https://www.eejournal.com/article/an-ai-storm-is-coming-as-analog-ai-surfaces-in-sensors/

Views expressed by the author are personal and need not reflect or represent the views of Centre for Public Policy Research.

Dr Monika Krishan
Dr Monika Krishan
Dr Monika Krishan's academic background includes a Master’s in Electrical Engineering from the Indian Institute of Science, Bangalore, India and a Ph.D. in Cognitive Psychology from Rutgers University, New Jersey, USA. Her research interests include image processing, psychovisual perception of textures, perception of animacy, goal based inference, perception of uncertainty and invariance detection in visual and non-visual domains. Areas of study also include the impact of artificial intelligence devices on human cognition from the developmental stages of the human brain, through adulthood, all the way through the aging process, and the resulting impact on the socio-cognitive health of society. She has worked on several projects on the cognitive aspects of the use and misuse of technology in social and antisocial contexts at SERC, IISc as well as the development of interactive graphics for Magnetic Resonance Imaging systems at Siemens. She is a member of Ohio University’s Consortium for the Advancement of Cognitive Science. She has offered services at economically challenged schools and hospitals for a number of years and continues to be an active community volunteer in the field of education and mental health

Leave a Reply

Your email address will not be published. Required fields are marked *