Monday, April 15, 2024
HomeAITiny AI: Downsizing And Refining Existing Deep Learning

Tiny AI: Downsizing And Refining Existing Deep Learning

Sometimes decluttering is progressive. Simplification has gained traction, even human lifestyles are adapting to the minimalism movement. As technology gets more advanced, and sophisticated, it gets over-encumbered with complexity. There is now a turn to refine, downsize, and descale. Early classical computers filled entire rooms, in order to carry out algorithms which our small phones are capable of in seconds. Other areas such as artificial intelligence are now following the trend of downsizing, as tiny AI becomes the next evolutionary step along the AI continuum. 

Tiny AI Refines The Existing Deep Learning Models

As artificial intelligence becomes ‘smarter’ and more powerful, it relies on greater amounts of computational power to carry out a myriad of algorithms, and process big data.

Tiny AI is countering the excessive need for power demands, by refining the existing deep learning models without sacrificing its learning capabilities. It reduces the lines of code that AI currently operates on, simplifying the algorithm without compromising functionality. As Leonardo da Vinci famously said, “Simplicity is the ultimate sophistication”.

The Key Role Of Reducing The Processing Power

Nick McQuire, vice president at CCS Insight, commented that “Over the past few years there’s been an arms race of sorts in AI and an effect of this competition is that we’ve seen some ML models become enormous in the race to achieve high performance”.

Reducing the processing power is imperative since the sheer complexity of artificial intelligence algorithms require extremely large amounts of computational power. MIT introduced BERT (Bidirectional Encoder Representations from Transformers) in 2018, which is a language processing AI utilized by Google.

BERT has 340 million data parameters, and training it once is equivalent electrical power to cover a typical US household for 50 days. Training AI is the passing of copious amounts of data into the algorithm for learning. Microsoft introduced the Turing Natural Language Generation model early this year, which has the largest data parameters of any AI, surpassing 17 billion, outlining the need for tiny AI.

Are you interested in Artificial Intelligence? Check out:

Computer Vision: Extraordinary AI Algorithm

How Does Tiny AI Achieve Its Processing Power Edge?

Furthermore, there is a cost factor which includes the need for cloud, as McQuire said “We’ve seen customers that for every dollar they spend on AI, spend $10-15 more on cloud computing to support the application”. 

Tiny AI achieves its processing power edge by reducing the need for the cloud centralised network. Eliminating correspondence with the cloud reduces the middleman, thus improving latency and reducing power consumption.

Andrew White, a patent attorney at Mathys & Squire, explained that “Tiny AI involves building algorithms into hardware at the periphery of a network, such as the sensors themselves. The idea is that they can be integrated into hardware to perform data analytics at low power, avoiding the need to send data back to the cloud for processing”.

Development of TinyBert

TinyBERT was developed by Huazhong University of Science and Technology and Huawei Noah’s Ark Lab, as the name suggests it incorporates tiny artificial intelligence on the BERT algorithm.

TinyBERT is 7.5 times smaller in terms of code, and as a result, it is 9.4 times faster than the original IBM BERT model. The downscaling can be attributed to ‘distillation methods’. This includes a compression technique in which a smaller network algorithm is taught by a larger more trained neural network, similar to how a student is taught by a teacher.

Improved Environmental Effect

Alongside the reduced clutter of the code, and the faster processing, there is also an improved environmental effect due to reduced required computational power. Training an AI includes running the algorithm through many data points, which comes at a hidden cost to the environment.

The University of Massachusetts conducted a study, concluding that the carbon footprint of training an AI causes 284 tonnes of carbon dioxide equivalent, which is five times greater than the lifetime emissions of an average car. This is also equivalent to 300 return flights between San Francisco and New York.

Tiny AI, therefore, reduces the emission through more efficient processing, thus pushing towards a greener and more environmentally friendly technological advances. Roy Schwartz, from Allen Institute, said “We don’t want to reach a state where AI will become a significant contributor to global warming”, pushing the need for tiny AI

Apple Integrating The Tiny AI

Tiny AI is slowly integrating itself into our existing technology, as Apple Siri has followed this newest artificial intelligence technological trend, now is being run locally as of iOS 12.

This allows the response time of Siri to shorten, as it doesn’t require access to the cloud when requiring deep learning models. Reducing the latency allows autonomous vehicles to make decisions more quickly, which is important considering there is no room for error with vehicle automation.

Tiny AI Improves Data Privacy

Furthermore, tiny AI improves data privacy, since the dissociation with the cloud keeps all personal data stored locally. Privacy applications can extend towards privacy-conscious healthcare and banking sectors.

Tiny AI outlines the potential future of artificial intelligence. As AI continues to develop and become more intelligent, the traditional AI algorithms will undoubtedly need to grow. The transition to Tiny AI will become more appealing as time passes.

As the world becomes more environmentally conscientious, Tiny AI will become even more enticing given the smaller relative carbon footprint it leaves. The Tiny AI space is still in its infancy, and there are few experts in the field, but it has great potential.

Jon is a writer for RegTech Global, specialized background is in Computer Science, Zoology, Finance, and Neuroscience. He is interested in biotechnology and Green-tech and pursues these fields in his professional life. Outside of writing, Jon is passionate about the outdoors, enjoying hiking, surfing, and skiing.


Please enter your comment!
Please enter your name here


Stay Connected



Twitter feed is not available at the moment.
Skip to content