Open-source software offers AI models a more eco-friendly and efficient alternative

This article has been checked in accordance with Science X’s editorial process and guidelines. Editors have underscored the following attributes while confirming the content’s reliability:

1. verified
2. peer-reviewed article
3. edited

Ok!

by: Sara Rebein
Leibniz-Institut für Analytische Wissenschaften – ISAS – e. V.

Semantic 3D segmentation of osteocytes in mouse bones (images via light sheet fluorescence microscope). Credit: Prof Dr Anika Grüneboom, ISAS

Artificial intelligence (AI) has become an essential part in the analysis of microscopic data. However, while AI models are getting better and more intricate, the computing power and accompanying energy consumption are also growing.

Researchers at the Leibniz-Institut für Analytische Wissenschaften (ISAS) and Peking University have thus created a free compression software that enables scientists to execute existing bioimaging AI models faster and with significantly reduced energy consumption.

The researchers have presented their user-friendly toolbox, called EfficientBioAI, in an article released in Nature Methods.

Modern microscopy methods produce an ample quantity of high-resolution images, and individual data sets can consist of thousands of them. Scientists predominantly employ AI-assisted software in order to accurately analyze these data sets. Nevertheless, as AI models become more intricate, the latency (processing time) for images can considerably increase.

“High network latency, as an illustration with particularly large images, triggers higher computing power and ultimately, higher energy consumption,” stated Dr. Jianxu Chen, head of the AMBIOM—Analysis of Microscopic BIOMedical Images junior research group at ISAS.

A well-recognized technique gets new applications

To evade high latency in image analysis, especially on devices with limited computing power, researchers use advanced algorithms to condense the AI models. This implies that they minimize the amount of computations in the models whilst conserving comparable prediction accuracy.

“Model compression is a technique that is extensively used in the area of digital image processing, recognized as computer vision, and AI to make models lighter and greener,” elaborated Chen.

Researchers amalgamate several strategies in order to reduce memory consumption, expedite model inference, the “thought process” of the model—and consequently conserve energy. Pruning, for instance, is utilized to eliminate surplus nodes from the neural network.

“These techniques are often still unfamiliar in the bioimaging community. Consequently, we wished to create a ready-to-use and straightforward resolution to apply them to prevalent AI tools in bioimaging,” stated Yu Zhou, the paper’s leading author and Ph.D. student at AMBIOM.

Energy savings of up to approximately 81%

To evaluate their novel toolbox, the researchers under Chen tested their software on multiple real-life applications. With different hardware and diverse bioimaging analysis tasks, the compression methods were capable of significantly reducing latency and cut energy consumption by ranging from 12.5% to 80.6%.

“Our trials demonstrate that EfficientBioAI can considerably boost the efficiency of neural networks in bioimaging without restricting the accuracy of the models,” summarised Chen.

He illustrates the energy savings utilizing the ubiquitously employed CellPose model as an example: If a myriad of users were to utilise the toolbox to condense the model and implement it to the Jump Target ORF dataset (approximately one million microscope images of cells) they could conserve energy equivalent to the emissions of a car journey of roughly 7,300 miles (approx. 11,750 kilometers).

No particular expertise required

The authors are eager to make EfficientBioAI reachable to as many scientists in biomedical research as possible. Researchers can install the software and seamlessly integrate it into existing PyTorch libraries (open-source program library for the Python programming language).

For certain widely used models, such as Cellpose, researchers can henceforth use the software without having to alter the code themselves. To assist particular change requests, the group additionally supplies several demonstrations and tutorials. With only a few altered lines of code, the toolbox can then also be applied to personalized AI models.

About EfficientBioAI

EfficientBioAI is a ready-to-use and open-source compression software for AI models in the domain of bioimaging. The plug-and-play toolbox is kept simple for standard use, but offers adjustable functions. These include adaptable compression levels and straightforward switching between the central processing unit (CPU) and graphics processing unit (GPU).

The researchers are continuously refining the toolbox and are presently labouring to make it accessible for MacOS in addition to Linux (Ubuntu 20.04, Debian 10) and Windows 10. As of now, the focal point of the toolbox is on enhancing the inference efficiency of pre-trained models as opposed to enhancing efficiency throughout the training phase.

Leave a Reply

Your email address will not be published. Required fields are marked *