Machine learning at speed

Technology developed through a KAUST-led collaboration with Intel, Microsoft and the University of Washington can dramatically increase the speed of machine learning on parallelized computing systems.
Credit: © 2021 KAUST; Anastasia Serin

Inserting lightweight optimization code in high-speed network devices has enabled a KAUST-led collaboration to increase the speed of machine learning on parallelized computing systems five-fold.

This “in-network aggregation” technology, developed with researchers and systems architects at Intel, Microsoft and the University of Washington, can provide dramatic speed improvements using readily available programmable network hardware.

The fundamental benefit of artificial intelligence (AI) that gives it so much power to “understand” and interact with the world is the machine-learning step, in which the model is trained using large sets of labeled training data. The more data the AI is trained on, the better the model is likely to perform when exposed to new inputs.

The recent burst of AI applications is largely due to better machine learning and the use of larger models and more diverse datasets. Performing the machine-learning computations, however, is an enormously taxing task that increasingly relies on large arrays of computers running the learning algorithm in parallel.

“How to train deep-learning models at a large scale is a very challenging problem,” says Marco Canini from the KAUST research team. “The AI models can consist of billions of parameters, and we can use hundreds of processors that need to work efficiently in parallel. In such systems, communication among processors during incremental model updates easily becomes a major performance bottleneck.”

The team found a potential solution in new network technology developed by Barefoot Networks, a division of Intel.

“We use Barefoot Networks’ new programmable dataplane networking hardware to offload part of the work performed during distributed machine-learning training,” explains Amedeo Sapio, a KAUST alumnus who has since joined the Barefoot Networks team at Intel. “Using this new programmable networking hardware, rather than just the network, to move data means that we can perform computations along the network paths.”

The key innovation of the team’s SwitchML platform is to allow the network hardware to perform the data aggregation task at each synchronization step during the model update phase of the machine-learning process. Not only does this offload part of the computational load, it also significantly reduces the amount of data transmission.

“Although the programmable switch dataplane can do operations very quickly, the operations it can do are limited,” says Canini. “So our solution had to be simple enough for the hardware and yet flexible enough to solve challenges such as limited onboard memory capacity. SwitchML addresses this challenge by co-designing the communication network and the distributed training algorithm, achieving an acceleration of up to 5.5 times compared to the state-of-the-art approach.”

Media Contact

Michael Cusack
King Abdullah University of Science & Technology (KAUST)

All news from this category: Information Technology

Here you can find a summary of innovations in the fields of information and data processing and up-to-date developments on IT equipment and hardware.

This area covers topics such as IT services, IT architectures, IT management and telecommunications.

Back to the Homepage

Comments (0)

Write comment

Latest posts

Scientists show how to attack the ‘fortress’ surrounding pancreatic cancer tumors

UNSW medical researchers have found a way to starve pancreatic cancer cells and ‘disable’ the cells that block treatment from working effectively. Their findings in mice and human lab models…

Novel nanotech improves cystic fibrosis antibiotic by 100,000-fold

World-first nanotechnology developed by the University of South Australia could change the lives of thousands of people living with cystic fibrosis (CF) as groundbreaking research shows it can improve the…

New evidence for electron’s dual nature found in a quantum spin liquid

Results from a Princeton-led experiment support a controversial theory that the electron is composed of two particles. A new discovery led by Princeton University could upend our understanding of how…

Partners & Sponsors