1. EachPod

Compressed Federated Learning of Tiny Language Models

Author
Neural Intelligence Network
Published
Sun 25 May 2025
Episode Link
https://podcasters.spotify.com/pod/show/neuralintelpod/episodes/Compressed-Federated-Learning-of-Tiny-Language-Models-e338e4o

This document details research into improving Federated Learning (FL) efficiency in autonomous mobile networks by incorporating tiny language models (TLMs) for predicting network performance features. It focuses on the challenge of communication overhead in FL due to frequent neural network data exchanges. The paper proposes and evaluates the use of NNCodec, an implementation of the ISO/IEC Neural Network Coding (NNC) standard, to compress the data shared between participating network cells (clients) and a central server. Experimental results using the Berlin V2X dataset demonstrate that NNCodec can significantly reduce communication data while maintaining the prediction performance of the TLMs.

Share to: