Federated Learning Using Shared Functional Basis Expansions for Privacy-Preserving Classification

Motivation Federated Learning (FL) allows collaborative model training across decentralized clients without sharing raw data. However, high-dimensional or continuous data such as time series, longitudinal measurements, or signals (e.g., EEG, proteomics) introduce computational and privacy challenges in FL settings. Functional Data Analysis (FDA) provides a natural framework to represent such data using basis expansions (e.g., B-splines, Fourier), reducing the dimensionality and capturing the underlying structure. This project proposes using a shared functional basis across clients so each participant projects their data locally into coefficient space, and the federated model is trained on these low-dimensional representations. ...

July 2, 2025