Privacy and Data Reconstruction in Low-Rank Personalized Federated Learning

Motivation Federated learning enables multiple clients to train models collaboratively without sharing raw data. When each participating centre aims to obtain its own personalised model rather than a single global model, the field is referred to as personalised federated learning (pFL). Although data is never transmitted directly, research has shown that gradients and model updates can still leak sensitive information, enabling partial reconstruction of private training data. Recent approaches based on low-rank matrix factorisation promise communication efficiency and improved personalisation. However, it remains unclear whether such structured parameterisations strengthen or weaken privacy protection. ...

November 3, 2025

Federated Learning Using Shared Functional Basis Expansions for Privacy-Preserving Classification

Motivation Federated Learning (FL) allows collaborative model training across decentralized clients without sharing raw data. However, high-dimensional or continuous data such as time series, longitudinal measurements, or signals (e.g., EEG, proteomics) introduce computational and privacy challenges in FL settings. Functional Data Analysis (FDA) provides a natural framework to represent such data using basis expansions (e.g., B-splines, Fourier), reducing the dimensionality and capturing the underlying structure. This project proposes using a shared functional basis across clients so each participant projects their data locally into coefficient space, and the federated model is trained on these low-dimensional representations. ...

July 2, 2025