BEGIN:VCALENDAR
VERSION:2.0
PRODID:-// - ECPv6.15.15//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://isdm.umontpellier.fr
X-WR-CALDESC:Évènements pour 
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20240331T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20241027T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20250330T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20251026T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20260329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20261025T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20250527T160000
DTEND;TZID=Europe/Paris:20250527T170000
DTSTAMP:20260502T012431
CREATED:20250521T131218Z
LAST-MODIFIED:20250521T131218Z
UID:6930-1748361600-1748365200@isdm.umontpellier.fr
SUMMARY:Vertical Federated Learning with Missing Features During Training and Inference
DESCRIPTION:Machine Learning in Montpellier\, Theory & Practice – Pedro Valdeira \nVertical federated learning trains models from feature-partitioned datasets across multiple clients\, who collaborate without sharing their local data. Standard approaches assume that all feature partitions are available during both training and inference. Yet\, in practice\, this assumption rarely holds\, as for many samples only a subset of the clients observe their partition. However\, not utilizing incomplete samples during training harms generalization\, and not supporting them during inference limits the utility of the model. Moreover\, if any client leaves the federation after training\, its partition becomes unavailable\, rendering the learned model unusable. Missing feature blocks are therefore a key challenge limiting the applicability of vertical federated learning in real-world scenarios. To address this\, we propose LASER VFL\, a vertical federated learning method for efficient training and inference of split neural network-based models that is capable of handling arbitrary sets of partitions. Our approach is simple yet effective\, relying on the sharing of model parameters and on task-sampling to train a family of predictors. We show that LASER-VFL achieves a convergence rate for nonconvex objectives and\, under the Polyak-Łojasiewicz inequality\, it achieves linear convergence to a neighborhood of the optimum. Numerical experiments show improved performance of LASER-VFL over the baselines. Remarkably\, this is the case even in the absence of missing features. For example\, for CIFAR-100\, we see an improvement in accuracy of % when each of four feature blocks is observed with a probability of 0.5 and of % when all features are observed. The code for this work is available at https://github.com/Valdeira/LASER-VFL. \nVisio\nEn savoir plus
URL:https://isdm.umontpellier.fr/event/vertical-federated-learning-with-missing-features-during-training-and-inference/
CATEGORIES:Séminaire
ATTACH;FMTTYPE=image/jpeg:https://isdm.umontpellier.fr/wp-content/uploads/2025/02/ml-mpt-1.jpg
END:VEVENT
END:VCALENDAR