Zero-LEAD: Source-Free Universal Domain Adaptation for Abdominal Multi-Organ Segmentation
Ahmed El-Sayed · Marwan Torki
Abstract
Cross-modality medical image segmentation is critical for diagnosis and treatment planning, yet domain shifts and source data restrictions pose significant challenges. This paper introduces Zero-LEAD, the first unified framework for source-free universal domain adaptation (SF-UniDA) in segmentation, addressing all four UDA scenarios, closed-set, partial-set, open-set, and universal-set, without access to source data. Zero-LEAD integrates (1) Label-Efficient Adaptive Decomposition (LEAD) to decompose features into source-known and source-unknown components, and (2) a zero-shot segmentation module leveraging anatomical priors and semantic attributes to segment novel target classes.Extensive experiments across four datasets, Synapse, CHAOS, BTCV, and FLARE22, demonstrate strong performance across all adaptation settings. Zero-LEAD achieves 0.9159 Dice in closed-set (Synapse$\rightarrow$CHAOS), 0.8721 Dice in partial-set (BTCV$\rightarrow$CHAOS), 0.7801 Dice in open-set (Synapse$\rightarrow$BTCV), and 0.7716/0.6866 Dice in universal-set (BTCV$\leftrightarrow$FLARE22), significantly outperforming state-of-the-art baselines. Ablation studies confirm the complementary contributions of LEAD and zero-shot modules, and qualitative analysis highlights improved boundary precision and robustness under both domain and label shifts.
Successful Page Load