HodgeCover solves the triplet-merge problem in sparse MoE compression
A new learning-free compression method solves the triplet-merge problem in sparse Mixture-of-Experts by treating expert compatibility as a topological structure and isolating harmonic barriers.
HodgeCover, a learning-free compression technique from a May 2026 arXiv preprint, addresses a structural blind spot in existing sparse Mixture-of-Experts compressors. The method treats expert compatibility as a 2-complex—vertices are experts, edges carry KL merge barriers, triangles carry triplet barriers—and applies Hodge decomposition to isolate the harmonic kernel, the exact mathematical object that blocks naive pairwise merging. Three experts can each be pairwise compatible yet form an irreducible cycle when merged together; every prior compressor that ranks experts on pairwise signals misses these triplet conflicts.
Tested on three open-weight sparse MoE backbones under aggressive expert reduction, HodgeCover matches state-of-the-art learning-free baselines on the expert-reduction axis and leads on the aggressive-compression frontier when paired with off-the-shelf weight pruning. A hybrid variant combines the topological selector with weight pruning on surviving experts, uniquely balancing retained mass across all four Hodge components. The preprint does not publish absolute perplexity numbers or name the three backbones tested, but the results demonstrate that exposing the harmonic kernel of a learned MoE structure changes which compressor wins in the regime that matters most.
