cat

Learning Generalizable 3D Reassembly
for Real-World Fractures

1New York University 2Yale University 3Max Planck Institute
*, † Equal Contribution, ✉ Corresponding Author

GARF is a generalizable flow matching-based 3D reassembly method trained on 1.9 Million fractures, enabling precise real-world fragment pose alignment. 😊 Achieves strong performance across extensive benchmarks, concise code with efficient performance.

Bowl
Plate
Pot
Vase
Vase
Bowl
Plate
Pot
Vase
Vase

Note: All fragments shown above are obtained from real 3D scans. Even when presented with unfamiliar fragments or missing parts, GARF successfully assembles them and accurately aligns their poses. Try rotating and zooming in on the 3D models to see how seamlessly the pieces fit together!All the fragments above are obtained from real scans.

Abstract

3D reassembly is a challenging spatial intelligence task with broad applications across scientific domains. While large-scale synthetic datasets have fueled promising learning-based approaches, their generalizability to different domains is limited. Critically, it remains uncertain whether models trained on synthetic datasets can generalize to real-world fractures where breakage patterns are more complex. To bridge this gap, we propose GARF, a generalizable 3D reassembly framework for real-world fractures. GARF leverages fracture-aware pretraining to learn fracture features from individual fragments, with flow matching enabling precise 6-DoF alignments. At inference time, we introduce one-step preassembly, improving robustness to unseen objects and varying numbers of fractures. In collaboration with archaeologists, paleoanthropologists, and ornithologists, we curate FRACTURA, a diverse dataset for vision and learning communities, featuring real-world fracture types across ceramics, bones, eggshells, and lithics. Comprehensive experiments have shown our approach consistently outperforms state-of-the-art methods on both synthetic and real-world datasets, achieving 82.87% lower rotation error and 25.15% higher part accuracy. This sheds light on training on synthetic data to advance real-world 3D puzzle solving, demonstrating its strong generalization across unseen object shapes and diverse fracture types.

Methodology

Overview Image

Overall illustration of GARF. Our framework comprises two main components: (i) Fracture-aware pretraining leverages 14x more data than previous methods to learn the local fracture features via fracture point segmentation, and (ii) Flow-based reassembly on SE(3) leverages the SO(3) manifold for precise rotation estimation. At inference time, one-step pre-assembly strategy provides better initial poses, enhancing robustness against unseen objects and increasing numbers of fractures.

Assembly Results

Explore our 3D reassembly results interactively. The models below demonstrate how GARF accurately aligns fragments across different material categories. You can rotate, zoom, and inspect the reassembled objects from any angle.

Breaking Bad Dataset

Everyday
Everyday
Everyday
Everyday
Artifact
Artifact
Artifact
Artifact

Fractura Dataset (Synthetic Fracture Subset)

Following the simulated fracture method used in Breaking Bad, we also performed simulated breakage on the complete objects in our Fractura dataset.

Fantastic Break Dataset

Although the objects in the Fantastic Break dataset only break into two pieces, the challenge lies in the fact that the fracture surfaces are obtained from real scans, not simulations.

BibTeX

@article{Li2025GARF,
 title={GARF: Learning Generalizable 3D Reassembly for Real-World Fractures},
 author={Sihang Li and Zeyu Jiang and Grace Chen and Chenyang Xu and Siqi Tan and Xue Wang and Irving Fang and Kristof Zyskowski and Shannon McPherron and Radu Iovita and Chen Feng and Jing Zhang},
 year={2025},
 journal={arXiv preprint arXiv:2412.00138},
}

Acknowledgements

We gratefully acknowledge the Physical Anthropology Unit, Universidad Complutense de Madrid for providing access to the human skeletons under their curation. This work was supported in part through NSF grants 2152565, 2238968, 2322242, and 2426993, and the NYU IT High Performance Computing resources, services, and staff expertise.