Journal of Prosthodontics News
January 13, 2026
Automated reconstruction of missing tooth morphology using a transformer-based implicit neural network
Now online in the Journal of Prosthodontics, a report of the development and evaluation of a novel transformer-based model for automated tooth morphology reconstruction, co-authored by ACP Fellow, Wei-Shao Lin, DDS, PhD, MBA, FACP.
In this study, the implicit neural network (INN) framework was augmented with a transformer-based self-structure enhancement module tailored for the missing teeth; 2D depth maps were incorporated as a complementary feature modality to enrich geometric details and improve reconstruction accuracy across different tooth positions.
Digital full-arch casts with intact target and adjacent teeth were collected, comprising 500 first molars, 600 first premolars, and 700 central incisors after data augmentation. The transformer-based INN model was trained with either 12,000 or 50,000 sampling points. After the model was trained and evaluated, it was used to generate crown designs. Reconstructed generated crowns (GC) were compared with original crowns (OC) and technician-designed crowns (TC) in terms of 3D morphological deviations.
The model trained with 50,000 sampling points exhibited superior reconstruction performance, with high similarity to natural tooth morphology. By integrating multi-view 2D depth maps and transformer-based attention mechanisms, the model achieved efficient training, improved reconstruction accuracy, and consistent performance across anterior and posterior tooth positions. While the results support the model's feasibility for clinical application, further studies with more diverse dentition cases and functional criteria are warranted to ensure reliability in complex clinical scenarios.
Wang Y, Shi Y, Li N, Lin W-S, Tan J, Chen L. Automated reconstruction of missing tooth morphology using a transformer-based implicit neural network: A multi-tooth position evaluation. J Prosthodont. 2025; 1–10. https://doi.org/10.1111/jopr.70084
Previous
Next
Back to News