Few-shot font generation via denoising diffusion and component-level fine-grained style
Few-shot font generation aims to create new fonts using a small number of style examples. It is increasingly gaining attention due to its significant reduction in labor costs. Existing methods rely on GAN-based image-to-image style-transfer frameworks, which are prone to training collapse and strugg...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | en en |
| Published: |
Elsevier
2025
|
| Subjects: | |
| Online Access: | http://psasir.upm.edu.my/id/eprint/122473/1/122473.pdf http://psasir.upm.edu.my/id/eprint/122473/3/122473-Accepted.pdf http://psasir.upm.edu.my/id/eprint/122473/ https://linkinghub.elsevier.com/retrieve/pii/S0957417425026041 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Few-shot font generation aims to create new fonts using a small number of style examples. It is increasingly gaining attention due to its significant reduction in labor costs. Existing methods rely on GAN-based image-to-image style-transfer frameworks, which are prone to training collapse and struggle to maintain consistency between character content and style. Moreover, they capture only the global style while overlooking fine-grained features of radicals, components, and strokes. To address these challenges, we propose a diffusion model-based image-to-image font generation method.We fully consider the component styles between content glyphs and reference glyphs, assigning appropriate fine-grained styles to content glyphs through a multi-character style aggregation module. Additionally, in order to better preserve the integrity of character structures during the denoising iteration process, we propose leveraging an offset-enhanced multi-head attention mechanism. This mechanism adaptively samples and embeds multi-scale glyph content features into the diffusion model. Comprehensive experiments demonstrate that our method outperforms existing font generation methods both qualitatively and quantitatively. |
|---|
