基于生成對(duì)抗網(wǎng)絡(luò)與穩(wěn)定擴(kuò)散模型的花卉絲巾圖案生成方法

打開文本圖片集
中圖分類號(hào):TS106;TP18 文獻(xiàn)標(biāo)志碼:A 文章編號(hào):1673-3851(2025)07-0556-15
引用格式:,.基于生成對(duì)抗網(wǎng)絡(luò)與穩(wěn)定擴(kuò)散模型的花卉絲巾圖案生成方法[J].學(xué)報(bào)(自然科學(xué)),2025,53(4):556-570.
Abstract:With floral scarf patterns as the research objects, this study proposed a dual-stage collaborative generation method combining generative adversarial networks (GANs) and stable diffusion models for rapid scarf pattern generation. First,we constructed an SDXL model-based scarf pattern augmentation workflow,establishing a floral scarf pattern dataset through systematic patern collection, preprocessing,and data augmentation. Subsequently,in the first stage of pattern generation,we improved conventional GANs by integrating both self-atention and border-attention mechanisms into the StyleGAN framework,developing the SAB-StyleGAN model to generate base floral scarf patterns. Finall,in the second stage of pattern generation,we built an image-to-image workflow based on the SDXL model, effectively grafting the detailed rendering capabilities of stable difusion models onto GANs to produce refined floral scarf patterns with enhanced controllability and precision.Experimental results demonstrated that the generated refined floral scarf patterns exhibited superior clarity,achieving an FID value as low as 41.25,which closely resembled authentic designer samples.This method provides an eficient solution for rapid scarf pattern generation, significantly reducing enterprise design costs, enhancing production efficiency,and advancing digital transformation in the fashion industry.
Key words: silk scarf pattern; pattern generation method; generative adversarial networks (GANs); stable diffusion models; image-to-image translation; data augmentation
0引言
絲巾作為一種經(jīng)典的配飾,在時(shí)尚界占據(jù)著重要地位。(剩余18884字)