Paper
11 July 2024 Camouflage pattern adversarial generation model using color and semantic constraints
Hao Zhang, Zhenping Xie
Author Affiliations +
Abstract
In response to the challenge that traditional camouflage design methods struggle to evade detection by modern unmanned aerial reconnaissance, we propose a camouflage pattern generation adversarial network model using color-semantic constraints. A reference image generation model using color-semantic constraints has been established, which through sentence encoding model, generates reference images with fundamental texture and color features. This is achieved by incorporating adversarial loss, texture loss, and pixel-level loss. We design a color standardization processing strategy based on the SimCLR framework. This model generates semantic camouflage images in batches by designing data augmentation strategies, positive-negative sample similarity measurement strategies, and sample structural similarity algorithms, regard to reference images. Qualitative and quantitative experimental results demonstrate that our proposed method exhibits strong camouflage performance in different environmental settings.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Hao Zhang and Zhenping Xie "Camouflage pattern adversarial generation model using color and semantic constraints", Proc. SPIE 13210, Third International Symposium on Computer Applications and Information Systems (ISCAIS 2024), 132101X (11 July 2024); https://doi.org/10.1117/12.3034945
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Camouflage

Image processing

Design

Education and training

Semantics

Gallium nitride

Visualization

Back to Top