Страница публикации

Object Texture Transform Based on Improved CycleGAN

Авторы: Zhang L., Pan B., Cherkashin E., Zhang X., Ruan X., Li, Q.

Журнал: Communications in Computer and Information Science

Том: 1335

Номер:

Год: 2020

Отчётный год: 2020

Издательство:

Местоположение издательства:

URL:

Проекты:

DOI: 10.1007/978-981-33-4929-2_6

Аннотация: Image style transfer is a kind of image-to-image conversion method, which has been widely used in the field of computer vision and is an important research direction. In the application of target deformation, the traditional algorithm has the disadvantages of unclear positioning and difficult separation from the background. In this paper, an improved cyclic generative antagonistic network (CycleGAN) is proposed. The model adds a layer of self-attention algorithm to the discriminator structure, and uses clues from all feature locations to establish global dependence, to obtain global geometric features of images, and to achieve long correlation. Compared with CycleGAN algorithm, the results show that the algorithm in this paper can learn to generate a style transfer image with clear texture and full rendering, and at the same time the image has no obvious loss in details. Compared with the original algorithm, the convergence speed in the learning process of the algorithm in this paper is obviously improved, the oscillation amplitude of loss function is small, and the number of steps required for network convergence is reduced. In the process of testing, the algorithm stylizes 1163 images, and the consumption time of each generation only increases by 17 s.

Индексируется WOS: Нет

Индексируется Scopus: Нет

Индексируется УБС: Нет

Индексируется РИНЦ: Да

Индексируется ВАК: Нет

Индексируется CORE: Нет

Публикация в печати: 0