Conference Papers Year : 2022

Improved Transformer-Based Implicit Latent GAN with Multi-headed Self-attention for Unconditional Text Generation

Ziyun Jiao
  • Function : Author
  • PersonId : 1405994
Xin Kang
  • Function : Author
  • PersonId : 1275130

Abstract

Generative Adversarial Network (GAN) is widely used in computer vision, such as image generation and other tasks. In recent years, GAN has also been developed in the field of unconditional text generation. In this work, we improve TILGAN for unconditional text generation by refactoring the generator. In short, we use Multi-headed Self-attention to replace the Linear layer and BN layer to endow the generator with better text generation capabilities. Our model consists of three components: a transformer autoencoder, a Multi-headed Self attention based generator and a linear based discriminator. The encoder in transformer autoencoder is used to generate the distribution of real samples, and the decoder is used to decode real or generated sentence vector into text. The loss functions for autoencoder and GAN are cross entropy and KL divergence, respectively. On the MS COCO dataset, the proposed model has achieved a better BLEU score than TILGAN. Our ablation experiments also proved the effectiveness of the proposed generator network for unconditional text generation.
Fichier principal
Vignette du fichier
537972_1_En_18_Chapter.pdf (361.88 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-04666458 , version 1 (01-08-2024)

Licence

Identifiers

Cite

Fuji Ren, Ziyun Jiao, Xin Kang. Improved Transformer-Based Implicit Latent GAN with Multi-headed Self-attention for Unconditional Text Generation. 5th International Conference on Intelligence Science (ICIS), Oct 2022, Xi'an, China. pp.166-173, ⟨10.1007/978-3-031-14903-0_18⟩. ⟨hal-04666458⟩
37 View
1 Download

Altmetric

Share

More