Home Organization Call For Papers

NeuralGen 2019

Methods for Optimizing and Evaluating Neural Language Generation

Workshop co-located with NAACL 2019 in Minneapolis!

Email: neuralgen2019@gmail.com

Call For Papers

Advances in training deep neural networks for sequence modeling have led to an active and rising interest in language generation research. New approaches based on deep learning have allowed the community to tackle and achieve impressive results on numerous tasks such as machine translation [1], summarization [2], program generation [3] and more [4,5]. However, prevailing neural methods remain imperfect. Conditional language modeling losses are unable to capture global context [6]. Evaluation metrics are uncorrelated with human judgements leading to bias against expected model functionality [7], while high quality human evaluations remain costly. Effective transfer across tasks, languages and domains holds promise [8, 9], but remains only preliminarily explored.

We invite members of the community to participate in a discussion on addressing these issues and to contribute works that explore the next frontiers in neural language generation. We propose this workshop to explore commonalities and differences in various approaches to fixing the issues of neural language generation and are excited to receive submissions on the following topics:

Modeling Advancements:
  • Beyond maximum likelihood training (eg: risk loss, reinforcement learning objectives, variational approaches, adversarial training, pretrained discriminators, other novel loss functions)
  • Unsupervised, weakly supervised, and semi-supervised language generation
  • Editing models
  • Mixing neural and template-based generation
  • Human-in-the-loop learning
  • Beyond teacher-forcing (beam search during training, non-autoregressive generation)
Evaluation:
  • New automatic metrics for evaluating different characteristics of coherent language
  • Evaluation using pretrained models
  • Proposing better human evaluation strategies
Generalization:
  • Transfer learning (unsupervised pre-training for generation, low-resource generation, domain adaptation)
  • Multi-task learning
  • Model distillation
Analysis:
  • Model analysis, interpretability and/or visualizations
  • Error analysis of machine-generated language
  • Analysis of evaluation metrics
  • Benefits/drawbacks of different loss functions

Important Dates:

Important Dates:

All deadlines are are 11:59 PM GMT -12 (anywhere-on-earth time)

Call For Papers:January 7, 2019
Deadline for submission:March 11, 2019  March 13, 2019
Notification of acceptance:March 29, 2019
Deadline for camera-ready version:April 5, 2019
Workshop Date:June 6, 2019

Financial Assistance:

Partial financial Assistance will be available to authors of papers who demonstrate significant financial need.

Submission Guidelines:

Submissions may contain between 4 and 8 pages of content, with unlimited references. They will undergo the double-blind review process and so must not identify authors or their affiliations. Please format your papers using the standard NAACL style files. Submissions on work published or submitted elsewhere are also permitted. Already published work will not be subject to the review process, will not be included in the workshop proceedings, and acceptance will be conditioned on relevance to the goals of the workshop. Presentation format and schedule will be announced before the camera-ready deadline.

Clarification about non-archival submissions: Submissions that authors wish to be non-archival or excluded from official proceedings are also permitted. The submission page will have an option for authors to select to be excluded from official workshop proceedings.

Submit via the Softconf START system.

References:

[1] Vaswani, Ashish, et al. "Attention is all you need." In NIPS. 2017.
[2] Celikyilmaz, Asli, et al. "Deep communicating agents for abstractive summarization." In NAACL. 2018.
[5] Ling, Wang, et al. "Latent predictor networks for code generation." In ACL. 2016.
[4] Bosselut, Antoine, et al. "Discourse-Aware Neural Rewards for Coherent Text Generation." In NAACL. 2018.
[5] Wiseman, Sam, Stuart M. Shieber, and Alexander M. Rush. "Challenges in data-to-document generation." In EMNLP. 2017.
[6] Holtzman, Ari, et al. "Learning to Write with Cooperative Discriminators." In ACL. 2018.
[7] Novikova, Jekaterina, et al. “Why We Need New Evaluation Metrics for NLG.” In EMNLP. 2017.
[8] Ramachandran, Prajit, et al. “Unsupervised Pretraining for Sequence to Sequence Learning.” In EMNLP. 2017.
[9] Gu, Jiatao, et al. “Meta-Learning for Low-Resource Neural Machine Translation.” In EMNLP. 2018.