Title: NNLG: Neural Natural Language Generation In the recent years we have seen a shift in the NLP community towards Deep Learning and applications of Neural Networks to downstream tasks, ranging from document classification, and sentiment analysis, to language modelling, syntactic parsing, and machine translation. Perhaps the most applicable architecture to the field of Natural Language Generation, is the class of models referred to as recurrent neural networks (RNN). Their capacity to both memorize long patterns in the input as well as model arbitrary long histories of generated output, as well as the simplicity to train them without the need to reach to feature engineering, renders them very successful in sequential tasks, often outperforming traditional non-neural models. In this talk we will examine a few recent successes on applications of RNNs for concept-to-text, code-to-language, storytelling generation, as well as generating from meaning representations of language. We will also highlight issues specific to NLG tasks and discuss ways to address them in the future.