In recent years, we have seen tremendous advances in the field of natural language processing through the use of neural networks. In fact, they have done so well, that they have almost succeeded in rewriting the field as we knew it. In this talk, I examine the state of the field and its link to the past, with a focus on language generation of many forms. I ask where neural networks have been particularly successful, where approaches from the past might still be valuable, and where we need to turn in the future if we are to go beyond our current success. To answer these questions, this talk will feature clips from a series of interviews I carried out with experts in the field.