Inproceedings,

Improving Response Selection in Multi-Turn Dialogue Systems by Incorporating Domain Knowledge

, , , and .
Proceedings of the 22nd Conference on Computational Natural Language Learning, page 497--507. Brussels, Belgium, Association for Computational Linguistics, (October 2018)
DOI: 10.18653/v1/K18-1048

Abstract

Building systems that can communicate with humans is a core problem in Artificial Intelligence. This work proposes a novel neural network architecture for response selection in an end-to-end multi-turn conversational dialogue setting. The architecture applies context level attention and incorporates additional external knowledge provided by descriptions of domain-specific words. It uses a bi-directional Gated Recurrent Unit (GRU) for encoding context and responses and learns to attend over the context words given the latent response representation and vice versa. In addition, it incorporates external domain specific information using another GRU for encoding the domain keyword descriptions. This allows better representation of domain-specific keywords in responses and hence improves the overall performance. Experimental results show that our model outperforms all other state-of-the-art methods for response selection in multi-turn conversations.

Tags

    Users

    • @scadsfct

    Comments and Reviews