Some of the material in is restricted to members of the community. By logging in, you may be able to gain additional access to certain collections or items. If you have questions about access or logging in, please use the form on the Contact Page.
This thesis introduces in detail how we apply deep neural networks to solve several natural language processing problems, including Word Sense Disambiguation (WSD), word representation and knowledge-base completion. We apply Bidirectional Long-short Term Memory networks and self-attention models on the MSH WSD dataset, which is related to medical WSD. Experiments show that our models provide the state-of-the-art performance on the MSH WSD dataset. In addition, we provide a comprehensive mathematical analysis on the learning rules of word2vec, the first continuous word representation model. We indicate that the learning rules of the word2vec model have deep connections with competitive learning. Finally, we apply novel algorithms to generate low dimensional synset embeddings preserving the semantic relationships in WordNet, one of the most popular linguistic knowledge-bases of English. Our synset embedding system is the first one that can explicitly and completely preserve the hypernym tree structure in WordNet.
Knowledge-Base, Language Understanding, Machine Learning, Natural Language Processing, Word Representation
Date of Defense
July 1, 2020.
A Dissertation submitted to the Department of Mathematics in partial fulfillment of the requirements for the degree of Doctor of Philosophy.
Includes bibliographical references.
Xiuwen Liu, Professor Co-Directing Dissertation; Richard Bertram, Professor Co-Directing Dissertation; Frank Johnson, University Representative; Martin Bauer, Committee Member; Monica Hurdal, Committee Member.
Florida State University
Zhang, C. (2020). Natural Language Processing by Deep Neural Networks. Retrieved from https://purl.lib.fsu.edu/diginole/2020_Summer_Fall_Zhang_fsu_0071E_16000