An Approximate Perspective on Word Prediction in Context: Ontological Semantics meets BERT (Forthcoming)

Abstract

This paper presents an analysis of a large neural network model – BERT, by placing its word prediction in context capability under the framework of Ontological Semantics. BERT has reportedly performed well in tasks that require semantic competence without any explicit semantic inductive bias. We posit that word prediction in context can be interpreted as the task of inferring the meaning of an unknown word. This practice has been employed by several papers following the Ontological Semantic Technology (OST) approach to Natural Language Understanding. Using this approach, we deconstruct BERT’s output for an example sentence and interpret it using OST’s fuzziness handling mechanisms, revealing the degree to which each output satisfies the sentence’s constraints.

Publication
In Fuzzy Information Processing 2020, Proceedings of NAFIPS'2020
Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.
Kanishka Misra
Kanishka Misra
Research Assistant Professor at Toyota Technological Institute at Chicago

My research interests include Natural Language Processing, Cognitive Science, and Deep Learning.