BERT, have you no manners? : representations of troponymy and verb hypernymy in BERT

Show full item record

Title: BERT, have you no manners? : representations of troponymy and verb hypernymy in BERT
Author: Narkevich, Dmitry
Contributor: University of Helsinki, Faculty of Arts
Publisher: Helsingin yliopisto
Date: 2021
Language: eng
Thesis level: master's thesis
Degree program: Kielellisen diversiteetin ja digitaalisten menetelmien maisteriohjelma
Master's Programme Linguistic Diversity in the Digital Age
Magisterprogrammet i språklig diversitet och digitala metoder
Specialisation: Kieliteknologia
Language Technology
Abstract: Hypernymy is a relationship between two words, where the hyponym carries a more specific meaning, and entails a hypernym that carries a more general meaning. A particular kind of verbal hypernymy is troponymy, where troponyms are verbs that encode a particular manner or way of doing something, such as “whisper” meaning “to speak in a quiet manner”. Recently, contextualized word vectors have emerged as a powerful tool for representing the semantics of words in a given context, in contrast to earlier static embeddings where every word is represented by a single vector regardless of sense. BERT, a pre-trained language model that uses contextualized word representations, achieved state of the art performance on various downstream NLP tasks such as question answering. Previous research identified knowledge of scalar adjective intensity in BERT, but not systematic knowledge of nominal hypernymy. In this thesis, we investigate systematic knowledge of troponymy and verbal hypernymy in the base English version of BERT. We compare the similarity of vector representations for manner verbs and adverbs of interest, to see if troponymy is represented in the vector space. Then, we evaluate BERT’s predictions for cloze tasks involving troponymy and verbal hypernymy. We also attempt to train supervised models to probe vector representations for this knowledge. Lastly, we perform clustering analyses on vector representations of words in hypernymy pairs. Data on troponymy and hypernymy relationships is extracted from WordNet and HyperLex, and sentences containing instances of the relevant words are obtained from the ukWaC corpus. We were unable to identify any systematic knowledge about troponymy and verb hypernymy in BERT. It was reasonably successful at predicting hypernyms in the masking experiments, but a general inability to go in the other direction suggests that this knowledge is not systematic. Our probing models were unsuccessful at recovering information related to hypernymy and troponymy from the representations. In contrast with previous work that finds type-level semantic information to be located in the lower layers of BERT, our cluster-based analyses suggest that the upper layers contain stronger or more accessible representations of hypernymy.
Subject: lexical semantics
pre-trained language models

Files in this item

Total number of downloads: Loading...

Files Size Format View
Narkevich_Dmitry_thesis_2021.pdf 1.469Mb PDF View/Open

This item appears in the following Collection(s)

Show full item record