Show simple item record

dc.contributor.authorÆsøy, Kristoffer
dc.date.accessioned2023-06-12T23:54:23Z
dc.date.available2023-06-12T23:54:23Z
dc.date.issued2023-06-02
dc.date.submitted2023-06-12T22:02:23Z
dc.identifier.urihttps://hdl.handle.net/11250/3071012
dc.description.abstractModels used for machine learning are used for a multitude of tasks that require some type of reasoning. Language models have been very capable of capturing patterns and regularities found in natural language, but their ability to perform logical reasoning has come under scrutiny. In contrast, systems for automated reasoning are well-versed in logic-based reasoning but require their input to be in logical rules to do so. The issue is that the conception of such systems, and the production of adequate rules are time-consuming processes that few have the skill set to perform. Thus, we investigate the Transformer architecture's ability to translate natural language sentences into logical rules. We perform experiments of neural machine translation on the DKET dataset from the literature consisting of definitory sentences, and we create a dataset of if-then statements from the Atomic knowledge bank by using an algorithm we have created that we also perform experiments on.
dc.language.isoeng
dc.publisherThe University of Bergen
dc.rightsCopyright the Author. All rights reserved
dc.titleRule learning of the Atomic dataset using Transformers
dc.typeMaster thesis
dc.date.updated2023-06-12T22:02:23Z
dc.rights.holderCopyright the Author. All rights reserved
dc.description.degreeMasteroppgave i informatikk
dc.description.localcodeINF399
dc.description.localcodeMAMN-PROG
dc.description.localcodeMAMN-INF
dc.subject.nus754199
fs.subjectcodeINF399
fs.unitcode12-12-0


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record