Categorical representation learning: morphism is all you need

Sheshmani, Artan and You, Yi-Zhuang (2022) Categorical representation learning: morphism is all you need. Machine Learning: Science and Technology, 3 (1). 015016. ISSN 2632-2153

[thumbnail of Sheshmani_2022_Mach._Learn.__Sci._Technol._3_015016.pdf] Text
Sheshmani_2022_Mach._Learn.__Sci._Technol._3_015016.pdf - Published Version

Download (868kB)

Abstract

We provide a construction for categorical representation learning and introduce the foundations of 'categorifier'. The central theme in representation learning is the idea of everything to vector. Every object in a dataset $\mathcal{S}$ can be represented as a vector in $\mathbb{R}^n$ by an encoding map $E: \mathcal{O}bj(\mathcal{S})\to\mathbb{R}^n$. More importantly, every morphism can be represented as a matrix $E: \mathcal{H}om(\mathcal{S})\to\mathbb{R}^{n}_{n}$. The encoding map E is generally modeled by a deep neural network. The goal of representation learning is to design appropriate tasks on the dataset to train the encoding map (assuming that an encoding is optimal if it universally optimizes the performance on various tasks). However, the latter is still a set-theoretic approach. The goal of the current article is to promote the representation learning to a new level via a category-theoretic approach. As a proof of concept, we provide an example of a text translator equipped with our technology, showing that our categorical learning model outperforms the current deep learning models by 17 times. The content of the current article is part of a US provisional patent application filed by QGNai, Inc.

Item Type: Article
Subjects: STM Open Library > Multidisciplinary
Depositing User: Unnamed user with email support@stmopenlibrary.com
Date Deposited: 09 Jul 2023 03:56
Last Modified: 15 Mar 2024 12:27
URI: http://ebooks.netkumar1.in/id/eprint/1880

Actions (login required)

View Item
View Item