Surrogate-based optimization of learning strategies for additively regularized topic models

Logic Journal of the IGPL 31 (2):287-299 (2023)
  Copy   BIBTEX

Abstract

Topic modelling is a popular unsupervised method for text processing that provides interpretable document representation. One of the most high-level approaches is additively regularized topic models (ARTM). This method features better quality than other methods due to its flexibility and advanced regularization abilities. However, it is challenging to find an optimal learning strategy to create high-quality topics because a user needs to select the regularizers with their values and determine the order of application. Moreover, it may require many real runs or model training which makes this task time consuming. At the current moment, there is a lack of research on parameter optimization for ARTM-based models. Our work proposes an approach that formalizes the learning strategy into a vector of parameters which can be solved with evolutionary approach. We also propose a surrogate-based modification which utilizes machine learning methods that makes the approach for parameters search time efficient. We investigate different optimization algorithms (evolutionary and Bayesian) and their modifications with surrogates in application to topic modelling optimization using the proposed learning strategy approach. An experimental study conducted on English and Russian datasets indicates that the proposed approaches are able to find high-quality parameter solutions for ARTM and substantially reduce the execution time of the search.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,219

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

The ‘Big Picture’: The Problem of Extrapolation in Basic Research.Tudor M. Baetu - 2016 - British Journal for the Philosophy of Science 67 (4):941-964.
Wiring optimization explanation in neuroscience.Sergio Daniel Barberis - 2019 - Theoria: Revista de Teoría, Historia y Fundamentos de la Ciencia 34 (1):89-110.
Wiring optimization explanation in neuroscience: What is Special about it?Sergio Daniel Barberis - 2019 - Theoria : An International Journal for Theory, History and Fundations of Science 1 (34):89-110.

Analytics

Added to PP
2022-04-09

Downloads
10 (#1,129,009)

6 months
4 (#698,851)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references