BERT-Sort: How to use language models to semantically order categorical values
The AutoML PodcastNovember 24, 202200:40:3727.92 MB

BERT-Sort: How to use language models to semantically order categorical values

Today Ankush Garg is talking to Mehdi Bahrami about his recent project: BERT-Sort.

BERT-Sort is an example of how large language models can add useful context to tabular datasets, and to AutoML systems.

Mehdi is a Member of Research Staff at Fujitsu and, as he describes, he began using AutoML systems for his research, yet he came across some crucial limitations of existing solutions. The modifications he made highlight a promising future for the relationship between language models and AutoML. This is a direction we're going to continue to explore on the show.

References:
BERT-Sort: A Zero-shot MLM Semantic Encoder on Ordinal Features for AutoML - https://proceedings.mlr.press/v188/bahrami22a.html

PyTorrent: A Python Library Corpus for Large-scale Language Models: https://arxiv.org/abs/2110.01710

AugmentedCode: Examining the Effects of Natural Language Resources in Code Retrieval Models: https://arxiv.org/abs/2110.08512