.

Technology

FT-Transformer

FT-Transformer (Feature Tokenizer + Transformer) is a deep learning architecture that leverages the Transformer's self-attention mechanism to model complex interactions within tabular data.

This model is a powerful, yet simple, adaptation of the standard Transformer encoder for tabular data, introduced in the 2021 paper 'Revisiting Deep Learning Models for Tabular Data.' The key innovation is the Feature Tokenizer: it converts both categorical and numerical features into distinct vector embeddings, treating each feature like a 'token.' These tokens are then fed into the Transformer's self-attention layers, allowing the model to learn complex, non-linear relationships between all features. FT-Transformer has demonstrated competitive performance against established Gradient Boosting Decision Tree (GBDT) methods like XGBoost and LightGBM across various datasets, establishing a new, more universal deep learning baseline for the field.

https://arxiv.org/abs/2106.11959
1 project · 1 city

Related technologies

Recent Talks & Demos

Showing 1-1 of 1

Members-Only

Sign in to see who built these projects