Skip to content

Commit 34b8273

Browse files
committed
update name in readme
1 parent f5407d8 commit 34b8273

1 file changed

Lines changed: 57 additions & 33 deletions

File tree

README.md

Lines changed: 57 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -2,24 +2,24 @@
22
<img src="./docs/images/logo/mamba_tabular.jpg" width="400"/>
33

44

5-
[![PyPI](https://img.shields.io/pypi/v/mambular)](https://pypi.org/project/mambular)
6-
![PyPI - Downloads](https://img.shields.io/pypi/dm/mambular)
7-
[![docs build](https://readthedocs.org/projects/mambular/badge/?version=latest)](https://mambular.readthedocs.io/en/latest/?badge=latest)
8-
[![docs](https://img.shields.io/badge/docs-latest-blue)](https://mambular.readthedocs.io/en/latest/)
9-
[![open issues](https://img.shields.io/badge/contributions-welcome-brightgreen.svg?style=flat)](https://github.com/basf/mamba-tabular/issues)
5+
[![PyPI](https://img.shields.io/pypi/v/deeptabular)](https://pypi.org/project/deeptabular)
6+
![PyPI - Downloads](https://img.shields.io/pypi/dm/deeptabular)
7+
[![docs build](https://readthedocs.org/projects/deeptabular/badge/?version=latest)](https://deeptabular.readthedocs.io/en/latest/?badge=latest)
8+
[![docs](https://img.shields.io/badge/docs-latest-blue)](https://deeptabular.readthedocs.io/en/latest/)
9+
[![open issues](https://img.shields.io/badge/contributions-welcome-brightgreen.svg?style=flat)](https://github.com/OpenTabular/DeepTabular/issues)
1010

1111

12-
[📘Documentation](https://mambular.readthedocs.io/en/latest/index.html) |
13-
[🛠️Installation](https://mambular.readthedocs.io/en/latest/installation.html) |
14-
[Models](https://mambular.readthedocs.io/en/latest/api/models/index.html) |
15-
[🤔Report Issues](https://github.com/basf/mamba-tabular/issues)
12+
[📘Documentation](https://deeptabular.readthedocs.io/en/latest/index.html) |
13+
[🛠️Installation](https://deeptabular.readthedocs.io/en/latest/installation.html) |
14+
[Models](https://deeptabular.readthedocs.io/en/latest/api/models/index.html) |
15+
[🤔Report Issues](https://github.com/OpenTabular/DeepTabular/issues)
1616
</div>
1717

1818
<div style="text-align: center;">
19-
<h1>Mambular: Tabular Deep Learning Made Simple</h1>
19+
<h1>DeepTabular: Tabular Deep Learning Made Simple</h1>
2020
</div>
2121

22-
Mambular is a Python library for tabular deep learning. It includes models that leverage the Mamba (State Space Model) architecture, as well as other popular models like TabTransformer, FTTransformer, TabM and tabular ResNets. Check out our paper `Mambular: A Sequential Model for Tabular Deep Learning`, available [here](https://arxiv.org/abs/2408.06291). Also check out our paper introducing [TabulaRNN](https://arxiv.org/pdf/2411.17207) and analyzing the efficiency of NLP inspired tabular models.
22+
DeepTabular is a Python library for tabular deep learning. It includes models that leverage the Mamba (State Space Model) architecture, as well as other popular models like TabTransformer, FTTransformer, TabM and tabular ResNets. Check out our paper `Mambular: A Sequential Model for Tabular Deep Learning`, available [here](https://arxiv.org/abs/2408.06291). Also check out our paper introducing [TabulaRNN](https://arxiv.org/pdf/2411.17207) and analyzing the efficiency of NLP inspired tabular models.
2323

2424
<h3>⚡ What's New ⚡</h3>
2525
<ul>
@@ -48,10 +48,10 @@ Mambular is a Python library for tabular deep learning. It includes models that
4848

4949

5050
# 🏃 Quickstart
51-
Similar to any sklearn model, Mambular models can be fit as easy as this:
51+
Similar to any sklearn model, DeepTabular models can be fit as easy as this:
5252

5353
```python
54-
from mambular.models import MambularClassifier
54+
from deeptabular.models import MambularClassifier
5555
# Initialize and fit your model
5656
model = MambularClassifier()
5757

@@ -60,7 +60,7 @@ model.fit(X, y, max_epochs=150, lr=1e-04)
6060
```
6161

6262
# 📖 Introduction
63-
Mambular is a Python package that brings the power of advanced deep learning architectures to tabular data, offering a suite of models for regression, classification, and distributional regression tasks. Designed with ease of use in mind, Mambular models adhere to scikit-learn's `BaseEstimator` interface, making them highly compatible with the familiar scikit-learn ecosystem. This means you can fit, predict, and evaluate using Mambular models just as you would with any traditional scikit-learn model, but with the added performance and flexibility of deep learning.
63+
DeepTabular is a Python package that brings the power of advanced deep learning architectures to tabular data, offering a suite of models for regression, classification, and distributional regression tasks. Designed with ease of use in mind, DeepTabular models adhere to scikit-learn's `BaseEstimator` interface, making them highly compatible with the familiar scikit-learn ecosystem. This means you can fit, predict, and evaluate using DeepTabular models just as you would with any traditional scikit-learn model, but with the added performance and flexibility of deep learning.
6464

6565

6666
# 🤖 Models
@@ -98,9 +98,9 @@ You can find the Mamba-Tabular API documentation [here](https://mambular.readthe
9898

9999
# 🛠️ Installation
100100

101-
Install Mambular using pip:
101+
Install DeepTabular using pip:
102102
```sh
103-
pip install mambular
103+
pip install deeptabular
104104
```
105105

106106
If you want to use the original mamba and mamba2 implementations, additionally install mamba-ssm via:
@@ -120,7 +120,7 @@ pip install mamba-ssm
120120

121121
<h2> Preprocessing </h2>
122122

123-
Mambular uses pretab preprocessing: https://github.com/OpenTabular/PreTab
123+
DeepTabular uses pretab preprocessing: https://github.com/OpenTabular/PreTab
124124

125125
Hence, datatypes etc. are detected automatically and all preprocessing methods from pretab as well as from Sklearn.preprocessing are available.
126126
Additionally, you can specify that each feature is preprocessed differently, according to your requirements, by setting the `feature_preprocessing={}`argument during model initialization.
@@ -144,10 +144,10 @@ For an overview over all available methods: [pretab](https://github.com/OpenTabu
144144

145145

146146
<h2> Fit a Model </h2>
147-
Fitting a model in mambular is as simple as it gets. All models in mambular are sklearn BaseEstimators. Thus the `.fit` method is implemented for all of them. Additionally, this allows for using all other sklearn inherent methods such as their built in hyperparameter optimization tools.
147+
Fitting a model in deeptabular is as simple as it gets. All models in deeptabular are sklearn BaseEstimators. Thus the `.fit` method is implemented for all of them. Additionally, this allows for using all other sklearn inherent methods such as their built in hyperparameter optimization tools.
148148

149149
```python
150-
from mambular.models import MambularClassifier
150+
from deeptabular.models import MambularClassifier
151151
# Initialize and fit your model
152152
model = MambularClassifier(
153153
d_model=64,
@@ -243,12 +243,12 @@ Or use the built-in bayesian hpo simply by running:
243243
best_params = model.optimize_hparams(X, y)
244244
```
245245

246-
This automatically sets the search space based on the default config from ``mambular.configs``. See the documentation for all params with regard to ``optimize_hparams()``. However, the preprocessor arguments are fixed and cannot be optimized here.
246+
This automatically sets the search space based on the default config from ``deeptabular.configs``. See the documentation for all params with regard to ``optimize_hparams()``. However, the preprocessor arguments are fixed and cannot be optimized here.
247247

248248

249249
<h2> ⚖️ Distributional Regression with MambularLSS </h2>
250250

251-
MambularLSS allows you to model the full distribution of a response variable, not just its mean. This is crucial when understanding variability, skewness, or kurtosis is important. All Mambular models are available as distributional models.
251+
MambularLSS allows you to model the full distribution of a response variable, not just its mean. This is crucial when understanding variability, skewness, or kurtosis is important. All DeepTabular models are available as distributional models.
252252

253253
<h3> Key Features of MambularLSS: </h3>
254254

@@ -277,10 +277,10 @@ These distribution classes make MambularLSS versatile in modeling various data t
277277

278278
<h3> Getting Started with MambularLSS: </h3>
279279

280-
To integrate distributional regression into your workflow with `MambularLSS`, start by initializing the model with your desired configuration, similar to other Mambular models:
280+
To integrate distributional regression into your workflow with `MambularLSS`, start by initializing the model with your desired configuration, similar to other DeepTabular models:
281281

282282
```python
283-
from mambular.models import MambularLSS
283+
from deeptabular.models import MambularLSS
284284

285285
# Initialize the MambularLSS model
286286
model = MambularLSS(
@@ -305,18 +305,18 @@ model.fit(
305305

306306
# 💻 Implement Your Own Model
307307

308-
Mambular allows users to easily integrate their custom models into the existing logic. This process is designed to be straightforward, making it simple to create a PyTorch model and define its forward pass. Instead of inheriting from `nn.Module`, you inherit from Mambular's `BaseModel`. Each Mambular model takes three main arguments: the number of classes (e.g., 1 for regression or 2 for binary classification), `cat_feature_info`, and `num_feature_info` for categorical and numerical feature information, respectively. Additionally, you can provide a config argument, which can either be a custom configuration or one of the provided default configs.
308+
DeepTabular allows users to easily integrate their custom models into the existing logic. This process is designed to be straightforward, making it simple to create a PyTorch model and define its forward pass. Instead of inheriting from `nn.Module`, you inherit from DeepTabular's `BaseModel`. Each DeepTabular model takes three main arguments: the number of classes (e.g., 1 for regression or 2 for binary classification), `cat_feature_info`, and `num_feature_info` for categorical and numerical feature information, respectively. Additionally, you can provide a config argument, which can either be a custom configuration or one of the provided default configs.
309309

310-
One of the key advantages of using Mambular is that the inputs to the forward passes are lists of tensors. While this might be unconventional, it is highly beneficial for models that treat different data types differently. For example, the TabTransformer model leverages this feature to handle categorical and numerical data separately, applying different transformations and processing steps to each type of data.
310+
One of the key advantages of using DeepTabular is that the inputs to the forward passes are lists of tensors. While this might be unconventional, it is highly beneficial for models that treat different data types differently. For example, the TabTransformer model leverages this feature to handle categorical and numerical data separately, applying different transformations and processing steps to each type of data.
311311

312-
Here's how you can implement a custom model with Mambular:
312+
Here's how you can implement a custom model with DeepTabular:
313313

314314
1. **First, define your config:**
315315
The configuration class allows you to specify hyperparameters and other settings for your model. This can be done using a simple dataclass.
316316

317317
```python
318318
from dataclasses import dataclass
319-
from mambular.configs import BaseConfig
319+
from deeptabular.configs import BaseConfig
320320

321321
@dataclass
322322
class MyConfig(BaseConfig):
@@ -332,8 +332,8 @@ Here's how you can implement a custom model with Mambular:
332332
Define your custom model just as you would for an `nn.Module`. The main difference is that you will inherit from `BaseModel` and use the provided feature information to construct your layers. To integrate your model into the existing API, you only need to define the architecture and the forward pass.
333333

334334
```python
335-
from mambular.base_models.utils import BaseModel
336-
from mambular.utils.get_feature_dimensions import get_feature_dimensions
335+
from deeptabular.base_models.utils import BaseModel
336+
from deeptabular.utils.get_feature_dimensions import get_feature_dimensions
337337
import torch
338338
import torch.nn
339339

@@ -372,19 +372,19 @@ Here's how you can implement a custom model with Mambular:
372372
return output
373373
```
374374

375-
3. **Leverage the Mambular API:**
376-
You can build a regression, classification, or distributional regression model that can leverage all of Mambular's built-in methods by using the following:
375+
3. **Leverage the DeepTabular API:**
376+
You can build a regression, classification, or distributional regression model that can leverage all of DeepTabular's built-in methods by using the following:
377377

378378
```python
379-
from mambular.models.utils import SklearnBaseRegressor
379+
from deeptabular.models.utils import SklearnBaseRegressor
380380

381381
class MyRegressor(SklearnBaseRegressor):
382382
def __init__(self, **kwargs):
383383
super().__init__(model=MyCustomModel, config=MyConfig, **kwargs)
384384
```
385385

386386
4. **Train and evaluate your model:**
387-
You can now fit, evaluate, and predict with your custom model just like with any other Mambular model. For classification or distributional regression, inherit from `SklearnBaseClassifier` or `SklearnBaseLSS` respectively.
387+
You can now fit, evaluate, and predict with your custom model just like with any other DeepTabular model. For classification or distributional regression, inherit from `SklearnBaseClassifier` or `SklearnBaseLSS` respectively.
388388

389389
```python
390390
regressor = MyRegressor(numerical_preprocessing="ple")
@@ -395,6 +395,30 @@ Here's how you can implement a custom model with Mambular:
395395

396396

397397

398+
# 🏷️ Citation
399+
400+
If you find this project useful in your research, please consider cite:
401+
```BibTeX
402+
@article{thielmann2024mambular,
403+
title={Mambular: A Sequential Model for Tabular Deep Learning},
404+
author={Thielmann, Anton Frederik and Kumar, Manish and Weisser, Christoph and Reuter, Arik and S{\"a}fken, Benjamin and Samiee, Soheila},
405+
journal={arXiv preprint arXiv:2408.06291},
406+
year={2024}
407+
}
408+
```
409+
410+
If you use TabulaRNN please consider to cite:
411+
```BibTeX
412+
@article{thielmann2024efficiency,
413+
title={On the Efficiency of NLP-Inspired Methods for Tabular Deep Learning},
414+
author={Thielmann, Anton Frederik and Samiee, Soheila},
415+
journal={arXiv preprint arXiv:2411.17207},
416+
year={2024}
417+
}
418+
```
419+
420+
421+
398422
# 🏷️ Citation
399423

400424
If you find this project useful in your research, please consider cite:

0 commit comments

Comments
 (0)