Skip to content

Commit 2280a5c

Browse files
committed
adjust configs
1 parent c3006b8 commit 2280a5c

3 files changed

Lines changed: 4 additions & 2 deletions

File tree

mambular/configs/fttransformer_config.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,4 +27,4 @@ class DefaultFTTransformerConfig:
2727
bias: bool = True
2828
transformer_activation: callable = nn.SELU()
2929
layer_norm_eps: float = 1e-05
30-
transformer_dim_feedforward: int = 2048
30+
transformer_dim_feedforward: int = 512

mambular/configs/mambular_config.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -32,3 +32,5 @@ class DefaultMambularConfig:
3232
head_use_batch_norm: bool = False
3333
layer_norm_after_embedding: bool = False
3434
pooling_method: str = "avg"
35+
bidirectional: bool = True
36+
use_learnable_interaction: bool = False

mambular/configs/tabtransformer_config.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,4 +27,4 @@ class DefaultTabTransformerConfig:
2727
bias: bool = True
2828
transformer_activation: callable = nn.SELU()
2929
layer_norm_eps: float = 1e-05
30-
transformer_dim_feedforward: int = 2048
30+
transformer_dim_feedforward: int = 512

0 commit comments

Comments
 (0)