Skip to content
This repository was archived by the owner on Aug 28, 2025. It is now read-only.

Commit fb7e973

Browse files
speediedanBorda
andauthored
Fine-Tuning Scheduler Tutorial Update for Lightning/PyTorch 2.1.0 (#278)
Co-authored-by: Jirka Borovec <6035284+Borda@users.noreply.github.com>
1 parent a22e269 commit fb7e973

2 files changed

Lines changed: 1 addition & 23 deletions

File tree

lightning_examples/finetuning-scheduler/RteBoolqModule_ft_schedule_deberta_base.yaml

Lines changed: 0 additions & 17 deletions
This file was deleted.

lightning_examples/finetuning-scheduler/finetuning-scheduler.py

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -46,8 +46,6 @@
4646
# [default schedule](#The-Default-Fine-Tuning-Schedule) and proceed to fine-tune according to the generated schedule,
4747
# using default [FTSEarlyStopping](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts_supporters.html#finetuning_scheduler.fts_supporters.FTSEarlyStopping) and [FTSCheckpoint](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts_supporters.html#finetuning_scheduler.fts_supporters.FTSCheckpoint) callbacks with ``monitor=val_loss``.
4848
#
49-
# </div>
50-
#
5149
# ```python
5250
# import lightning as L
5351
# from finetuning_scheduler import FinetuningScheduler
@@ -455,8 +453,6 @@ def configure_optimizers(self):
455453
# large transformer-based language models. The values used here have some justification
456454
# in the referenced literature but have been largely empirically determined and while a good
457455
# starting point could be could be further tuned.
458-
#
459-
# </div>
460456

461457
# %%
462458
optimizer_init = {"weight_decay": 1e-05, "eps": 1e-07, "lr": 1e-05}
@@ -474,8 +470,7 @@ def configure_optimizers(self):
474470
#
475471
# [FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) also supports both optimizer and LR scheduler
476472
# reinitialization in explicit and implicit finetuning schedule modes. See the advanced usage documentation ([LR scheduler reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/lr_scheduler_reinitialization.html), [optimizer reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/optimizer_reinitialization.html)) for explanations and demonstration of the extension's support for more complex requirements.
477-
# </div>
478-
473+
#
479474

480475
# %%
481476
lr_scheduler_init = {"T_0": 1, "T_mult": 2, "eta_min": 1e-07}

0 commit comments

Comments
 (0)