From f21713ca4f272879127da3064a7959a2d76312b7 Mon Sep 17 00:00:00 2001 From: m574s Date: Tue, 28 May 2024 10:47:51 +0200 Subject: [PATCH 1/3] Fixed Finetuning Docu --- documentation/pretraining_and_finetuning.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/documentation/pretraining_and_finetuning.md b/documentation/pretraining_and_finetuning.md index 5360eb74e..0d6d8226c 100644 --- a/documentation/pretraining_and_finetuning.md +++ b/documentation/pretraining_and_finetuning.md @@ -38,7 +38,7 @@ nnUNetv2_extract_fingerprint -d SOURCE_DATASET Now we can take the plans from the target dataset and transfer it to the source: ```bash -nnUNetv2_move_plans_between_datasets -s TARGET_DATASET -t SOURCE_DATASET -sp TARGET_PLANS_IDENTIFIER -tp SOURCE_PLANS_IDENTIFIER +nnUNetv2_move_plans_between_datasets -s TARGET_DATASET -t SOURCE_DATASET -sp SOURCE_PLANS_IDENTIFIER -tp TARGET_PLANS_IDENTIFIER ``` `SOURCE_PLANS_IDENTIFIER` is hereby probably nnUNetPlans unless you changed the experiment planner in From ff9dda8558ded10806d9f6d104002d7033543c0b Mon Sep 17 00:00:00 2001 From: m574s Date: Tue, 28 May 2024 10:54:12 +0200 Subject: [PATCH 2/3] Fix docu of move_plans_between_datasets --- documentation/pretraining_and_finetuning.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/documentation/pretraining_and_finetuning.md b/documentation/pretraining_and_finetuning.md index 0d6d8226c..1049f19d2 100644 --- a/documentation/pretraining_and_finetuning.md +++ b/documentation/pretraining_and_finetuning.md @@ -38,7 +38,7 @@ nnUNetv2_extract_fingerprint -d SOURCE_DATASET Now we can take the plans from the target dataset and transfer it to the source: ```bash -nnUNetv2_move_plans_between_datasets -s TARGET_DATASET -t SOURCE_DATASET -sp SOURCE_PLANS_IDENTIFIER -tp TARGET_PLANS_IDENTIFIER +nnUNetv2_move_plans_between_datasets -s SOURCE_DATASET -t TARGET_DATASET -sp SOURCE_PLANS_IDENTIFIER -tp TARGET_PLANS_IDENTIFIER ``` `SOURCE_PLANS_IDENTIFIER` is hereby probably nnUNetPlans unless you changed the experiment planner in From 46da54c990951cd8306a43e882b97b12485f2373 Mon Sep 17 00:00:00 2001 From: m574s Date: Wed, 29 May 2024 10:02:38 +0200 Subject: [PATCH 3/3] More changes to pretraining docu --- documentation/pretraining_and_finetuning.md | 22 ++++++++++----------- 1 file changed, 11 insertions(+), 11 deletions(-) diff --git a/documentation/pretraining_and_finetuning.md b/documentation/pretraining_and_finetuning.md index 1049f19d2..5c3f4d0c2 100644 --- a/documentation/pretraining_and_finetuning.md +++ b/documentation/pretraining_and_finetuning.md @@ -2,7 +2,7 @@ ## Intro -So far nnU-Net only supports supervised pre-training, meaning that you train a regular nnU-Net on some source dataset +So far nnU-Net only supports supervised pre-training, meaning that you train a regular nnU-Net on some pretraining dataset and then use the final network weights as initialization for your target dataset. As a reminder, many training hyperparameters such as patch size and network topology differ between datasets as a @@ -16,11 +16,11 @@ how the resulting weights can then be used for initialization. Throughout this README we use the following terminology: -- `source dataset` is the dataset you intend to run the pretraining on +- `pretraining dataset` is the dataset you intend to run the pretraining on (former: source dataset) - `target dataset` is the dataset you are interested in; the one you wish to fine tune on -## Pretraining on the source dataset +## Training on the pretraining dataset In order to obtain matching network topologies we need to transfer the plans from one dataset to another. Since we are only interested in the target dataset, we first need to run experiment planning (and preprocessing) for it: @@ -29,19 +29,19 @@ only interested in the target dataset, we first need to run experiment planning nnUNetv2_plan_and_preprocess -d TARGET_DATASET ``` -Then we need to extract the dataset fingerprint of the source dataset, if not yet available: +Then we need to extract the dataset fingerprint of the pretraining dataset, if not yet available: ```bash -nnUNetv2_extract_fingerprint -d SOURCE_DATASET +nnUNetv2_extract_fingerprint -d PRETRAINING_DATASET ``` -Now we can take the plans from the target dataset and transfer it to the source: +Now we can take the plans from the target dataset and transfer it to the pretraining dataset: ```bash -nnUNetv2_move_plans_between_datasets -s SOURCE_DATASET -t TARGET_DATASET -sp SOURCE_PLANS_IDENTIFIER -tp TARGET_PLANS_IDENTIFIER +nnUNetv2_move_plans_between_datasets -s PRETRAINING_DATASET -t TARGET_DATASET -sp PRETRAINING_PLANS_IDENTIFIER -tp TARGET_PLANS_IDENTIFIER ``` -`SOURCE_PLANS_IDENTIFIER` is hereby probably nnUNetPlans unless you changed the experiment planner in +`PRETRAINING_PLANS_IDENTIFIER` is hereby probably nnUNetPlans unless you changed the experiment planner in nnUNetv2_plan_and_preprocess. For `TARGET_PLANS_IDENTIFIER` we recommend you set something custom in order to not overwrite default plans. @@ -51,16 +51,16 @@ work well (but it could, depending on the schemes!). Note on CT normalization: Yes, also the clip values, mean and std are transferred! -Now you can run the preprocessing on the source task: +Now you can run the preprocessing on the pretraining dataset: ```bash -nnUNetv2_preprocess -d SOURCE_DATSET -plans_name TARGET_PLANS_IDENTIFIER +nnUNetv2_preprocess -d PRETRAINING_DATASET -plans_name TARGET_PLANS_IDENTIFIER ``` And run the training as usual: ```bash -nnUNetv2_train SOURCE_DATSET CONFIG all -p TARGET_PLANS_IDENTIFIER +nnUNetv2_train PRETRAINING_DATASET CONFIG all -p TARGET_PLANS_IDENTIFIER ``` Note how we use the 'all' fold to train on all available data. For pretraining it does not make sense to split the data.