diff --git a/.github/workflows/deploy.yml b/.github/workflows/deploy.yml
index d26025f..5974ccf 100644
--- a/.github/workflows/deploy.yml
+++ b/.github/workflows/deploy.yml
@@ -2,6 +2,12 @@ name: Deploy to GitHub Pages
on:
workflow_call:
+ inputs:
+ dry-run:
+ description: "Whether to do a dry run (test the deployment without actually publishing)"
+ required: false
+ default: false
+ type: boolean
# Allow this job to clone the repo and create a page deployment
permissions:
@@ -93,6 +99,7 @@ jobs:
deploy:
needs: build
runs-on: ubuntu-latest
+ if: ${{ !inputs.dry-run }}
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
diff --git a/.github/workflows/pr_checks.yml b/.github/workflows/pr_checks.yml
index d710d06..092261d 100644
--- a/.github/workflows/pr_checks.yml
+++ b/.github/workflows/pr_checks.yml
@@ -5,74 +5,12 @@ on:
- main
jobs:
- markdown-components:
- name: markdown-modules
- runs-on: ubuntu-latest
- steps:
- - uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4.1.7
- with:
- repository: nf-neuro/modules
- ref: main
- path: nf-neuro
- fetch-depth: 0
- - name: Setup python
- uses: actions/setup-python@v5.4.0
- with:
- python-version: "3.10"
- cache: pip
- - name: Install dependencies
- run: |
- pip install pyyaml jinja2
- - name: Generate markdown
- id: generate-module
- run: |
- shopt -s globstar
- function convert_module () {
- local file="$1"
- local module_name=$(basename "$(dirname "$file")")
- local category=$(basename "$(dirname "$(dirname "$file")")")
- mkdir -p modules/$category
- cd nf-neuro
- python docs/astro/convert_module.py ../$(dirname "$file") ${{ github.sha }} ../modules/${category}/${module_name}.md
- }
- function convert_subworkflow () {
- local file="$1"
- local sbwf_name=$(basename "$(dirname "$file")")
- cd nf-neuro
- python docs/astro/convert_subworkflow.py ../$(dirname "$file") ${{ github.sha }} ../subworkflows/${sbwf_name}.md
- }
- export -f convert_module
- export -f convert_subworkflow
-
- mkdir -p modules/ subworkflows/
- parallel --jobs $(nproc) convert_module ::: nf-neuro/modules/nf-neuro/**/meta.yml
- parallel --jobs $(nproc) convert_subworkflow ::: nf-neuro/subworkflows/nf-neuro/**/meta.yml
-
- - name: Pack into artifact
- uses: actions/upload-artifact@v4
- with:
- name: markdown
- path: |
- modules/
- subworkflows/
-
- build:
- needs: markdown-components
- runs-on: ubuntu-latest
- steps:
- - name: Checkout website code
- uses: actions/checkout@v4
- with:
- repository: nf-neuro/website
-
- - name: Download components markdown
- uses: actions/download-artifact@v4
- with:
- name: markdown
- path: src/content/docs/api/
-
- - name: Install, build, and upload your site
- uses: withastro/action@v2
- with:
- node-version: 22
- package-manager: npm@latest
+ pr_checks:
+ if: github.repository == 'nf-neuro/website'
+ permissions:
+ contents: read
+ pages: write
+ id-token: write
+ uses: ./.github/workflows/deploy.yml
+ with:
+ dry-run: true
diff --git a/src/content/docs/contribute/create-your-subworkflow/3-optional-inputs.mdx b/src/content/docs/contribute/create-your-subworkflow/3-optional-inputs.mdx
index 78b3490..9e5d352 100644
--- a/src/content/docs/contribute/create-your-subworkflow/3-optional-inputs.mdx
+++ b/src/content/docs/contribute/create-your-subworkflow/3-optional-inputs.mdx
@@ -68,7 +68,7 @@ main:
:::
-## Skipping a component execution given an optional input
+## Skipping a component execution
If a brain mask is given in input, then brain masking modules included in the subworflow should not be executed. This is
not as simple as doing a conditional test on `ch_brain_mask`, it will return true even if
diff --git a/src/content/docs/contribute/create-your-subworkflow/4-configuration.mdx b/src/content/docs/contribute/create-your-subworkflow/4-configuration.mdx
index b22de59..3dec304 100644
--- a/src/content/docs/contribute/create-your-subworkflow/4-configuration.mdx
+++ b/src/content/docs/contribute/create-your-subworkflow/4-configuration.mdx
@@ -37,32 +37,43 @@ For subworkflows, configuration acts on two fronts :
2. Configuring the **behavior of its included components** to fit its use-case, and allows end-users to modify it.
-## Define subworkflow behaviors
-
:::caution
-**NEVER** define a configuration logic using the **content of a channel**. Channels evaluate to true
-nonetheless, even when empty. You need to use configuration through `params` or `task.ext`, or apply
-[nextflow operators](https://www.nextflow.io/docs/latest/reference/operator.html) to unravel the logic at runtime by
-removing data that should not be processed.
+**NEVER** define a configuration logic using the **content of a channel**, such as :
+
+```groovy
+if (ch_brain_mask) {
+ // do something
+}
+```
+
+Channels evaluate to true nonetheless, even when empty. You need to use either :
+- configuration through an [input `options` map](#define-subworkflow-behaviors)
+- configuration [parameters (`params`) or module arguments (`task.ext`)](#change-components-behaviors)
+- [logic on the channel's content](/contribute/create-your-subworkflow/3-optional-inputs) using
+ [nextflow operators](https://www.nextflow.io/docs/latest/reference/operator.html)
+
:::
-Subworkflows parameters are defined using the `params` configuration scope. This scope is available in all **_.nf_** files, but
-`params` defined there can be overriden by **_.config_** files at execution, or by other subworkflows that would include yours.
-For the `preproc_anat` subworkflow, there are no subworkflow parameters defined for now, but we could, for example, add one to
+## Define subworkflow behaviors
+
+Changing the behavior of a subworkflow (the code contained in the `main` block) is done via an **input options `map`**. This `options` input is
+available to other subworkflows and pipelines that will include yours to customize its execution. It is a
+[map data structure](https://groovy-lang.org/syntax.html#_maps) with `string` keys and values of any type. In the context of the `preproc_anat`
+subworkflow, no subworkflow options are defined yet, but we could, for example, add one to
**skip the denoising step** :
```diff lang="groovy" title="main.nf"
...
-+params.preproc_anat_denoise = true
-
subworkflow preproc_anat {
take:
ch_anatomical // Structure : [ [id: string] , path(anat_image) ]
...
++ options // Structure : map(options), optional
main:
++ def preproc_anat_denoise = options.withDefault{ true }.preproc_anat_denoise as Boolean
...
-+ if (params.preproc_anat_denoise) {
++ if (preproc_anat_denoise) {
ch_denoising_nlmeans = ch_anatomical
.join(ch_brain_mask, remainder: true)
.map{ meta, image, mask -> [meta, image, [], mask ?: []] }
@@ -82,12 +93,36 @@ main:
```
:::tip[What is happening here ?]
-- The DENOISING_NLMEANS module is only run if the parameter `preproc_anat_denoise` is set
- to true. When it is, it changes the channel assigned to `ch_anatomical` for the output
- of the module.
+- The DENOISING_NLMEANS module is only run if the option `preproc_anat_denoise` is set
+ to true in the input `options` map. When it is, it changes the channel assigned to
+ `ch_anatomical` for the output of the module.
+- We want the DENOISING_NLMEANS to run by default. To do so, we use the
+ [Groovy `withDefault` method](https://docs.groovy-lang.org/latest/html/groovy-jdk/java/util/Map.html#withDefault(groovy.lang.Closure)) to
+ temporarily set `preproc_anat_denoise` to true if it is absent from the `options` map.
- The other downstream components then access this variable, instead of calling directly.
:::
+### Setting defaults from metadata
+
+You'll edit the **subworkflow metadata** later. For now, you only need to know that **you will define all option's default values there**. To ease
+linking those defaults in your subworkflow, we created an utility, `utils_options`. In the context of `preproc_anat`, it looks like this :
+
+```diff lan="groovy" title="main.nf"
+...
++import { getOptionsWithDefaults } from '../utils_options/main'
+
+workflow PREPROC_ANAT {
+take:
+ ch_anatomical // Structure : [ [id: string] , path(anat_image) ]
+ ...
+ options // Structure : map(options), optional
+main:
++ options = getOptionsWithDefaults(options, "${moduleDir}/meta.yml")
+
+ ...
+}
+```
+
## Change components behaviors
Changing the behavior of included components is not as straightforward as it seems.
@@ -109,8 +144,7 @@ you will document its existence and still use it profusely in tests :
```groovy title="nextflow.config"
params {
- preproc_anat_denoise = true
- preproc_anat_n4 = true
+
}
```
@@ -121,8 +155,6 @@ and only module, using a **process selector** :
```diff title="nextflow.config" lang="groovy" {"1": 4} {"2": 7-11}
params {
- preproc_anat_denoise = true
- preproc_anat_n4 = true
+ preproc_anat_nlmeans_number_of_coils = 1
}
@@ -155,9 +187,7 @@ configuration job for `preproc_anat`, but the implementation would be more compl
:::
-Configuration is done using the `params` scope directly in the **_main.nf_** file of your subworkflow, **after** the sub-components inclusion and
-**before** the `workflow` definition. **Use the same parameter names** as the ones defined in the subworkflow you include and they will
-**overwrite their values**.
+Configuration is done using the input `options` map. Simply pass the map to child subworkflows for it to unpack and apply.
@@ -174,10 +204,8 @@ include { BETCROP_ANTSBET } from '../../../modules/nf-neuro/betcrop/ant
include { PREPROC_N4 } from '../../../modules/nf-neuro/preproc/n4/main'
// SUBWORKFLOWS
include { ANATOMICAL_SEGMENTATION } from '../anatomical_segmentation/main'
-
-params.preproc_anat_denoise = true
-params.preproc_anat_bet_before_n4 = true
-params.preproc_anat_n4 = true
+// UTILITY
+include { getOptionsWithDefaults } from '../utils_options/main'
workflow PREPROC_ANAT {
take:
@@ -189,10 +217,12 @@ take:
ch_freesurferseg // Structure : [ [id: string] , path(aparc+aseg) , path(wmparc) ], optional
ch_lesion // Structure : [ [id: string] , path(lesion) ], optional
ch_fs_license // Structure : [ path(license) ], optional
+ options // Structure : map(options) , optional
main:
ch_versions = Channel.empty()
+ options = getOptionsWithDefaults(options, "${moduleDir}/meta.yml")
- if (params.preproc_anat_denoise) {
+ if (options.preproc_anat_denoise) {
ch_denoising_nlmeans = ch_anatomical
.join(ch_brain_mask, remainder: true)
.map{ meta, image, mask -> [meta, image, [], mask ?: []] }
@@ -203,7 +233,7 @@ main:
}
ch_brain_pre_mask = Channel.empty()
- if (params.preproc_anat_bet_before_n4) {
+ if (options.preproc_anat_bet_before_n4) {
ch_betcrop_synthbet = ch_anatomical
.join(ch_brain_mask, remainder: true)
.filter{ meta, image, mask -> !mask }
@@ -215,7 +245,7 @@ main:
ch_brain_pre_mask = ch_brain_mask.mix(BETCROP_SYNTHBET.out.brain_mask)
}
- if (params.preproc_anat_n4) {
+ if (options.preproc_anat_n4) {
ch_preproc_n4 = ch_anatomical
.join(ch_n4_reference, remainder: true)
.join(ch_brain_pre_mask, remainder: true)
@@ -239,7 +269,8 @@ main:
ch_anatomical,
ch_freesurferseg,
ch_lesion,
- ch_license
+ ch_license,
+ options // Pass the options map to child subworkflows for customization
)
emit:
ch_anatomical = ch_anatomical // channel: [ [id: string] , path(image) ]
@@ -258,10 +289,6 @@ emit:
```groovy title="nextflow.config"
params {
- preproc_anat_denoise = true
- preproc_anat_bet_before_n4 = true
- preproc_anat_n4 = true
-
// Configure DENOISING_NLMEANS
preproc_anat_nlmeans_number_of_coils = 1
preproc_anat_nlmeans_sigma = 0.5
@@ -276,10 +303,6 @@ params {
// Configure PREPROC_N4
preproc_anat_n4_knots_per_voxel = 1
preproc_anat_n4_shrink_factor = 1
-
- // Configure ANATOMICAL_SEGMENTATION
- run_synthbet = false // Reusing the same name, we could have changed if wanted
-
}
process {
diff --git a/src/content/docs/contribute/create-your-subworkflow/5-metadata.mdx b/src/content/docs/contribute/create-your-subworkflow/5-metadata.mdx
index 4e246c7..6e3f6bc 100644
--- a/src/content/docs/contribute/create-your-subworkflow/5-metadata.mdx
+++ b/src/content/docs/contribute/create-your-subworkflow/5-metadata.mdx
@@ -11,7 +11,7 @@ and dependencies** of your subworkflow, as well as other useful fields.
usage outputs for command line interfaces (CLI)**. All this really helps users and developers understand the purpose of
your subworkflow, its configuration and its behaviors.
-### The `description` field
+## The `description` field
Each subworkflow requires a thourough description of it main purpose, its configuration and its behaviors. In addition,
the description should contain a **consise explanation of the dataflow between the modules and other subworkflows used**.
@@ -29,7 +29,7 @@ description: |
+ [ ANATOMICAL_SEGMENTATION ] Segment the anatomical image into different tissue types (GM, WM, CSF).
```
-### The `keywords` field
+## The `keywords` field
The keywords field is a list of keywords that describe the subworkflow and **will help users find it through search tools**
such as the ones included on this website. **The keywords should be relevant to the subworkflow and its purpose**. For
@@ -51,7 +51,7 @@ description: |
+ - segmentation
```
-### The `components` field
+## The `components` field
The components field is a list of all the modules, then all subworkflows, included in the **_main.nf_** file :
@@ -72,7 +72,7 @@ keywords:
+ - anatomical_segmentation
```
-### The `args` field
+## The `args` field
The args field is a list of all the parameters (`params`) defined for the subworkflow. **Parameters defined in included
components MUST also be described here, unless they SHOULD be hidden**. For the `preproc_anat` subworkflow, use the following :
@@ -87,18 +87,6 @@ keywords:
components:
...
+args:
-+ - preproc_anat_denoise:
-+ type: boolean
-+ description: Denoise the anatomical image with the Non-Local Means algorithm.
-+ default: true
-+ - preproc_anat_bet_before_n4:
-+ type: boolean
-+ description: Perform brain extraction before intensity normalization.
-+ default: true
-+ - preproc_anat_n4:
-+ type: boolean
-+ description: Perform intensity normalization on the anatomical image.
-+ default: true
+ - preproc_anat_nlmeans_number_of_coils:
+ type: int
+ description: Number of receive coils used to acquire the anatomical image.
@@ -137,16 +125,13 @@ components:
+ - preproc_anat_n4_shrink_factor:
+ type: int
+ description: Resolution downsampling factor to use for intensity normalization.
-+ - run_synthbet:
-+ type: boolean
-+ description: (ANATOMICAL_SEGMENTATION) Run the synthetic brain extraction step.
-+ default: false
```
-### The `input` field
+## The `input` field
The input field lists all input channels **in order**. **Each entry defines the content structure of an input channel in
-a list of maps**. For the `preproc_anat` subworkflow, there are 8 inputs to document :
+a list of maps**. For the `preproc_anat` subworkflow, there are 8 inputs to document (excluding the `options` map, which is
+described just after) :
```diff lang="yaml"
---
@@ -234,7 +219,38 @@ args:
+ description: FreeSurfer license file.
```
-### The `output` field
+### Input `options` map
+
+To document this input, you need to describe and assign **default values** to all potential options. This is done using the `entries`
+entry of its metadata field. The content of each entry follow the same for as for [arguments](#the-args-field). Below is an example
+for `preproc_anat` :
+
+```diff lang="yaml"
+input:
+...
++ - options:
++ description: Map of options for the preproc_anat subworkflow.
++ mandatory: false
++ entries:
++ preproc_anat_denoise:
++ type: boolean
++ description: Denoise the anatomical image with the Non-Local Means algorithm.
++ default: true
++ preproc_anat_bet_before_n4:
++ type: boolean
++ description: Perform brain extraction before intensity normalization.
++ default: true
++ preproc_anat_n4:
++ type: boolean
++ description: Perform intensity normalization on the anatomical image.
++ default: true
++ run_synthbet:
++ type: boolean
++ description: (ANATOMICAL_SEGMENTATION) Run the synthetic brain extraction step.
++ default: false
+```
+
+## The `output` field
Finally, it's time to document the `output`, the last metadata section ! They are **mostly structured the same way as the
`input`**, with `files` having an **aditional `pattern` field** describing their naming convention and possible formats.