Command-line interface

Pipeline2Apps's command line interface consists of a number of sub-commands under the pipeline2app command. To save on keystrokes the main command is also aliased to p2a.

pipeline2app bootstrap

Generate a YAML specification file for a Pipeline2app App

pipeline2app bootstrap [OPTIONS] OUTPUT_FILE

Options

-t, --title <title>

The title of the image

-u, --docs-url <docs_url>

URL explaining the tool/workflow that is being wrapped into an app

-r, --registry <registry>

The Docker registry of the image

-d, --description <description>

A longer form description of the tool/workflow implemented in the pipeline

-a, --author <name> <email>

The name of the author of the image

-b, --base-image <attr> <value>

Set one of the attributes of the base-image, e.g. '--base-image name debian', '--base-image package_manager apt', '--base-image tag focal', '--base-image conda_env base', or '--base-image python /usr/bin/python3.7'

-v, --version <version>

The version of the image

-t, --command-task <command_task>

The command to execute in the image

-y, --packages-pip <package-name>[==<version>

Packages to install via pip

-s, --packages-system <package-name>[==<version>

Packages to install via the system package manager

-n, --packages-neurodocker <package-name>[==<version>

Packages to install via NeuroDocker

-i, --command-input <name> <attrs>

Input specifications, name and attribute pairs. Attributes are comma-separated name/value pairs, e.g. 'datatype=str,configuration.argstr=,configuration.position=0,help=The input image''

-o, --command-output <name> <attrs>

Output specifications, name and attribute pairs. Attributes are comma-separated name/value pairs, e.g. 'datatype=str,configuration.argstr=,configuration.position=1,help=The output image'

-p, --command-parameter <name> <attrs>

Parameter specifications, name and attribute pairs. Attributes are comma-separated name/value pairs, e.g. 'datatype=str,help='compression level'

-c, --command-configuration <name> <value>

Command configuration value

-f, --frequency <frequency>

The level in the data tree that the pipeline will operate on, e.g. common:Clinical[session] designates that the pipeline runs on 'sessions' as opposed to 'subjects'

-l, --license <license-name> <path-to-license-file> <info-url> <description>

Licenses that are required at runtime within the image. The name is used to refer to the license, when providing a license file at build time or alternatively installing the license in the data store. The path to the license file is where the license will be installed within the image. The info URL is where the details of the license can be found and where it can be acquired from. The description gives a brief description of the license and what it is required for

--name <name>

The name of the command

Arguments

OUTPUT_FILE

Required argument

pipeline2app make

Construct and build a docker image containing a pipeline to be run on data stored in a data repository or structure (e.g. XNAT Container Service Pipeline or BIDS App)

TARGET is the type of image to build. For standard images just the pipeline2app sub-package is required (e.g. 'xnat' or 'common'). However, specific App subclasses can be specified using <module-path>:<app-class-name> format, e.g. pipeline2app.xnat:XnatApp

SPEC_PATH is the file system path to the specification to build, or directory containing multiple specifications

pipeline2app make [OPTIONS] TARGET SPEC_PATH

Options

--registry <registry>

The Docker registry to deploy the pipeline to

--build-dir <build_dir>

Specify the directory to build the Docker image in. Defaults to .build in the directory containing the YAML specification

--release <release-name> <release-version>

Name of the release for the package as a whole (i.e. for all pipelines)

--tag-latest, --dont-tag-latest

whether to tag the release as the "latest" or not

--save-manifest <save_manifest>

File path at which to save the build manifest

--logfile <logfile>

Log output to file instead of stdout

--loglevel <loglevel>

The level to display logs at

--use-local-packages, --dont-use-local-packages

Use locally installed Python packages, instead of pulling them down from PyPI

--install-extras <install_extras_str>

Install extras to use when installing Pipeline2app inside the container image. Typically only used in tests to provide 'test' extra

--for-localhost, --not-for-localhost

Build the image so that it can be run in Pipeline2app's test configuration (only for internal use)

--raise-errors, --log-errors

Raise exceptions instead of logging failures

--generate-only, --build

Just create the build directory and dockerfile

--license <license-name> <path-to-license-file>

Licenses provided at build time to be stored in the image (instead of downloaded at runtime)

--license-to-download <license_to_download>

Specify licenses that are not provided at runtime and instead downloaded from the data store at runtime in order to satisfy their conditions

--check-registry, --dont-check-registry

Check the registry to see if an existing image with the same tag is present, and if so whether the specification matches (and can be skipped) or not (raise an error)

--push, --dont-push

push built images to registry

--clean-up, --dont-clean-up

Remove built images after they are pushed to the registry

--spec-root <spec_root>

The root path to consider the specs to be relative to, defaults to CWD

-s, --source-package <source_package>

Path to a local Python package to be included in the image. Needs to have a package definition that can be built into a source distribution and the name of the directory needs to match that of the package to be installed. Multiple packages can be specified by repeating the option.

-e, --export-file <internal-dir> <external-dir>

Path to be exported from the Docker build directory for convenience. Multiple files can be specified by repeating the option.

--resource <name> <path>

Supply resources to be copied into the image, the name should match up with the name of a resource within the spec

--resources-dir <path>

A directory containing resources to be copied into the image, the names of the sub-directories within the resources dir should match up with the name of a resource within the spec

--stream-logs, --no-stream-logs

Stream the build logs to stdout as they are generated. Defaults to True if the log-level <= info

Arguments

TARGET

Required argument

SPEC_PATH

Required argument

pipeline2app make-docs

Build docs for one or more yaml wrappers

SPEC_ROOT is the path of a YAML spec file or directory containing one or more such files.

The generated documentation will be saved to OUTPUT.

pipeline2app make-docs [OPTIONS] SPEC_PATH OUTPUT

Options

--registry <registry>

The Docker registry to deploy the pipeline to

--flatten, --no-flatten
--loglevel <loglevel>

The level to display logs at

--default-axes <default_axes>

The default axes to assume if it isn't explicitly stated in the command

--spec-root <spec_root>

The root path to consider the specs to be relative to, defaults to CWD

Arguments

SPEC_PATH

Required argument

OUTPUT

Required argument

pipeline2app list-images

Walk through the specification paths and list tags of the images that will be build from them.

SPEC_ROOT is the file system path to the specification to build, or directory containing multiple specifications

DOCKER_ORG is the Docker organisation the images should belong to

pipeline2app list-images [OPTIONS] SPEC_ROOT

Options

--registry <registry>

The Docker registry to deploy the pipeline to

Arguments

SPEC_ROOT

Required argument

pipeline2app inspect-docker-exec

Extract the executable from a Docker image

pipeline2app inspect-docker-exec [OPTIONS] IMAGE_TAG

Arguments

IMAGE_TAG

Required argument

pipeline2app required-packages

Detect the Python packages required to run the specified workflows and return them and their versions

pipeline2app required-packages [OPTIONS] [TASK_LOCATIONS]...

Arguments

TASK_LOCATIONS

Optional argument(s)

pipeline2app changelog

Displays the changelogs found in the release manifest of a deployment build

MANIFEST_JSON is a JSON file containing a list of container images built in the release and the commands present in them

pipeline2app changelog [OPTIONS] MANIFEST_JSON [IMAGES]...

Arguments

MANIFEST_JSON

Required argument

IMAGES

Optional argument(s)

pipeline2app pipeline-entrypoint

Loads/creates a dataset, then applies and launches a pipeline in a single command. To be used within the command configuration of an XNAT Container Service ready Docker image.

ADDRESS string containing the nickname of the data store, the ID of the dataset (e.g. XNAT project ID or file-system directory) and the dataset's name in the format <store-nickname>//<dataset-id>[@<dataset-name>]

pipeline2app pipeline-entrypoint [OPTIONS] ADDRESS

Options

-i, --input <col-name> <match-criteria>

The match criteria to pass to the column

-o, --output <col-name> <output-path>

The path in which to store the output of the pipeline

-p, --parameter <name> <value>

sets a parameter of the workflow

--dataset-name <dataset_name>

The name of the dataset. Will be created if not present. FrameSet names are used to separate different analyses performed on the same data into different namespaces

--overwrite, --no-overwrite

Whether to overwrite a saved pipeline with the same name, and its parameterisation, if present

--command <command>

Which command to run, defaults to the first (or only) command installed.

-w, --work <work_dir>

The location of the directory where the working files created during the pipeline execution will be stored

--export-work <export_work>

Export the work directory to another location after the task/workflow exits (used for post-hoc investigation of completed workflows in situations where the scratch space is inaccessible after the workflow exits

--plugin <plugin>

The Pydra plugin with which to process the task/workflow

--loglevel <loglevel>

The level of detail logging information is presented

--ids <ids>

List of IDs to restrict the pipeline execution to (i.e. don't execute over the whole dataset)

--dataset-hierarchy <dataset_hierarchy>

Comma-separated hierarchy of the dataset (see http://pipeline2app.readthedocs.io/data_model.html

--raise-errors, --catch-errors

raise exceptions instead of capturing them to suppress call stack

--keep-running-on-errors, --exit-on-errors

Keep the the pipeline running in infinite loop on error (will need to be manually killed). Can be useful in situations where the enclosing container will be removed on completion and you need to be able to 'exec' into the container to debug.

--spec-path <spec_path>

Used to specify a different path to the spec path from the one that is written to in the image (typically used in debugging/testing)

Arguments

ADDRESS

Required argument