Execute a darfix workflow without GUI#

[1]:
import os
from orangecontrib.darfix import tutorials

from CLI using ewoks#

You can execute darfix workflows as any ewoks workflow using the ewoks execute command. Parameter values can be provided with the --parameter options (or the -p alias). For example, we want to execute a workflow named my_darfix_workflow.ows by setting two parameters: the detector HDF5 dataset and the HDF5 positioner dataset (aka metadata). For this, we can use the following command:

ewoks execute /home/esrf/payno/Documents/my_darfix_workflow.ows --parameter filenames=silx:///data/scisoft/darfix/datasets/bliss_hdf5/Silicon_111_reflection_0003/Silicon_111_reflection_0003.h5?path=/1.1/instrument/pco_ff/data --parameter metadata_url=silx:///data/scisoft/darfix/datasets/bliss_hdf5/Silicon_111_reflection_0003/Silicon_111_reflection_0003.h5?path=/1.1/instrument/positioners

Warning: the HDF5 dataset must be provided as silx URL

From python#

step 1: Define task arguments which are not defined in the workflow#

in this example we will focus about HDF5 dataset.

[2]:
from silx.io.url import DataUrl

root_dir = "/tmp/darfix"
os.makedirs(root_dir, exist_ok=True)
hdf5_file = os.path.join(tutorials.__path__[0], "hdf5_dataset", "strain.hdf5")
assert os.path.exists(hdf5_file)
filenames = (
    DataUrl(
        file_path=hdf5_file,
        data_path="1.1/instrument/my_detector/data",
        scheme="silx",
    ),
)
metadata_url = DataUrl(
    file_path=hdf5_file,
    data_path="1.1/instrument/positioners",
    scheme="silx",
)

step2: Execute the workflow#

For more information on executing ewoks workflows: https://workflow.gitlab-pages.esrf.fr/ewoks/ewoks/

[3]:
from ewoks import execute_graph
graph = os.path.join(tutorials.__path__[0], "darfix_example2.ows")
results = execute_graph(
    graph,
    inputs=[
        {"name": "filenames", "value": filenames,},
        {"name": "metadata_url", "value": metadata_url,},
        {"name": "in_memory", "value": False,},
    ],
    output_tasks=True,
)
/usr/local/lib/python3.8/site-packages/silx/io/h5py_utils.py:50: H5pyDeprecationWarning: h5py.get_config().swmr_min_hdf5_version is deprecated. This version of h5py does not support older HDF5 without SWMR.
  HDF5_SWMR_VERSION = calc_hexversion(*h5py.get_config().swmr_min_hdf5_version[:3])
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] workflow started
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '0'] [task 'darfix.core.process.DataSelection'] task started
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '0'] [task 'darfix.core.process.DataSelection'] task finished
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '6'] [task 'darfix.core.process.MetadataTask'] task started
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '6'] [task 'darfix.core.process.MetadataTask'] task finished
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '5'] [task 'darfix.core.process.DimensionDefinition'] task started
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '5'] [task 'darfix.core.process.DimensionDefinition'] task finished
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '1'] [task 'darfix.core.process.RoiSelection'] task started
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '1'] [task 'darfix.core.process.RoiSelection'] task finished
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '2'] [task 'darfix.core.process.NoiseRemoval'] task started
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '2'] [task 'darfix.core.process.NoiseRemoval'] task finished
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '3'] [task 'darfix.core.process.ShiftCorrection'] task started
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '3'] [task 'darfix.core.process.ShiftCorrection'] task finished
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '4'] [task 'darfix.core.process.ZSum'] task started
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] [node '4'] [task 'darfix.core.process.ZSum'] task finished
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] [workflow 'Darfix example 2'] workflow finished
INFO:ewokscore.events.global_state:[job '2ebe92f3-838f-49a0-a63e-9e86e22304d6'] job finished
Applying roi |████████████████████████████████████████████████████████████████████████████████████████████████████| 100.0%
Applying background subtraction |████████████████████████████████████████████████████████████████████████████████████████████████████| 100.0%
Applying hot pixel removal |████████████████████████████████████████████████████████████████████████████████████████████████████| 100.0%
Applying threshold |████████████████████████████████████████████████████████████████████████████████████████████████████| 100.0%

Inspect the results#

[4]:
for node_id, task in results.items():
    assert task.succeeded, f"task {node_id} failed"
    print(task.get_output_values())
{'dataset': Dataset(dataset=<darfix.core.dataset.Dataset object at 0x7f3a6b9c9cd0>, indices=None, bg_indices=None, bg_dataset=None)}
{'dataset': Dataset(dataset=<darfix.core.dataset.Dataset object at 0x7f3a6b9c9cd0>, indices=None, bg_indices=None, bg_dataset=None)}
{'dataset': Dataset(dataset=<darfix.core.dataset.Dataset object at 0x7f3a6b9c9cd0>, indices=None, bg_indices=None, bg_dataset=None)}
{'dataset': Dataset(dataset=<darfix.core.dataset.Dataset object at 0x7f3a6b9df280>, indices=None, bg_indices=None, bg_dataset=None)}
{'dataset': Dataset(dataset=<darfix.core.dataset.Dataset object at 0x7f3a6b9eb550>, indices=None, bg_indices=None, bg_dataset=None)}
{'dataset': Dataset(dataset=<darfix.core.dataset.Dataset object at 0x7f3a6b9eb550>, indices=None, bg_indices=None, bg_dataset=None)}
{'dataset': Dataset(dataset=<darfix.core.dataset.Dataset object at 0x7f3a6b9eb550>, indices=None, bg_indices=None, bg_dataset=None)}

Note#

The same can be done for EDF datasets. In this case the, metadata_url parameter should be omitted and the filenames parameter will contain the list of the EDF image paths:

from glob import glob
root_dir = "/tmp/darfix"
os.makedirs(root_dir, exist_ok=True)
filenames = glob(os.path.join(tutorials.__path__[0], "*.edf"))
[ ]: