Execute a darfix workflow without GUI#
[1]:
import os
from orangecontrib.darfix import tutorials
from CLI using ewoks#
You can execute darfix workflows as any ewoks workflow using the ewoks execute command. Parameter values can be provided with the --parameter
options (or the -p
alias). For example, we want to execute a workflow named my_darfix_workflow.ows
by setting two parameters: the detector HDF5 dataset and the HDF5 positioner dataset (aka metadata). For this, we can use the following command:
ewoks execute /home/esrf/payno/Documents/my_darfix_workflow.ows --parameter filenames=silx:///data/scisoft/darfix/datasets/bliss_hdf5/Silicon_111_reflection_0003/Silicon_111_reflection_0003.h5?path=/1.1/instrument/pco_ff/data --parameter metadata_url=silx:///data/scisoft/darfix/datasets/bliss_hdf5/Silicon_111_reflection_0003/Silicon_111_reflection_0003.h5?path=/1.1/instrument/positioners
Warning: the HDF5 dataset must be provided as silx URL
From Python#
step 1: Define task arguments which are not defined in the workflow#
in this example we will focus about HDF5 dataset.
[2]:
root_dir = "/tmp/darfix"
os.makedirs(root_dir, exist_ok=True)
hdf5_file = os.path.join(tutorials.__path__[0], "hdf5_dataset", "strain.hdf5")
assert os.path.exists(hdf5_file)
step2: Execute the workflow#
For more information on executing ewoks workflows: https://workflow.gitlab-pages.esrf.fr/ewoks/ewoks/
[3]:
from ewoks import execute_graph
graph = os.path.join(tutorials.__path__[0], "darfix_example_hdf.ows")
results = execute_graph(
graph,
inputs=[
{
"name": "raw_input_file",
"value": hdf5_file,
},
{
"name": "raw_detector_data_path",
"value": "/1.1/instrument/my_detector/data",
},
{"name": "raw_positioners_group_path", "value": "/1.1/instrument/positioners"},
{
"name": "in_memory",
"value": True,
},
],
output_tasks=True,
)
Loading data in memory: 100%|██████████| 2/2 [00:00<00:00, 1288.37it/s]
WARNING:/usr/local/lib/python3.9/site-packages/darfix/tasks/roi.py:Cannot apply a ROI if origin ([]) or size ([]) is empty. Dataset is unchanged.
Moments: compute mean 1/4: 100%|██████████| 2/2 [00:00<00:00, 83.43it/s]
Moments: compute var 2/4: 100%|██████████| 2/2 [00:00<00:00, 338.43it/s]
Moments: compute skew 3/4: 100%|██████████| 2/2 [00:00<00:00, 43.66it/s]
Moments: compute kurt 4/4: 100%|██████████| 2/2 [00:00<00:00, 64.52it/s]
Inspect the results#
[4]:
for node_id, task in results.items():
assert task.succeeded, f"task {node_id} failed"
print(task.get_output_values())
{'dataset': Dataset(dataset=<darfix.core.dataset.ImageDataset object at 0x7f2bd1d01e50>, indices=None, bg_indices=None, bg_dataset=None)}
{}
{'dataset': Dataset(dataset=<darfix.core.dataset.ImageDataset object at 0x7f2bd1d01e50>, indices=None, bg_indices=None, bg_dataset=None)}
{'dataset': Dataset(dataset=<darfix.core.dataset.ImageDataset object at 0x7f2bd1d01e50>, indices=None, bg_indices=None, bg_dataset=None)}
{'dataset': Dataset(dataset=<darfix.core.dataset.ImageDataset object at 0x7f2bd1d01e50>, indices=None, bg_indices=None, bg_dataset=None)}
{'dataset': Dataset(dataset=<darfix.core.dataset.ImageDataset object at 0x7f2bd1d01e50>, indices=None, bg_indices=None, bg_dataset=None)}
{'zsum': array([[178, 214, 218, ..., 217, 208, 221],
[215, 219, 212, ..., 227, 225, 222],
[377, 223, 221, ..., 217, 217, 211],
...,
[167, 188, 197, ..., 191, 199, 190],
[185, 184, 181, ..., 178, 183, 191],
[176, 198, 203, ..., 181, 195, 187]], dtype=uint64)}
{'dataset': Dataset(dataset=<darfix.core.dataset.ImageDataset object at 0x7f2bd1d01e50>, indices=None, bg_indices=None, bg_dataset=None)}
Note#
The same can be done for EDF datasets. In this case the, metadata_url
parameter should be omitted and the filenames
parameter will contain the list of the EDF image paths:
from glob import glob
root_dir = "/tmp/darfix"
os.makedirs(root_dir, exist_ok=True)
filenames = glob(os.path.join(tutorials.__path__[0], "*.edf"))
[ ]: