Documentation

Support

Coding patterns

Use coding best practices to ensure reliable and maintainable Custom Scripts.
Read time 1 minuteLast updated 6 hours ago

You can use the below coding patterns to improve script reliability, prevent data overwrites, and make pipeline execution easier to debug and maintain.

Use unique file paths for each step’s output file

Custom Scripts enables you to write output data files to the job’s storage, so that subsequent steps in the pipeline can access them. For example, your script can generate a
.json
artifact that another step can parse. When you write output files to the job storage, use file paths unique to that step’s execution to prevent the files from being overwritten. This issue can occur if your script runs multiple times in a pipeline. Use the
{{system.stepId}}
parameter to avoid this.

Use error codes

Use logs and exit codes to ensure your scripts catch errors. This helps your pipeline users understand the issue. You can use the
exit
command to stop script execution and return an exit code. The exit code can be an integer in the range
0–255
, this helps you identify why the script execution stopped. An exit code of
0
indicates that the execution was successful.

Python Code example

This is as example of a Custom Script that demonstrates how to save pipeline execution data to a JSON file. This code example shows the following:
  • Pattern for logging
  • Referencing system parameters
  • Getting all pipeline parameters
  • Pattern for saving output data
  • Pattern for error handling
import jsonimport logging_LOG_PREFIX = "[My Custom Script]"logging.basicConfig( level=logging.INFO, format=f"{_LOG_PREFIX} - %(asctime)s - %(levelname)s - %(message)s", datefmt="%Y/%m/%d %H:%M:%S")logger = logging.getLogger(__name__)def _save_output(): """Saves output data to a JSON file.""" output_file = '/workspace/{{system.stepId}}/output.json' logger.info(f"Saving the following output data to {output_file}:") # If there are no parameters {{inputs.parameters}} evaluates to "null" # In Python you can handle this by converting "null" to None pipeline_params = {{inputs.parameters}} if """{{inputs.parameters}}""" != "null" else None output = { "step_id": "{{system.stepId}}", "job_id": "{{system.jobId}}", "random_uuid": "{{system.newUUID}}", "organization_id": "{{system.organizationId}}", "pipeline_params": pipeline_params, } logger.info(output) with open(output_file, 'w') as f: json.dump(output, f, ensure_ascii=False, indent=4)if __name__ == "__main__": logger.info("My custom script initialized.") try: _save_output() except Exception as e: logger.error(f"Action failed: {e}") exit(1)