The submit_job_on_scheduler.py module#
Summary#
Send the SLURM job-configuration to the REST endpoint. |
|
Description#
Submit an HFSS job to the scheduler back-end service.
The service must be running (default: localhost:8080) before this script is executed. To start it:
python -m pyedb.workflows.job_manager.backend.job_manager_handler
Usage#
# Get help python submit_job_on_scheduler.py –help
# Explicit values python submit_job_on_scheduler.py –host 127.0.0.1 –port 8080 –project-path “/tmp/jobs/job1.aedb” –partition default –nodes 1 –cores-per-node 8
# Use defaults (localhost:8080, 1 node, 1 core, partition default) python submit_job_on_scheduler.py –project-path “/tmp/jobs/job1.aedb”
Module detail#
- async submit_job_on_scheduler.submit_slurm_job(*, host: str, port: int, project_path: str, partition: str, nodes: int, cores_per_node: int, run_limit: str) None#
Send the SLURM job-configuration to the REST endpoint.
- submit_job_on_scheduler.parse_cli() argparse.Namespace#
- submit_job_on_scheduler.DEFAULT_HOST = 'localhost'#
- submit_job_on_scheduler.DEFAULT_PORT = 8080#
- submit_job_on_scheduler.DEFAULT_PARTITION = 'default'#
- submit_job_on_scheduler.DEFAULT_NODES = 1#
- submit_job_on_scheduler.DEFAULT_CORES_PER_NODE = 1#
- submit_job_on_scheduler.DEFAULT_RUN_LIMIT = 'unlimited'#
- submit_job_on_scheduler.logger#