braket.jobs.quantum_job_creation module

braket.jobs.quantum_job_creation.prepare_quantum_job(device: str, source_module: str, entry_point: str | None = None, image_uri: str | None = None, job_name: str | None = None, code_location: str | None = None, role_arn: str | None = None, hyperparameters: dict[str, Any] | None = None, input_data: str | dict | S3DataSourceConfig | None = None, instance_config: InstanceConfig | None = None, distribution: str | None = None, stopping_condition: StoppingCondition | None = None, output_data_config: OutputDataConfig | None = None, copy_checkpoints_from_job: str | None = None, checkpoint_config: CheckpointConfig | None = None, aws_session: AwsSession | None = None, tags: dict[str, str] | None = None, reservation_arn: str | None = None) dict[source]

Creates a hybrid job by invoking the Braket CreateJob API.

Parameters:
  • device (str) – Device ARN of the QPU device that receives priority quantum task queueing once the hybrid job begins running. Each QPU has a separate hybrid jobs queue so that only one hybrid job is running at a time. The device string is accessible in the hybrid job instance as the environment variable “AMZN_BRAKET_DEVICE_ARN”. When using embedded simulators, you may provide the device argument as string of the form: “local:<provider>/<simulator_name>”.

  • source_module (str) – Path (absolute, relative or an S3 URI) to a python module to be tarred and uploaded. If source_module is an S3 URI, it must point to a tar.gz file. Otherwise, source_module may be a file or directory.

  • entry_point (str | None) – A str that specifies the entry point of the hybrid job, relative to the source module. The entry point must be in the format importable.module or importable.module:callable. For example, source_module.submodule:start_here indicates the start_here function contained in source_module.submodule. If source_module is an S3 URI, entry point must be given. Default: source_module’s name

  • image_uri (str | None) – A str that specifies the ECR image to use for executing the hybrid job.`image_uris.retrieve_image()` function may be used for retrieving the ECR image URIs for the containers supported by Braket. Default = <Braket base image_uri>.

  • job_name (str | None) – A str that specifies the name with which the hybrid job is created. The hybrid job name must be between 0 and 50 characters long and cannot contain underscores. Default: f’{image_uri_type}-{timestamp}’.

  • code_location (str | None) – The S3 prefix URI where custom code will be uploaded. Default: f’s3://{default_bucket_name}/jobs/{job_name}/script’.

  • role_arn (str | None) – A str providing the IAM role ARN used to execute the script. Default: IAM role returned by AwsSession’s get_default_jobs_role().

  • hyperparameters (dict[str, Any] | None) – Hyperparameters accessible to the hybrid job. The hyperparameters are made accessible as a Dict[str, str] to the hybrid job. For convenience, this accepts other types for keys and values, but str() is called to convert them before being passed on. Default: None.

  • input_data (str | dict | S3DataSourceConfig | None) – Information about the training data. Dictionary maps channel names to local paths or S3 URIs. Contents found at any local paths will be uploaded to S3 at f’s3://{default_bucket_name}/jobs/{job_name}/data/{channel_name}. If a local path, S3 URI, or S3DataSourceConfig is provided, it will be given a default channel name “input”. Default: {}.

  • instance_config (InstanceConfig | None) – Configuration of the instance(s) for running the classical code for the hybrid job. Defaults to InstanceConfig(instanceType='ml.m5.large', instanceCount=1, volumeSizeInGB=30).

  • distribution (str | None) – A str that specifies how the hybrid job should be distributed. If set to “data_parallel”, the hyperparameters for the hybrid job will be set to use data parallelism features for PyTorch or TensorFlow. Default: None.

  • stopping_condition (StoppingCondition | None) – The maximum length of time, in seconds, and the maximum number of quantum tasks that a hybrid job can run before being forcefully stopped. Default: StoppingCondition(maxRuntimeInSeconds=5 * 24 * 60 * 60).

  • output_data_config (OutputDataConfig | None) – Specifies the location for the output of the hybrid job. Default: OutputDataConfig(s3Path=f’s3://{default_bucket_name}/jobs/{job_name}/data’, kmsKeyId=None).

  • copy_checkpoints_from_job (str | None) – A str that specifies the hybrid job ARN whose checkpoint you want to use in the current hybrid job. Specifying this value will copy over the checkpoint data from use_checkpoints_from_job’s checkpoint_config s3Uri to the current hybrid job’s checkpoint_config s3Uri, making it available at checkpoint_config.localPath during the hybrid job execution. Default: None

  • checkpoint_config (CheckpointConfig | None) – Configuration that specifies the location where checkpoint data is stored. Default: CheckpointConfig(localPath=’/opt/jobs/checkpoints’, s3Uri=f’s3://{default_bucket_name}/jobs/{job_name}/checkpoints’).

  • aws_session (AwsSession | None) – AwsSession for connecting to AWS Services. Default: AwsSession()

  • tags (dict[str, str] | None) – Dict specifying the key-value pairs for tagging this hybrid job. Default: {}.

  • reservation_arn (str | None) – the reservation window arn provided by Braket Direct to reserve exclusive usage for the device to run the hybrid job on. Default: None.

Returns:

dict – Hybrid job tracking the execution on Amazon Braket.

Raises:

ValueError – Raises ValueError if the parameters are not valid.