batch {paws.compute} | R Documentation |
AWS Batch
Description
Batch
Using Batch, you can run batch computing workloads on the Amazon Web Services Cloud. Batch computing is a common means for developers, scientists, and engineers to access large amounts of compute resources. Batch uses the advantages of the batch computing to remove the undifferentiated heavy lifting of configuring and managing required infrastructure. At the same time, it also adopts a familiar batch computing software approach. You can use Batch to efficiently provision resources, and work toward eliminating capacity constraints, reducing your overall compute costs, and delivering results more quickly.
As a fully managed service, Batch can run batch computing workloads of any scale. Batch automatically provisions compute resources and optimizes workload distribution based on the quantity and scale of your specific workloads. With Batch, there's no need to install or manage batch computing software. This means that you can focus on analyzing results and solving your specific problems instead.
Usage
batch(config = list(), credentials = list(), endpoint = NULL, region = NULL)
Arguments
config |
Optional configuration of credentials, endpoint, and/or region.
|
credentials |
Optional credentials shorthand for the config parameter
|
endpoint |
Optional shorthand for complete URL to use for the constructed client. |
region |
Optional shorthand for AWS Region used in instantiating the client. |
Value
A client for the service. You can call the service's operations using
syntax like svc$operation(...)
, where svc
is the name you've assigned
to the client. The available operations are listed in the
Operations section.
Service syntax
svc <- batch( config = list( credentials = list( creds = list( access_key_id = "string", secret_access_key = "string", session_token = "string" ), profile = "string", anonymous = "logical" ), endpoint = "string", region = "string", close_connection = "logical", timeout = "numeric", s3_force_path_style = "logical", sts_regional_endpoint = "string" ), credentials = list( creds = list( access_key_id = "string", secret_access_key = "string", session_token = "string" ), profile = "string", anonymous = "logical" ), endpoint = "string", region = "string" )
Operations
cancel_job | Cancels a job in an Batch job queue |
create_compute_environment | Creates an Batch compute environment |
create_job_queue | Creates an Batch job queue |
create_scheduling_policy | Creates an Batch scheduling policy |
delete_compute_environment | Deletes an Batch compute environment |
delete_job_queue | Deletes the specified job queue |
delete_scheduling_policy | Deletes the specified scheduling policy |
deregister_job_definition | Deregisters an Batch job definition |
describe_compute_environments | Describes one or more of your compute environments |
describe_job_definitions | Describes a list of job definitions |
describe_job_queues | Describes one or more of your job queues |
describe_jobs | Describes a list of Batch jobs |
describe_scheduling_policies | Describes one or more of your scheduling policies |
list_jobs | Returns a list of Batch jobs |
list_scheduling_policies | Returns a list of Batch scheduling policies |
list_tags_for_resource | Lists the tags for an Batch resource |
register_job_definition | Registers an Batch job definition |
submit_job | Submits an Batch job from a job definition |
tag_resource | Associates the specified tags to a resource with the specified resourceArn |
terminate_job | Terminates a job in a job queue |
untag_resource | Deletes specified tags from an Batch resource |
update_compute_environment | Updates an Batch compute environment |
update_job_queue | Updates a job queue |
update_scheduling_policy | Updates a scheduling policy |
Examples
## Not run:
svc <- batch()
# This example cancels a job with the specified job ID.
svc$cancel_job(
jobId = "1d828f65-7a4d-42e8-996d-3b900ed59dc4",
reason = "Cancelling job."
)
## End(Not run)