Exports {sevenbridges2} | R Documentation |
R6 Class representing storage exports endpoints
Description
R6 Class representing storage exports resource endpoints.
Super class
sevenbridges2::Resource
-> Exports
Public fields
URL
List of URL endpoints for this resource.
Methods
Public methods
Method new()
Create a new Exports object.
Usage
Exports$new(...)
Arguments
...
Other response arguments.
Method query()
This call lists export jobs initiated by particular user. Note that when you export a file from a project on the Platform into a volume, you write to your cloud storage bucket.
Usage
Exports$query( volume = NULL, state = NULL, limit = getOption("sevenbridges2")$limit, offset = getOption("sevenbridges2")$offset, ... )
Arguments
volume
Volume id or Volume object. List all exports into this particular volume. Optional.
state
The state of the export job. Possible values are:
-
PENDING
: the export is queued; -
RUNNING
: the export is running; -
COMPLETED
: the export has completed successfully; -
FAILED
: the export has failed.
Example:
state = c("RUNNING", "FAILED")
-
limit
The maximum number of collection items to return for a single request. Minimum value is
1
. The maximum value is100
and the default value is50
. This is a pagination-specific attribute.offset
The zero-based starting index in the entire collection of the first item to return. The default value is
0
. This is a pagination-specific attribute....
Other arguments that can be passed to core
api()
function like 'fields', etc.
Returns
Collection
of Export
objects.
Examples
\dontrun{ exports_object <- Exports$new( auth = auth ) # List all your running or failed export jobs on the volume exports_object$query(volume = volume, state = c("RUNNING", "FAILED")) }
Method get()
This call will return the details of an export job.
Usage
Exports$get(id, ...)
Arguments
id
The export job identifier (id).
...
Other arguments that can be passed to core
api()
function like 'fields', etc.
Returns
Export
object.
Examples
\dontrun{ exports_object <- Exports$new( auth = auth ) # Get export job by ID exports_object$get(id = id) }
Method submit_export()
This call lets you queue a job to export a file from a
project on the Platform into a volume. The file selected for export must
not be a public file or an alias. Aliases are objects stored in your
cloud storage bucket which have been made available on the Platform.
The volume you are exporting to must be configured for
read-write access. To do this, set the access_mode
parameter to
RW
when creating or modifying a volume.
Essentially, the call writes to your cloud storage bucket via the
volume. If this call is successful, the original project file will
become an alias to the newly exported object on the volume.
The source file will be deleted from the Platform and, if no more
copies of this file exist, it will no longer count towards your total
storage price on the Platform.
In summary, once you export a file from the Platform to a volume, it is
no longer part of the storage on the Platform and cannot be exported
again.
Read more about this operation in our documentation
here.
If you want to export multiple files, the recommended way is to do it in bulk considering the API rate limit (learn more). Bulk operations will be implemented in next releases.
Usage
Exports$submit_export( source_file, destination_volume, destination_location, overwrite = FALSE, copy_only = FALSE, properties = NULL, ... )
Arguments
source_file
File id or File object you want to export to the volume.
destination_volume
Volume id or Volume object you want to export files into.
destination_location
Volume-specific location to which the file will be exported. This location should be recognizable to the underlying cloud service as a valid key or path to a new file. Please note that if this volume has been configured with a prefix parameter, the value of prefix will be prepended to location before attempting to create the file on the volume.
If you would like to export the file into some folder on the volume, please add folder name as prefix before file name in form
<folder-name>/<file-name>
.overwrite
Set to
TRUE
if you want to overwrite the item if another one with the same name already exists at the destination.copy_only
If
TRUE
, file will be copied to a volume but source file will remain on the Platform.properties
Named list of additional volume properties, like:
-
sse_algorithm
- S3 server-side encryption to use when exporting to this bucket. Supported values:AES256
(SSE-S3 encryption),aws:kms
,null
(no server-side encryption). Default:AES256
. -
sse_aws_kms_key_id
: Applies to type:s3
. If AWS KMS encryption is used, this should be set to the required KMS key. If not set andaws:kms
is set assse_algorithm
, default KMS key is used. -
aws_canned_acl
: S3 canned ACL to apply on the object on during export. Supported values: any one of S3 canned ACLs;null
(do not apply canned ACLs). Default:null
.
-
...
Other arguments that can be passed to core
api()
function like 'fields', etc.
Returns
Export
object.
Examples
\dontrun{ exports_object <- Exports$new( auth = auth ) # Submit export job exp_job1 <- exports_object$submit_export( source_file = test_file, destination_volume = vol1, destination_location = "new_volume_file.txt" ) }
Method delete()
Deleting export jobs is not possible.
Usage
Exports$delete()
Method bulk_get()
This call returns the details of a bulk export job. When you export files from a project on the Platform into a volume, you write to your cloud storage bucket. This call obtains the details of that job.
Usage
Exports$bulk_get(exports)
Arguments
exports
The list of the export job IDs as returned by the call to start a bulk export job or list of
Export
objects.
Returns
Collection
with list of Export
objects.
Examples
\dontrun{ exports_object <- Exports$new( auth = auth, ) # List export jobs exports_object$bulk_get( exports = list("export-job-id-1", "export-job-id-2") ) }
Method bulk_submit_export()
Bulk export files from your project on the Seven Bridges
Platform into your volume. One call can contain up to 100 items.
Files selected for export must not be public files or aliases.
Aliases are objects stored in your cloud storage bucket which have
been made available on the Platform. The volume you are exporting to
must be configured for read-write access. To do this, set the
access_mode
parameter to RW when creating or modifying a volume.
Essentially, the call writes to your cloud storage bucket via the volume. If this call is successful, the original project files will become aliases to the newly exported objects on the volume. Source files will be deleted from the Platform and, if no more copies of the files exist, they will no longer count towards your total storage price on the Platform. In summary, once you export files from the Platform to a volume, they are no longer part of the storage on the Platform and cannot be exported again.
Learn more about using the Volumes API for Amazon S3 and for Google Cloud Storage.
Usage
Exports$bulk_submit_export(items, copy_only = FALSE)
Arguments
items
Nested list of elements containing information about each file to be exported. For each element, users must provide:
-
source_file
- File ID or File object you want to export to the volume, -
destination_volume
- Volume ID or Volume object you want to export files into. -
destination_location
- Volume-specific location to which the file will be exported. This location should be recognizable to the underlying cloud service as a valid key or path to a new file. Please note that if this volume has been configured with aprefix
parameter, the value ofprefix
will be prepended to the location before attempting to create the file on the volume.
If you would like to export the file into a folder on the volume, please add folder name as a prefix before the file name in the<folder-name>/<file-name>
form. -
overwrite
- Set toTRUE
if you want to overwrite the item with the same name if it already exists at the destination. -
properties
- Named list of additional volume properties, like:-
sse_algorithm
- S3 server-side encryption to use when exporting to this bucket. Supported values:AES256
(SSE-S3 encryption),aws:kms
,null
(no server-side encryption). Default:AES256
. -
sse_aws_kms_key_id
: Applies to type:s3
. If AWS KMS encryption is used, this should be set to the required KMS key. If not set andaws:kms
is set assse_algorithm
, default KMS key is used. -
aws_canned_acl
: S3 canned ACL to apply on the object during export. Supported values: any one of S3 canned ACLs;null
(do not apply canned ACLs). Default:null
.
-
Example of the list:
items <- list( list( source_file = "test_file-id", destination_volume = "volume-id", destination_location = "new_volume_file.txt" ), list( source_file = "test_file_obj", destination_volume = "test_volume_obj", destination_location = "/volume_folder/exported_file.txt", overwrite = TRUE ), list( source_file = "project_file_3_id", destination_volume = "volume-id", destination_location = "project_file_3.txt", properties = list( sse_algorithm = "AES256" ) ) )
Read more on how to export files from your project to a volume or a volume folder.
Utility function
prepare_items_for_bulk_export
can help you prepare theitems
parameter for thebulk_submit_export()
method.-
copy_only
If set to true, the files will be copied to a volume but the source files will remain on the Platform.
Returns
Collection
with list of Export
objects.
Examples
\dontrun{ exports_object <- Exports$new( auth = auth ) # Submit new bulk export into a volume exports_object$bulk_submit_export(items = list( list( source_file = "test_file-id", destination_volume = "volume-id", destination_location = "new_volume_file.txt" ), list( source_file = test_file_obj, destination_volume = test_volume_obj, destination_location = "/volume_folder/exported_file.txt", overwrite = TRUE ), list( source_file = "project_file_3_id", destination_volume = "volume-id", destination_location = "project_file_3.txt", properties = list( sse_algorithm = "AES256" ) ) ), copy_only = TRUE ) }
Method clone()
The objects of this class are cloneable with this method.
Usage
Exports$clone(deep = FALSE)
Arguments
deep
Whether to make a deep clone.
Examples
## ------------------------------------------------
## Method `Exports$query`
## ------------------------------------------------
## Not run:
exports_object <- Exports$new(
auth = auth
)
# List all your running or failed export jobs on the volume
exports_object$query(volume = volume, state = c("RUNNING", "FAILED"))
## End(Not run)
## ------------------------------------------------
## Method `Exports$get`
## ------------------------------------------------
## Not run:
exports_object <- Exports$new(
auth = auth
)
# Get export job by ID
exports_object$get(id = id)
## End(Not run)
## ------------------------------------------------
## Method `Exports$submit_export`
## ------------------------------------------------
## Not run:
exports_object <- Exports$new(
auth = auth
)
# Submit export job
exp_job1 <- exports_object$submit_export(
source_file = test_file,
destination_volume = vol1,
destination_location = "new_volume_file.txt"
)
## End(Not run)
## ------------------------------------------------
## Method `Exports$bulk_get`
## ------------------------------------------------
## Not run:
exports_object <- Exports$new(
auth = auth,
)
# List export jobs
exports_object$bulk_get(
exports = list("export-job-id-1", "export-job-id-2")
)
## End(Not run)
## ------------------------------------------------
## Method `Exports$bulk_submit_export`
## ------------------------------------------------
## Not run:
exports_object <- Exports$new(
auth = auth
)
# Submit new bulk export into a volume
exports_object$bulk_submit_export(items = list(
list(
source_file = "test_file-id",
destination_volume = "volume-id",
destination_location = "new_volume_file.txt"
),
list(
source_file = test_file_obj,
destination_volume = test_volume_obj,
destination_location = "/volume_folder/exported_file.txt",
overwrite = TRUE
),
list(
source_file = "project_file_3_id",
destination_volume = "volume-id",
destination_location = "project_file_3.txt",
properties = list(
sse_algorithm = "AES256"
)
)
), copy_only = TRUE
)
## End(Not run)