Get desktop application:
View/edit binary Protocol Buffers messages
Transfers data between between Google Cloud Storage buckets or from a data source external to Google to a Cloud Storage bucket.
Returns the Google service account that is used by Storage Transfer Service to access buckets in the project where transfers run or in other projects. Each Google service account is associated with one Google Cloud Platform Console project. Users should add this service account to the Google Cloud Storage bucket ACLs to grant access to Storage Transfer Service. This service account is created and owned by Storage Transfer Service and can only be used by Storage Transfer Service.
Request passed to GetGoogleServiceAccount.
The ID of the Google Cloud Platform Console project that the Google service account is associated with. Required.
Google service account
Required.
Creates a transfer job that runs periodically.
Request passed to CreateTransferJob.
The job to create. Required.
Updates a transfer job. Updating a job's transfer spec does not affect transfer operations that are running already. Updating the scheduling of a job is not allowed.
Request passed to UpdateTransferJob.
The name of job to update. Required.
The ID of the Google Cloud Platform Console project that owns the job. Required.
The job to update. `transferJob` is expected to specify only three fields: `description`, `transferSpec`, and `status`. An UpdateTransferJobRequest that specifies other fields will be rejected with an error `INVALID_ARGUMENT`. Required.
The field mask of the fields in `transferJob` that are to be updated in this request. Fields in `transferJob` that can be updated are: `description`, `transferSpec`, and `status`. To update the `transferSpec` of the job, a complete transfer specification has to be provided. An incomplete specification which misses any required fields will be rejected with the error `INVALID_ARGUMENT`.
Gets a transfer job.
Request passed to GetTransferJob.
The job to get. Required.
The ID of the Google Cloud Platform Console project that owns the job. Required.
Lists transfer jobs.
`project_id`, `job_names`, and `job_statuses` are query parameters that can be specified when listing transfer jobs.
A list of query parameters specified as JSON text in the form of {"project_id":"my_project_id", "job_names":["jobid1","jobid2",...], "job_statuses":["status1","status2",...]}. Since `job_names` and `job_statuses` support multiple values, their values must be specified with array notation. `project_id` is required. `job_names` and `job_statuses` are optional. The valid values for `job_statuses` are case-insensitive: `ENABLED`, `DISABLED`, and `DELETED`.
The list page size. The max allowed value is 256.
The list page token.
Response from ListTransferJobs.
A list of transfer jobs.
The list next page token.
Pauses a transfer operation.
Request passed to PauseTransferOperation.
The name of the transfer operation. Required.
Resumes a transfer operation that is paused.
Request passed to ResumeTransferOperation.
The name of the transfer operation. Required.
AWS access key (see [AWS Security Credentials](http://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
Used in:
AWS access key ID. Required.
AWS secret access key. This field is not returned in RPC responses. Required.
An AwsS3Data can be a data source, but not a data sink. In an AwsS3Data, an object's name is the S3 object's key name.
Used in:
S3 Bucket name (see [Creating a bucket](http://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)). Required.
AWS access key used to sign the API requests to the AWS S3 bucket. Permissions on the bucket must be granted to the access ID of the AWS access key. Required.
An entry describing an error that has occurred.
Used in:
A URL that refers to the target (a data source, a data sink, or an object) with which the error is associated. Required.
A list of messages that carry the error details.
A summary of errors by error code, plus a count and sample error log entries.
Used in:
Required.
Count of this type of error. Required.
Error samples.
In a GcsData, an object's name is the Google Cloud Storage object's name and its `lastModificationTime` refers to the object's updated time, which changes when the content or the metadata of the object is updated.
Used in:
Google Cloud Storage bucket name (see [Bucket Name Requirements](https://cloud.google.com/storage/docs/bucket-naming#requirements)). Required.
An HttpData specifies a list of objects on the web to be transferred over HTTP. The information of the objects to be transferred is contained in a file referenced by a URL. The first line in the file must be "TsvHttpData-1.0", which specifies the format of the file. Subsequent lines specify the information of the list of objects, one object per list entry. Each entry has the following tab-delimited fields: * HTTP URL - The location of the object. * Length - The size of the object in bytes. * MD5 - The base64-encoded MD5 hash of the object. For an example of a valid TSV file, see [Transferring data from URLs](https://cloud.google.com/storage/transfer/create-url-list). When transferring data based on a URL list, keep the following in mind: * When an object located at `http(s)://hostname:port/<URL-path>` is transferred to a data sink, the name of the object at the data sink is `<hostname>/<URL-path>`. * If the specified size of an object does not match the actual size of the object fetched, the object will not be transferred. * If the specified MD5 does not match the MD5 computed from the transferred bytes, the object transfer will fail. For more information, see [Generating MD5 hashes](https://cloud.google.com/storage/transfer/#md5) * Ensure that each URL you specify is publicly accessible. For example, in Google Cloud Storage you can [share an object publicly] (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get a link to it. * Storage Transfer Service obeys `robots.txt` rules and requires the source HTTP server to support `Range` requests and to return a `Content-Length` header in each response. * [ObjectConditions](#ObjectConditions) have no effect when filtering objects to transfer.
Used in:
The URL that points to the file that stores the object list entries. This file must allow public access. Currently, only URLs with HTTP and HTTPS schemes are supported. Required.
Conditions that determine which objects will be transferred.
Used in:
If unspecified, `minTimeElapsedSinceLastModification` takes a zero value and `maxTimeElapsedSinceLastModification` takes the maximum possible value of Duration. Objects that satisfy the object conditions must either have a `lastModificationTime` greater or equal to `NOW` - `maxTimeElapsedSinceLastModification` and less than `NOW` - `minTimeElapsedSinceLastModification`, or not have a `lastModificationTime`.
`maxTimeElapsedSinceLastModification` is the complement to `minTimeElapsedSinceLastModification`.
If `includePrefixes` is specified, objects that satisfy the object conditions must have names that start with one of the `includePrefixes` and that do not start with any of the `excludePrefixes`. If `includePrefixes` is not specified, all objects except those that have names starting with one of the `excludePrefixes` must satisfy the object conditions. Requirements: * Each include-prefix and exclude-prefix can contain any sequence of Unicode characters, of max length 1024 bytes when UTF8-encoded, and must not contain Carriage Return or Line Feed characters. Wildcard matching and regular expression matching are not supported. * Each include-prefix and exclude-prefix must omit the leading slash. For example, to include the `requests.gz` object in a transfer from `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include prefix as `logs/y=2015/requests.gz`. * None of the include-prefix or the exclude-prefix values can be empty, if specified. * Each include-prefix must include a distinct portion of the object namespace, i.e., no include-prefix may be a prefix of another include-prefix. * Each exclude-prefix must exclude a distinct portion of the object namespace, i.e., no exclude-prefix may be a prefix of another exclude-prefix. * If `includePrefixes` is specified, then each exclude-prefix must start with the value of a path explicitly included by `includePrefixes`. The max size of `includePrefixes` is 1000.
`excludePrefixes` must follow the requirements described for `includePrefixes`. The max size of `excludePrefixes` is 1000.
Transfers can be scheduled to recur or to run just once.
Used in:
The first day the recurring transfer is scheduled to run. If `scheduleStartDate` is in the past, the transfer will run for the first time on the following day. Required.
The last day the recurring transfer will be run. If `scheduleEndDate` is the same as `scheduleStartDate`, the transfer will be executed only once.
The time in UTC at which the transfer will be scheduled to start in a day. Transfers may start later than this time. If not specified, recurring and one-time transfers that are scheduled to run today will run immediately; recurring transfers that are scheduled to run on a future date will start at approximately midnight UTC on that date. Note that when configuring a transfer with the Cloud Platform Console, the transfer's start time in a day is specified in your local timezone.
A collection of counters that report the progress of a transfer operation.
Used in:
Objects found in the data source that are scheduled to be transferred, excluding any that are filtered based on object conditions or skipped due to sync.
Bytes found in the data source that are scheduled to be transferred, excluding any that are filtered based on object conditions or skipped due to sync.
Objects found only in the data sink that are scheduled to be deleted.
Bytes found only in the data sink that are scheduled to be deleted.
Objects in the data source that are not transferred because they already exist in the data sink.
Bytes in the data source that are not transferred because they already exist in the data sink.
Objects that are copied to the data sink.
Bytes that are copied to the data sink.
Objects that are deleted from the data source.
Bytes that are deleted from the data source.
Objects that are deleted from the data sink.
Bytes that are deleted from the data sink.
Objects in the data source that failed during the transfer.
Bytes in the data source that failed during the transfer.
Objects that failed to be deleted from the data sink.
Bytes that failed to be deleted from the data sink.
This resource represents the configuration of a transfer job that runs periodically.
Used as response type in: StorageTransferService.CreateTransferJob, StorageTransferService.GetTransferJob, StorageTransferService.UpdateTransferJob
Used as field type in:
, ,A globally unique name assigned by Storage Transfer Service when the job is created. This field should be left empty in requests to create a new transfer job; otherwise, the requests result in an `INVALID_ARGUMENT` error.
A description provided by the user for the job. Its max length is 1024 bytes when Unicode-encoded.
The ID of the Google Cloud Platform Console project that owns the job.
Transfer specification.
Schedule specification.
Status of the job. This value MUST be specified for `CreateTransferJobRequests`. NOTE: The effect of the new job status takes place during a subsequent job run. For example, if you change the job status from `ENABLED` to `DISABLED`, and an operation spawned by the transfer is running, the status change would not affect the current operation.
This field cannot be changed by user requests.
This field cannot be changed by user requests.
This field cannot be changed by user requests.
The status of the transfer job.
Used in:
Zero is an illegal value.
New transfers will be performed based on the schedule.
New transfers will not be scheduled.
This is a soft delete state. After a transfer job is set to this state, the job and all the transfer executions are subject to garbage collection.
A description of the execution of a transfer.
A globally unique ID assigned by the system.
The ID of the Google Cloud Platform Console project that owns the operation. Required.
Transfer specification. Required.
Start time of this transfer execution.
End time of this transfer execution.
Status of the transfer operation.
Information about the progress of the transfer operation.
Summarizes errors encountered with sample error log entries.
The name of the transfer job that triggers this transfer operation.
The status of a TransferOperation.
Used in:
Zero is an illegal value.
In progress.
Paused.
Completed successfully.
Terminated due to an unrecoverable failure.
Aborted by the user.
TransferOptions uses three boolean parameters to define the actions to be performed on objects in a transfer.
Used in:
Whether overwriting objects that already exist in the sink is allowed.
Whether objects that exist only in the sink should be deleted. Note that this option and `deleteObjectsFromSourceAfterTransfer` are mutually exclusive.
Whether objects should be deleted from the source after they are transferred to the sink. Note that this option and `deleteObjectsUniqueInSink` are mutually exclusive.
Configuration for running a transfer.
Used in:
,The read source of the data.
A Google Cloud Storage data source.
An AWS S3 data source.
An HTTP URL data source.
The write sink for the data.
A Google Cloud Storage data sink.
Only objects that satisfy these object conditions are included in the set of data source and data sink objects. Object conditions based on objects' `lastModificationTime` do not exclude objects in a data sink.
If the option `deleteObjectsUniqueInSink` is `true`, object conditions based on objects' `lastModificationTime` are ignored and do not exclude objects in a data source or a data sink.