package google.events.cloud.dataflow.v1beta3

Mouse Melon logoGet desktop application:
View/edit binary Protocol Buffers messages

enum AutoscalingAlgorithm

data.proto:344

Specifies the algorithm used to determine the number of worker processes to run at any given point in time, based on the amount of data left to process, the number of workers, and how quickly existing workers are processing data.

Used in: AutoscalingSettings

message AutoscalingSettings

data.proto:142

Settings for WorkerPool autoscaling.

Used in: WorkerPool

message BigQueryIODetails

data.proto:565

Metadata for a BigQuery connector used by the job.

Used in: JobMetadata

message BigTableIODetails

data.proto:553

Metadata for a Cloud Bigtable connector used by the job.

Used in: JobMetadata

message DatastoreIODetails

data.proto:529

Metadata for a Datastore connector used by the job.

Used in: JobMetadata

message DebugOptions

data.proto:271

Describes any options that have an effect on the debugging of pipelines.

Used in: Environment

enum DefaultPackageSet

data.proto:325

The default set of packages to be staged on a pool of workers.

Used in: WorkerPool

message Environment

data.proto:27

Describes the environment in which a Dataflow Job runs.

Used in: Job

message ExecutionStageState

data.proto:648

A message describing the state of a particular execution stage.

Used in: Job

message FileIODetails

data.proto:547

Metadata for a File connector used by the job.

Used in: JobMetadata

enum FlexResourceSchedulingGoal

data.proto:295

Specifies the resource to optimize for in Flexible Resource Scheduling.

Used in: Environment

message Job

data.proto:388

Defines a job to be run by the Cloud Dataflow service. Do not enter confidential information when you supply string values using the API. Fields stripped from source Job proto: - steps - pipeline_description - transform_name_mapping

Used in: JobEventData

message JobEventData

data.proto:752

The data within all Job events.

Used in: JobStatusChangedEvent

message JobExecutionInfo

data.proto:661

Additional information about how a Cloud Dataflow job will be executed that isn't contained in the submitted job.

Used in: Job

message JobExecutionStageInfo

data.proto:669

Contains information about how a particular [google.dataflow.v1beta3.Step][google.dataflow.v1beta3.Step] will be executed.

Used in: JobExecutionInfo

message JobMetadata

data.proto:624

Metadata available primarily for filtering jobs. Will be included in the ListJob response and Job SUMMARY view.

Used in: Job

enum JobState

data.proto:678

Describes the overall state of a [google.dataflow.v1beta3.Job][google.dataflow.v1beta3.Job].

Used in: ExecutionStageState, Job

message JobStatusChangedEvent

events.proto:32

The CloudEvent raised when a Job status changes.

enum JobType

data.proto:281

Specifies the processing model used by a [google.dataflow.v1beta3.Job], which determines the way the Job is managed by the Cloud Dataflow service (how workers are scheduled, how inputs are sharded, etc).

Used in: Job

message Package

data.proto:128

The packages that must be installed in order for a worker to run the steps of the Cloud Dataflow job that will be assigned to its worker pool. This is the mechanism by which the Cloud Dataflow SDK causes code to be loaded onto the workers. For example, the Cloud Dataflow Java SDK might use this to install jars containing the user's code and all of the various dependencies (libraries, data files, etc.) required in order for that code to run.

Used in: WorkerPool

message PubSubIODetails

data.proto:538

Metadata for a Pub/Sub connector used by the job.

Used in: JobMetadata

message SdkHarnessContainerImage

data.proto:151

Defines an SDK harness container for executing Dataflow pipelines.

Used in: WorkerPool

message SdkVersion

data.proto:592

The version of the SDK used to run the job.

Used in: JobMetadata

enum SdkVersion.SdkSupportStatus

data.proto:594

The support status of the SDK used to run the job.

Used in: SdkVersion

enum ShuffleMode

data.proto:371

Specifies the shuffle mode used by a [google.dataflow.v1beta3.Job], which determines the approach data is shuffled during processing. More details in: https://cloud.google.com/dataflow/docs/guides/deploying-a-pipeline#dataflow-shuffle

Used in: Environment

message SpannerIODetails

data.proto:580

Metadata for a Spanner connector used by the job.

Used in: JobMetadata

enum TeardownPolicy

data.proto:308

Specifies what happens to a resource when a Cloud Dataflow [google.dataflow.v1beta3.Job][google.dataflow.v1beta3.Job] has completed.

Used in: WorkerPool

enum WorkerIPAddressConfiguration

data.proto:356

Specifies how IP addresses should be allocated to the worker machines.

Used in: WorkerPool

message WorkerPool

data.proto:176

Describes one particular pool of Cloud Dataflow workers to be instantiated by the Cloud Dataflow service in order to perform the computations required by a job. Note that a workflow job may use multiple pools, in order to match the various computational requirements of the various stages of the job.

Used in: Environment