package google.cloud.dialogflow.v2beta1

Mouse Melon logoGet desktop application:
View/edit binary Protocol Buffers messages

service Agents

agent.proto:65

Agents are best described as Natural Language Understanding (NLU) modules that transform user requests into actionable data. You can include agents in your app, product, or service to determine user intent and respond to the user in a natural way. After you create an agent, you can add [Intents][google.cloud.dialogflow.v2beta1.Intents], [Contexts][google.cloud.dialogflow.v2beta1.Contexts], [Entity Types][google.cloud.dialogflow.v2beta1.EntityTypes], [Webhooks][google.cloud.dialogflow.v2beta1.WebhookRequest], and so on to manage the flow of a conversation and match user input to predefined intents and actions. You can create an agent using both Dialogflow Standard Edition and Dialogflow Enterprise Edition. For details, see [Dialogflow Editions](https://cloud.google.com/dialogflow/docs/editions). You can save your agent for backup or versioning by exporting the agent by using the [ExportAgent][google.cloud.dialogflow.v2beta1.Agents.ExportAgent] method. You can import a saved agent by using the [ImportAgent][google.cloud.dialogflow.v2beta1.Agents.ImportAgent] method. Dialogflow provides several [prebuilt agents](https://cloud.google.com/dialogflow/docs/agents-prebuilt) for common conversation scenarios such as determining a date and time, converting currency, and so on. For more information about agents, see the [Dialogflow documentation](https://cloud.google.com/dialogflow/docs/agents-overview).

service Contexts

context.proto:52

A context represents additional information included with user input or with an intent returned by the Dialogflow API. Contexts are helpful for differentiating user input which may be vague or have a different meaning depending on additional details from your application such as user setting and preferences, previous user input, where the user is in your application, geographic location, and so on. You can include contexts as input parameters of a [DetectIntent][google.cloud.dialogflow.v2beta1.Sessions.DetectIntent] (or [StreamingDetectIntent][google.cloud.dialogflow.v2beta1.Sessions.StreamingDetectIntent]) request, or as output contexts included in the returned intent. Contexts expire when an intent is matched, after the number of `DetectIntent` requests specified by the `lifespan_count` parameter, or after 20 minutes if no intents are matched for a `DetectIntent` request. For more information about contexts, see the [Dialogflow documentation](https://cloud.google.com/dialogflow/docs/contexts-overview).

service Documents

document.proto:38

Manages documents of a knowledge base.

service EntityTypes

entity_type.proto:64

Entities are extracted from user input and represent parameters that are meaningful to your application. For example, a date range, a proper name such as a geographic location or landmark, and so on. Entities represent actionable data for your application. When you define an entity, you can also include synonyms that all map to that entity. For example, "soft drink", "soda", "pop", and so on. There are three types of entities: * **System** - entities that are defined by the Dialogflow API for common data types such as date, time, currency, and so on. A system entity is represented by the `EntityType` type. * **Developer** - entities that are defined by you that represent actionable data that is meaningful to your application. For example, you could define a `pizza.sauce` entity for red or white pizza sauce, a `pizza.cheese` entity for the different types of cheese on a pizza, a `pizza.topping` entity for different toppings, and so on. A developer entity is represented by the `EntityType` type. * **User** - entities that are built for an individual user such as favorites, preferences, playlists, and so on. A user entity is represented by the [SessionEntityType][google.cloud.dialogflow.v2beta1.SessionEntityType] type. For more information about entity types, see the [Dialogflow documentation](https://cloud.google.com/dialogflow/docs/entities-overview).

service Intents

intent.proto:72

An intent represents a mapping between input from a user and an action to be taken by your application. When you pass user input to the [DetectIntent][google.cloud.dialogflow.v2beta1.Sessions.DetectIntent] (or [StreamingDetectIntent][google.cloud.dialogflow.v2beta1.Sessions.StreamingDetectIntent]) method, the Dialogflow API analyzes the input and searches for a matching intent. If no match is found, the Dialogflow API returns a fallback intent (`is_fallback` = true). You can provide additional information for the Dialogflow API to use to match user input to an intent by adding the following to your intent. * **Contexts** - provide additional context for intent analysis. For example, if an intent is related to an object in your application that plays music, you can provide a context to determine when to match the intent if the user input is "turn it off". You can include a context that matches the intent when there is previous user input of "play music", and not when there is previous user input of "turn on the light". * **Events** - allow for matching an intent by using an event name instead of user input. Your application can provide an event name and related parameters to the Dialogflow API to match an intent. For example, when your application starts, you can send a welcome event with a user name parameter to the Dialogflow API to match an intent with a personalized welcome message for the user. * **Training phrases** - provide examples of user input to train the Dialogflow API agent to better match intents. For more information about intents, see the [Dialogflow documentation](https://cloud.google.com/dialogflow/docs/intents-overview).

service KnowledgeBases

knowledge_base.proto:36

Manages knowledge bases. Allows users to setup and maintain knowledge bases with their knowledge data.

service SessionEntityTypes

session_entity_type.proto:51

Entities are extracted from user input and represent parameters that are meaningful to your application. For example, a date range, a proper name such as a geographic location or landmark, and so on. Entities represent actionable data for your application. Session entity types are referred to as **User** entity types and are entities that are built for an individual user such as favorites, preferences, playlists, and so on. You can redefine a session entity type at the session level. Session entity methods do not work with Google Assistant integration. Contact Dialogflow support if you need to use session entities with Google Assistant integration. For more information about entity types, see the [Dialogflow documentation](https://cloud.google.com/dialogflow/docs/entities-overview).

service Sessions

session.proto:46

A session represents an interaction with a user. You retrieve user input and pass it to the [DetectIntent][google.cloud.dialogflow.v2beta1.Sessions.DetectIntent] (or [StreamingDetectIntent][google.cloud.dialogflow.v2beta1.Sessions.StreamingDetectIntent]) method to determine user intent and respond.

message Agent

agent.proto:193

Represents a conversational agent.

Used as response type in: Agents.GetAgent, Agents.SetAgent

Used as field type in: SearchAgentsResponse, SetAgentRequest

enum Agent.ApiVersion

agent.proto:215

API version for the agent.

Used in: Agent

enum Agent.MatchMode

agent.proto:201

Match mode determines how intents are detected from user queries.

Used in: Agent

enum Agent.Tier

agent.proto:230

Represents the agent tier.

Used in: Agent

enum AudioEncoding

audio_config.proto:37

Audio encoding of the audio content sent in the conversational query request. Refer to the [Cloud Speech API documentation](https://cloud.google.com/speech-to-text/docs/basics) for more details.

Used in: InputAudioConfig

message BatchUpdateEntityTypesResponse

entity_type.proto:407

The response message for [EntityTypes.BatchUpdateEntityTypes][google.cloud.dialogflow.v2beta1.EntityTypes.BatchUpdateEntityTypes].

message BatchUpdateIntentsResponse

intent.proto:1379

The response message for [Intents.BatchUpdateIntents][google.cloud.dialogflow.v2beta1.Intents.BatchUpdateIntents].

message Context

context.proto:166

Represents a context.

Used as response type in: Contexts.CreateContext, Contexts.GetContext, Contexts.UpdateContext

Used as field type in: CreateContextRequest, Intent, ListContextsResponse, QueryParameters, QueryResult, UpdateContextRequest, WebhookResponse

message Document

document.proto:148

A document resource. Note: The `projects.agent.knowledgeBases.documents` resource is deprecated; only use `projects.knowledgeBases.documents`.

Used as response type in: Documents.GetDocument

Used as field type in: CreateDocumentRequest, ListDocumentsResponse, UpdateDocumentRequest

enum Document.KnowledgeType

document.proto:150

The knowledge type of document content.

Used in: Document

message EntityType

entity_type.proto:200

Represents an entity type. Entity types serve as a tool for extracting parameter values from natural language queries.

Used as response type in: EntityTypes.CreateEntityType, EntityTypes.GetEntityType, EntityTypes.UpdateEntityType

Used as field type in: BatchUpdateEntityTypesResponse, CreateEntityTypeRequest, EntityTypeBatch, ListEntityTypesResponse, UpdateEntityTypeRequest

enum EntityType.AutoExpansionMode

entity_type.proto:249

Represents different entity type expansion modes. Automated expansion allows an agent to recognize values that have not been explicitly listed in the entity (for example, new kinds of shopping list items).

Used in: EntityType

message EntityType.Entity

entity_type.proto:202

An **entity entry** for an associated entity type.

Used in: BatchCreateEntitiesRequest, BatchUpdateEntitiesRequest, EntityType, SessionEntityType

enum EntityType.Kind

entity_type.proto:228

Represents kinds of entities.

Used in: EntityType

message EntityTypeBatch

entity_type.proto:483

This message is a wrapper around a collection of entity types.

Used in: BatchUpdateEntityTypesRequest

message EventInput

session.proto:610

Events allow for matching intents by event name instead of the natural language input. For instance, input `<event: { name: "welcome_event", parameters: { name: "Sam" } }>` can trigger a personalized welcome response. The parameter `name` may be used by the agent in the response: `"Hello #welcome_event.name! What can I do for you today?"`.

Used in: QueryInput, WebhookResponse

message ExportAgentResponse

agent.proto:373

The response message for [Agents.ExportAgent][google.cloud.dialogflow.v2beta1.Agents.ExportAgent].

message GcsSource

gcs.proto:31

Google Cloud Storage location for single input.

Used in: ReloadDocumentRequest

message InputAudioConfig

audio_config.proto:178

Instructs the speech recognizer on how to process the audio content.

Used in: QueryInput

message Intent

intent.proto:164

Represents an intent. Intents convert a number of user expressions or patterns into an action. An action is an extraction of a user command or sentence semantics.

Used as response type in: Intents.CreateIntent, Intents.GetIntent, Intents.UpdateIntent

Used as field type in: BatchDeleteIntentsRequest, BatchUpdateIntentsResponse, CreateIntentRequest, IntentBatch, ListIntentsResponse, QueryResult, UpdateIntentRequest

message Intent.FollowupIntentInfo

intent.proto:1107

Represents a single followup intent in the chain.

Used in: Intent

message Intent.Message

intent.proto:286

Corresponds to the `Response` field in the Dialogflow console.

Used in: Intent, QueryResult, WebhookResponse

message Intent.Message.BasicCard

intent.proto:362

The basic card message. Useful for displaying information.

Used in: Message

message Intent.Message.BasicCard.Button

intent.proto:364

The button object that appears at the bottom of a card.

Used in: BasicCard, TableCard

message Intent.Message.BasicCard.Button.OpenUriAction

intent.proto:366

Opens the given URI.

Used in: Button

message Intent.Message.BrowseCarouselCard

intent.proto:796

Browse Carousel Card for Actions on Google. https://developers.google.com/actions/assistant/responses#browsing_carousel

Used in: Message

message Intent.Message.BrowseCarouselCard.BrowseCarouselCardItem

intent.proto:798

Browsing carousel tile

Used in: BrowseCarouselCard

message Intent.Message.BrowseCarouselCard.BrowseCarouselCardItem.OpenUrlAction

intent.proto:800

Actions on Google action to open a given url.

Used in: BrowseCarouselCardItem

enum Intent.Message.BrowseCarouselCard.BrowseCarouselCardItem.OpenUrlAction.UrlTypeHint

intent.proto:802

Type of the URI.

Used in: OpenUrlAction

enum Intent.Message.BrowseCarouselCard.ImageDisplayOptions

intent.proto:843

Image display options for Actions on Google. This should be used for when the image's aspect ratio does not match the image container's aspect ratio.

Used in: BrowseCarouselCard

message Intent.Message.Card

intent.proto:313

The card response message.

Used in: Message

message Intent.Message.Card.Button

intent.proto:315

Optional. Contains information about a button.

Used in: Card

message Intent.Message.CarouselSelect

intent.proto:443

The card for presenting a carousel of options to select from.

Used in: Message

message Intent.Message.CarouselSelect.Item

intent.proto:445

An item in the carousel.

Used in: CarouselSelect

message Intent.Message.ColumnProperties

intent.proto:900

Column properties for [TableCard][google.cloud.dialogflow.v2beta1.Intent.Message.TableCard].

Used in: TableCard

enum Intent.Message.ColumnProperties.HorizontalAlignment

intent.proto:902

Text alignments within a cell.

Used in: ColumnProperties

message Intent.Message.Image

intent.proto:294

The image response message.

Used in: Message, BasicCard, BrowseCarouselCard.BrowseCarouselCardItem, CarouselSelect.Item, ListSelect.Item, MediaContent.ResponseMediaObject, TableCard

message Intent.Message.LinkOutSuggestion

intent.proto:409

The suggestion chip message that allows the user to jump out to the app or website associated with this agent.

Used in: Message

message Intent.Message.ListSelect

intent.proto:419

The card for presenting a list of options to select from.

Used in: Message

message Intent.Message.ListSelect.Item

intent.proto:421

An item in the list.

Used in: ListSelect

message Intent.Message.MediaContent

intent.proto:756

The media content card for Actions on Google.

Used in: Message

message Intent.Message.MediaContent.ResponseMediaObject

intent.proto:758

Response media object for media content card.

Used in: MediaContent

enum Intent.Message.MediaContent.ResponseMediaType

intent.proto:779

Format of response media type.

Used in: MediaContent

enum Intent.Message.Platform

intent.proto:939

Represents different platforms that a rich message can be intended for.

Used in: Intent, Message

message Intent.Message.QuickReplies

intent.proto:304

The quick replies response message.

Used in: Message

message Intent.Message.RbmCardContent

intent.proto:606

Rich Business Messaging (RBM) Card content

Used in: RbmCarouselCard, RbmStandaloneCard

message Intent.Message.RbmCardContent.RbmMedia

intent.proto:625

Rich Business Messaging (RBM) Media displayed in Cards The following media-types are currently supported: ## Image Types image/jpeg image/jpg' image/gif image/png ## Video Types video/h263 video/m4v video/mp4 video/mpeg video/mpeg4 video/webm

Used in: RbmCardContent

enum Intent.Message.RbmCardContent.RbmMedia.Height

intent.proto:627

Media height

Used in: RbmMedia

message Intent.Message.RbmCarouselCard

intent.proto:538

Carousel Rich Business Messaging (RBM) rich card. Rich cards allow you to respond to users with more vivid content, e.g. with media and suggestions. For more details about RBM rich cards, please see: https://developers.google.com/rcs-business-messaging/rbm/guides/build/send-messages#rich-cards. If you want to show a single card with more control over the layout, please use [RbmStandaloneCard][google.cloud.dialogflow.v2beta1.Intent.Message.RbmStandaloneCard] instead.

Used in: Message

enum Intent.Message.RbmCarouselCard.CardWidth

intent.proto:540

The width of the cards in the carousel.

Used in: RbmCarouselCard

message Intent.Message.RbmStandaloneCard

intent.proto:568

Standalone Rich Business Messaging (RBM) rich card. Rich cards allow you to respond to users with more vivid content, e.g. with media and suggestions. For more details about RBM rich cards, please see: https://developers.google.com/rcs-business-messaging/rbm/guides/build/send-messages#rich-cards. You can group multiple rich cards into one using [RbmCarouselCard][google.cloud.dialogflow.v2beta1.Intent.Message.RbmCarouselCard] but carousel cards will give you less control over the card layout.

Used in: Message

enum Intent.Message.RbmStandaloneCard.CardOrientation

intent.proto:570

Orientation of the card.

Used in: RbmStandaloneCard

enum Intent.Message.RbmStandaloneCard.ThumbnailImageAlignment

intent.proto:583

Thumbnail preview alignment for standalone cards with horizontal layout.

Used in: RbmStandaloneCard

message Intent.Message.RbmSuggestedAction

intent.proto:708

Rich Business Messaging (RBM) suggested client-side action that the user can choose from the card.

Used in: RbmSuggestion

message Intent.Message.RbmSuggestedAction.RbmSuggestedActionDial

intent.proto:711

Opens the user's default dialer app with the specified phone number but does not dial automatically (https://goo.gl/ergbB2).

Used in: RbmSuggestedAction

message Intent.Message.RbmSuggestedAction.RbmSuggestedActionOpenUri

intent.proto:723

Opens the user's default web browser app to the specified uri (https://goo.gl/6GLJD2). If the user has an app installed that is registered as the default handler for the URL, then this app will be opened instead, and its icon will be used in the suggested action UI.

Used in: RbmSuggestedAction

message Intent.Message.RbmSuggestedAction.RbmSuggestedActionShareLocation

intent.proto:730

Opens the device's location chooser so the user can pick a location to send back to the agent (https://goo.gl/GXotJW).

Used in: RbmSuggestedAction

(message has no fields)

message Intent.Message.RbmSuggestedReply

intent.proto:696

Rich Business Messaging (RBM) suggested reply that the user can click instead of typing in their own response.

Used in: RbmSuggestion

message Intent.Message.RbmSuggestion

intent.proto:683

Rich Business Messaging (RBM) suggestion. Suggestions allow user to easily select/click a predefined response or perform an action (like opening a web uri).

Used in: RbmCardContent, RbmText

message Intent.Message.RbmText

intent.proto:521

Rich Business Messaging (RBM) text response with suggestions.

Used in: Message

message Intent.Message.SelectItemInfo

intent.proto:465

Additional info about the select item for when it is triggered in a dialog.

Used in: CarouselSelect.Item, ListSelect.Item

message Intent.Message.SimpleResponse

intent.proto:338

The simple response message containing speech or text.

Used in: SimpleResponses

message Intent.Message.SimpleResponses

intent.proto:356

The collection of simple response candidates. This message in `QueryResult.fulfillment_messages` and `WebhookResponse.fulfillment_messages` should contain only one `SimpleResponse`.

Used in: Message

message Intent.Message.Suggestion

intent.proto:396

The suggestion chip message that the user can tap to quickly post a reply to the conversation.

Used in: Suggestions

message Intent.Message.Suggestions

intent.proto:402

The collection of suggestions.

Used in: Message

message Intent.Message.TableCard

intent.proto:879

Table card for Actions on Google.

Used in: Message

message Intent.Message.TableCardCell

intent.proto:933

Cell of [TableCardRow][google.cloud.dialogflow.v2beta1.Intent.Message.TableCardRow].

Used in: TableCardRow

message Intent.Message.TableCardRow

intent.proto:924

Row of [TableCard][google.cloud.dialogflow.v2beta1.Intent.Message.TableCard].

Used in: TableCard

message Intent.Message.TelephonyPlayAudio

intent.proto:476

Plays audio from a file in Telephony Gateway.

Used in: Message

message Intent.Message.TelephonySynthesizeSpeech

intent.proto:499

Synthesizes speech and plays back the synthesized audio to the caller in Telephony Gateway. Telephony Gateway takes the synthesizer settings from `DetectIntentResponse.output_audio_config` which can either be set at request-level or can come from the agent-level synthesizer config.

Used in: Message

message Intent.Message.TelephonyTransferCall

intent.proto:512

Transfers the call in Telephony Gateway.

Used in: Message

message Intent.Message.Text

intent.proto:288

The text response message.

Used in: Message

message Intent.Parameter

intent.proto:246

Represents intent parameters.

Used in: Intent

message Intent.TrainingPhrase

intent.proto:172

Represents an example that the agent is trained on.

Used in: Intent

message Intent.TrainingPhrase.Part

intent.proto:174

Represents a part of a training phrase.

Used in: TrainingPhrase

enum Intent.TrainingPhrase.Type

intent.proto:195

Represents different types of training phrases.

Used in: TrainingPhrase

enum Intent.WebhookState

intent.proto:1118

Represents the different states that webhooks can be in.

Used in: Intent

message IntentBatch

intent.proto:1396

This message is a wrapper around a collection of intents.

Used in: BatchUpdateIntentsRequest

enum IntentView

intent.proto:1404

Represents the options for views of an intent. An intent can be a sizable object. Therefore, we provide a resource view that does not return training phrases in the response by default.

Used in: BatchUpdateIntentsRequest, CreateIntentRequest, GetIntentRequest, ListIntentsRequest, UpdateIntentRequest

message KnowledgeAnswers

session.proto:313

Represents the result of querying a Knowledge base.

Used in: QueryResult

message KnowledgeAnswers.Answer

session.proto:315

An answer from Knowledge Connector.

Used in: KnowledgeAnswers

enum KnowledgeAnswers.Answer.MatchConfidenceLevel

session.proto:318

Represents the system's confidence that this knowledge answer is a good match for this conversational query.

Used in: Answer

message KnowledgeBase

knowledge_base.proto:116

Represents knowledge base resource. Note: The `projects.agent.knowledgeBases` resource is deprecated; only use `projects.knowledgeBases`.

Used as response type in: KnowledgeBases.CreateKnowledgeBase, KnowledgeBases.GetKnowledgeBase, KnowledgeBases.UpdateKnowledgeBase

Used as field type in: CreateKnowledgeBaseRequest, ListKnowledgeBasesResponse, UpdateKnowledgeBaseRequest

message KnowledgeOperationMetadata

document.proto:271

Metadata in google::longrunning::Operation for Knowledge operations.

enum KnowledgeOperationMetadata.State

document.proto:273

States of the operation.

Used in: KnowledgeOperationMetadata

message OriginalDetectIntentRequest

webhook.proto:123

Represents the contents of the original request that was passed to the `[Streaming]DetectIntent` call.

Used in: WebhookRequest

message OutputAudioConfig

audio_config.proto:329

Instructs the speech synthesizer how to generate the output audio content.

Used in: DetectIntentRequest, DetectIntentResponse, StreamingDetectIntentRequest, StreamingDetectIntentResponse

enum OutputAudioEncoding

audio_config.proto:310

Audio encoding of the output audio format in Text-To-Speech.

Used in: OutputAudioConfig

message QueryInput

session.proto:204

Represents the query input. It can contain either: 1. An audio config which instructs the speech recognizer how to process the speech audio. 2. A conversational query in the form of text. 3. An event that specifies which intent to trigger.

Used in: DetectIntentRequest, StreamingDetectIntentRequest

message QueryParameters

session.proto:157

Represents the parameters of the conversational query.

Used in: DetectIntentRequest, StreamingDetectIntentRequest

message QueryResult

session.proto:219

Represents the result of conversational query or event processing.

Used in: DetectIntentResponse, StreamingDetectIntentResponse, WebhookRequest

message Sentiment

session.proto:641

The sentiment, such as positive/negative feeling or association, for a unit of analysis, such as the query text.

Used in: SentimentAnalysisResult

message SentimentAnalysisRequestConfig

session.proto:625

Configures the types of sentiment analysis to perform.

Used in: QueryParameters

message SentimentAnalysisResult

session.proto:634

The result of sentiment analysis as configured by `sentiment_analysis_request_config`.

Used in: QueryResult

message SessionEntityType

session_entity_type.proto:176

Represents a session entity type. Extends or replaces a developer entity type at the user session level (we refer to the entity types defined at the agent level as "developer entity types"). Note: session entity types apply to all queries, regardless of the language.

Used as response type in: SessionEntityTypes.CreateSessionEntityType, SessionEntityTypes.GetSessionEntityType, SessionEntityTypes.UpdateSessionEntityType

Used as field type in: CreateSessionEntityTypeRequest, ListSessionEntityTypesResponse, QueryParameters, UpdateSessionEntityTypeRequest, WebhookResponse

enum SessionEntityType.EntityOverrideMode

session_entity_type.proto:178

The types of modifications for a session entity type.

Used in: SessionEntityType

message SpeechContext

audio_config.proto:83

Hints for the speech recognizer to help with recognition in a specific conversation state.

Used in: InputAudioConfig

enum SpeechModelVariant

audio_config.proto:116

Variant of the specified [Speech model][google.cloud.dialogflow.v2beta1.InputAudioConfig.model] to use. See the [Cloud Speech documentation](https://cloud.google.com/speech-to-text/docs/enhanced-models) for which models have different variants. For example, the "phone_call" model has both a standard and an enhanced variant. When you use an enhanced model, you will generally receive higher quality results than for a standard model.

Used in: InputAudioConfig

message SpeechWordInfo

audio_config.proto:153

Information for a word recognized by the speech recognizer.

Used in: StreamingRecognitionResult

enum SsmlVoiceGender

audio_config.proto:248

Gender of the voice as described in [SSML voice element](https://www.w3.org/TR/speech-synthesis11/#edef_voice).

Used in: VoiceSelectionParams

message StreamingRecognitionResult

session.proto:532

Contains a speech recognition result corresponding to a portion of the audio that is currently being processed or an indication that this is the end of the single requested utterance. Example: 1. transcript: "tube" 2. transcript: "to be a" 3. transcript: "to be" 4. transcript: "to be or not to be" is_final: true 5. transcript: " that's" 6. transcript: " that is" 7. message_type: `END_OF_SINGLE_UTTERANCE` 8. transcript: " that is the question" is_final: true Only two of the responses contain final results (#4 and #8 indicated by `is_final: true`). Concatenating these generates the full transcript: "to be or not to be that is the question". In each response we populate: * for `TRANSCRIPT`: `transcript` and possibly `is_final`. * for `END_OF_SINGLE_UTTERANCE`: only `message_type`.

Used in: StreamingDetectIntentResponse

enum StreamingRecognitionResult.MessageType

session.proto:534

Type of the response message.

Used in: StreamingRecognitionResult

message SynthesizeSpeechConfig

audio_config.proto:278

Configuration of how speech should be synthesized.

Used in: OutputAudioConfig

message TextInput

session.proto:593

Represents the natural language text to be processed.

Used in: QueryInput

message ValidationError

validation_result.proto:31

Represents a single validation error.

Used in: ValidationResult

enum ValidationError.Severity

validation_result.proto:33

Represents a level of severity.

Used in: ValidationError

message VoiceSelectionParams

audio_config.proto:264

Description of which voice to use for speech synthesis.

Used in: SynthesizeSpeechConfig

message WebhookRequest

webhook.proto:36

The request message for a webhook call.

message WebhookResponse

webhook.proto:61

The response message for a webhook call.