package google.cloud.dialogflow.v2

Mouse Melon logoGet desktop application:
View/edit binary Protocol Buffers messages

service Agents

agent.proto:64

Agents are best described as Natural Language Understanding (NLU) modules that transform user requests into actionable data. You can include agents in your app, product, or service to determine user intent and respond to the user in a natural way. After you create an agent, you can add [Intents][google.cloud.dialogflow.v2.Intents], [Contexts][google.cloud.dialogflow.v2.Contexts], [Entity Types][google.cloud.dialogflow.v2.EntityTypes], [Webhooks][google.cloud.dialogflow.v2.WebhookRequest], and so on to manage the flow of a conversation and match user input to predefined intents and actions. You can create an agent using both Dialogflow Standard Edition and Dialogflow Enterprise Edition. For details, see [Dialogflow Editions](https://cloud.google.com/dialogflow/docs/editions). You can save your agent for backup or versioning by exporting the agent by using the [ExportAgent][google.cloud.dialogflow.v2.Agents.ExportAgent] method. You can import a saved agent by using the [ImportAgent][google.cloud.dialogflow.v2.Agents.ImportAgent] method. Dialogflow provides several [prebuilt agents](https://cloud.google.com/dialogflow/docs/agents-prebuilt) for common conversation scenarios such as determining a date and time, converting currency, and so on. For more information about agents, see the [Dialogflow documentation](https://cloud.google.com/dialogflow/docs/agents-overview).

service Contexts

context.proto:54

A context represents additional information included with user input or with an intent returned by the Dialogflow API. Contexts are helpful for differentiating user input which may be vague or have a different meaning depending on additional details from your application such as user setting and preferences, previous user input, where the user is in your application, geographic location, and so on. You can include contexts as input parameters of a [DetectIntent][google.cloud.dialogflow.v2.Sessions.DetectIntent] (or [StreamingDetectIntent][google.cloud.dialogflow.v2.Sessions.StreamingDetectIntent]) request, or as output contexts included in the returned intent. Contexts expire when an intent is matched, after the number of `DetectIntent` requests specified by the `lifespan_count` parameter, or after 20 minutes if no intents are matched for a `DetectIntent` request. For more information about contexts, see the [Dialogflow documentation](https://cloud.google.com/dialogflow/docs/contexts-overview).

service EntityTypes

entity_type.proto:64

Entities are extracted from user input and represent parameters that are meaningful to your application. For example, a date range, a proper name such as a geographic location or landmark, and so on. Entities represent actionable data for your application. When you define an entity, you can also include synonyms that all map to that entity. For example, "soft drink", "soda", "pop", and so on. There are three types of entities: * **System** - entities that are defined by the Dialogflow API for common data types such as date, time, currency, and so on. A system entity is represented by the `EntityType` type. * **Developer** - entities that are defined by you that represent actionable data that is meaningful to your application. For example, you could define a `pizza.sauce` entity for red or white pizza sauce, a `pizza.cheese` entity for the different types of cheese on a pizza, a `pizza.topping` entity for different toppings, and so on. A developer entity is represented by the `EntityType` type. * **User** - entities that are built for an individual user such as favorites, preferences, playlists, and so on. A user entity is represented by the [SessionEntityType][google.cloud.dialogflow.v2.SessionEntityType] type. For more information about entity types, see the [Dialogflow documentation](https://cloud.google.com/dialogflow/docs/entities-overview).

service Intents

intent.proto:72

An intent represents a mapping between input from a user and an action to be taken by your application. When you pass user input to the [DetectIntent][google.cloud.dialogflow.v2.Sessions.DetectIntent] (or [StreamingDetectIntent][google.cloud.dialogflow.v2.Sessions.StreamingDetectIntent]) method, the Dialogflow API analyzes the input and searches for a matching intent. If no match is found, the Dialogflow API returns a fallback intent (`is_fallback` = true). You can provide additional information for the Dialogflow API to use to match user input to an intent by adding the following to your intent. * **Contexts** - provide additional context for intent analysis. For example, if an intent is related to an object in your application that plays music, you can provide a context to determine when to match the intent if the user input is "turn it off". You can include a context that matches the intent when there is previous user input of "play music", and not when there is previous user input of "turn on the light". * **Events** - allow for matching an intent by using an event name instead of user input. Your application can provide an event name and related parameters to the Dialogflow API to match an intent. For example, when your application starts, you can send a welcome event with a user name parameter to the Dialogflow API to match an intent with a personalized welcome message for the user. * **Training phrases** - provide examples of user input to train the Dialogflow API agent to better match intents. For more information about intents, see the [Dialogflow documentation](https://cloud.google.com/dialogflow/docs/intents-overview).

service SessionEntityTypes

session_entity_type.proto:53

Entities are extracted from user input and represent parameters that are meaningful to your application. For example, a date range, a proper name such as a geographic location or landmark, and so on. Entities represent actionable data for your application. Session entity types are referred to as **User** entity types and are entities that are built for an individual user such as favorites, preferences, playlists, and so on. You can redefine a session entity type at the session level. Session entity methods do not work with Google Assistant integration. Contact Dialogflow support if you need to use session entities with Google Assistant integration. For more information about entity types, see the [Dialogflow documentation](https://cloud.google.com/dialogflow/docs/entities-overview).

service Sessions

session.proto:44

A session represents an interaction with a user. You retrieve user input and pass it to the [DetectIntent][google.cloud.dialogflow.v2.Sessions.DetectIntent] (or [StreamingDetectIntent][google.cloud.dialogflow.v2.Sessions.StreamingDetectIntent]) method to determine user intent and respond.

message Agent

agent.proto:176

Represents a conversational agent.

Used as response type in: Agents.GetAgent, Agents.SetAgent

Used as field type in: SearchAgentsResponse, SetAgentRequest

enum Agent.ApiVersion

agent.proto:192

API version for the agent.

Used in: Agent

enum Agent.MatchMode

agent.proto:178

Match mode determines how intents are detected from user queries.

Used in: Agent

enum Agent.Tier

agent.proto:207

Represents the agent tier.

Used in: Agent

enum AudioEncoding

audio_config.proto:37

Audio encoding of the audio content sent in the conversational query request. Refer to the [Cloud Speech API documentation](https://cloud.google.com/speech-to-text/docs/basics) for more details.

Used in: InputAudioConfig

message BatchUpdateEntityTypesResponse

entity_type.proto:427

The response message for [EntityTypes.BatchUpdateEntityTypes][google.cloud.dialogflow.v2.EntityTypes.BatchUpdateEntityTypes].

message BatchUpdateIntentsResponse

intent.proto:876

The response message for [Intents.BatchUpdateIntents][google.cloud.dialogflow.v2.Intents.BatchUpdateIntents].

message Context

context.proto:109

Represents a context.

Used as response type in: Contexts.CreateContext, Contexts.GetContext, Contexts.UpdateContext

Used as field type in: CreateContextRequest, Intent, ListContextsResponse, QueryParameters, QueryResult, UpdateContextRequest, WebhookResponse

message EntityType

entity_type.proto:190

Represents an entity type. Entity types serve as a tool for extracting parameter values from natural language queries.

Used as response type in: EntityTypes.CreateEntityType, EntityTypes.GetEntityType, EntityTypes.UpdateEntityType

Used as field type in: BatchUpdateEntityTypesResponse, CreateEntityTypeRequest, EntityTypeBatch, ListEntityTypesResponse, UpdateEntityTypeRequest

enum EntityType.AutoExpansionMode

entity_type.proto:244

Represents different entity type expansion modes. Automated expansion allows an agent to recognize values that have not been explicitly listed in the entity (for example, new kinds of shopping list items).

Used in: EntityType

message EntityType.Entity

entity_type.proto:197

An **entity entry** for an associated entity type.

Used in: BatchCreateEntitiesRequest, BatchUpdateEntitiesRequest, EntityType, SessionEntityType

enum EntityType.Kind

entity_type.proto:223

Represents kinds of entities.

Used in: EntityType

message EntityTypeBatch

entity_type.proto:523

This message is a wrapper around a collection of entity types.

Used in: BatchUpdateEntityTypesRequest

message EventInput

session.proto:476

Events allow for matching intents by event name instead of the natural language input. For instance, input `<event: { name: "welcome_event", parameters: { name: "Sam" } }>` can trigger a personalized welcome response. The parameter `name` may be used by the agent in the response: `"Hello #welcome_event.name! What can I do for you today?"`.

Used in: QueryInput, WebhookResponse

message ExportAgentResponse

agent.proto:379

The response message for [Agents.ExportAgent][google.cloud.dialogflow.v2.Agents.ExportAgent].

message InputAudioConfig

audio_config.proto:82

Instructs the speech recognizer how to process the audio content.

Used in: QueryInput

message Intent

intent.proto:157

Represents an intent. Intents convert a number of user expressions or patterns into an action. An action is an extraction of a user command or sentence semantics.

Used as response type in: Intents.CreateIntent, Intents.GetIntent, Intents.UpdateIntent

Used as field type in: BatchDeleteIntentsRequest, BatchUpdateIntentsResponse, CreateIntentRequest, IntentBatch, ListIntentsResponse, QueryResult, UpdateIntentRequest

message Intent.FollowupIntentInfo

intent.proto:603

Represents a single followup intent in the chain.

Used in: Intent

message Intent.Message

intent.proto:278

Corresponds to the `Response` field in the Dialogflow console.

Used in: Intent, QueryResult, WebhookResponse

message Intent.Message.BasicCard

intent.proto:354

The basic card message. Useful for displaying information.

Used in: Message

message Intent.Message.BasicCard.Button

intent.proto:356

The button object that appears at the bottom of a card.

Used in: BasicCard

message Intent.Message.BasicCard.Button.OpenUriAction

intent.proto:358

Opens the given URI.

Used in: Button

message Intent.Message.Card

intent.proto:305

The card response message.

Used in: Message

message Intent.Message.Card.Button

intent.proto:307

Contains information about a button.

Used in: Card

message Intent.Message.CarouselSelect

intent.proto:435

The card for presenting a carousel of options to select from.

Used in: Message

message Intent.Message.CarouselSelect.Item

intent.proto:437

An item in the carousel.

Used in: CarouselSelect

message Intent.Message.Image

intent.proto:286

The image response message.

Used in: Message, BasicCard, CarouselSelect.Item, ListSelect.Item

message Intent.Message.LinkOutSuggestion

intent.proto:401

The suggestion chip message that allows the user to jump out to the app or website associated with this agent.

Used in: Message

message Intent.Message.ListSelect

intent.proto:411

The card for presenting a list of options to select from.

Used in: Message

message Intent.Message.ListSelect.Item

intent.proto:413

An item in the list.

Used in: ListSelect

enum Intent.Message.Platform

intent.proto:468

Represents different platforms that a rich message can be intended for.

Used in: Intent, Message

message Intent.Message.QuickReplies

intent.proto:296

The quick replies response message.

Used in: Message

message Intent.Message.SelectItemInfo

intent.proto:457

Additional info about the select item for when it is triggered in a dialog.

Used in: CarouselSelect.Item, ListSelect.Item

message Intent.Message.SimpleResponse

intent.proto:330

The simple response message containing speech or text.

Used in: SimpleResponses

message Intent.Message.SimpleResponses

intent.proto:348

The collection of simple response candidates. This message in `QueryResult.fulfillment_messages` and `WebhookResponse.fulfillment_messages` should contain only one `SimpleResponse`.

Used in: Message

message Intent.Message.Suggestion

intent.proto:388

The suggestion chip message that the user can tap to quickly post a reply to the conversation.

Used in: Suggestions

message Intent.Message.Suggestions

intent.proto:394

The collection of suggestions.

Used in: Message

message Intent.Message.Text

intent.proto:280

The text response message.

Used in: Message

message Intent.Parameter

intent.proto:238

Represents intent parameters.

Used in: Intent

message Intent.TrainingPhrase

intent.proto:164

Represents an example that the agent is trained on.

Used in: Intent

message Intent.TrainingPhrase.Part

intent.proto:166

Represents a part of a training phrase.

Used in: TrainingPhrase

enum Intent.TrainingPhrase.Type

intent.proto:187

Represents different types of training phrases.

Used in: TrainingPhrase

enum Intent.WebhookState

intent.proto:614

Represents the different states that webhooks can be in.

Used in: Intent

message IntentBatch

intent.proto:898

This message is a wrapper around a collection of intents.

Used in: BatchUpdateIntentsRequest

enum IntentView

intent.proto:906

Represents the options for views of an intent. An intent can be a sizable object. Therefore, we provide a resource view that does not return training phrases in the response by default.

Used in: BatchUpdateIntentsRequest, CreateIntentRequest, GetIntentRequest, ListIntentsRequest, UpdateIntentRequest

message OriginalDetectIntentRequest

webhook.proto:115

Represents the contents of the original request that was passed to the `[Streaming]DetectIntent` call.

Used in: WebhookRequest

message OutputAudioConfig

audio_config.proto:231

Instructs the speech synthesizer on how to generate the output audio content.

Used in: DetectIntentRequest, DetectIntentResponse, StreamingDetectIntentRequest, StreamingDetectIntentResponse

enum OutputAudioEncoding

audio_config.proto:247

Audio encoding of the output audio format in Text-To-Speech.

Used in: OutputAudioConfig

message QueryInput

session.proto:168

Represents the query input. It can contain either: 1. An audio config which instructs the speech recognizer how to process the speech audio. 2. A conversational query in the form of text,. 3. An event that specifies which intent to trigger.

Used in: DetectIntentRequest, StreamingDetectIntentRequest

message QueryParameters

session.proto:128

Represents the parameters of the conversational query.

Used in: DetectIntentRequest, StreamingDetectIntentRequest

message QueryResult

session.proto:183

Represents the result of conversational query or event processing.

Used in: DetectIntentResponse, StreamingDetectIntentResponse, WebhookRequest

message Sentiment

session.proto:507

The sentiment, such as positive/negative feeling or association, for a unit of analysis, such as the query text.

Used in: SentimentAnalysisResult

message SentimentAnalysisRequestConfig

session.proto:491

Configures the types of sentiment analysis to perform.

Used in: QueryParameters

message SentimentAnalysisResult

session.proto:500

The result of sentiment analysis as configured by `sentiment_analysis_request_config`.

Used in: QueryResult

message SessionEntityType

session_entity_type.proto:131

Represents a session entity type. Extends or replaces a developer entity type at the user session level (we refer to the entity types defined at the agent level as "developer entity types"). Note: session entity types apply to all queries, regardless of the language.

Used as response type in: SessionEntityTypes.CreateSessionEntityType, SessionEntityTypes.GetSessionEntityType, SessionEntityTypes.UpdateSessionEntityType

Used as field type in: CreateSessionEntityTypeRequest, ListSessionEntityTypesResponse, QueryParameters, UpdateSessionEntityTypeRequest, WebhookResponse

enum SessionEntityType.EntityOverrideMode

session_entity_type.proto:138

The types of modifications for a session entity type.

Used in: SessionEntityType

enum SpeechModelVariant

audio_config.proto:131

Variant of the specified [Speech model][google.cloud.dialogflow.v2.InputAudioConfig.model] to use. See the [Cloud Speech documentation](https://cloud.google.com/speech-to-text/docs/enhanced-models) for which models have different variants. For example, the "phone_call" model has both a standard and an enhanced variant. When you use an enhanced model, you will generally receive higher quality results than for a standard model.

Used in: InputAudioConfig

enum SsmlVoiceGender

audio_config.proto:215

Gender of the voice as described in [SSML voice element](https://www.w3.org/TR/speech-synthesis11/#edef_voice).

Used in: VoiceSelectionParams

message StreamingRecognitionResult

session.proto:416

Contains a speech recognition result corresponding to a portion of the audio that is currently being processed or an indication that this is the end of the single requested utterance. Example: 1. transcript: "tube" 2. transcript: "to be a" 3. transcript: "to be" 4. transcript: "to be or not to be" is_final: true 5. transcript: " that's" 6. transcript: " that is" 7. message_type: `END_OF_SINGLE_UTTERANCE` 8. transcript: " that is the question" is_final: true Only two of the responses contain final results (#4 and #8 indicated by `is_final: true`). Concatenating these generates the full transcript: "to be or not to be that is the question". In each response we populate: * for `TRANSCRIPT`: `transcript` and possibly `is_final`. * for `END_OF_SINGLE_UTTERANCE`: only `message_type`.

Used in: StreamingDetectIntentResponse

enum StreamingRecognitionResult.MessageType

session.proto:418

Type of the response message.

Used in: StreamingRecognitionResult

message SynthesizeSpeechConfig

audio_config.proto:182

Configuration of how speech should be synthesized.

Used in: OutputAudioConfig

message TextInput

session.proto:459

Represents the natural language text to be processed.

Used in: QueryInput

message VoiceSelectionParams

audio_config.proto:168

Description of which voice to use for speech synthesis.

Used in: SynthesizeSpeechConfig

message WebhookRequest

webhook.proto:36

The request message for a webhook call.

message WebhookResponse

webhook.proto:58

The response message for a webhook call.