package yggdrasil_decision_forests.model.decision_tree.proto

Mouse Melon logoGet desktop application:
View/edit binary Protocol Buffers messages

message Categorical

decision_tree.proto:530

How to handle categorical input features.

Used in: DecisionTreeTrainingConfig

message Categorical.CART

decision_tree.proto:572

Used in: Categorical

(message has no fields)

message Categorical.OneHot

decision_tree.proto:574

Used in: Categorical

message Categorical.Random

decision_tree.proto:579

Used in: Categorical

message Condition

decision_tree.proto:86

The sub-messages of "ConditionParams" are the different types of condition that can be attached to a node.

Used in: NodeCondition

message Condition.ContainsBitmap

decision_tree.proto:104

Condition of the type: (value \intersect elements) != empty_set where elements is stored as a bitmap over the possible values.

Used in: Condition

message Condition.ContainsVector

decision_tree.proto:98

Condition of the type: (value \intersect elements) != empty_set.

Used in: Condition

message Condition.DiscretizedHigher

decision_tree.proto:110

Condition of the type: indexed_value >= indexed_threshold.

Used in: Condition

message Condition.Higher

decision_tree.proto:93

Condition of the type: value >= threshold.

Used in: Condition

message Condition.NA

decision_tree.proto:89

Next ID: 6 Condition of the type: value == NA (i.e. missing).

Used in: Condition

(message has no fields)

message Condition.NumericalVectorSequence

decision_tree.proto:133

Used in: Condition

message Condition.NumericalVectorSequence.Anchor

decision_tree.proto:134

Used in: CloserThan, ProjectedMoreThan

message Condition.NumericalVectorSequence.CloserThan

decision_tree.proto:139

Used in: NumericalVectorSequence

message Condition.NumericalVectorSequence.ProjectedMoreThan

decision_tree.proto:151

Used in: NumericalVectorSequence

message Condition.Oblique

decision_tree.proto:114

Used in: Condition

message Condition.TrueValue

decision_tree.proto:91

Condition of the type: value == True.

Used in: Condition

(message has no fields)

message DecisionTreeTrainingConfig

decision_tree.proto:25

Training configuration for the Random Forest algorithm.

Next ID: 26

Used in: cart.proto.CartTrainingConfig, gradient_boosted_trees.proto.GradientBoostedTreesTrainingConfig, isolation_forest.proto.IsolationForestTrainingConfig, random_forest.proto.RandomForestTrainingConfig

message DecisionTreeTrainingConfig.AxisAlignedSplit

decision_tree.proto:162

See "split_axis".

Used in: DecisionTreeTrainingConfig

(message has no fields)

message DecisionTreeTrainingConfig.Honest

decision_tree.proto:370

Used in: DecisionTreeTrainingConfig

message DecisionTreeTrainingConfig.Internal

decision_tree.proto:396

Used in: DecisionTreeTrainingConfig

enum DecisionTreeTrainingConfig.Internal.SortingStrategy

decision_tree.proto:399

How the computation of sorted values (non discretized numerical values) are obtained.

Used in: Internal

message DecisionTreeTrainingConfig.MHLDObliqueSplit

decision_tree.proto:290

Used in: DecisionTreeTrainingConfig

enum DecisionTreeTrainingConfig.MissingValuePolicy

decision_tree.proto:85

Method used to handle missing attribute values.

Used in: DecisionTreeTrainingConfig

message DecisionTreeTrainingConfig.NumericalVectorSequence

decision_tree.proto:380

Used in: DecisionTreeTrainingConfig

message DecisionTreeTrainingConfig.SparseObliqueSplit

decision_tree.proto:165

See "split_axis".

Used in: DecisionTreeTrainingConfig

message DecisionTreeTrainingConfig.SparseObliqueSplit.BinaryWeights

decision_tree.proto:242

Weights sampled in {-1, 1} (default in "Sparse Projection Oblique Random Forests" (Tomita et al, 2020))).

Used in: SparseObliqueSplit

(message has no fields)

message DecisionTreeTrainingConfig.SparseObliqueSplit.ContinuousWeights

decision_tree.proto:246

Weights sampled in [-1, 1]. Consistently gives better quality models than binary weights.

Used in: SparseObliqueSplit

(message has no fields)

message DecisionTreeTrainingConfig.SparseObliqueSplit.IntegerWeights

decision_tree.proto:258

Weights sampled in uniformly in the integer range [minimum, maximum].

Used in: SparseObliqueSplit

enum DecisionTreeTrainingConfig.SparseObliqueSplit.Normalization

decision_tree.proto:267

Used in: SparseObliqueSplit

message DecisionTreeTrainingConfig.SparseObliqueSplit.PowerOfTwoWeights

decision_tree.proto:252

Weights sampled uniformly in the exponend space, i.e. the weights are of the form $s * 2^i$ with the integer exponent $i$ sampled uniformly in [min_exponent, max_exponent] and the sign $s$ sampled uniformly in {-1, 1}.

Used in: SparseObliqueSplit

message DecisionTreeTrainingConfig.Uplift

decision_tree.proto:307

Used in: DecisionTreeTrainingConfig

enum DecisionTreeTrainingConfig.Uplift.EmptyBucketOrdering

decision_tree.proto:346

How to order buckets having no values for one of the treatments. This parameter is used exclusively for the bucket sorting during the generation of some of the candidate splits. For example, for categorical features with the CART splitter

Used in: Uplift

enum DecisionTreeTrainingConfig.Uplift.SplitScore

decision_tree.proto:322

Splitter score i.e. score optimized by the splitters. Changing the splitter score will impact the trained model. The following scores are introduced in "Decision trees for uplift modeling with single and multiple treatments", Rzepakowski et al. Notation: p: probability of the positive outcome (categorical outcome) or average value of the outcome (numerical outcome) in the treatment group. q: probability / average value in the control group.

Used in: Uplift

message GreedyForwardCategoricalSet

decision_tree.proto:507

Used in: DecisionTreeTrainingConfig

message GrowingStrategyGlobalBest

decision_tree.proto:597

Specifies the global best growing strategy.

Used in: DecisionTreeTrainingConfig

message GrowingStrategyLocalBest

decision_tree.proto:594

Specifies the local best growing strategy. No extra configuration needed.

Used in: DecisionTreeTrainingConfig

(message has no fields)

message LabelStatistics

decision_tree.proto:603

Statistics about the label values used to operate a splitter algorithm.

Used in: distributed_decision_tree.proto.Split, distributed_gradient_boosted_trees.proto.Checkpoint, distributed_gradient_boosted_trees.proto.WorkerRequest.SetInitialPredictions, distributed_gradient_boosted_trees.proto.WorkerResult.GetLabelStatistics, distributed_gradient_boosted_trees.proto.WorkerResult.StartNewIter

message LabelStatistics.Classification

decision_tree.proto:612

Used in: LabelStatistics

message LabelStatistics.Regression

decision_tree.proto:617

Used in: LabelStatistics

message LabelStatistics.RegressionWithHessian

decision_tree.proto:622

Used in: LabelStatistics

message Node

decision_tree.proto:202

Node in a decision tree (without the information about the children).

Used in: distributed_gradient_boosted_trees.proto.WorkerRequest.EndIter.Tree

message NodeAnomalyDetectionOutput

decision_tree.proto:77

Output of a node in an anomaliy detection tree.

Next ID: 2

Used in: Node

message NodeClassifierOutput

decision_tree.proto:23

Output of a node in a classification tree.

Used in: Node

message NodeCondition

decision_tree.proto:179

Binary condition attached to a non-leaf node.

Used in: Node, distributed_decision_tree.proto.Split

message NodeRegressorOutput

decision_tree.proto:32

Output of a node in a regression tree.

Used in: Node

message NodeUpliftOutput

decision_tree.proto:49

Output of a node in an uplift tree with either binary categorical or numerical outcome. The fields have the same definition as the fields in the message "UpliftCategoricalLabelDistribution".

Used in: Node

message NumericalSplit

decision_tree.proto:475

How to find numerical splits.

Used in: DecisionTreeTrainingConfig

enum NumericalSplit.Type

decision_tree.proto:476

Used in: NumericalSplit