ClassifierPornQueryClassifierOutput

ClassifierSafeSearch

GoogleApi.ContentWarehouse.V1.Model.ClassifierPornQueryClassifierOutput

6
out of 10
Medium
SEO Impact
Generic output for one vertical.

SEO Analysis

AI Generated

Related to safe content filtering. Affects whether content appears in filtered search results. This model (Classifier Porn Query Classifier Output) contains SEO-relevant attributes including score. Key functionality includes: This field is only filled for the CSAI vertical.

Actionable Insights for SEOs

  • Understanding this model helps SEOs grasp Google's internal data architecture
  • Consider how this system might interact with other ranking signals

Attributes

4
Sort:|Filter:
csaiClassificationstring
Default: nilFull type: String.t

This field is only filled for the CSAI vertical.

debugstring
Default: nilFull type: String.t

Human-readable debug information about the classification. This field is only set if output_debug is set in the classification input.

isPositiveboolean(
Default: nil

The bit that shows if this classifier outputs positive classification for the input query. Set by thresholding with a recommended threshold.

scorenumber(
Default: nil

The score that the classifier assigned to the input query. This is filled by all verticals.