ClassifierPornQueryClassifierOutput
ClassifierSafeSearchGoogleApi.ContentWarehouse.V1.Model.ClassifierPornQueryClassifierOutput
SEO Analysis
AI GeneratedRelated to safe content filtering. Affects whether content appears in filtered search results. This model (Classifier Porn Query Classifier Output) contains SEO-relevant attributes including score. Key functionality includes: This field is only filled for the CSAI vertical.
Actionable Insights for SEOs
- Understanding this model helps SEOs grasp Google's internal data architecture
- Consider how this system might interact with other ranking signals
Attributes
4csaiClassificationstringnilFull type: String.tThis field is only filled for the CSAI vertical.
debugstringnilFull type: String.tHuman-readable debug information about the classification. This field is only set if output_debug is set in the classification input.
isPositiveboolean(nilThe bit that shows if this classifier outputs positive classification for the input query. Set by thresholding with a recommended threshold.
scorenumber(nilThe score that the classifier assigned to the input query. This is filled by all verticals.