PornFlagData
Porn FlagSafeSearchGoogleApi.ContentWarehouse.V1.Model.PornFlagData
SEO Analysis
AI GeneratedRelated to safe content filtering. Affects whether content appears in filtered search results. This model (Porn Flag Data) contains SEO-relevant attributes including finalOffensiveScore, finalViolenceScore, finalViolenceScoreVersion. Key functionality includes: DebugInfo stores debug information from the overall classifier. This allows for instance to update counters related to blacklisting without running th...
Actionable Insights for SEOs
- Monitor for changes in rankings that may correlate with updates to this system
- Consider how your content strategy aligns with what this signal evaluates
Attributes
18debugInfoImagePornDebugInfo →nilFull type: list(GoogleApi.ContentWarehouse.V1.Model.ImagePornDebugInfo.tDebugInfo stores debug information from the overall classifier. This allows for instance to update counters related to blacklisting without running the full classifier again.
finalOffensiveScorenumber(nilFinal offensive score based on image salient terms and image OCR vulgar and offensive scores.
finalViolenceScorenumber(nilFinal violence score based on some image signals (brain pixel score, co-clicked images violence score, navboost queries score, etc.).
finalViolenceScoreVersionstringnilFull type: String.tA string that indicates the version of SafeSearch classifier used to compute final_violence_score.
internalSignalsSafesearchInternalImageSignals →nilFull type: GoogleApi.ContentWarehouse.V1.Model.SafesearchInternalImageSignals.tA proto that stores SafeSearch internal signals that are not exported to clients. SafeSearch team does not provide any guarantees about the presence or the semantics of these signals in the future.
numberFacesinteger(nilnumber of faces
ocrAnnotationImageSafesearchContentOCRAnnotation →nilFull type: GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentOCRAnnotation.tInformation about image OCR text. For details see image/safesearch/content/public/ocr_annotation.proto.
offensiveSymbolDetectionImageSafesearchContentOffensiveSymbolDetection →nilFull type: GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentOffensiveSymbolDetection.tQuimbyCongas-based detection of offensive symbols in the image (currently swastika and Nazi yellow badge).
photodnaHashstringnilFull type: String.tBinary version of the PhotoDNA hash (144 bytes long). If not set (has_photodna_hash() == false) it means that it was not computed, if empty (has_photodna_hash() == true && photodna_hash() == "") it means that the computation failed (cannot be computed for images smaller than 50 x 50).
pornWithHighConfidenceboolean(nilThis field is set to true when we are pretty confident that the image is porn (with higher precision than the img_porn_moderate restrict). In particular, it means that the image might be demoted for non-porn queries when SafeSearch is Off.
qbstOffensiveScorenumber(nilQBST-based image offensive score, Navboost based
qbstSpoofScorenumber(nilQBST-based image spoof score, Navboost based, unrelated to the pixel-based score in PornAnnotation.
queryStatsClassifierPornQueryStats →nilFull type: GoogleApi.ContentWarehouse.V1.Model.ClassifierPornQueryStats.tQuery statistics from Navboost logs. For more details see classifier/porn/proto/image_porn_classifier_signals.proto.
queryTextViolenceScorenumber(nilAggregated navboost query violence score.
refererstringnilFull type: String.turl of the referer page
referrerCountsClassifierPornReferrerCounts →nilFull type: GoogleApi.ContentWarehouse.V1.Model.ClassifierPornReferrerCounts.tInformation about referrers and their porn classification. For details see classifier/porn/proto/image_porn_classifier_signals.proto.
semanticSexualizationScorenumber(nilStarburst-based score predicting sexualization level of the image.
urlstringnilFull type: String.turl of the image