PornFlagData

Porn FlagSafeSearch

GoogleApi.ContentWarehouse.V1.Model.PornFlagData

7
out of 10
High
SEO Impact
A protocol buffer to store the url, referer and porn flag for a url. and an optional image score. Next available tag id: 51.

SEO Analysis

AI Generated

Related to safe content filtering. Affects whether content appears in filtered search results. This model (Porn Flag Data) contains SEO-relevant attributes including finalOffensiveScore, finalViolenceScore, finalViolenceScoreVersion. Key functionality includes: DebugInfo stores debug information from the overall classifier. This allows for instance to update counters related to blacklisting without running th...

Actionable Insights for SEOs

  • Monitor for changes in rankings that may correlate with updates to this system
  • Consider how your content strategy aligns with what this signal evaluates

Attributes

18
Sort:|Filter:
Default: nilFull type: list(GoogleApi.ContentWarehouse.V1.Model.ImagePornDebugInfo.t

DebugInfo stores debug information from the overall classifier. This allows for instance to update counters related to blacklisting without running the full classifier again.

finalOffensiveScorenumber(
Default: nil

Final offensive score based on image salient terms and image OCR vulgar and offensive scores.

finalViolenceScorenumber(
Default: nil

Final violence score based on some image signals (brain pixel score, co-clicked images violence score, navboost queries score, etc.).

finalViolenceScoreVersionstring
Default: nilFull type: String.t

A string that indicates the version of SafeSearch classifier used to compute final_violence_score.

Default: nilFull type: GoogleApi.ContentWarehouse.V1.Model.SafesearchInternalImageSignals.t

A proto that stores SafeSearch internal signals that are not exported to clients. SafeSearch team does not provide any guarantees about the presence or the semantics of these signals in the future.

numberFacesinteger(
Default: nil

number of faces

Default: nilFull type: GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentOCRAnnotation.t

Information about image OCR text. For details see image/safesearch/content/public/ocr_annotation.proto.

Default: nilFull type: GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentOffensiveSymbolDetection.t

QuimbyCongas-based detection of offensive symbols in the image (currently swastika and Nazi yellow badge).

photodnaHashstring
Default: nilFull type: String.t

Binary version of the PhotoDNA hash (144 bytes long). If not set (has_photodna_hash() == false) it means that it was not computed, if empty (has_photodna_hash() == true && photodna_hash() == "") it means that the computation failed (cannot be computed for images smaller than 50 x 50).

pornWithHighConfidenceboolean(
Default: nil

This field is set to true when we are pretty confident that the image is porn (with higher precision than the img_porn_moderate restrict). In particular, it means that the image might be demoted for non-porn queries when SafeSearch is Off.

qbstOffensiveScorenumber(
Default: nil

QBST-based image offensive score, Navboost based

qbstSpoofScorenumber(
Default: nil

QBST-based image spoof score, Navboost based, unrelated to the pixel-based score in PornAnnotation.

Default: nilFull type: GoogleApi.ContentWarehouse.V1.Model.ClassifierPornQueryStats.t

Query statistics from Navboost logs. For more details see classifier/porn/proto/image_porn_classifier_signals.proto.

queryTextViolenceScorenumber(
Default: nil

Aggregated navboost query violence score.

refererstring
Default: nilFull type: String.t

url of the referer page

Default: nilFull type: GoogleApi.ContentWarehouse.V1.Model.ClassifierPornReferrerCounts.t

Information about referrers and their porn classification. For details see classifier/porn/proto/image_porn_classifier_signals.proto.

semanticSexualizationScorenumber(
Default: nil

Starburst-based score predicting sexualization level of the image.

urlstring
Default: nilFull type: String.t

url of the image