ImageSafesearchContentBrainPornAnnotation

Image SafeSearchSafeSearch

GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentBrainPornAnnotation

10
out of 10
Critical
SEO Impact
Don't change the field names. The names are used as sparse feature labels in client projects.

SEO Analysis

AI Generated

Related to safe content filtering. Affects whether content appears in filtered search results. This model (Image Safesearch Content Brain Porn Annotation) contains SEO-relevant attributes including childScore, csaiScore, csamA1Score. Key functionality includes: The probability that the youngest person in the image is a child.

Actionable Insights for SEOs

  • Monitor for changes in rankings that may correlate with updates to this system
  • Consider how your content strategy aligns with what this signal evaluates

Attributes

14
Sort:|Filter:
childScorenumber(
Default: nil

The probability that the youngest person in the image is a child.

csaiScorefloat(
Default: nil

This score correlates with potential child abuse. Google confidential!

csamA1Scorenumber(
Default: nil

Experimental score. Do not use. Google confidential!

csamAgeIndeterminateScorenumber(
Default: nil

Experimental score. Do not use. Google confidential!

iuInappropriateScorenumber(
Default: nil

This field contains the probability that an image is inappropriate for Images Universal, according to this policy: go/iupolicy.

medicalScorenumber(
Default: nil
pedoScorenumber(
Default: nil
pornScorefloat(
Default: nil
racyScorenumber(
Default: nil

This score is related to an image being sexually suggestive.

semanticSexualizationScorenumber(
Default: nil

This score is related to racy/sexual images where scores have semantic meaning from 0 to 1.

spoofScorenumber(
Default: nil
versionstring
Default: nilFull type: String.t
violenceScorenumber(
Default: nil
ytPornScorenumber(
Default: nil

Deprecated, use porn_score instead. The most recent model version does not produce this anymore.