ImageSafesearchContentBrainPornAnnotation
Image SafeSearchSafeSearchGoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentBrainPornAnnotation
SEO Analysis
AI GeneratedRelated to safe content filtering. Affects whether content appears in filtered search results. This model (Image Safesearch Content Brain Porn Annotation) contains SEO-relevant attributes including childScore, csaiScore, csamA1Score. Key functionality includes: The probability that the youngest person in the image is a child.
Actionable Insights for SEOs
- Monitor for changes in rankings that may correlate with updates to this system
- Consider how your content strategy aligns with what this signal evaluates
Attributes
14childScorenumber(nilThe probability that the youngest person in the image is a child.
csaiScorefloat(nilThis score correlates with potential child abuse. Google confidential!
csamA1Scorenumber(nilExperimental score. Do not use. Google confidential!
csamAgeIndeterminateScorenumber(nilExperimental score. Do not use. Google confidential!
iuInappropriateScorenumber(nilThis field contains the probability that an image is inappropriate for Images Universal, according to this policy: go/iupolicy.
medicalScorenumber(nilpedoScorenumber(nilpornScorefloat(nilracyScorenumber(nilThis score is related to an image being sexually suggestive.
semanticSexualizationScorenumber(nilThis score is related to racy/sexual images where scores have semantic meaning from 0 to 1.
spoofScorenumber(nilversionstringnilFull type: String.tviolenceScorenumber(nilytPornScorenumber(nilDeprecated, use porn_score instead. The most recent model version does not produce this anymore.