Create or edit an Image Analysis profile
Select Create New to open the Create Image Analysis Profile window.
To open the Edit Image Analysis Profile window, select a profile and then click Edit.
Configure the following settings and then click OK:
Name |
Enter a name for this profile. |
|
Comments |
Optional description of the profile. |
|
Image Skip Size |
Enter a value between 0 and 2,048. This value represents the image size that will be skipped by the image scan unit, in kilobytes. Images that are too small are difficult to scan and are more likely to be rated incorrectly by the image scan engine. The default value is 1. |
|
Image Skip Width |
This value represents the image width that will be skipped by the image scan unit, in pixels. Images that are too small are difficult to scan and are more likely to be rated incorrectly by the image scan engine. The default value is 30 pixels; the minimum value is 5 pixels. |
|
Image Skip Height |
This value represents the image height that will be skipped by the image scan unit, in pixels. Images that are too small are difficult to scan and are more likely to be rated incorrectly by the image scan engine. The default value is 30 pixels; the minimum value is 5 pixels. |
|
Rating Error Action |
Set to either Pass or Block the image when it exceeds the rating threshold. The default is Pass. |
|
Replace Image |
Select a replacement image. NOTE: The file type must be To specify the replacement image, go to System > Replacement Messages and select Manage Images. |
|
Log Option |
Select All to log all content or Violation to log content that exceeds any of the strictness levels. |
|
Saving Blocked Images |
Enable to save blocked images. |
|
Block Strictness Level |
For each category, select to Allow, Deny, or Monitor content that exceeds the strictness level, and set the level between 0 and 100. The higher the image score, the more chance of the image being explicit. The challenge with this setting is that if you set it too high, it will block legitimate images. If you set it too low, it will allow explicit images through. If the image score is above this setting, the Rating Error Action is taken. The default value is 30. |
|
|
Alcohol |
The alcohol category is designed to identify images containing alcoholic brands and beverages, people drinking alcohol, frat parties, keg stands, bars and nightclubs, party aftermaths, shots, beer pong, kegs, and plastic cups associated with drinking. |
|
Drugs |
The drugs category is designed to identify images containing illegal and legal drugs, drug use, drug paraphernalia, and plants and symbols relating to drugs. |
|
Extremism |
The extremism category is designed to identify images containing terrorist militants, beheadings, executions, propaganda, acts of terrorism, KKK rallies, Hitler, insignia related to Nazism, KKK, ISIS, and white supremacy icons. |
|
Gambling |
The gambling category is designed to identify images containing gambling. |
|
Gore |
The gore or graphic violence category is designed to identify images containing gore, graphic violence, self-harm, suicide, horrific imagery, bloody wounds, accident victims, shooting victims, beatings, mutilation, decapitation, and images that contain blood and guts. |
|
Porn |
The pornography category is designed to identify images and videos containing commercial pornography, amateur pornography, sexting selfies, nudity, sex acts, grayscale pornographic images, sexually explicit cartoons, and manga. |
|
Swim Underwear |
The swim and underwear, or risqué, category is designed to identify images containing people wearing swimwear or beachwear, underwear, and lingerie. |
|
Weapons |
The weapons category is designed to identify images containing rifles, machine guns, handguns, grenade launchers, swords, knives, and people holding handheld weapons. |
Alternatively, use the config image-analyzer profile
command to configure an image analysis profile in the CLI.