Image Moderation

Image Moderation detects and recognizes inappropriate and unsafe content in images. The criteria include suggestive or explicit racy content, nudity, violence, gore, bloodshed, and the presence of weapons and drugs.

You can learn more from the Image Moderation help page.

You can provide a .jpg/.jpeg or .png file as the input. Refer to the API documentation for the request and response formats.

You can set the moderation mode as BASIC, MODERATE, or ADVANCED optionally. The image is processed in the ADVANCED mode by default.

The response returns the probability of each criteria with their confidence scores, and the prediction of the image being safe_to_use or unsafe_to_use.

Ensure the following packages are imported:

Copiedimport com.zc.component.ml.ZCAnalyseMode;
import com.zc.component.ml.ZCImageModerateData;
import com.zc.component.ml.ZCImageModerationConfidence;
import com.zc.component.ml.ZCImageModerationOptions;
import com.zc.component.ml.ZCImageModerationPrediction;
import com.zc.component.ml.ZCML;
CopiedFile file = new File("{filePath}"); //Specify the file path
ZCImageModerationOptions options = ZCImageModerationOptions.getInstance().setAnalyseMode(ZCAnalyseMode.ADVANCED); //Set the moderation mode
ZCImageModerateData imData = ZCML.getInstance().moderateImage(file, options); //Call moderateImage() with the input file and options
ZCImageModerationPrediction prediction = imData.getPrediction(); //To get the final prediction
Double predictionConfidence = imData.getConfidence(); //To get the confidence score of the final prediction
List<ZCImageModerationConfidence> confidences = imData.getImageModerationConfidenceList(); //To get the confidence scores of each criteria predicted