Package com.google.genai.types
Class SafetySetting
java.lang.Object
com.google.genai.JsonSerializable
com.google.genai.types.SafetySetting
A safety setting that affects the safety-blocking behavior. A SafetySetting consists of a harm
category and a threshold for that category.
-
Nested Class Summary
Nested Classes -
Field Summary
Fields inherited from class com.google.genai.JsonSerializable
MAX_READ_LENGTH_PROPERTY -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionstatic SafetySetting.Builderbuilder()Instantiates a builder for SafetySetting.abstract Optional<HarmCategory>category()Required.static SafetySettingDeserializes a JSON string to a SafetySetting object.abstract Optional<HarmBlockMethod>method()Optional.abstract Optional<HarmBlockThreshold>Required.abstract SafetySetting.BuilderCreates a builder with the same values as this instance.Methods inherited from class com.google.genai.JsonSerializable
fromJsonNode, fromJsonString, objectMapper, setMaxReadLength, stringToJsonNode, toJson, toJsonNode, toJsonString
-
Constructor Details
-
SafetySetting
public SafetySetting()
-
-
Method Details
-
category
Required. The harm category to be blocked. -
method
Optional. The method for blocking content. If not specified, the default behavior is to use the probability score. This field is not supported in Gemini API. -
threshold
Required. The threshold for blocking content. If the harm probability exceeds this threshold, the content will be blocked. -
builder
Instantiates a builder for SafetySetting. -
toBuilder
Creates a builder with the same values as this instance. -
fromJson
Deserializes a JSON string to a SafetySetting object.
-