React Native Firebase
GitHub
npm
Preparing search index...
@react-native-firebase/ai
HarmCategory
Enumeration HarmCategory
Harm categories that would cause prompts or candidates to be blocked.
Index
Enumeration Members
HARM_
CATEGORY_
DANGEROUS_
CONTENT
HARM_
CATEGORY_
HARASSMENT
HARM_
CATEGORY_
HATE_
SPEECH
HARM_
CATEGORY_
SEXUALLY_
EXPLICIT
Enumeration Members
HARM_
CATEGORY_
DANGEROUS_
CONTENT
HARM_CATEGORY_DANGEROUS_CONTENT
:
"HARM_CATEGORY_DANGEROUS_CONTENT"
HARM_
CATEGORY_
HARASSMENT
HARM_CATEGORY_HARASSMENT
:
"HARM_CATEGORY_HARASSMENT"
HARM_
CATEGORY_
HATE_
SPEECH
HARM_CATEGORY_HATE_SPEECH
:
"HARM_CATEGORY_HATE_SPEECH"
HARM_
CATEGORY_
SEXUALLY_
EXPLICIT
HARM_CATEGORY_SEXUALLY_EXPLICIT
:
"HARM_CATEGORY_SEXUALLY_EXPLICIT"
Settings
Member Visibility
Protected
Inherited
Theme
OS
Light
Dark
On This Page
Enumeration Members
HARM_
CATEGORY_
DANGEROUS_
CONTENT
HARM_
CATEGORY_
HARASSMENT
HARM_
CATEGORY_
HATE_
SPEECH
HARM_
CATEGORY_
SEXUALLY_
EXPLICIT
Documentation
GitHub
npm
React Native Firebase
Loading...
Harm categories that would cause prompts or candidates to be blocked.