React Native Firebase
    Preparing search index...

    If the prompt was blocked, this will be populated with blockReason and the relevant safetyRatings.

    interface PromptFeedback {
        blockReason?: BlockReason;
        blockReasonMessage?: string;
        safetyRatings: SafetyRating[];
    }
    Index

    Properties

    blockReason?: BlockReason
    blockReasonMessage?: string

    A human-readable description of the blockReason.

    This property is only supported in the Vertex AI Gemini API (VertexAIBackend).

    safetyRatings: SafetyRating[]