-
-
Notifications
You must be signed in to change notification settings - Fork 54
FB4D Reference IGeminiAIRequest
This interface must be used for more complex requests to Gemini AI that go beyond simple text requests.
However, it can also be used to generate a simple text request. The following function is used for this:
function Prompt(const PromptText: string): IGeminiAIRequest;
For command prompt that referes to a media file (document, picture, video, audio) the following method can be used:
function PromptWithMediaData(const PromptText, MimeType: string; MediaStream: TStream): IGeminiAIRequest;
For application based on Firemonkey or on FGX framework, you can simply use for images the following method which extracts the mime type automatically from the media stream. Keep in mind, this is working for images only. PDF, Video and audio need to constructed with the above method.
function PromptWithImgData(const PromptText: string; ImgStream: TStream): IGeminiAIRequest;
Note: VCL does not offer this great functionality, that is why you need to use always the method PromptWithMediaData
.
The following method sets the model parameters for the request.
function ModelParameter(Temperatur, TopP: double; MaxOutputTokens, TopK: cardinal): IGeminiAIRequest;
The Temperatur controls the randomness of the generated text. The value of TopP controls the diversity of the generated text. The MaxOutputTokens Limits the maximum number of tokens in the generated text. The TopK controls the number of candidate words considered during generation.
The following method offers to sets a stop word or a stop sequences. Stop sequences are used to stop the text generation in Gemini AI when such a word or sequence of words are generated.
function SetStopSequences(StopSequences: TStrings): IGeminiAIRequest;
The following method can be used to set the safety settings for one category of harm. Use this method on the same request interface for all categories that needs to handled in different way than the default model does.
function SetSafety(HarmCat: THarmCategory; LevelToBlock: TSafetyBlockLevel): IGeminiAIRequest;
For using in chats add the model answer to the next request by the following method.
procedure AddAnswerForNextRequest(const ResultAsMarkDown: string);
In chats after adding the last answer from the model the next question from the user can be added with the following method.
procedure AddQuestionForNextRequest(const PromptText: string);
Within a chat to calculated to number of tokens in the prompt the next function allows get a working request.
function CloneWithoutCfgAndSettings(Request: IGeminiAIRequest): IGeminiAIRequest;
You can use the following method to instruct the Gemini AI to only respond with a JSON object by supplying a defined JSON schema. This function is suitable in cases where you need structured data from queries that need to be processed further in the code.
function SetJSONResponseSchema(Schema: IGeminiSchema): IGeminiAIRequest;
For understanding how to declare the JSON schema, see this page IGeminiSchema.
Have you discovered an error? Or is something unclear? Please let us know in the discussion forum.
Schneider Infosystems Ltd. CH-6340 Baar, Switzerland, www.schneider-infosys.ch
Introduction into FB4D
Getting Started
Fundamental Principles
Project Settings
GUI Pattern Self-Registration
RT-DB Sample Applications
Firestore Chat Sample Applications
PhotoBox demonstrates Firestore, Storage, VisionML
Interface Reference
Configuration and Class Factory
Helper Classes
Authentication