| 6 minutes read | Level: Advanced | Last Updated: December 2025 |
File Search Using OpenAI Assistants
Glific’s File Search using OpenAI Assistant enables users to upload documents and get AI-generated answers to user questions. The system uses a method called Retrieval Augmented Generation (RAG), where the assistant searches through your files to give accurate, helpful responses, including answers to follow-up questions.
Use this when:
- Users want to ask questions based on your PDFs, reports, or manuals.
- There is a need to build an automated knowledge assistant for your organisation.
- Help users get instant responses.
This document guides you through three main parts:
- Creating an OpenAI Assistant
- Using the Assistant in your Flows (including handling follow-up questions)
- Handling Voice Inputs and Responses
How to Create an OpenAI Assistant in Glific
Step 1: Create a new AI Assistant
Click on AI Assistant from the left sidebar, then select Create Assistant to generate a blank assistant.
Step 2: Fill in the Assistant Details
Define the following parameters:
- Choose the most relevant model from the first drop down.
- Provide a name to the assistant.
- Provide a system prompt in the
Instructionsfield.
Click Here to read more on prompt engineering. - Files (PDF, DOCX, etc.) can be uploaded by clicking on
Manage Files. These files will be utilized by the assistant to generate responses.
Click Here to know the supported file formats by the OpenAI APIs. - Set the
Temperature(between 0 to 2). A higher value increases creativity/randomness.
Recommended: keep temperature at 0.
Note: The quality of the bot’s response depends on the prompt. Give appropriate prompts based on your use case.
Step 3: Save Your Assistant
Once the files are added, click on Add. This completes the Assistant setup.
Click on the Save button after making any changes.
Step 4: Copy the Assistant ID
Once created, copy the Assistant ID shown below the assistant name.
This ID will be used in the webhook nodes in the flow editor.
Using the OpenAI Assistant in Floweditor
The following sections explain how to use an assistant to answer questions or create conversations.
Handling text inputs and outputs
This section explains how to:
- Use the
filesearch-gptwebhook function to pass a user’s question to the OpenAI Assistant. - Receive the assistant’s response.
- Handle follow-up questions using conversational memory.
Step 1: Get User Question
- Create a flow where the user sends a question as text input.
- Add
Send Messagenode and receive user question as text. - This question will be passed to the assistant for a response.
- Provide a
Result Namefor the `Wait for Response node. - In the example below, the result name is set as
question.
Screenshot of example flow set up is given below
Step 2: Add a Call Webhook node. This is where we integrate the OpenAI Assistant.
- By default,
Functionwould be selected. Leave this as it is.
- In the
Functionfield, select the pre-defined function namefilesearch-gpt, from the dropdown.
- Give the webhook
Result Name- you can use any name. In the screenshot example, it’s named asgptresponse.
Step 3: Click on Function Body (top right corner). You would see the following.
-
In
questionparameter enter the flow variable containing the question asked by the user. In the given examplequestionis theresult name, hence provided@result.questionin the question parameter. -
In
assistant_identer the assistant id obtained in step 4 of "How to Create an OpenAI Assistant in Glific"
Step 4: Display the Assistant's response
-
Once the Webhook is updated, add a
Send Messagenode and enter@results.gptresponse.messagevariable to receive the AI response. -
In the given example
gptresponseis theresult name(refer to Step 2). Ifai_responsewas the result name, the variable would be@results.ai_response.message.Sample Flow Click on the Sample Flow link to import it and explore how it works.
Conversational Memory
When a user asks a follow-up question, the assistant uses thread ID to remember the earlier conversation. This helps it give better answers by understanding the context of what was already asked.
Step 5: Add thread_id in the next Webhook call
- To answer the subsequent questions based on the context of a question already asked, in the next webhook call, an additional parameter called
thread_idneeds to be passed. - This parameter should be set to the value
@results.previouswebhookname.thread_id. - In the example shown, the previous webhook result name is gptresponse. So the thread ID should be referenced as -
@results.gptresponse.thread_id.
- In question parameter enter the flow variable containing the follow up question asked by the user. In the given example
result_5is the result name, hence provided@results.result_5in the question parameter.
Handling Voice Inputs and Responses
Some beneficiaries may find it easier to talk instead of typing. This is helpful for people who are not comfortable reading or writing. With voice input, beneficiaries can send voice notes to ask questions and get answers as both text and voice.
This section explains how to use the voice-filesearch-gpt webhook function in Glific flows to take a user’s voice note as input and return both text and voice note responses in the desired language.
Step 1: Capture end user’s voice input
- Create a
Send Messagenode directing users to send their responses as audio messages, based on their preference. - In the
Wait for Responsenode, select has audio as the message response type. Also, give a result name. In the screenshot below,audio_queryis used as the result name.
Step 2: Create Call a Webhook node
- By default,
Functionwould be selected. Leave this as it is.
- In the
Functionfield, select the pre-defined function namevoice-filesearch-gpt, from the dropdown.
- Give the webhook result name - you can use any name. In the screenshot example, it’s named
gpt_voice.
Step 3: Click on Function Body (top right corner). You would see the following.
Pass the following paramters in the function body.
- For
contactkeep this value as@contactas mentioned in the screenshot. speechis the result name which is storing the voice note sent by the user.assistant_idis the assistant id obtained in step 4 of "How to Create an OpenAI Assistant in Glific.source_langaugeis the expected language of the user.target_languageis the language that the response voice note needs to be in.
Step 4: Display the text response
- Create a
Send Message node. - Use
@results.webhook_result-name.translated_textto show the text response. - In the given example
gpt_voiceis the webhook result name.
Step 5: Send the voice note response
- In a new
Send Messagenode, go toAttachments. - Choose
Expressionfrom the dropdown. - Use
@results.gpt_voice.media_url(gpt_voice is the result name of webhook node)
Sample Flow Click on the Sample Flow link to import it and explore how it works.
Pricing
NGOs can use AI features in Glific without any additional cost for inferencing. Glific is supported by OpenAI to help more NGOs experiment, pilot, and run programs using large language models (LLMs), enabling them to scale their impact without being limited by cost. Additionally NGOs can use up to $100 worth of credits until August 2026.