Securing API keys is crucial for any application, especially in the context of such integrations as OpenAI API integration within FlutterFlow. One significant reason is the financial risk associated with leaked API keys. Each request made using an API key incurs costs, and if this key is exposed, unauthorized usage can lead to unexpected charges. Unfortunately, FlutterFlow currently does not fully protects your appagainst such vulnerabilities, highlighting the critical need for developers to proactively secure their API keys.
In this article I’ll demonstrate you how standard API calls are being executed by FlutterFlow and how to improve them to be more secure
In this article we’ll use standard Open AI Completion endpoint that will take meal name as input variable and ask AI to describe this meal. Below you can find definition of this call:
Headers
Body
As you can see, prompt is : “Describe me meal: Input Variable“. This variable is passed in FlutterFlow
Let’s call it and open Google Chrome network tab to see how FlutterFlow executes this
As you can see, all the necessary information to get your API key is exposed in the headers of request
1) Request URL can easily tell that you’re knocking on the Open AI door
2) X-Request-URL is exposing address of API endpoint. List of those API endpoints is available at Open AI’s website and can be used to track “useful” requests when intercepting network traffic of your app
3)Authorization. And your API key is being used directly in headers of the request
Moreover, if you download the generated code, API key is also stored directly inside the application making it vulnerable if application is decoded
Conclusion: Standard Api requests are fine when you’re in the hypothesis testing stage, but when you’re publishing the app, it will be far from secure.
Also, taking into consideration that API key is “baked” into the app, if it exposed, you’ll have to publish another version of the app to replace this key
There is a feature provided by FlutterFlow called “Deploy Private API”
What it effectively does is it takes your API call and wraps it into Cloud Function in Firebase. So Firebase serves as a proxy between client application and Open AI endpoint (and it is the correct approach to work with such sensitive API calls.
To simplify understanding of this mechanism, take a look at diagram below
So the idea is that you perform API call indirectly. You call it using your server as Proxy. All the sensitive information is supposed to be stored on server in this case.
Pros:
– Your API key is not stored on client’s device. So it is safe from being exposed using app’s decomplier
– Network interception becomes much more difficult
– Even if your API key is exposed, you can easily rotate it
Cons:
– You need to be on Blaze plan of Firebase
– Some additional developer time needed to configure it
– It does not actually work the way it is supposed to. Let’s take a look at how FlutterFlow executes API Call via Cloud Function
Above you can see result of API request using Private API Feature.
And it actually looks much better than the first option!
1) Main request is definitely executed on server (we do not see any trace of Open AI’s endpoint in network tab)
2) API key is not exposed in headers
3) Only Client’s bearer token that is used to identify that this request is being executed by authorised device is being passed in the request. And it is actually a very good thing, with this token server knows that this request is sent from authorised user
BUT, if we enter “Payload” of request, we can see that besides the variable that we explicitly defined when creating API request in FlutterFlow, apiKey is being passed 🙁
Also, if we take a look at code, API key is also “baked” into source code 🙁
Basically, all the Pros that theoretically occur when you use Firebase as Proxy are gone 🙁 This feature however really obfuscates headers, making it more difficult to understand what the request actually are being executed on the server.
The updated schema looks like this:
In order to secure our API calls we’ll use the same strategy (that FlutterFlow team uses in Private Calls feature) of using our server (Firebase or Supabase) as Proxy server to execute those API calls. But, we’ll store API key on server, not in application
This section will not be covered by this article. However, I can recommend this article that gives an example of Open AI cloud function in Firebase https://medium.com/@makiex/how-to-safely-use-openai-in-your-app-with-firebase-cloud-functions-10a55ba95d11
And this one is good to usnderstand how to create and deploy Cloud Functions in Firebase
https://medium.com/firebase-developers/a-practical-approach-to-cloud-functions-for-firebase-an-introduction-d3b5ae2114e7
Let’s start!
You’ll need a lot of prerequisites to run EDGE functions locally and then deploy them to your Supabase Project. I advice you to take a look at this video made by Supabase team to understand how those dependencies are being used (https://www.youtube.com/watch?v=29p8kIqyU_Y&t=627s):
1) Supabase CLI
2) Node JS
3) Docker (If you’re on Windows, also install Ubuntu from Windows Store
4) Get you Open AI key from Open AI Api Dashboard
1) Open disk where Node JS is installed and create your project’s Folder
2) Open this folder in VS code (or any code editor of your choice)
3) Run “supabase init” command in your terminal. It will create “supabase” subfolder in your project with all necessary configurations:
4) Run “supabase functions new openai” command in your terminal. This command will create new “functions” folder in your project with “openai” file name in it (as you can see on the screenshot above)
5) Сreate new file in your supabase project called “.env.local” and put there your Open AI api key in such format:
6) Paste code below into your function (index.ts file)
// Follow this setup guide to integrate the Deno language server with your editor:
// https://deno.land/manual/getting_started/setup_your_environment
// This enables autocomplete, go to definition, etc.
// Setup type definitions for built-in Supabase Runtime APIs
///
import 'https://deno.land/x/xhr@0.3.0/mod.ts'
import { CreateCompletionRequest } from 'https://esm.sh/openai@3.1.0'
const corsHeaders = {
'Access-Control-Allow-Origin': '*', // Allow requests from any origin
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type', // Allow specified headers
}
Deno.serve(async (req) => {
// CORS handling for OPTIONS request
if (req.method === 'OPTIONS') {
return new Response('ok', { headers: corsHeaders })
}
try {
const { query } = await req.json()
const completionConfig: CreateCompletionRequest = {
model: 'gpt-3.5-turbo-instruct',
prompt: query,
max_tokens: 256,
temperature: 0,
}
// Make the request to OpenAI API
const response = await fetch('https://api.openai.com/v1/completions', {
method: 'POST',
headers: {
Authorization: `Bearer ${Deno.env.get('OPENAI_API_KEY')}`,
'Content-Type': 'application/json',
},
body: JSON.stringify(completionConfig),
})
// Return the response from OpenAI API with CORS headers
return new Response(await response.text(), {
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
status: response.status,
})
} catch (error) {
// Handle errors and return a response with CORS headers
return new Response(JSON.stringify({ error: error.message }), {
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
status: 400,
})
}
})
/* To invoke locally:
1. Run `supabase start` (see: https://supabase.com/docs/reference/cli/supabase-start)
2. Make an HTTP request:
curl -i --location --request POST 'http://127.0.0.1:54321/functions/v1/openai' \
--header 'Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0' \
--header 'Content-Type: application/json' \
--data '{"name":"Functions"}'
*/
Let me explain a little bit code above because it differs from the one that provided by Supabase team:
const corsHeaders = {
'Access-Control-Allow-Origin': '*', // Allow requests from any origin
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type', // Allow specified headers
}
Deno.serve(async (req) => {
// CORS handling for OPTIONS request
if (req.method === 'OPTIONS') {
return new Response('ok', { headers: corsHeaders })
}
This part is needed for correct functioning of our function in production. When running locally, CORS is not being checked, but once we deploy this function and try to call it using application, they are checked. And without this part of code it will always return an error.
const { query } = await req.json()
const completionConfig: CreateCompletionRequest = {
model: 'gpt-3.5-turbo-instruct',
prompt: query,
max_tokens: 256,
temperature: 0,
}
Here we define request configuration and setup “query” variable that is basically a prompt that we’ll generate inside our application and pass it to Edge function. And Edge function will use it as “prompt” parameter of actual API request to Open AI.
Everything else is explained in comments inside the code and in video provided by Supabase team
7) To check this function locally, run “supabase start” command. Before running this command, make sure that Docker application is running
8) Run command “supabase functions serve –env-file=.env.local openai”
9) Open new “bash” terminal in VS Code and paste curl function
curl --request POST 'http://127.0.0.1:54321/functions/v1//functions/v1/openai' \
--header 'Authorization: Bearer SUPABASE_ANON_KEY' \
--header 'Content-Type: application/json' \
--data '{ "name":"Functions" }'
To get your SUPABASE_ANON_KEY, run this command: supabase status
And you’ll see anon key:
If there was no errors, you’ll see that your function actually works!
Here’s official guide from Supabase: https://supabase.com/docs/guides/functions/deploy
Coming back to our project,
1) Run command “supabase login”. It will open browser and you’ll be able to login
2) Run command “supabase projects list”. Grab there project ref of your Supabase project
3) Run this command to connect your local project to production “supabase link –project-ref your-project-id“
4) Run “supabase functions deploy openai” to deploy this function
5) Go to configuration tab of your Supabase project and add your OpenAI key to EDGE functions secrets
Here’s official documentation on how to invoke edge functions using Dart (Flutter):
https://supabase.com/docs/reference/dart/functions-invoke
Below is the Custom Action that involves our Edge Function
// Automatic FlutterFlow imports
import '/backend/backend.dart';
import '/backend/schema/structs/index.dart';
import '/backend/supabase/supabase.dart';
import '/flutter_flow/flutter_flow_theme.dart';
import '/flutter_flow/flutter_flow_util.dart';
import '/custom_code/actions/index.dart'; // Imports other custom actions
import '/flutter_flow/custom_functions.dart'; // Imports custom functions
import 'package:flutter/material.dart';
// Begin custom action code
// DO NOT REMOVE OR MODIFY THE CODE ABOVE!
import 'dart:convert';
Future completionResult(String? prompt) async {
final openaiResponse =
await SupaFlow.client.functions.invoke('openai', body: {'query': prompt});
try {
final jsonResponse = openaiResponse.data;
// Access the "text" field from the "choices" array
final text = jsonResponse['choices'][0]['text'];
return text.toString(); // Return the "text" field from choices as a string
} catch (e) {
print('Error decoding JSON: $e');
return ''; // Return an empty string in case of an error
}
}
And how it looks like in editor:
final openaiResponse =
await SupaFlow.client.functions.invoke('openai', body: {'query': prompt});
‘openai’ is the name of EDGE function. ‘query’ is EDGE function parameter that we set up above. Prompt is the argument that we’ll set up in FlutterFlow
final jsonResponse = openaiResponse.data;
// Access the "text" field from the "choices" array
final text = jsonResponse['choices'][0]['text'];
Here we:
1) Access the JSON that is passed by EDGE function
2) Parse it. Choices as you remember is a type of list, but it provides only one element. That is why we take “first” element of this list. And then we access “text” inside this first item:
As you remember, we have “prompt” argument that we need to pass to custom action that will be passed to EDGE function:
That is the final prompt that we’ll send
1) We can see that headers do no expose anything besides edge function name and our bearer token that identifies us:
2) And we see that in payload we pass only function argument
That’s it! Hope that this article was useful for you and you have understanding of how you can secure you API secrets!