After successfully implementing the openAI API in my Next.js application using the langchain library, everything worked flawlessly on localhost. However, upon deploying to Vercel (ProVersion), I encountered an error:
Error: (Azure) OpenAI API key not found at new OpenAIChat (file:///var/task/node_modules/langchain/dist/llms/openai-chat.js:184:19) at new OpenAI (file:///var/task/node_modules/langchain/dist/llms/openai.js:54:20) at /var/task/.next/server/pages/api/tasks/ai.js:63:21 RequestId: 472c0bdb-dbbc-4cd4-95a3-1808d0b6a5ac Error: Runtime exited with error: exit status 1 Runtime.ExitError
This error points back to a section of code within the langchain node_module utilized in my application:
this.openAIApiKey =
fields?.openAIApiKey ?? getEnvironmentVariable("OPENAI_API_KEY");
this.azureOpenAIApiKey =
fields?.azureOpenAIApiKey ??
getEnvironmentVariable("AZURE_OPENAI_API_KEY");
if (!this.azureOpenAIApiKey && !this.openAIApiKey) {
throw new Error("(Azure) OpenAI API key not found");
}
I had placed the OPENAI_API_KEY environmental variable (matching the one in my .env file) in Vercel:
In my app, I defined the OPENAI_API_KEY in the .env file and accessed it in my backend as follows:
const apiKey = process.env.OPENAI_API_KEY;
const openAIModel = new OpenAI({
modelName: "gpt-3.5-turbo",
temperature: 0,
maxTokens: 2000,
openAIApiKey: apiKey,
});
Interestingly, I was able to both send and receive API requests locally without processing the OPENAI_API_KEY in the backend since the langchain module directly fetched it from my .env file:
const openAIModel = new OpenAI({
modelName: "gpt-3.5-turbo",
temperature: 0,
maxTokens: 2000,
});
Further experimentation included switching the model to OpenAIChat, which worked just as well as the OpenAI model did locally. But again, the same error persisted when deployed to Vercel:
const openAIModel = new OpenAIChat({
modelName: "gpt-3.5-turbo",
temperature: 0,
maxTokens: 2000,
});
Despite these configurations, the error continued to surface consistently. Any insights or suggestions would be greatly appreciated!
Thank you in advance! Nasti