Skip to content

voiceflow-gallagan/vf-vapi

Repository files navigation

Voiceflow Vapi Custom LLM integration

STEP 1: Voiceflow

Import the Agent demo file on your Voiceflow workspace.

Import the vapi_agent.vf file on your Voiceflow workspace by clicking on the import button.

Import

Open your agent and be sure to test the Agent demo at least once to compile the version if your VOICEFLOW_VERSION_ID is set to development, or to click on the Publish button if you've set the VOICEFLOW_VERSION_ID to production.

Retrieve the Voiceflow API key from the Voiceflow integrations view. Get Voiceflow API key

Retrieve the Voiceflow project ID from the Voiceflow settings view. Get Voiceflow project ID

STEP 2: VAPI API KEY

Retrieve you private API key from the VAPI dashboard

Get API key Get API key

STEP 3: SETUP THE INTEGRATION

Create the .env file

cp .env.template .env

Update the .env file with your Voiceflow API key and project ID.

Start ngrok on your port

ngrok http 3101

Save the ngrok forwarding url for next step to update the Custom LLM endpoint in the VAPI assistant.

Ngrok

Setup the integration

npm run setup

Start the client

npm run cli
Welcome to the VF | VAPI Custom LLM Endpoint
1. Launch server
2. Create an assistant
Enter your choice (1 or 2):

Select 2 to create an assistant.

Enter assistant name: { any name }
Enter endpoint: { the ngrok forwarding url copied from previous step }
Enter VAPI API key: { the VAPI API key copied from previous step }

You can now start the server

Assistant {name} created successfully
Do you want to start the server now? (y/n): y
[ ready ] http://localhost:3101

OR (if you've already done the previous step)

Start the server

npm run start

STEP 4: TESTING

On VAPI, select the assistant you've just created and click on the Talk button to test your assistant.

Test

trackgit-views

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published