Skip to content

thetatoken/theta-edgecloud-chatbot-example

Repository files navigation

Theta EdgeCloud Chatbot Example

This project provides a working Chatbot example using ThetaEdge Cloud. It is implemented using React and Vite.

Getting Started

  1. Configure your chatbot by editing the varibles in the file .local.env. See Prerequisites if you don't have an API URL.
VITE_CHATBOT_API_URL : the inference endpoint generated with your Theta Edge Cloud dashboard.
VITE_CHATBOT_INSTRUCTIONS : describes your chatbot intended functionality
VITE_CHATBOT_FIRST_QUESTION : the first question to display to your users
VITE_CHATBOT_FIRST_ANSWER : the first answer to display to your users

  1. Run the following commands to install the project dependencies and start it locally:
npm install
npm run dev

Prerequisites

To run this project, you'll need an API URL, here is how to get it :

  1. Navigate to Hugging Face Tokens and generate a new API key. Save this key securely as you will need it to create a Llama-3 model.

  2. Visit the Meta-Llama-3-8B-Instruct license page on Hugging Face and agree to the terms of use.

  3. Deploy the Llama-3 model on Theta Edge Cloud by going to the Model Explorer. Use the API key obtained in Step 1 during the deployment process.


Preview

screenshot

About

Setup your chatbot project in 5 minutes

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published