Read the Intro blog!
Ducky is an AI agent operating the twitter account Duck Unfilitered You can see his stream of thought here Ducky Website
Warning
This code is not Stable, in the midst of a refactoring and heavy developement, do not rely on it for mission critical stuff, yet. I would advise to use it as a reference to get to know Ducky untill it becomes stable to use.
This code runs the Unfiltered Duck, Ducky on twitter.
- Character Prompt: Character
- Agents - Context Injecting prompts
- Wallets - AI controlled Turnkey TEE wallet, cosigned on a gnosis Safe multisig Code
- UI - Website (Stream of thought backend logs, community dashboard, chat v2)
- Sentiment Analysis - Tracks sentiment in telegram and twitter replies
- Conversations - (in progress), personalized chat with Ducky
- We use Railway.app
- Create a postgres service
- Frontend Service: Root Directory: /frontend, custom start command
bun start
- Discord Service:
./entrypoint.sh discord
- Telegram Service:
./entrypoint.sh telegram
- Cleo Tweeter: (From conversations with Cleo, see blog Root Dir:
twitter-server
and Custom Start Command:bun generateCleoConvos
- Degen Tweeter: Trained on his previous tweets and responses. Root Dir:
twitter-server
and Custom Start Command:bun generateTweet
- Reply: Root Dir:
twitter-server
and Custom Start Command:bun generateReplies
ANTHROPIC_API_KEY
DATABASE_URL
TODO:(some others not documented yet)
This repo is part python, javascript and rust. Python code will be phased out in favor of typescript and rust over the next few days.
├── conversations - rust server for conversations (not in production yet)
├── db - Database (Depreciating in favor of more typesafe in Drizzle)
├── memory - database
├── frontend - next.js frontend app for ducky
├── ducky - main typescript brain
├── utils/sentiment_analysis/message_fetcher - tg messages backfill
├── wallet - turnkey and gnosis
./entrypoint.sh studio
- Good docker image to use:
Better Ollama CUDA12
- GPUs to select:
4 x A40s
- VSCode/Cursor Remote:
ssh root@XX.XX.XX.XXX -p 22XXX -i ~/.ssh/<RUNPOD_SSH>
- Install ollama:
(curl -fsSL https://ollama.com/install.sh | sh && OLLAMA_HOST=0.0.0.0 ollama serve > ollama.log 2>&1) &
- KeepAlive:
ollama run llama3.1:70b --keepalive 1000m
DONT FORGET OLLAMA_HOST=0.0.0.0 in above command
tweepy | rate limit | Link | |
---|---|---|---|
create_tweet | POST /2/tweets | User rate limit (User context): 200 requests per 15-minute window per each authenticated user | https://developer.x.com/en/docs/x-api/tweets/manage-tweets/api-reference/post-tweets |
search_recent_tweets | GET /2/tweets/search/recent | App rate limit (Application-only): 450 requests per 15-minute window shared among all users of your app | User rate limit (User context): 180 requests per 15-minute window per each authenticated user |
- agent-twitter-clientsaved us from
ERROR 429 - Elon hates you
, errors - Luna Virtuals - awesome brain and prompting