Skip to content

stealthrabbi/twitter-bot-openai

Repository files navigation

twitter-bot-openai

Twitter bot python webapp using OpenAI, tracery

Development

Poetry can be used to manage dependencies locally. Poetry dependencies must be exported (synchronized) to requirements.txt if deploying to Azure app service for python applications, as by default at least, it installs dependencies from requirements.txt when building the container.

To update requirements.txt, run

poetry export --without-hashes --without dev -f requirements.txt -o requirements.txt

Customizing and running

  1. Copy .env.example to .env
  2. Gain access to the Twitter API v1 to be able to write tweets, and enter API tokens in to .env
  3. Gain access to Open AI API and set token in .env OPENAI_API_KEY.
  4. Configure Open AI environmetn variables. SEe Open AI - GPT based bot.
  5. (Optional) Create tracery configuration for tracery-based bot. See Tracery-based bot.

Open AI - GPT based bot

This app uses the Chat Completion API to prompt Chat GPT with a user prompt and to receive the resposne.

The .env includes:

  • OPEN_AI_SYSTEM_PROMPT: the system prompt for initializing the API
  • OPEN_AI_USER_PROMPT_ARRAY: an array possible user prompts to engage the conversation. This is in the form of a python array as a string, and a random choise is selected when runniing the program. If only 1 prompt is desired, an array of a single item is permitted.
  • OPEN_AI_TEMPERATURE - A value bewteen 0 and 2. THe higher the value, the more diverse and creative the resuls are.

Tracery-based bot

This app also supports using Tracery to generate message and generate tweets as well. Tracery is a language for generating text based on rules and expansions. Tracery grammars are written in a format called JSON (or “javascript object notation”). JSON is a common format for exchanging data between computer programs written in different programming languages and on different kinds of computers.

Create a file named tracery.json to use the tracery bot function. See tracery.example.json as a template. It is similar to MadLibs.

Deployment

You can deploy the servcie and run it periodically on Google Cloud, or other cloud providers. The author has found that Google Cloud is the less expensive (essentially free), and reliable at the lowest tier.

  1. Create a Cloud Function (serverless), and deploy the code there.
  2. Create a Cloud Scheduler to hit the URL of your Cloud Function at some periodicity. Authentication should be enabled, and restrict authentication within your Google Cloud project's servcie account.

About

Twitter bot python webapp using OpenAI, tracery

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages