No description
Find a file
2025-04-06 17:40:05 +02:00
src Implement handleApiResponse when handling ai response 2025-04-06 17:33:11 +02:00
.gitignore Update gitignore 2025-02-02 10:20:45 +01:00
.nvmrc Upgrade node version 2025-02-02 07:18:26 +01:00
config.ts.example Update example config 2025-02-02 11:02:21 +01:00
package.json Add lowdb to packages 2025-02-02 10:20:24 +01:00
README.md Update README 2025-04-06 17:40:05 +02:00
tsconfig.json Update ts-node settings 2024-08-25 01:59:14 +02:00
tslint.json Add tsconfig and tslint 2024-08-24 22:32:37 +02:00
yarn.lock Add lowdb to packages 2025-02-02 10:20:24 +01:00

matrix-bot-openai-wrapper

What does this bot do?

This bot relays the prompt to the openai API and sends the answer back to the user who made the request. All this is done through the Matrix chat-client.

Why?

This way Openai can do less user-tracking since different people prompt through the same API-key. Also imo chat clients, such as Matrix, are more practical instead of using the official app.

Requirements

  • Local environment requires nodejs, yarn and nvm
  • Access token for the Matrix account. The access token can be obtained by using this script.
  • Openai API-key

Installation, setup and running the bot

  • nvm use
  • yarn install
  • cp config.ts.example config.ts
  • Add matrix homeserver, bot-token and Openai API-key to the config.ts
  • Run yarn start to start the bot

Implemented Features

  • Encryption

Planned features

  • Add logic for user management (work in progress )
  • Allow setting the model in the config-file
  • Add option to temporarily set/reset context