GitHub Workflow Status (with event) PyPI - Version PyPI - Downloads Gitter

中文

Octogen

an open-source code interpreter
一款开源可本地部署的代码解释器

News

https://github.com/dbpunk-labs/octogen/assets/8623385/7445cc4d-567e-4d1a-bedc-b5b566329c41

Supported OSs Supported Interpreters Supported Dev Enviroment

Getting Started

Requirement

To deploy Octogen, the user needs permission to run Docker commands.
To use codellama, your host must have at least 8 CPUs and 16 GB of RAM.

Install the octogen on your local computer

  1. Install og_up
pip install og_up
  1. Set up the Octogen service
og_up

You have the following options to select

The default is using docker as container engine. use podman with flag --use_podman

  1. Execute the command og, you will see the following output
Welcome to use octogen❤️ . To ask a programming question, simply type your question and press esc + enter
You can use /help to look for help

[1]🎧>

Development

Prepare the environment

git clone https://github.com/dbpunk-labs/octogen.git
cd octogen
python3 -m venv octogen_venv
source octogen_venv/bin/activate
pip install -r requirements.txt

Run the sandbox including Agent with mock model and Kernel

$ bash start_sandbox.sh
$ og

Welcome to use octogen❤️ . To ask a programming question, simply type your question and press esc + 
enter
Use /help for help

[1]🎧>hello
╭─ 🐙Octogen ─────────────────────────────────────────────────────────────────────────────────────────╮
│                                                                                                     │
│  0 🧠 how can I help you today?                                                                     │
│                                                                                                     │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────╯
[2]🎧>

Supported API Service

name type status installation
Openai GPT 3.5/4 LLM ✅ fully supported use og_up then choose the OpenAI
Azure Openai GPT 3.5/4 LLM ✅ fully supported use og_up then choose the Azure OpenAI
LLama.cpp Server LLM ✔️ supported use og_up then choose the CodeLlama
Octopus Agent Service Code Interpreter ✅ supported apply api key from octogen.dev then use og_up then choose the Octogen

The internal of local deployment

octogen-internal

Features

if you have any feature suggestion. please create a discuession to talk about it

Roadmap