Context Aware
AI assistant for coding, chat, code explanation, review with supporting local and online language models.
AIAssist
is compatible with OpenAI and Azure AI Services through apis and Ollama models through ollama engine locally.
Tip
You can use ollama and its models that are more compatible with code like deepseek-v2.5 or qwen2.5-coder locally. To use local models, you will need to run Ollama process first. For running ollama you can use ollama docker container.
Note
Development of vscode
and jetbrains
plugins are in the plan and I will add them soon.
- β
Context Aware
ai code assistant through ai embeddings which is based on Retrieval Augmented Generation (RAG) or tree-sitter application summarization to summarize application context and understanding by AI. - β Support different result formats, like Unified Diff Format, Code Block Format and Search-Replace Format.
- β Code assistant for developing new features, finding bugs, refactor and review existing code base.
- β Chat mode for chatting with different local and online AI models through terminal.
- β Support local ollama models and OpenAI and Azure AI Service models.
- β Support multiple code languages like C#, Java, go,...
- β
Syntax highlighting for showing code blocks and using
md format
for ai results with capability ofchanging theme
like dracula theme or vscode light theme. - β
Defining a dedicated ignore file for AIAssist through a
.aiassistignore
for excluding files and folders that you want to exclude from code assist process and decreasing final token size. - β
Customize
configuration
through creatingaiassist-config.json
running directory foraiassist
and a format like predefined aiassist-config.json. - β
Showing token usage count and calculated priced based on each model
input token
andoutput token
price. - β
Customize models information with creating a customized models information through
ModelsInformationOptions
section in theaiassist-config.json
and a format like predefined models information. - β
Customize models options through
ModelsOptions
section in theaiassist-config.json
and a format like predefined models options. - β
Terminal main commands like
aiassist code
,aiassist chat
,aiassist explain
and support some internal commands like:clear
,:add-file
,:clear-history
,:token
, ...
AIAssist uses Azure AI Services or OpenAI apis by default. For using OpenAI
or Azure AI
apis we need to have a ApiKey
.
- Install
aiassist
withdotnet tool install
and bellow command:
TODO: Add Nuget Soon
- For OpenAI If you don't have a API key you can sign up in OpenAI and get a ApiKey.
- For Azure AI service you can signup a azure account and get a AI model API key.
- After getting Api key we should set API key for chat and embedding models through environment variable or command options.
- Now got to
project directory
withcd
command in terminal, For runningaiassist
and setting api key.
# Go to project directory
cd /to/project/directory
- Set
Api Key
throughenvironment variable
:
Linux terminal:
export CHAT_MODEL_API_KEY=your-chat-api-key-here
export EMBEDDINGS_MODEL_API_KEY=your-embedding-api-key-here
Windows Powershell Terminal:
$env:CHAT_MODEL_API_KEY=your-chat-api-key-here
$env:EMBEDDINGS_MODEL_API_KEY=your-embedding-api-key-here
- Or set
Api Key
throughcommand option
:
aiassist code --chat-api-key your-chat-api-key-here --embeddings-api-key your-embedding-api-key-here
- If you are using AI models that need
ApiVersion
,DeploymentId
andBaseAddress
like Azure AI Service models, you can set them by environment variable or command options. - Set
ApiVersion
,DeploymentId
andBaseAddress
throughenvironment variable
:
Linux terminal:
export CHAT_BASE_ADDRESS=your-chat-base-address-here
export CHAT_API_VERSION=your-chat-api-version-here
export CHAT_DEPLOYMENT_ID=your-chat-deployment-id-here
export EMBEDDINGS_BASE_ADDRESS=your-embedding-base-address-here
export EMBEDDINGS_API_VERSION=your-embedding-api-version-here
export EMBEDDINGS_DEPLOYMENT_ID=your-embedding-deployment-id-here
Windows Powershell Terminal:
$env:CHAT_BASE_ADDRESS=your-chat-base-address-here
$env:CHAT_API_VERSION=your-chat-api-version-here
$env:CHAT_DEPLOYMENT_ID=your-chat-deployment-id-here
$env:EMBEDDINGS_BASE_ADDRESS=your-embedding-base-address-here
$env:EMBEDDINGS_API_VERSION=your-embedding-api-version-here
$env:EMBEDDINGS_DEPLOYMENT_ID=your-embedding-deployment-id-here
- Or set
ApiVersion
,DeploymentId
andBaseAddress
throughcommand option
:
aiassist code --chat-base-address your-chat-base-address-here --chat-api-version your-chat-api-version-here --chat-deployment-id your-chat-deployment-id-here --embeddings-base-address your-embeddings-base-address-here --embeddings-api-version your-embeddings-api-version-here --embeddings-deployment-id your-embeddings-deployment-id-here
- Now run the ai assistant with
aiassist
command.
# run aiassist in code assistant mode.
aiassist code
If you like feel free to β this repository, It helps out :)
Thanks a bunch for supporting me!
The application is in development status. You are feel free to submit a pull request or create an issue for any bugs or suggestions.
The project is under Apache-2.0 license.