Skip to content

Releases: sozercan/aikit

v0.16.0

10 Dec 07:09
b2833a2
Compare
Choose a tag to compare

Notable Changes

  • ✨ Update to LocalAI v2.24.1.
  • 🖖 Apple Silicon containers uses llama.cpp kompute backend instead of vulkan.
  • 🦙 Added Llama 3.3 70B and QwQ 32B to pre-made models.

Features

Continuous Integration

Chores

v0.15.0

27 Nov 07:39
9bd67fa
Compare
Choose a tag to compare

Notable Changes

  • 🍎 Apple Silicon GPU acceleration through Podman Desktop for macOS!

     podman run -d --rm --device /dev/dri -p 8080:8080 ghcr.io/sozercan/applesilicon/llama3.1:8b

    then visit http://localhost:8080/chat or use:

     curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{
         "model": "llama-3.1-8b-instruct",
         "messages": [{"role": "user", "content": "tell me about quantum mechanics"}]
       }'
     docker buildx build -t registry/repo/image:tag --push \
         --build-arg="model=huggingface://MaziyarPanahi/Llama-3.2-1B-Instruct-GGUF/Llama-3.2-1B-Instruct.Q4_K_M.gguf" \
         --build-arg="runtime=applesilicon" \
         "https://raw.githubusercontent.com/sozercan/aikit/main/models/aikitfile.yaml"
  • 🍏 Pre-made models for Llama 3.2 and 3.1, Phi 3.5, and Gemma 2 with Apple Silicon support

  • ✨ Update to LocalAI v2.23.0

  • 🦥 Update Unsloth to Sept 2024 release

Features

Bug Fixes

Documentation

Continuous Integration

Chores

v0.14.0

27 Sep 03:50
9fa9ac1
Compare
Choose a tag to compare

Notable Changes

Features

Documentation

Continuous Integration

Chores

v0.13.0

07 Sep 20:35
97c114e
Compare
Choose a tag to compare

Notable Changes

Features

Documentation

Continuous Integration

Chores

v0.12.2

03 Aug 23:18
43058b2
Compare
Choose a tag to compare

Notable Changes

Chores

v0.12.1

03 Aug 20:26
69284c6
Compare
Choose a tag to compare

Notable Changes

Chores

v0.12.0

28 Jul 05:56
767888a
Compare
Choose a tag to compare

Notable Changes

Deprecation


Features

Bug Fixes

Documentation

Tests

Continuous Integration

Chores

v0.11.1

13 Jun 05:44
abf1ef4
Compare
Choose a tag to compare

Notable Changes

  • 💪 Multi-platform images with ARM64 support! All pre-made models include both AMD64 and ARM64 platform support.
  • 📦 Support for models from OCI Artifacts. For example, use models from ollama by simply running:
docker buildx build -t my-model --load \
    --build-arg="model=oci://registry.ollama.ai/library/llama3:8b" \
    "https://raw.githubusercontent.com/sozercan/aikit/main/models/aikitfile.yaml"

docker run -d --rm -p 8080:8080 my-model

curl http://localhost:8080/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "llama3",
        "messages": [
            {
                "role": "user",
                "content": "Hello!"
            }
        ]
    }'
  • ⎈ Helm chart security hardening with restricted pod security admission

Chores

Commits

v0.11.0

13 Jun 03:49
8f80515
Compare
Choose a tag to compare

Notable Changes

  • 💪 Multi-platform images with ARM64 support! All pre-made models include both AMD64 and ARM64 platform support.
  • 📦 Support for models from OCI Artifacts. For example, use models from ollama by simply running:
docker buildx build -t my-model --load \
    --build-arg="model=oci://registry.ollama.ai/library/llama3:8b" \
    "https://raw.githubusercontent.com/sozercan/aikit/main/models/aikitfile.yaml"

docker run -d --rm -p 8080:8080 my-model

curl http://localhost:8080/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "llama3",
        "messages": [
            {
                "role": "user",
                "content": "Hello!"
            }
        ]
    }'
  • ⎈ Helm chart security hardening with restricted pod security admission

Features

Documentation

Continuous Integration

Chores

v0.10.0

06 Jun 11:11
0eb9597
Compare
Choose a tag to compare

Notable changes

⚡️ Quick start to create custom images using models from Hugging Face 🤗 without creating an aikitfile

Example:

docker buildx build -t my-model --load \
    --build-arg="model=huggingface://TheBloke/Llama-2-7B-Chat-GGUF/llama-2-7b-chat.Q4_K_M.gguf" \
    "https://raw.githubusercontent.com/sozercan/aikit/main/models/aikitfile.yaml"

Features

Bug Fixes

Documentation

Tests

Continuous Integration

Chores