Skip to content

Releases: sozercan/aikit

v0.15.0

27 Nov 07:39
9bd67fa
Compare
Choose a tag to compare

Notable Changes

Features

Bug Fixes

Documentation

Continuous Integration

Chores

v0.14.0

27 Sep 03:50
9fa9ac1
Compare
Choose a tag to compare

Notable Changes

Features

Documentation

Continuous Integration

Chores

v0.13.0

07 Sep 20:35
97c114e
Compare
Choose a tag to compare

Notable Changes

Features

Documentation

Continuous Integration

Chores

v0.12.2

03 Aug 23:18
43058b2
Compare
Choose a tag to compare

Notable Changes

Chores

v0.12.1

03 Aug 20:26
69284c6
Compare
Choose a tag to compare

Notable Changes

Chores

v0.12.0

28 Jul 05:56
767888a
Compare
Choose a tag to compare

Notable Changes

Deprecation


Features

Bug Fixes

Documentation

Tests

Continuous Integration

Chores

v0.11.1

13 Jun 05:44
abf1ef4
Compare
Choose a tag to compare

Notable Changes

  • 💪 Multi-platform images with ARM64 support! All pre-made models include both AMD64 and ARM64 platform support.
  • 📦 Support for models from OCI Artifacts. For example, use models from ollama by simply running:
docker buildx build -t my-model --load \
    --build-arg="model=oci://registry.ollama.ai/library/llama3:8b" \
    "https://raw.githubusercontent.com/sozercan/aikit/main/models/aikitfile.yaml"

docker run -d --rm -p 8080:8080 my-model

curl http://localhost:8080/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "llama3",
        "messages": [
            {
                "role": "user",
                "content": "Hello!"
            }
        ]
    }'
  • ⎈ Helm chart security hardening with restricted pod security admission

Chores

Commits

v0.11.0

13 Jun 03:49
8f80515
Compare
Choose a tag to compare

Notable Changes

  • 💪 Multi-platform images with ARM64 support! All pre-made models include both AMD64 and ARM64 platform support.
  • 📦 Support for models from OCI Artifacts. For example, use models from ollama by simply running:
docker buildx build -t my-model --load \
    --build-arg="model=oci://registry.ollama.ai/library/llama3:8b" \
    "https://raw.githubusercontent.com/sozercan/aikit/main/models/aikitfile.yaml"

docker run -d --rm -p 8080:8080 my-model

curl http://localhost:8080/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "llama3",
        "messages": [
            {
                "role": "user",
                "content": "Hello!"
            }
        ]
    }'
  • ⎈ Helm chart security hardening with restricted pod security admission

Features

Documentation

Continuous Integration

Chores

v0.10.0

06 Jun 11:11
0eb9597
Compare
Choose a tag to compare

Notable changes

⚡️ Quick start to create custom images using models from Hugging Face 🤗 without creating an aikitfile

Example:

docker buildx build -t my-model --load \
    --build-arg="model=huggingface://TheBloke/Llama-2-7B-Chat-GGUF/llama-2-7b-chat.Q4_K_M.gguf" \
    "https://raw.githubusercontent.com/sozercan/aikit/main/models/aikitfile.yaml"

Features

Bug Fixes

Documentation

Tests

Continuous Integration

Chores

v0.9.0

30 May 21:53
36d6cd4
Compare
Choose a tag to compare

Notable Changes

  • 🕵️ Auto runtime detection: AIKit will now automatically check your CPU and GPU capabilities, and automatically choose the most optimized runtime. With this change, there is no differentiation for runtimes avx, avx2, and avx512. They can still be used to skip installing the cuda runtime libraries for slimmer images.
  • 💿 Pre-made models will now include CUDA libraries by default, and CUDA specific images will not be updated anymore. Thanks to auto runtime detection if a compatible GPU is not found, it will fallback to the most optimized CPU runtime.
  • Helm chart for Kubernetes is now available!
  • 🚀 New pre-made models are now available: Gemma 1.1 2B Instruct and Codestral 22B.
  • ✨ Updated LocalAI to v2.16.0.
  • 🦥 Updated Unsloth to May 2024 release.

Features

Bug Fixes

Documentation

Continuous Integration

Chores

Reverts