The AI orchestration tool that lights the way

Built for complexity, designed for clarity. Kamiwaza helps your business — and your data — flow forward. No bottlenecks, no system overhauls. Just AI that keeps pace as you grow.

The AI orchestration tool that lights the way

You don’t need to move mountains.

AI orchestration shouldn’t mean centralization. With Kamiwaza, it doesn’t. Our multi-site intelligence and distributed AI inference mean you skip costly, complicated centralization.

Kamiwaza’s AI-powered orchestration engine uses parallel processing to search your data sources for requested information — and intelligently aggregates inferences to produce one clean, accurate response. Whether you’re looking for on-premise Gen AI, a hybrid AI infrastructure, or you’re fully operating in the cloud — Kamiwaza’s locality-aware data engine goes where your data lives.

AI-orchestration-engine

Don’t just scale — summit.

As your business (and your data) grows, Kamiwaza grows with you. Our Docker-based, loosely coupled architecture means you scale without getting tangled in complexity. Just one API, SDK, and shared context to keep everything moving smoothly.

Whether you’re using our low-code UI or going full custom AI app — get the flexibility to move fast and stay in control.


Core requirements.

What are the core requirements?

To use Kamiwaza, you need the following:

  • Python 3.10 or later
  • Docker Engine with Compose (v2) and Docker Swarm
  • Node.js 22 (installed automatically via Node Version Manager (NVM))

You can swap out packages with code — but by default, we use the following:

  • Milvus 
  • SentenceTransformers 
  • Datahub
  • Hugging Face
  • vLLM
For more information, review our Kamiwaza docs.
What operating systems does Kamiwaza support?

Kamiwaza can be used on Linux 22.04 LTS with Ubuntu (primary) and macOS 12.0 or later (community edition only).

What hardware and architectures can I use?

Kamiwaza is hardware-agnostic. This means you can use any hardware or architecture you’d like, including: 

  • CPU (including CPU-only environment)
  • GPU
  • Intel® Gaudi®
  • Intel® Xeon®
  • NVIDIA (and NVIDIA GPUs) 
  • AMD
  • Ampere
  • ARM64
What languages does Kamiwaza support?

Kamiwaza uses a REST API — so you can use any language, whether that’s Python, C++, Java, PHP, Typescript, C#, Bash, and so on.

Integration.

How can I purchase and integrate Kamiwaza?

You can get started with Kamiwaza directly through us, or through your cloud provider’s marketplace (Azure, Google Cloud, and Amazon Web Services (AWS)). Kamiwaza can be used in our low-code UI or in your custom AI app via our REST API.

Detailed installation steps are available in our docs.

How does Kamiwaza work with LLMs?

Kamiwaza works with large language models (LLMs) by using Hugging Face to dynamically fetch, load, and manage any open-source or custom model you choose. Because Kamiwaza connects with Hugging Face, you can discover and connect to it just like any other model — then use its standardized APIs to pull in models (including Qwen or Llama) on demand. 

Once loaded into Kamiwaza’s orchestration engine, LLMs can be combined, version-controlled, and scaled across environments. So you have the flexibility to pick the fastest, most accurate, or most private model for each task — without getting locked into a single provider.

What data sources can Kamiwaza integrate with?
Our inference mesh can search the following data sources:
  • Apache Spark
  • Snowflake
  • Databricks
  • Dell
  • SAP
  • Salesforce
  • Delta Lake
  • Datahub
  • Oracle Cloud Infrastructure 
  • Amazon S3
  • Amazon Athena
  • BigQuery

Deployment and security.

Where does Kamiwaza deploy to?

Kamiwaza supports on-premise, cloud, hybrid, and edge AI deployment.

How does Kamiwaza support secure AI data processing?

Kamiwaza has direct access to your enterprise’s data, whether that’s on-premise in a data center or in the cloud. But we never transfer or move your data across the internet — even as our AI-powered interference engine parses through databases to find information. Only inferences (answers) are exchanged and compared, keeping your actual data safe and secure.

What security protocols does Kamiwaza follow?

Kamiwaza fits into your existing DevSecOps processes. We follow all security and single sign-on (SSO) protocols set by your organization, including SAML AI integration, OAuth, and OpenID Connect. These protocols apply to the sharing of inferences, too.

Our technology partners.

GET IN TOUCH

The horizon’s bright. Let’s talk about what’s ahead.

Reach out to schedule a demo, complete an architecture review, and get started.