Skip to main content

Simple to Integrate

How It
Works

Add trustworthy reasoning to your AI in minutes. Choose your integration path: REST API, MCP for AI agents, or native SDKs. Deploy anywhere.

Get Started in Minutes

3 simple steps

From zero to production-ready reasoning in your application

1

Define Your Knowledge

Model your domain using our intuitive SDK or let AI auto-extract ontologies from your documents.

// Define domain concepts
reasoning.define({
  customer: {
    risk_score: "0..100",
    segment: ["retail", "pro"]
  }
})
2

Add Your Rules

Write rules in code, or describe them in natural language and let AI generate them.

// Option 1: Code

reasoning.rule({
  when: "risk_score > 80",
  then: "require_review"
})

// Option 2: Natural language

reasoning.fromText(`
  Flag if risk > 80
  or customer is new
`)
3

Query & Get Proofs

Every response includes a complete audit trail. Deterministic, explainable, compliant.

// Get auditable results
const result = reasoning.query({
  customer: { id: "C-123" }
})
// result.proof has full trace

Architecture

LLMs for interface
Pure logic for inference

The best of both worlds: use LLMs for natural language understanding while keeping reasoning deterministic, auditable, and hallucination-free.

🏢

Your App

Any language

🧠

ReasoningLayer

Deterministic

Auditable Results

With proof traces

OptionalAdd LLM layer for natural language interface & auto-ontology extraction

< 50ms

Response time

🔒

100% Deterministic

Same input = same output

📋

Full Audit Trail

Every decision explained

💰

No Token Costs

Pure logic, unlimited queries

Flexible Deployment

Deploy anywhere

Cloud, on-premise, edge devices, or embedded. You choose where your data lives.

Fastest Start

Managed Cloud

Start in seconds. EU-hosted, GDPR compliant, SOC 2 certified. We handle scaling and updates.

  • No infrastructure to manage
  • 99.9% SLA
  • Free tier available

Self-Hosted

Full control. Deploy on your infrastructure with Docker or Kubernetes. Air-gapped support.

  • Data never leaves your network
  • Docker & Kubernetes ready
  • Air-gapped environments

Edge & Embedded

Run on IoT devices, mobile apps, or browsers. Native Rust core runs anywhere.

  • Raspberry Pi to smartphones
  • WASM for browser
  • Fully offline capable

Integration

Choose your path

REST API for any language. MCP for AI agents. Native SDKs for the best developer experience.

SDK Integration

TypeScript, Python, Rust SDKs available

// Define a medical diagnosis query
reasoning.query({
  patient: {
    symptoms: ["fever", "cough", "fatigue"],
    history: ["diabetes"],
    vitals: { temp: 38.5, bp: "120/80" }
  },
  confidence_threshold: 0.85,
  explain: true
})

REST API

Direct integration from any language

POST
/v1/infer
curl -X POST https://api.reasoninglayer.ai/v1/infer \
  -H "Authorization: Bearer $API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "tenant": "acme-corp",
    "namespace": "medical",
    "query": {
      "type": "diagnosis",
      "patient": {
        "symptoms": ["fever", "cough"],
        "age": 45
      }
    },
    "options": {
      "explain": true,
      "max_depth": 10
    }
  }'
OpenAPI 3.0JSON responses< 50ms latencyWebhook support

MCP Protocol

Native integration with AI agents

MCP
Model Context Protocol
// MCP Tool auto-generated from ontology
{
  "name": "reasoning_query",
  "description": "Execute semantic reasoning
    over the knowledge base",
  "inputSchema": {
    "type": "object",
    "properties": {
      "domain": {
        "type": "string",
        "enum": ["medical", "legal", "finance"]
      },
      "query": { "type": "string" },
      "explain": { "type": "boolean" }
    }
  }
}
Claude compatibleGPT compatibleAuto-generated toolsStreaming

Ready to add reasoning?

Join the waitlist to get early access and start building trustworthy AI systems.

Explore Use Cases