Getting StartedIntroduction

Introduction

Skyflo is a self-hosted AI operations agent for Kubernetes and CI/CD. It converts natural language into typed, auditable tool execution inside your cluster. Every mutating operation requires explicit approval before execution. Post-action verification checks that the outcome matches your intent.

It is not a CLI wrapper, not autonomous, and not a GitOps control plane. Skyflo is an in-cluster execution runtime. You describe what you want; the agent plans, proposes, and executes only after you approve.

How It Works

Skyflo follows a deterministic execution model:

Plan: The agent analyzes your intent, discovers resources, and generates a concrete, replayable plan.

Approve: Every mutating tool call requires explicit approval. Read operations flow freely. The gate is enforced by the engine, not a UI toggle.

Execute: Typed tools run via MCP (Kubernetes, Helm, Argo Rollouts, Jenkins). Schema-validated inputs. Execution runs inside the MCP server container. Full audit trail.

Verify: The agent validates the outcome matches the original intent. If state does not match, it flags the discrepancy and suggests remediation.

Supported Tools

ToolCategoryCountDescription
KubernetesCluster22 toolsDiscovery, logs, exec, apply, diff, scale, rollout
HelmPackage16 toolsSearch, install, upgrade, rollback, list, status
Argo RolloutsProgressive Delivery13 toolsPause, resume, promote, abort, retry
JenkinsCI/CD13 toolsJobs, builds, logs, SCM, identity

Safety Properties

  • Approval gate: Every mutating tool call requires explicit approval. Approval enforcement is implemented in the Engine runtime and cannot be disabled through configuration.
  • Typed tool execution: Schema-validated inputs via MCP. No raw shell injection.
  • Persisted audit trail: Every operation logged. Who asked, what was planned, who approved, what executed.
  • Replayable control loop: Plans are deterministic and replayable.
  • Runs inside cluster: No Skyflo telemetry or phone-home. LLM calls go only to the provider you configure.
  • LLM-agnostic: OpenAI, Anthropic, Gemini, Groq, Mistral, or self-hosted via Ollama.

Next Steps