Every developer has a graveyard of useful commands. They’re buried in .bash_history, scattered across Notion pages, pinned in Slack threads, or, most commonly, simply forgotten. You wrote a perfect find one-liner three weeks ago. You’ll spend ten minutes reconstructing it tomorrow.

This is the problem zipet was built to solve. Not with another note-taking app or another dotfiles repo, but with a tool that treats your commands as what they actually are: reusable, composable, shareable software.

The path every command takes

There’s a natural lifecycle that useful commands follow, and no existing tool respects the full arc:

  1. You write a command. It works.
  2. You want to reuse it, so you save it somewhere.
  3. You want it to adapt, so you parameterize it.
  4. You want to combine it with others, so you chain them into a workflow.
  5. You want your team to have it, so you share it.
  6. You want your AI agent to use it, so you expose it via an interface.

Most tools stop at step 2. Maybe step 3 if you’re lucky. zipet was designed around the full progression — from a one-liner you typed five seconds ago to a portable, version-controlled automation layer that works from the terminal, a TUI, or an AI coding agent.

Why Zig

This wasn’t an arbitrary language choice. zipet is a CLI tool, and CLI tools live or die by two things: startup time and binary portability.

Zig gives us both. The binary is statically compiled, has zero runtime dependencies, and starts instantly. There’s no interpreter warming up, no garbage collector pausing, no dynamic linker resolving. You type zipet and the TUI is already on screen.

But Zig’s real advantage for this project is libvaxis, a terminal UI library built specifically for Zig that gives zipet a proper visual interface without pulling in ncurses or any system dependency. The TUI uses vim-native keybindings (j/k, gg/G, / for search, :q to quit), which means if you already live in the terminal, zipet feels like home on the first launch.

The cross-compilation story matters too. From a single codebase, zipet builds for Linux (x86_64, aarch64), macOS (Intel and Apple Silicon), and Windows — all as static binaries. curl | bash and you’re done. No package manager, no version conflicts, no “works on my machine.”

TOML as the source of truth

Everything in zipet is a TOML file. Your snippets, your workflows, your workspace configs — all human-readable, all editable in your $EDITOR, all version-controllable with git.

This is a deliberate architectural choice. When your automation layer is stored in a format you can read, diff, and review, it stops being a black box. You can PR your team’s snippet library. You can audit what an AI agent has access to. You can copy a .toml file into a new machine and be productive in seconds.

[snippets.find-large]
desc = "Find large files"
tags = ["system", "find"]
cmd = "find {{path}} -type f -size +{{size}} -exec ls -lh {} \\;"

[snippets.find-large.params]
path = { prompt = "Search path", default = "." }
size = { prompt = "Minimum size", default = "100M" }

No YAML indentation headaches. No JSON without comments. TOML is readable, writable, and boring in exactly the right way.

The MCP layer: why it actually matters

Here’s where zipet diverges from every other snippet manager.

The rise of AI coding agents (Claude Code, Cursor, Windsurf, and what’s coming next) introduces a new problem: these agents are excellent at understanding intent but unreliable at producing commands. They hallucinate flags that don’t exist, invent paths that aren’t there, and confidently generate destructive operations.

zipet’s built-in MCP server flips this dynamic. Instead of the agent inventing commands from its training data, it searches your verified, tested snippet library. Your commands become a semantic firewall: the agent can search, preview, and execute only what you’ve already approved.

Agent: "I need to clean up Docker resources"

zipet MCP: searches your snippets → finds "docker-prune"

Preview: "docker system prune -af --volumes"

Safety gate: you approve → it runs

The safety modes (confirm, allowlist, dry-run, open) give you granular control over what the agent can do. In production, you run confirm mode: the agent proposes, you approve. In a sandboxed dev environment, you might open it up. The point is: you decide.

This isn’t theoretical. As AI agents become the primary interface for development workflows, the snippet library stops being a convenience tool and becomes infrastructure. It’s the curated set of operations your agent is allowed to perform — version-controlled, reviewable, and shared across your team.

Composability as a design principle

zipet’s feature set follows a deliberate composability curve:

  • Snippets are the atoms. Single commands, parameterized, tagged.
  • Workflows are molecules. Snippets chained together with data passing ({{prev_stdout}}), error handling, and shared parameters.
  • Packs are packages. Curated collections of snippets and workflows, installable from a registry, a URL, or a local file.
  • Workspaces are contexts. Isolated snippet collections per project that auto-activate when you cd into a directory.

Each layer builds on the one below it without replacing it. You can use zipet for years and never touch workflows. Or you can build a full CI/CD pipeline out of chained snippets and share it as a pack for your team. The tool scales with your needs.

Who this is for

zipet isn’t trying to be everything. It’s for developers who:

  • Live in the terminal and are tired of losing useful commands
  • Want automation that’s transparent and auditable, not hidden behind a GUI
  • Work with AI agents and want to control what those agents can execute
  • Believe that good tools should be fast, simple, and portable

If you’ve ever spent time reconstructing a command you know you’ve written before, or if you’ve watched an AI agent confidently run the wrong thing, zipet was built for those moments.

What’s next

The foundation is set: snippets, workflows, packs, workspaces, MCP. The roadmap extends this in the directions that matter most:

  • Team registries — private pack registries for organizations, so your team’s operational knowledge is installable, not tribal.
  • Richer AI integration — context-aware snippet suggestions based on your current project, git state, and running services.
  • Plugin system — extend zipet with custom parameter resolvers, output formatters, and execution backends.

The core philosophy won’t change: everything stays as readable TOML files, the binary stays static and dependency-free, and your commands stay yours.


zipet is open source and available on GitHub. Install it with one command:

curl -sSL https://raw.githubusercontent.com/Luisgarcav/zipet/main/scripts/install.sh | bash

Then:

zipet init && zipet

Your commands deserve better than .bash_history.