Home / Article / VEXIS-CLI 2

VEXIS-CLI 2

AI Agent CLI

VEXIS-CLI 2

Overview

This tool is an AI agent CLI that generates, validates, executes, and verifies optimal CLI commands from natural language instructions. It has evolved significantly from the previous version, with greatly expanded AI provider support.

Main Updates

16+ AI Providers Supported

Ollama (local), Groq, Google Gemini, OpenAI, Anthropic, xAI, Mistral, Azure OpenAI, AWS Bedrock, and more. Easily switch between your preferred providers.

5-Phase Execution Engine

Natural Language Understanding -> Command Planning -> Safety Verification -> Execution & Auto-Recovery -> Result Verification

Not just command generation, but end-to-end responsibility for the entire process.

Enhanced Safety Mechanisms

safety_mode, dry-run, command validation, blacklist/whitelist, and system critical path protection are all standard features. Detailed guidance is displayed when errors occur.

Yellow Selection Menu

An intuitive UI for selecting from multiple candidates.

Developer Quality

Black/isort/flake8/mypy compliant, pytest testing, and comprehensive documentation.

Quick Start

git clone https://github.com/vexis-project/VEXIS-CLI-2.git
cd VEXIS-CLI-2

python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
pip install -e .

cp config.yaml.example config.yaml

Recommended to try with Ollama:

ollama serve
ollama pull gemma3:4b

Then set preferred_provider: "ollama" in config.yaml and run.

Usage Examples

python run.py "Compress the latest log file with gzip and move to backup folder"
python run.py "Safely remove unused imports from Python files in the project"

For detailed documentation, see ARCHITECTURE.md / CONFIGURATION.md / OLLAMA_INTEGRATION.md in the repository.

GitHub: https://github.com/vexis-project/VEXIS-CLI-2

Please try it out! Feedback and Pull Requests are welcome.

View on GitHub Follow on X Back to Articles