Documentation
Complete guide to DA Toolkit - 6 core + 6 DA commands for Claude Code, from data analysis to reports and dashboards.
Getting Started
Install and setup in 2 minutes.
Installation
Default Setup (Core Only)
Creates CLAUDE.md with 6 core commands. Suitable for any project.
your-project/ ├── CLAUDE.md └── .claude/commands/
DA Setup (Full Toolkit)
RecommendedDA toolkit with 6 DA commands only, data directories, and agent templates.
your-project/ ├── CLAUDE.md ├── data/raw/ # Raw data files ├── data/processed/ # Cleaned data ├── analysis/ # Analysis outputs ├── reports/ # Generated reports ├── notebooks/ # Jupyter notebooks └── .claude/commands/da/ # 6 DA commands only
Workflow
Choose the right setup: Core for general development, DA for data analysis
Knowledge Management
Commands automatically update CLAUDE.md with learned knowledge.
Continuous Learning
Each command automatically updates knowledge to CLAUDE.md when completed. This helps Claude "remember" what it learned across sessions.
Knowledge Sections in CLAUDE.md
## Project Knowledge ## Data Sources ## Current Plans ## Project Files ## Data Quality ## Documentation How it works
Run a command, e.g.: /analyze "sales data"
Command executes and delivers results
Command auto-updates CLAUDE.md with learned knowledge (data sources, insights, patterns...)
Next time you ask, Claude has full context from CLAUDE.md
Core Commands
6 commands for general development. Install with tramy setup
/analyze Explore and understand data, code, or problems
Deep dive into any topic - explore structure, patterns, and provide insights before taking action. This is the first step in any workflow.
/analyze "sales_2024.csv" Analyze file structure, columns, data types, missing values
/analyze "user authentication flow" Analyze current auth flow, security concerns
/plan Create detailed implementation plan
Break complex tasks into specific steps with clear deliverables. Helps you get an overview before starting work.
/plan "quarterly revenue report" 5-step plan with data sources, metrics, visualizations
/plan "migrate database to PostgreSQL" Migration plan with rollback strategy
/build Implement solution - write code, queries, notebooks
Execute the plan - write SQL, Python, create notebooks, build dashboards. This is the main implementation step.
/build "ETL pipeline for user data" Python scripts with data validation
/build "REST API for products" API endpoints with CRUD operations
/test Validate results and verify data quality
Check data quality, validate results, test edge cases. Ensure output is correct before delivery.
/test "check for null values and outliers" Data quality report with issues found
/test "validate API responses" Test results with pass/fail status
/doc Generate documentation and reports
Create documentation, reports, methodology notes, presentations. Help stakeholders understand your results.
/doc "create analysis summary" Executive summary with key metrics
/doc "API documentation" OpenAPI spec with examples
/commit Git commit with proper message
Analyze changes and create descriptive commit message following conventions. Auto-format message.
/commit feat: add user analytics dashboard with KPI metrics
DA Commands
6 specialized commands for Data Analysts. Install with tramy setup da
DA Setup = DA Commands Only
tramy setup da only installs 6 DA commands (no core commands). If you need both, run tramy setup first, then tramy setup da.
/da:query Write SQL queries from natural language
Convert plain English to SQL. Supports complex queries with JOINs, subqueries, window functions, and CTEs.
"top 10 customers by revenue last quarter"
SELECT with ORDER BY, LIMIT, date filtering
"month-over-month growth rate"
Window functions with LAG()
"products that never sold"
LEFT JOIN with NULL check
/da:analyze Automatic Exploratory Data Analysis
Auto-detect the type of analysis needed (descriptive, diagnostic, predictive) and perform comprehensive EDA.
"analyze sales_data.csv"
Full EDA with distributions, correlations, outliers
"compare Q1 vs Q2 performance"
Comparative analysis with statistical tests
"find anomalies in transaction data"
Anomaly detection with IQR/Z-score
/da:clean Clean and transform data
Transform raw data to processed data. Handle missing values, duplicates, outliers, and standardize formats.
"remove duplicates and nulls from sales.csv"
Python script with pandas cleaning
"standardize date formats in transactions"
Data transformation script
"handle outliers in revenue data"
IQR/Z-score filtering
/da:report Generate professional analysis reports
Create professional reports with executive summary, methodology, findings, and recommendations.
"weekly sales summary"
Markdown report with KPIs and trends
"customer segmentation analysis"
Full report with methodology and insights
"A/B test results"
Statistical report with confidence intervals
/da:dashboard Design BI dashboards
Design dashboard layouts, define metrics, generate code for Tableau, Metabase, or custom solutions.
"sales performance dashboard"
Layout + metrics + SQL for each widget
"real-time monitoring dashboard"
Time-series charts with alerts
"customer health scorecard"
Scorecard with color-coded metrics
/da:notebook Create Jupyter notebooks for analysis
Generate complete Jupyter notebooks with data loading, cleaning, analysis, and visualization cells.
"cohort analysis notebook"
Python notebook with retention curves
"regression analysis"
Notebook with model training + evaluation
"data profiling template"
Reusable profiling notebook
Tutorials
Detailed guides for common use cases.
Sales Data Analysis
From raw data to executive report (DA Setup)
tramy setup da Place file in data/raw/ /da:analyze "explore sales_2024.csv" /da:clean "remove nulls and duplicates" /da:query "top 10 products by revenue" /da:report "Q4 sales performance summary" Create KPI Dashboard
Design executive dashboard (DA Setup)
tramy setup da /da:dashboard "revenue, users, churn metrics" /da:query "daily active users last 30 days" /da:query "MRR breakdown by plan" /da:query "churn rate by cohort" /da:report "dashboard documentation" Data Cleaning Pipeline
Raw → Clean → Analysis (DA Setup)
tramy setup da /da:analyze "profile raw data" /da:clean "standardize dates, handle nulls" /da:query "validate cleaned data" /da:notebook "data cleaning documentation" DA Workflow
Data analysis workflow
/da:analyze /da:clean /da:query /da:report /da:dashboard /da:notebook CLI Reference
All available CLI commands.
tramy setup Default setup with 6 core commands
Creates CLAUDE.md and command templates
tramy setup da DA toolkit setup
6 DA commands only + data directories
tramy setup -y Quick setup without prompts
Uses defaults, skip interactive mode
tramy list List roles and commands
Shows available roles and command counts
tramy context View project context
Shows CLAUDE.md content
tramy context update Update context
Re-scan tech stack and regenerate CLAUDE.md
tramy doctor Health check
Verify installation and diagnose issues