Skills

The ChapterTwo CLI bundles AI skills — structured knowledge files that teach Claude Code how to use the platform. Skills are installed to ~/.claude/skills/ and automatically loaded by Claude Code when relevant.

Installation

bash
npm install -g @chapter-two-music/cli c2 skill install

Run c2 skill list to see available skills and their install status. Use c2 skill install c2 to install a specific skill.

c2

Manage rightsholders, distributors, statements, and query royalty earnings via the Chapter Two c2 CLI or MCP tools. Use when user says "show my earnings", "list rightsholders", "upload a statement", "top tracks by revenue", "earnings by source", "apply override", or any Chapter Two API call. Do NOT use for modifying the c2 CLI source code or API internals.

Files

FileDescription
SKILL.mdMain skill definition
references/analytics_queries.mdReference document
references/upload_workflow.mdReference document
scripts/c2-upload.shHelper script

SKILL.md

markdown
--- name: c2 description: > Manage rightsholders, distributors, statements, and query royalty earnings via the Chapter Two c2 CLI or MCP tools. Use when user says "show my earnings", "list rightsholders", "upload a statement", "top tracks by revenue", "earnings by source", "apply override", or any Chapter Two API call. Do NOT use for modifying the c2 CLI source code or API internals. --- # ChapterTwo Platform ## Critical Rules - NEVER guess endpoint names or input shapes. Run `c2 describe <endpoint>` first. - NEVER hardcode Cube dimension/measure names. Run `c2 api cube.meta --raw` first. - ALWAYS check auth before any operation: `c2 auth status` - ALWAYS filter analytics queries by `rightsholder_id` and set a limit. - Organization scoping is auto-injected from JWT. Never include `organization_id` in filters. For analytics/earnings questions, follow the procedure in [analytics_queries.md](references/analytics_queries.md). For uploading statements, follow the procedure in [upload_workflow.md](references/upload_workflow.md). ## Authentication ```bash c2 auth login # OAuth via browser c2 auth status # Check token validity c2 auth logout # Clear credentials ``` Tokens: scoped to one org, ~30 min. Stored in `~/.c2/credentials-prod.json`. ## API Calls ```bash c2 api <endpoint> [-i '<json>'] [--raw] ``` Shorthand: `c2 rightsholder.list` = `c2 api rightsholder.list`. Query/mutation auto-detected. Destructive actions prompt for confirmation (bypass with `-y`). ### Pagination Endpoints with .list have pagination support`c2 api rightsholder.list -i '{"pagination":{"limit":10, "page": 0}}'` if response payload are to big. ### Endpoint Discovery ```bash c2 routes --groups # All endpoints grouped by namespace c2 routes -g <pattern> # Filter c2 describe <endpoint> # Input fields, auth, example ``` ### Entity Examples ```bash c2 api rightsholder.list c2 api rightsholder.create -i '{"name":"Artist Name"}' c2 api rightsholder.getById -i '{"id":"<uuid>"}' c2 api rightsholder.updateById -i '{"id":"<uuid>", "name": "Artist Name"}' ``` ## Statement Upload (Quick Reference) Use the upload script for batch uploads: `bash scripts/c2-upload.sh <rightsholder-id> <file-or-directory>...` For the full step-by-step procedure (manual or programmatic), see [upload_workflow.md](references/upload_workflow.md). ### Overrides Works on ANY status including COMPLETE (triggers reprocessing): ```bash c2 api upload.uploadAction -i '{"action":{"type":"add-statement-overrides","upload_id":"<id>","statement_id":"<id>","overrides":{"stmtYear":{"mapping_value":"2024"}}}}' -y --raw ``` Override fields: `currency`, `country`, `stmtYear`, `stmtPeriod`, `stmtCadence`, `typeOfRight`, `source`, `salesType`. Each takes `{"mapping_value":"<value>"}`. Other actions: `select-distributor`, `file-skip`, `redrive`, `ignore-classification`, `human-input` (WAIT_FOR_HUMAN_INPUT only). See `c2 describe upload.uploadAction`. ## MCP Tools Baltazar MCP exposes the same actions with `c2_` prefix: | MCP Tool | CLI Equivalent | |----------|---------------| | `c2_api_call` | `c2 api <endpoint>` | | `c2_query_analytics` | `c2 query` | | `c2_cube_meta` | `c2 api cube.meta` | | `c2_list_endpoints` | `c2 routes` | | `c2_describe_endpoint` | `c2 describe` | | `c2_whoami` | `c2 whoami` | | `c2_health` | `c2 health` | | `c2_get_permissions` | (user capabilities) | ## Errors | Error | Fix | |-------|-----| | Not authenticated / expired | `c2 auth login` | | 403 Forbidden | User needs admin role | | 400 Bad Request | Run `c2 describe <endpoint>` to check input schema | | Cube timeout | Add `--year` filter, remove `tm_period`, split multi-RH queries | | S3 upload returns non-204 | Check curl field construction -- use Python subprocess, not shell interpolation (AWS tokens have special chars) | | Status poll returns 0 items | Ensure `organizationId` is included (required field) |

references/analytics_queries.md

markdown
# Analytics Queries ## Workflow Follow these steps in order for every analytics question. ### Step 1: Verify Auth ```bash c2 auth status c2 whoami # Confirm correct organization ``` If expired: `c2 auth login` ### Step 2: Get Rightsholder IDs ```bash c2 api rightsholder.list --raw ``` If user didn't specify a rightsholder, ask them to pick one. Rightsholder filter is always required. ### Step 3: Discover Schema ```bash c2 api cube.meta --raw ``` Use the output to select view, dimensions, and measures. Do not guess field names. ### Step 4: Build Query Rules: - Prefer `analytics_view` over `analytics_full_view` - Fully qualify all fields: `analytics_view.<field>` - Always include `rightsholder_id` filter - Always set a limit (default 100) - Use `_usd` suffix measures for earnings unless user asks for original currency - Avoid `_raw` dimensions unless user explicitly asks - For time series, prefer `stmt_year` unless user asks finer granularity Confirm your query choices with the user before executing. ### Step 5: Execute Prefer `c2 query` (auto-prefixes, handles polling): ```bash c2 query -m <measures...> -d <dimensions...> -r <uuids...> [options] ``` Options: - `-m, --measures` -- required, what to aggregate - `-d, --dimensions` -- what to group by - `-r, --rightsholder` -- required, rightsholder UUIDs - `--year <year>` -- filter by statement year - `--period <codes...>` -- TM period codes - `--filter <json>` -- additional JSON filter array - `--order <json>` -- sort order - `-l, --limit <n>` -- max rows (default 50) - `--raw` -- JSON output For full control, use `cube.query` (mutation, blocks until ready): ```bash c2 api cube.query -i '{"query":{"measures":[...],"dimensions":[...],"filters":[{"member":"analytics_view.rightsholder_id","operator":"equals","values":["<uuid>"]}],"order":{...},"limit":N}}' --raw ``` For long queries: `cube.queryAsync` + `cube.queryPoll`. ## Common Patterns | Question | c2 query | |----------|----------| | Total earnings | `-m royalty_earnings_usd -r <uuid>` | | By year | `-m royalty_earnings_usd -d stmt_year -r <uuid>` | | Top platforms | `-m royalty_earnings_usd -d source_c2e_earnings_source -r <uuid> -l 10` | | By country | `-m royalty_earnings_usd -d territory_c2e -r <uuid> -l 20` | | By track | `-m royalty_earnings_usd -d track_title_c2c -r <uuid> -l 20` | | Monthly 2024 | `-m royalty_earnings_usd -d stmt_period_full -r <uuid> --year 2024` | | Stream vs download | `-m royalty_earnings_usd -d primary_sales_type -r <uuid>` | | Compare artists | `-m royalty_earnings_usd -d rightsholder_name -r <uuid1> <uuid2>` | | Master vs publishing | `-m royalty_earnings_usd -d type_of_right -r <uuid>` | | By distributor | `-m royalty_earnings_usd -d distributor_name -r <uuid>` | | Catalogue age | `-m c2c_dollar_age -r <uuid>` | ## SQL Queries Use only if user explicitly asks for SQL: ```bash c2 api cube.sqlQuery -i '{"sql":"SELECT ... FROM analytics_view ... LIMIT 10"}' ``` ## Timeout Recovery 1. Add `--year` filter 2. Remove `tm_period` -- use `stmt_year` instead 3. Add distributor filter 4. Split multi-rightsholder queries

references/upload_workflow.md

markdown
# Statement Upload Workflow ## Overview Uploading statements is a 4-step sequential process. Each step depends on the previous one. For batch uploads, use the bundled script at `scripts/c2-upload.sh`. ## Prerequisites ```bash c2 auth status # Must show "valid" ``` Get the rightsholder ID: ```bash c2 api rightsholder.list --raw # Or create one: c2 api rightsholder.create -i '{"name":"Artist Name"}' --raw ``` Get the organization ID (needed for status polling): ```bash c2 whoami --raw # Look for the organizationId field ``` ## Content Type Mapping | Extension | Content Type | Direct Upload? | |-----------|-------------|---------------| | .csv | text/csv | Yes | | .tsv | text/tab-separated-values | Yes | | .txt | text/plain | Yes | | .html | text/html | Yes | | .xlsx, .xls | application/vnd.openxmlformats-officedocument.spreadsheetml.sheet | No (use `directUpload: false`) | | .zip | application/zip | No (use `directUpload: false`) | ## Step 1: Compute File Hash and Size Only needed for direct uploads (CSV/TSV/TXT/HTML): ```bash FILE_HASH=$(shasum -a 256 "$FILE" | awk '{print $1}') FILE_SIZE=$(wc -c < "$FILE" | tr -d ' ') ``` ## Step 2: Get Presigned URL ```bash # Direct upload (CSV/TSV/TXT/HTML): c2 api upload.uploadFile -i "{ \"rightsholderId\":\"<uuid>\", \"fileName\":\"file.csv\", \"directUpload\":true, \"fileId\":\"$FILE_HASH\", \"contentType\":\"text/csv\", \"fileSize\":$FILE_SIZE }" --raw # Normal upload (ZIP/Excel): c2 api upload.uploadFile -i '{"rightsholderId":"<uuid>","fileName":"file.xlsx","directUpload":false}' --raw ``` ### Response Structure ```json { "uploadId": "hash-based-deterministic-id", "fileId": "sha256-of-file-content", "statementId": "same-as-fileId-for-direct", "directUpload": true, "url": "https://<bucket>.s3.eu-central-1.amazonaws.com/", "fields": { "x-amz-meta-uploadid": "...", "x-amz-meta-originalfilename": "...", "x-amz-meta-rightsholderid": "...", "x-amz-meta-organizationid": "...", "x-amz-meta-userid": "...", "bucket": "...", "X-Amz-Algorithm": "AWS4-HMAC-SHA256", "X-Amz-Credential": "...", "X-Amz-Date": "...", "X-Amz-Security-Token": "...very long token with +/= chars...", "key": "...", "Policy": "...base64...", "X-Amz-Signature": "..." } } ``` IMPORTANT: The `fields` object contains AWS Security Tokens with special characters (`+`, `/`, `=`). These MUST NOT be shell-interpolated or they will be mangled, causing S3 to return HTTP 400. ## Step 3: Upload File to S3 CRITICAL: Use Python subprocess to construct the curl command. Do NOT use shell variable expansion for the presigned fields -- the AWS Security Token contains `+`, `/`, `=` characters that get corrupted by shell interpolation. ### Recommended: Python subprocess approach ```python import subprocess, json text = raw_cli_output data = json.loads(text) url = data['url'] fields = data['fields'] # Build curl args as a list (no shell interpolation) cmd = ['curl', '-s', '-w', '%{http_code}', '-o', '/dev/null', '-X', 'POST', url] for key, value in fields.items(): cmd.extend(['-F', f'{key}={value}']) cmd.extend(['-F', f'file=@{file_path}']) # File MUST be last result = subprocess.run(cmd, capture_output=True, text=True) http_code = result.stdout # "204" = success ``` ### Alternative: Write a temp script If you must use bash, write the curl command to a temp file and execute it: ```bash # Extract JSON and generate curl script echo "$RAW_RESPONSE" | python3 -c " import sys, json text = sys.stdin.read() d = json.loads(text) url = d['url'] fields = d['fields'] with open('/tmp/upload_cmd.sh', 'w') as f: f.write('curl -s -w \"%{http_code}\" -o /dev/null -X POST \"' + url + '\"') for k, v in fields.items(): f.write(f' -F \"{k}={v}\"') f.write(f' -F \"file=@\$1\"') " && bash /tmp/upload_cmd.sh "$FILE" ``` ### Verify HTTP 204 = success. Any other code = failure. ## Step 4: Poll Processing Status ```bash c2 api upload.statements -i '{ "organizationId":"<org-uuid>", "rightsholderId":"<rh-uuid>", "pagination":{"page":1,"limit":50} }' --raw ``` IMPORTANT: `organizationId` is REQUIRED. Get it from `c2 whoami --raw` or `c2 auth status`. ### Response Item Fields | Field | Description | |-------|------------| | `upload_id` | Upload group ID (NOT `uploadId`) | | `statement_id` | Per-file statement ID (NOT `id` or `statementId`) | | `originalUploadName` | Original filename as uploaded | | `fileName` | Processed filename | | `statementStatus` | Current processing status | | `keyMetrics.earnings_usd` | Total earnings (available after COMPLETE) | | `keyMetrics.stmtYear` | Detected statement year(s) | | `keyMetrics.currency` | Detected currency(ies) | | `distributorId` | Matched distributor UUID | | `totalRows` | Row count | ### Terminal Statuses - COMPLETE -- successfully processed - PROCESSING_SUCCESS -- processed (legacy status) - SKIP_FILE -- file was skipped - Error statuses: DECODING_ERROR, CORRUPT_EXCEL_FILE, NO_CONTENT_FOUND, PREPARE_STATEMENT_FAILED, PROCESS_STATEMENT_FAILED ### Parsing Status Response ```python import json text = raw_output data = json.loads(text) for item in data.get('items', []): name = item.get('originalUploadName', '') or item.get('fileName', '') status = item['statementStatus'] sid = item['statement_id'] uid = item['upload_id'] earnings = (item.get('keyMetrics') or {}).get('earnings_usd', '-') print(f'{name}: {status} (earnings: ${earnings})') ``` ## Complete Single-File Example ```bash RH_ID="<rightsholder-uuid>" FILE="/path/to/statement.csv" FILENAME=$(basename "$FILE") # Step 1: Hash FILE_HASH=$(shasum -a 256 "$FILE" | awk '{print $1}') FILE_SIZE=$(wc -c < "$FILE" | tr -d ' ') # Step 2: Presigned URL RAW=$(c2 api upload.uploadFile -i "{\"rightsholderId\":\"$RH_ID\",\"fileName\":\"$FILENAME\",\"directUpload\":true,\"fileId\":\"$FILE_HASH\",\"contentType\":\"text/csv\",\"fileSize\":$FILE_SIZE}" --raw 2>&1) # Step 3: Upload via Python subprocess (safe from shell escaping) echo "$RAW" | python3 -c " import sys, json, subprocess text = sys.stdin.read() d = json.loads(text) cmd = ['curl', '-s', '-w', '%{http_code}', '-o', '/dev/null', '-X', 'POST', d['url']] for k, v in d['fields'].items(): cmd.extend(['-F', f'{k}={v}']) cmd.extend(['-F', f'file=@{sys.argv[1]}']) r = subprocess.run(cmd, capture_output=True, text=True) print('SUCCESS' if r.stdout == '204' else f'FAILED: HTTP {r.stdout}') " "$FILE" ``` ## Batch Upload: Use the Script For uploading multiple files or directories: ```bash # Upload all files in a directory bash .claude/skills/c2/scripts/c2-upload.sh <rightsholder-id> /path/to/folder/ # Upload specific files bash .claude/skills/c2/scripts/c2-upload.sh <rightsholder-id> file1.csv file2.csv file3.txt # Mix files and directories bash .claude/skills/c2/scripts/c2-upload.sh <rightsholder-id> /path/to/folder/ extra-file.csv ``` ## Applying Overrides After Upload To find a specific statement's IDs for overrides: ```bash # Get statement_id and upload_id by filename c2 api upload.statements -i '{ "organizationId":"<org-uuid>", "rightsholderId":"<rh-uuid>", "pagination":{"page":1,"limit":50} }' --raw | python3 -c " import sys, json d = json.load(sys.stdin) for item in d.get('items', []): name = item.get('originalUploadName', '') or item.get('fileName', '') print(f'{name}: upload_id={item[\"upload_id\"]} statement_id={item[\"statement_id\"]} status={item[\"statementStatus\"]}') " ``` Then apply the override: ```bash c2 api upload.uploadAction -i '{ "action":{ "type":"add-statement-overrides", "upload_id":"<upload_id>", "statement_id":"<statement_id>", "overrides":{"stmtYear":{"mapping_value":"2024"}} } }' -y --raw ```

scripts/c2-upload.sh

bash
#!/usr/bin/env bash # c2-upload.sh -- Batch upload statement files to ChapterTwo # # Usage: # bash c2-upload.sh <rightsholder-id> <file-or-directory>... # # Examples: # bash c2-upload.sh abc-123 /path/to/statements/ # bash c2-upload.sh abc-123 file1.csv file2.txt /path/to/more/ # # Supports: CSV, TSV, TXT, HTML (direct upload), ZIP, XLSX, XLS (extraction upload) set -euo pipefail if [ $# -lt 2 ]; then echo "Usage: $0 <rightsholder-id> <file-or-directory>..." exit 1 fi RH_ID="$1" shift # Collect all files from arguments (expand directories) FILES=() for arg in "$@"; do if [ -d "$arg" ]; then while IFS= read -r -d '' f; do FILES+=("$f") done < <(find "$arg" -type f ! -name '.DS_Store' ! -name '._*' -print0 | sort -z) elif [ -f "$arg" ]; then FILES+=("$arg") else echo "WARNING: Skipping '$arg' (not a file or directory)" fi done if [ ${#FILES[@]} -eq 0 ]; then echo "ERROR: No files found to upload." exit 1 fi echo "Uploading ${#FILES[@]} file(s) for rightsholder $RH_ID" echo "" SUCCESS=0 FAILED=0 for FILE in "${FILES[@]}"; do FILENAME=$(basename "$FILE") EXT="${FILENAME##*.}" EXT_LOWER=$(echo "$EXT" | tr '[:upper:]' '[:lower:]') # Determine content type and upload mode DIRECT="true" case "$EXT_LOWER" in csv) CONTENT_TYPE="text/csv" ;; tsv) CONTENT_TYPE="text/tab-separated-values" ;; txt) CONTENT_TYPE="text/plain" ;; html) CONTENT_TYPE="text/html" ;; xlsx|xls) CONTENT_TYPE="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"; DIRECT="false" ;; zip) CONTENT_TYPE="application/zip"; DIRECT="false" ;; *) echo "=== SKIP: $FILENAME (unsupported extension: .$EXT_LOWER) ===" FAILED=$((FAILED + 1)) continue ;; esac FILE_SIZE=$(wc -c < "$FILE" | tr -d ' ') echo "=== Uploading: $FILENAME ($FILE_SIZE bytes, $CONTENT_TYPE) ===" # Build upload request if [ "$DIRECT" = "true" ]; then FILE_HASH=$(shasum -a 256 "$FILE" | awk '{print $1}') RAW=$(c2 api upload.uploadFile -i "{\"rightsholderId\":\"$RH_ID\",\"fileName\":\"$FILENAME\",\"directUpload\":true,\"fileId\":\"$FILE_HASH\",\"contentType\":\"$CONTENT_TYPE\",\"fileSize\":$FILE_SIZE}" --raw 2>&1) else RAW=$(c2 api upload.uploadFile -i "{\"rightsholderId\":\"$RH_ID\",\"fileName\":\"$FILENAME\",\"directUpload\":false}" --raw 2>&1) fi # Upload to S3 using Python subprocess (safe from shell escaping issues) RESULT=$(echo "$RAW" | python3 -c " import sys, json, subprocess text = sys.stdin.read() try: d = json.loads(text) except json.JSONDecodeError as e: print(f'ERROR: Invalid JSON: {e}') sys.exit(0) if 'url' not in d or 'fields' not in d: print(f'ERROR: Missing url/fields in response: {list(d.keys())}') sys.exit(0) cmd = ['curl', '-s', '-w', '%{http_code}', '-o', '/dev/null', '-X', 'POST', d['url']] for k, v in d['fields'].items(): cmd.extend(['-F', f'{k}={v}']) cmd.extend(['-F', f'file=@{sys.argv[1]}']) r = subprocess.run(cmd, capture_output=True, text=True, timeout=300) print(r.stdout) " "$FILE" 2>&1) if [ "$RESULT" = "204" ]; then echo " SUCCESS" SUCCESS=$((SUCCESS + 1)) else echo " FAILED ($RESULT)" FAILED=$((FAILED + 1)) fi done echo "" echo "=== Upload complete: $SUCCESS succeeded, $FAILED failed out of ${#FILES[@]} ==="