Skip to content

feat: add /ops:marketing — email, ads, analytics, SEO command center#49

Merged
auroracapital merged 2 commits intomainfrom
feature/ops-ecom
Apr 13, 2026
Merged

feat: add /ops:marketing — email, ads, analytics, SEO command center#49
auroracapital merged 2 commits intomainfrom
feature/ops-ecom

Conversation

@auroracapital
Copy link
Copy Markdown
Collaborator

@auroracapital auroracapital commented Apr 13, 2026

Summary

  • Adds skills/ops-marketing/SKILL.md — full marketing command center with Klaviyo (email), Meta Ads, GA4, Google Search Console, social media, cross-channel campaigns, and interactive setup
  • Adds bin/ops-marketing-dash — bash pre-gather script that hits all configured channels in parallel and outputs unified JSON
  • Updates skills/ops/SKILL.md router with marketing, email, klaviyo, ads, meta, seo, campaigns routes
  • Adds 5 new userConfig fields to plugin.json: klaviyo_private_key, meta_ads_token, meta_ad_account_id, ga4_property_id, google_search_console_site

Test plan

  • /ops:marketing with no args shows unified dashboard (graceful nulls for unconfigured channels)
  • /ops:marketing setup prompts for each credential and validates with test API call
  • /ops:marketing email returns Klaviyo lists + campaigns
  • /ops:marketing ads returns Meta Ads spend/ROAS for last 7d
  • /ops:marketing analytics returns GA4 sessions/conversions (requires gcloud ADC)
  • /ops:marketing seo returns Search Console clicks/impressions
  • /ops:marketing campaigns shows cross-channel active campaigns
  • /ops router correctly dispatches marketing, email, klaviyo, ads, meta, seo, campaigns
  • Credentials resolve via userConfig → env var → Doppler order

🤖 Generated with Claude Code


Open with Devin

Note

Medium Risk
Introduces new scripts/skills that call external APIs (Shopify, Klaviyo, Meta, Google) and adds new credential configuration fields; main risk is misconfiguration or leaking/handling tokens incorrectly in logs/output.

Overview
Adds new /ops:ops-marketing and /ops:ops-ecom skills to provide a marketing dashboard (Klaviyo, Meta Ads, GA4, Search Console) and a Shopify store command center (orders, inventory, fulfillment, analytics, health).

Introduces bin/ops-marketing-dash and bin/ops-ecom-health to fetch channel/store metrics and emit unified JSON, with credential resolution via plugin config/env vars and Doppler fallback.

Extends .claude-plugin/plugin.json with new sensitive API key/token fields (plus Shopify/ShipBob settings) and updates the main skills/ops/SKILL.md router to dispatch new marketing and ecom commands.

Reviewed by Cursor Bugbot for commit f784157. Bugbot is set up for automated code reviews on this repo. Configure here.

Summary by CodeRabbit

  • New Features
    • Added integrated marketing operations dashboard aggregating data from Klaviyo, Meta Ads, Google Analytics 4, and Google Search Console.
    • Enabled configuration support for connecting multiple external marketing platforms and services.
    • Introduced marketing command center for email campaigns, ad performance monitoring, analytics, SEO tracking, and cross-channel reporting.
    • Implemented secure credential management with environment variable and configuration-based support.

…enter

Adds ops-marketing skill with Klaviyo, Meta Ads, GA4, and Google Search Console
integrations. Includes bin/ops-marketing-dash pre-gather script, ops router update,
and plugin.json userConfig fields for all marketing credentials.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Copilot AI review requested due to automatic review settings April 13, 2026 15:29
@blocksorg
Copy link
Copy Markdown
Contributor

blocksorg Bot commented Apr 13, 2026

Mention Blocks like a regular teammate with your question or request:

@blocks review this pull request
@blocks make the following changes ...
@blocks create an issue from what was mentioned in the following comment ...
@blocks explain the following code ...
@blocks are there any security or performance concerns?

Run @blocks /help for more information.

Workspace settings | Disable this message

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 13, 2026

📝 Walkthrough

Walkthrough

A new marketing operations skill and dashboard integrate with Klaviyo, Meta Ads, GA4, and Google Search Console. Plugin configuration is extended with credential fields, a bash script aggregates marketing metrics from multiple sources, and command routing directs marketing-related inputs to the new skill center.

Changes

Cohort / File(s) Summary
Configuration & Credentials
claude-ops/.claude-plugin/plugin.json
Extended userConfig schema with five new fields: two sensitive credential fields (klaviyo_private_key, meta_ads_token) and three non-sensitive configuration fields (meta_ad_account_id, ga4_property_id, google_search_console_site), all defaulting to empty strings.
Marketing Dashboard Script
claude-ops/bin/ops-marketing-dash
New executable bash script that fetches and aggregates marketing metrics from four channels (Klaviyo, Meta Ads, GA4, Google Search Console). Performs credential resolution from environment variables, plugin config, and Doppler secrets; executes concurrent API requests; parses responses with jq and awk; returns consolidated JSON output with per-channel data or null for unconfigured channels.
Marketing Skill Definition
claude-ops/skills/ops-marketing/SKILL.md
New skill document defining an ops-marketing command center with sub-commands for dashboard, email/ads/analytics/seo/social/campaigns, and setup. Includes credential resolution flows, per-channel command implementations (API queries, token acquisition, data aggregation), and interactive setup with validation.
Command Routing
claude-ops/skills/ops/SKILL.md
Added routing rule to direct inputs matching marketing-related keywords (marketing, email, klaviyo, ads, meta, seo, campaigns) to the new ops-marketing skill.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant OpsMktSkill as ops-marketing<br/>Skill
    participant Dashboard as ops-marketing-dash<br/>Script
    participant CredMgmt as Config/Secrets<br/>Management
    participant MktAPIs as Marketing APIs<br/>(Klaviyo, Meta, GA4, GSC)

    User->>OpsMktSkill: Request marketing data/dashboard
    OpsMktSkill->>Dashboard: Execute dashboard command
    Dashboard->>CredMgmt: Resolve credentials<br/>(env → plugin config → doppler)
    CredMgmt-->>Dashboard: Return credentials
    Dashboard->>MktAPIs: Concurrent API requests<br/>for each channel
    MktAPIs-->>Dashboard: Channel-specific responses
    Dashboard->>Dashboard: Parse & aggregate metrics<br/>(jq, awk processing)
    Dashboard-->>OpsMktSkill: Return JSON{date, klaviyo,<br/>meta, ga4, gsc}
    OpsMktSkill->>OpsMktSkill: Render dashboard/response
    OpsMktSkill-->>User: Display marketing metrics
Loading

Estimated Code Review Effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Poem

🐰 Hop, skip, and a data leap—
Klaviyo metrics we now keep!
Meta ads and GA4 flow,
Marketing dashboards steal the show!
Five new fields, four APIs clear,
Better insights, never fear! 🎯

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Title check ✅ Passed The title accurately describes the main change: addition of a /ops:marketing command center that integrates email, ads, analytics, and SEO capabilities across multiple platforms.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feature/ops-ecom

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@devin-ai-integration devin-ai-integration Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 3 potential issues.

View 4 additional findings in Devin Review.

Open in Devin Review

Comment on lines +171 to +175
KLAVIYO_DATA=$(gather_klaviyo) &
META_DATA=$(gather_meta) &
GA4_DATA=$(gather_ga4) &
GSC_DATA=$(gather_gsc) &
wait
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 Background subshell variable assignments are invisible to parent shell, making all gathered data empty

The parallel execution pattern KLAVIYO_DATA=$(gather_klaviyo) & runs the command substitution and assignment inside a background subshell. Variable assignments in subshells do not propagate back to the parent shell. After wait, KLAVIYO_DATA, META_DATA, GA4_DATA, and GSC_DATA are all empty/unset in the parent. With set -u (line 5), the script will immediately fail with an "unbound variable" error at line 179 when trying to use $KLAVIYO_DATA. Compare with the working pattern in claude-ops/bin/ops-gather which correctly writes subshell output to temp files and reads them back.

Prompt for agents
The parallel execution pattern on lines 171-174 assigns command substitution results in background subshells, but those assignments never propagate to the parent shell. After wait, all four variables (KLAVIYO_DATA, META_DATA, GA4_DATA, GSC_DATA) are empty/unset, causing the script to crash at line 179 due to set -u.

The existing ops-gather script (claude-ops/bin/ops-gather) demonstrates the correct pattern: create a temp directory, run each function writing to a temp file in the background, wait, then read the files back. Apply the same pattern here:

1. Create a temp directory with mktemp -d and a trap to clean it up
2. Change each background call to: gather_klaviyo > "$TMPDIR/klaviyo.json" &
3. After wait, read the files: KLAVIYO_DATA=$(cat "$TMPDIR/klaviyo.json")
4. Then use the variables in the final jq command as before
Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

| revenue, money, mrr, costs | `/ops-revenue` |
| deploy, ship | `/ops-deploy` |
| merge, prs, ship-prs | `/ops-merge $ARGUMENTS` |
| marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` |
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 Keyword email is ambiguous — matches both comms and marketing routes

The keyword email now appears in two routing table entries: line 24 routes it to /ops-comms $ARGUMENTS and line 33 routes it to /ops:ops-marketing $ARGUMENTS. When a user types email as the argument, the router has conflicting instructions and may pick either destination unpredictably, breaking the expected behavior for one or both features.

Suggested change
| marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` |
| marketing, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` |
Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

Comment on lines +143 to +144
GSC_SITE_ENCODED=$(python3 -c "import urllib.parse; print(urllib.parse.quote('${GSC_SITE}', safe=''))" 2>/dev/null \
|| echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g')
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 Shell injection via GSC_SITE in python3 single-quoted string

On line 143, GSC_SITE is interpolated directly into a Python single-quoted string literal: print(urllib.parse.quote('${GSC_SITE}', safe='')). If GSC_SITE contains a single quote (e.g., a user mistakenly configures it as https://example.com'; import os; os.system('...')), the shell expansion breaks out of the Python string and executes arbitrary code. While the value is user-configured (self-inflicted), this is still a code injection vector that should be sanitized, e.g., by passing the value as a command-line argument to python3 rather than interpolating it into the source code.

Suggested change
GSC_SITE_ENCODED=$(python3 -c "import urllib.parse; print(urllib.parse.quote('${GSC_SITE}', safe=''))" 2>/dev/null \
|| echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g')
GSC_SITE_ENCODED=$(python3 -c "import urllib.parse, sys; print(urllib.parse.quote(sys.argv[1], safe=''))" "${GSC_SITE}" 2>/dev/null \
|| echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g')
Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

Comment on lines +171 to +175
KLAVIYO_DATA=$(gather_klaviyo) &
META_DATA=$(gather_meta) &
GA4_DATA=$(gather_ga4) &
GSC_DATA=$(gather_gsc) &
wait
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Assigning variables from backgrounded commands like VAR=$(command) & will result in empty variables, as the assignment happens in a subshell. This causes a downstream jq failure.
Severity: CRITICAL

Suggested Fix

Follow the established pattern seen in the ops-dash script. Use temporary files to store the output of the background processes. Create a temporary directory, write the output of each backgrounded command to a file within it, and then after the wait command, read the contents of those files back into the variables.

Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.

Location: claude-ops/bin/ops-marketing-dash#L171-L175

Potential issue: The script executes data gathering functions like `gather_klaviyo` in
the background and attempts to assign their output to variables (e.g.,
`KLAVIYO_DATA=$(gather_klaviyo) &`). Because these assignments run in a background
subshell, the variables (`KLAVIYO_DATA`, `META_DATA`, etc.) remain empty in the parent
shell. These empty strings are then passed to `jq` via the `--argjson` flag. Since an
empty string is not valid JSON, `jq` will exit with an error, causing the entire
`/ops:marketing` dashboard script to fail deterministically on every execution.

Did we get this right? 👍 / 👎 to inform future reviews.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: d6206d770b

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +171 to +174
KLAVIYO_DATA=$(gather_klaviyo) &
META_DATA=$(gather_meta) &
GA4_DATA=$(gather_ga4) &
GSC_DATA=$(gather_gsc) &
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Capture parallel gather output in parent shell

These backgrounded assignments (KLAVIYO_DATA=$(...) & etc.) execute in subshells, so the parent shell never receives the variable values; with set -u, the later jq --argjson ... "$KLAVIYO_DATA" expansion aborts with an unbound-variable error and the dashboard fails before emitting JSON. Running bash claude-ops/bin/ops-marketing-dash currently exits on this path, so results need to be collected via temp files/FIFOs (or another parent-shell-safe pattern) instead of backgrounded assignments.

Useful? React with 👍 / 👎.

| revenue, money, mrr, costs | `/ops-revenue` |
| deploy, ship | `/ops-deploy` |
| merge, prs, ship-prs | `/ops-merge $ARGUMENTS` |
| marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` |
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Remove duplicate email alias from ops router

Adding email to this marketing route makes routing ambiguous because email is already mapped to /ops-comms earlier in the same table, so /ops email no longer has a deterministic target and can keep dispatching to comms instead of marketing. Since this file is the router instruction source, duplicate aliases make the new marketing entry unreliable; use non-overlapping keywords or remove one mapping.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
claude-ops/skills/ops/SKILL.md (1)

24-33: ⚠️ Potential issue | 🟠 Major

Routing conflict: email keyword claimed by both ops-comms and ops-marketing.

Line 24 routes email to /ops-comms (Gmail/messaging), while line 33 also includes email routing to /ops:ops-marketing (Klaviyo). This creates ambiguous routing behavior.

Consider disambiguating by:

  • Removing email from the marketing route and keeping only klaviyo for Klaviyo-specific access
  • Or renaming the comms route to use gmail instead of email
Suggested fix (option 1 - remove email from marketing route)
-| marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` |
+| marketing, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@claude-ops/skills/ops/SKILL.md` around lines 24 - 33, Routing currently
conflicts because the keyword "email" is listed for both ops-comms and the
ops-marketing route (/ops:ops-marketing); remove "email" from the marketing
keywords so "email" only maps to ops-comms (leave ops-comms' keywords intact) by
editing the marketing entry that currently reads "marketing, email, klaviyo,
ads, meta, seo, campaigns" to exclude "email" (keep "klaviyo" to target
Klaviyo-specific access); ensure the change is reflected in SKILL.md where the
`/ops:ops-marketing $ARGUMENTS` route is declared.
🧹 Nitpick comments (1)
claude-ops/skills/ops-marketing/SKILL.md (1)

141-141: Minor grammar nit: "Top performing" → "Top-performing".

Compound adjectives before a noun should be hyphenated.

-### Top performing ads (last 7d)
+### Top-performing ads (last 7d)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@claude-ops/skills/ops-marketing/SKILL.md` at line 141, Update the header text
"Top performing ads (last 7d)" to use a hyphenated compound adjective by
changing it to "Top-performing ads (last 7d)"; locate the header string in
SKILL.md (the line containing "Top performing ads (last 7d)") and replace it
accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@claude-ops/bin/ops-marketing-dash`:
- Around line 170-184: The backgrounded assignments like
KLAVIYO_DATA=$(gather_klaviyo) & run in subshells so the parent shell never gets
the results; change the pattern to have each gather_* producer write its JSON to
a unique temporary file (e.g., tmp_klaviyo, tmp_meta, tmp_ga4, tmp_gsc) while
still running in background (call gather_klaviyo > "$TMP_KLAVIYO" & etc.), wait
for all to finish, then read those files into KLAVIYO_DATA, META_DATA, GA4_DATA,
GSC_DATA (or pass them to jq via --argfile/--slurpfile) and clean up the temp
files; use the gather_klaviyo, gather_meta, gather_ga4, gather_gsc names to
locate where to redirect output.
- Around line 143-144: The current command injects GSC_SITE directly into a
Python string in the python3 -c invocation (producing GSC_SITE_ENCODED), which
allows a value containing a single quote to break out; change the invocation to
pass GSC_SITE via the environment or stdin so it is not interpolated into the
source string (e.g., read os.environ['GSC_SITE'] or read from stdin inside the
Python snippet) and fall back to the sed encoding only if the python invocation
fails; update references to GSC_SITE_ENCODED and the python3 -c usage
accordingly so input is safely obtained from the environment/STDIN rather than
string interpolation.

---

Outside diff comments:
In `@claude-ops/skills/ops/SKILL.md`:
- Around line 24-33: Routing currently conflicts because the keyword "email" is
listed for both ops-comms and the ops-marketing route (/ops:ops-marketing);
remove "email" from the marketing keywords so "email" only maps to ops-comms
(leave ops-comms' keywords intact) by editing the marketing entry that currently
reads "marketing, email, klaviyo, ads, meta, seo, campaigns" to exclude "email"
(keep "klaviyo" to target Klaviyo-specific access); ensure the change is
reflected in SKILL.md where the `/ops:ops-marketing $ARGUMENTS` route is
declared.

---

Nitpick comments:
In `@claude-ops/skills/ops-marketing/SKILL.md`:
- Line 141: Update the header text "Top performing ads (last 7d)" to use a
hyphenated compound adjective by changing it to "Top-performing ads (last 7d)";
locate the header string in SKILL.md (the line containing "Top performing ads
(last 7d)") and replace it accordingly.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 881d2cb7-50b0-4948-ba6c-f5be2dbfa80a

📥 Commits

Reviewing files that changed from the base of the PR and between d326544 and d6206d7.

📒 Files selected for processing (4)
  • claude-ops/.claude-plugin/plugin.json
  • claude-ops/bin/ops-marketing-dash
  • claude-ops/skills/ops-marketing/SKILL.md
  • claude-ops/skills/ops/SKILL.md

Comment on lines +143 to +144
GSC_SITE_ENCODED=$(python3 -c "import urllib.parse; print(urllib.parse.quote('${GSC_SITE}', safe=''))" 2>/dev/null \
|| echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g')
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Potential command injection in URL encoding.

The GSC_SITE variable is interpolated directly into a Python string. If it contains a single quote, it could break out of the string context.

Safer approach using environment variable
-  GSC_SITE_ENCODED=$(python3 -c "import urllib.parse; print(urllib.parse.quote('${GSC_SITE}', safe=''))" 2>/dev/null \
-    || echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g')
+  GSC_SITE_ENCODED=$(GSC_SITE="$GSC_SITE" python3 -c "import urllib.parse, os; print(urllib.parse.quote(os.environ['GSC_SITE'], safe=''))" 2>/dev/null \
+    || printf '%s' "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g')
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
GSC_SITE_ENCODED=$(python3 -c "import urllib.parse; print(urllib.parse.quote('${GSC_SITE}', safe=''))" 2>/dev/null \
|| echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g')
GSC_SITE_ENCODED=$(GSC_SITE="$GSC_SITE" python3 -c "import urllib.parse, os; print(urllib.parse.quote(os.environ['GSC_SITE'], safe=''))" 2>/dev/null \
|| printf '%s' "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g')
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@claude-ops/bin/ops-marketing-dash` around lines 143 - 144, The current
command injects GSC_SITE directly into a Python string in the python3 -c
invocation (producing GSC_SITE_ENCODED), which allows a value containing a
single quote to break out; change the invocation to pass GSC_SITE via the
environment or stdin so it is not interpolated into the source string (e.g.,
read os.environ['GSC_SITE'] or read from stdin inside the Python snippet) and
fall back to the sed encoding only if the python invocation fails; update
references to GSC_SITE_ENCODED and the python3 -c usage accordingly so input is
safely obtained from the environment/STDIN rather than string interpolation.

Comment on lines +170 to +184
# Run all in parallel (subshells)
KLAVIYO_DATA=$(gather_klaviyo) &
META_DATA=$(gather_meta) &
GA4_DATA=$(gather_ga4) &
GSC_DATA=$(gather_gsc) &
wait

# Emit unified JSON
jq -n \
--argjson klaviyo "$KLAVIYO_DATA" \
--argjson meta "$META_DATA" \
--argjson ga4 "$GA4_DATA" \
--argjson gsc "$GSC_DATA" \
--arg date "$TODAY" \
'{date: $date, klaviyo: $klaviyo, meta: $meta, ga4: $ga4, gsc: $gsc}'
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Critical: Background subshell variable assignments won't propagate to parent shell.

When you run VAR=$(command) &, the variable assignment occurs in a subshell. After wait, the parent shell's variables (KLAVIYO_DATA, META_DATA, etc.) remain unset/empty, causing the final jq command on lines 178-184 to fail.

Proposed fix using temporary files
-# Run all in parallel (subshells)
-KLAVIYO_DATA=$(gather_klaviyo) &
-META_DATA=$(gather_meta) &
-GA4_DATA=$(gather_ga4) &
-GSC_DATA=$(gather_gsc) &
-wait
-
-# Emit unified JSON
-jq -n \
-  --argjson klaviyo "$KLAVIYO_DATA" \
-  --argjson meta "$META_DATA" \
-  --argjson ga4 "$GA4_DATA" \
-  --argjson gsc "$GSC_DATA" \
-  --arg date "$TODAY" \
-  '{date: $date, klaviyo: $klaviyo, meta: $meta, ga4: $ga4, gsc: $gsc}'
+# Run all in parallel using temp files
+TMPDIR=$(mktemp -d)
+trap 'rm -rf "$TMPDIR"' EXIT
+
+gather_klaviyo > "$TMPDIR/klaviyo.json" &
+gather_meta > "$TMPDIR/meta.json" &
+gather_ga4 > "$TMPDIR/ga4.json" &
+gather_gsc > "$TMPDIR/gsc.json" &
+wait
+
+# Emit unified JSON
+jq -n \
+  --slurpfile klaviyo "$TMPDIR/klaviyo.json" \
+  --slurpfile meta "$TMPDIR/meta.json" \
+  --slurpfile ga4 "$TMPDIR/ga4.json" \
+  --slurpfile gsc "$TMPDIR/gsc.json" \
+  --arg date "$TODAY" \
+  '{date: $date, klaviyo: $klaviyo[0], meta: $meta[0], ga4: $ga4[0], gsc: $gsc[0]}'
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@claude-ops/bin/ops-marketing-dash` around lines 170 - 184, The backgrounded
assignments like KLAVIYO_DATA=$(gather_klaviyo) & run in subshells so the parent
shell never gets the results; change the pattern to have each gather_* producer
write its JSON to a unique temporary file (e.g., tmp_klaviyo, tmp_meta, tmp_ga4,
tmp_gsc) while still running in background (call gather_klaviyo > "$TMP_KLAVIYO"
& etc.), wait for all to finish, then read those files into KLAVIYO_DATA,
META_DATA, GA4_DATA, GSC_DATA (or pass them to jq via --argfile/--slurpfile) and
clean up the temp files; use the gather_klaviyo, gather_meta, gather_ga4,
gather_gsc names to locate where to redirect output.

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds a new “marketing command center” skill to the ops plugin, intended to centralize email (Klaviyo), paid ads (Meta), analytics (GA4), and SEO (Search Console) into a unified /ops-routed experience.

Changes:

  • Added ops-marketing skill documentation/routing, including setup and dashboard behavior.
  • Added bin/ops-marketing-dash to pre-gather multi-channel metrics and emit unified JSON.
  • Extended /ops router and plugin userConfig schema with marketing-related credentials.

Reviewed changes

Copilot reviewed 3 out of 4 changed files in this pull request and generated 6 comments.

File Description
claude-ops/skills/ops/SKILL.md Adds router entry intended to dispatch marketing-related requests to the new skill.
claude-ops/skills/ops-marketing/SKILL.md Introduces the marketing skill content: credential resolution, subcommands, and dashboard/setup guidance.
claude-ops/bin/ops-marketing-dash New Bash gatherer script intended to run channel fetches in parallel and output unified JSON.
claude-ops/.claude-plugin/plugin.json Adds new userConfig keys for Klaviyo, Meta Ads, GA4, and Search Console configuration.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

| revenue, money, mrr, costs | `/ops-revenue` |
| deploy, ship | `/ops-deploy` |
| merge, prs, ship-prs | `/ops-merge $ARGUMENTS` |
| marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` |
```bash
GA4_PROPERTY="${GA4_PROPERTY_ID:-$(claude plugin config get ga4_property_id 2>/dev/null)}"
# GA4 uses gcloud application default credentials — check if configured:
gcloud auth application-default print-access-token 2>/dev/null
Comment on lines +24 to +28
```bash
KLAVIYO_KEY="${KLAVIYO_PRIVATE_KEY:-$(claude plugin config get klaviyo_private_key 2>/dev/null)}"
if [ -z "$KLAVIYO_KEY" ]; then
KLAVIYO_KEY="$(doppler secrets get KLAVIYO_PRIVATE_KEY --plain 2>/dev/null)"
fi
| (empty), dashboard | Run full marketing dashboard |
| email, klaviyo | Klaviyo email metrics |
| ads, meta | Meta Ads performance |
| google-ads | Google Ads (if configured) |
Comment on lines +324 to +329
curl -s "https://graph.facebook.com/v18.0/me/accounts?fields=instagram_business_account" \
-H "Authorization: Bearer ${META_TOKEN}" | jq '.data[].instagram_business_account.id' 2>/dev/null

# Then pull media insights
curl -s "https://graph.facebook.com/v18.0/${IG_ACCOUNT_ID}?fields=followers_count,media_count,profile_views" \
-H "Authorization: Bearer ${META_TOKEN}" | jq '{followers: .followers_count, posts: .media_count, profile_views: .profile_views}'
Comment on lines +260 to +262
-d '{
"startDate": "'$(date -v-28d +%Y-%m-%d 2>/dev/null || date -d '28 days ago' +%Y-%m-%d)'",
"endDate": "'$(date +%Y-%m-%d)'",
META_DATA=$(gather_meta) &
GA4_DATA=$(gather_ga4) &
GSC_DATA=$(gather_gsc) &
wait
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Background subshell variable assignments are silently lost

High Severity

VAR=$(func) & runs the assignment in a background subshell, so the parent shell never receives the values. After wait, KLAVIYO_DATA, META_DATA, GA4_DATA, and GSC_DATA are all empty, causing the final jq --argjson to fail on invalid (empty) JSON input. The existing ops-gather script in the same repo correctly solves this by redirecting output to temp files and reading them after wait.

Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit d6206d7. Configure here.

| revenue, money, mrr, costs | `/ops-revenue` |
| deploy, ship | `/ops-deploy` |
| merge, prs, ship-prs | `/ops-merge $ARGUMENTS` |
| marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` |
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Keyword "email" routes to two conflicting skills

Medium Severity

The keyword email appears in both the comms route (line 24, dispatching to /ops-comms) and the new marketing route (line 33, dispatching to /ops:ops-marketing). This ambiguity means /ops email could be routed to either skill depending on which row the router matches first, leading to unpredictable behavior.

Additional Locations (1)
Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit d6206d7. Configure here.

if [ -z "$META_TOKEN" ]; then
META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")"
META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")"
fi
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Meta account ID Doppler fallback incorrectly gated

Medium Severity

The Doppler fallback for META_ACCOUNT is nested inside the if [ -z "$META_TOKEN" ] block. If META_TOKEN is resolved via env var or plugin config but META_ACCOUNT is not, the Doppler lookup for META_AD_ACCOUNT_ID is skipped entirely. This causes gather_meta to silently return null even when the account ID exists in Doppler.

Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit d6206d7. Configure here.

Adds ops-ecom skill with orders, inventory, fulfillment, analytics,
health, products, customers, and setup sub-commands. Includes
ops-ecom-health bin script, ops router entry, and plugin.json userConfig
fields for shopify_store_url, shopify_admin_token, shipbob_access_token.
Generic — works with any Shopify store via Admin API credentials.
@auroracapital auroracapital merged commit 2f707a8 into main Apr 13, 2026
2 of 3 checks passed
@auroracapital auroracapital deleted the feature/ops-ecom branch April 13, 2026 15:44
Copy link
Copy Markdown

@devin-ai-integration devin-ai-integration Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 1 new potential issue.

View 8 additional findings in Devin Review.

Open in Devin Review

Comment on lines +18 to +21
if [ -z "$META_TOKEN" ]; then
META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")"
META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")"
fi
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 META_AD_ACCOUNT_ID Doppler fallback only runs when META_ADS_TOKEN is also missing

On lines 18-21, the Doppler lookup for META_AD_ACCOUNT_ID is nested inside the if [ -z "$META_TOKEN" ] block. This means if a user has META_ADS_TOKEN set via environment variable or plugin config but has META_AD_ACCOUNT_ID only in Doppler, the account ID will never be resolved. The gather_meta() function (line 56) will then return null because META_ACCOUNT is empty, silently skipping all Meta Ads data even though valid credentials exist.

Suggested change
if [ -z "$META_TOKEN" ]; then
META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")"
META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")"
fi
if [ -z "$META_TOKEN" ]; then
META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")"
fi
if [ -z "$META_ACCOUNT" ]; then
META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")"
fi
Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: f7841576bd

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +7 to +8
SHOPIFY_STORE="${SHOPIFY_STORE_URL:-}"
SHOPIFY_TOKEN="${SHOPIFY_ACCESS_TOKEN:-}"
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Resolve Shopify creds from plugin config in health script

ops-ecom-health only reads SHOPIFY_STORE_URL and SHOPIFY_ACCESS_TOKEN, so users who configure the newly added plugin settings (shopify_store_url and shopify_admin_token) still hit missing_credentials. This breaks the documented /ops:ops-ecom setup path unless users also duplicate the values into env vars or Doppler, which makes the health check fail in a common configured state.

Useful? React with 👍 / 👎.

Comment on lines +18 to +20
if [ -z "$META_TOKEN" ]; then
META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")"
META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")"
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Preserve Meta account when applying Doppler token fallback

The fallback branch keyed on missing META_TOKEN also unconditionally rewrites META_ACCOUNT from Doppler. If the account ID is already present from env/plugin config and only the token needs Doppler (or the Doppler account secret is absent), this clears META_ACCOUNT and causes gather_meta to return null despite valid configuration. Keep the existing account value unless it is also missing.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

There are 4 total unresolved issues (including 3 from previous reviews).

Fix All in Cursor

Bugbot Autofix is ON, but it could not run because the branch was deleted or merged before autofix could start.

Reviewed by Cursor Bugbot for commit f784157. Configure here.

if [ -z "$META_TOKEN" ]; then
META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")"
META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")"
fi
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doppler fallback unconditionally overwrites already-resolved META_ACCOUNT

Medium Severity

When META_TOKEN is empty, the Doppler fallback block unconditionally overwrites META_ACCOUNT on line 20, even if it was already successfully resolved from an environment variable or plugin config on line 10. If the account ID exists in env/config but the token only exists in Doppler, entering this block clobbers the valid META_ACCOUNT with whatever Doppler returns — which could be empty, silently breaking Meta Ads data gathering.

Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit f784157. Configure here.

auroracapital added a commit that referenced this pull request Apr 13, 2026
…49)

* feat: add /ops:marketing skill — email, ads, analytics, SEO command center

Adds ops-marketing skill with Klaviyo, Meta Ads, GA4, and Google Search Console
integrations. Includes bin/ops-marketing-dash pre-gather script, ops router update,
and plugin.json userConfig fields for all marketing credentials.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* feat: add /ops:ecom — Shopify store command center

Adds ops-ecom skill with orders, inventory, fulfillment, analytics,
health, products, customers, and setup sub-commands. Includes
ops-ecom-health bin script, ops router entry, and plugin.json userConfig
fields for shopify_store_url, shopify_admin_token, shipbob_access_token.
Generic — works with any Shopify store via Admin API credentials.

---------

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants