feat: add /ops:marketing — email, ads, analytics, SEO command center#49
feat: add /ops:marketing — email, ads, analytics, SEO command center#49auroracapital merged 2 commits intomainfrom
Conversation
…enter Adds ops-marketing skill with Klaviyo, Meta Ads, GA4, and Google Search Console integrations. Includes bin/ops-marketing-dash pre-gather script, ops router update, and plugin.json userConfig fields for all marketing credentials. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
|
Mention Blocks like a regular teammate with your question or request: @blocks review this pull request Run |
📝 WalkthroughWalkthroughA new marketing operations skill and dashboard integrate with Klaviyo, Meta Ads, GA4, and Google Search Console. Plugin configuration is extended with credential fields, a bash script aggregates marketing metrics from multiple sources, and command routing directs marketing-related inputs to the new skill center. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant OpsMktSkill as ops-marketing<br/>Skill
participant Dashboard as ops-marketing-dash<br/>Script
participant CredMgmt as Config/Secrets<br/>Management
participant MktAPIs as Marketing APIs<br/>(Klaviyo, Meta, GA4, GSC)
User->>OpsMktSkill: Request marketing data/dashboard
OpsMktSkill->>Dashboard: Execute dashboard command
Dashboard->>CredMgmt: Resolve credentials<br/>(env → plugin config → doppler)
CredMgmt-->>Dashboard: Return credentials
Dashboard->>MktAPIs: Concurrent API requests<br/>for each channel
MktAPIs-->>Dashboard: Channel-specific responses
Dashboard->>Dashboard: Parse & aggregate metrics<br/>(jq, awk processing)
Dashboard-->>OpsMktSkill: Return JSON{date, klaviyo,<br/>meta, ga4, gsc}
OpsMktSkill->>OpsMktSkill: Render dashboard/response
OpsMktSkill-->>User: Display marketing metrics
Estimated Code Review Effort🎯 4 (Complex) | ⏱️ ~45 minutes Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
| KLAVIYO_DATA=$(gather_klaviyo) & | ||
| META_DATA=$(gather_meta) & | ||
| GA4_DATA=$(gather_ga4) & | ||
| GSC_DATA=$(gather_gsc) & | ||
| wait |
There was a problem hiding this comment.
🔴 Background subshell variable assignments are invisible to parent shell, making all gathered data empty
The parallel execution pattern KLAVIYO_DATA=$(gather_klaviyo) & runs the command substitution and assignment inside a background subshell. Variable assignments in subshells do not propagate back to the parent shell. After wait, KLAVIYO_DATA, META_DATA, GA4_DATA, and GSC_DATA are all empty/unset in the parent. With set -u (line 5), the script will immediately fail with an "unbound variable" error at line 179 when trying to use $KLAVIYO_DATA. Compare with the working pattern in claude-ops/bin/ops-gather which correctly writes subshell output to temp files and reads them back.
Prompt for agents
The parallel execution pattern on lines 171-174 assigns command substitution results in background subshells, but those assignments never propagate to the parent shell. After wait, all four variables (KLAVIYO_DATA, META_DATA, GA4_DATA, GSC_DATA) are empty/unset, causing the script to crash at line 179 due to set -u.
The existing ops-gather script (claude-ops/bin/ops-gather) demonstrates the correct pattern: create a temp directory, run each function writing to a temp file in the background, wait, then read the files back. Apply the same pattern here:
1. Create a temp directory with mktemp -d and a trap to clean it up
2. Change each background call to: gather_klaviyo > "$TMPDIR/klaviyo.json" &
3. After wait, read the files: KLAVIYO_DATA=$(cat "$TMPDIR/klaviyo.json")
4. Then use the variables in the final jq command as before
Was this helpful? React with 👍 or 👎 to provide feedback.
| | revenue, money, mrr, costs | `/ops-revenue` | | ||
| | deploy, ship | `/ops-deploy` | | ||
| | merge, prs, ship-prs | `/ops-merge $ARGUMENTS` | | ||
| | marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` | |
There was a problem hiding this comment.
🔴 Keyword email is ambiguous — matches both comms and marketing routes
The keyword email now appears in two routing table entries: line 24 routes it to /ops-comms $ARGUMENTS and line 33 routes it to /ops:ops-marketing $ARGUMENTS. When a user types email as the argument, the router has conflicting instructions and may pick either destination unpredictably, breaking the expected behavior for one or both features.
| | marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` | | |
| | marketing, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` | |
Was this helpful? React with 👍 or 👎 to provide feedback.
| GSC_SITE_ENCODED=$(python3 -c "import urllib.parse; print(urllib.parse.quote('${GSC_SITE}', safe=''))" 2>/dev/null \ | ||
| || echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g') |
There was a problem hiding this comment.
🟡 Shell injection via GSC_SITE in python3 single-quoted string
On line 143, GSC_SITE is interpolated directly into a Python single-quoted string literal: print(urllib.parse.quote('${GSC_SITE}', safe='')). If GSC_SITE contains a single quote (e.g., a user mistakenly configures it as https://example.com'; import os; os.system('...')), the shell expansion breaks out of the Python string and executes arbitrary code. While the value is user-configured (self-inflicted), this is still a code injection vector that should be sanitized, e.g., by passing the value as a command-line argument to python3 rather than interpolating it into the source code.
| GSC_SITE_ENCODED=$(python3 -c "import urllib.parse; print(urllib.parse.quote('${GSC_SITE}', safe=''))" 2>/dev/null \ | |
| || echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g') | |
| GSC_SITE_ENCODED=$(python3 -c "import urllib.parse, sys; print(urllib.parse.quote(sys.argv[1], safe=''))" "${GSC_SITE}" 2>/dev/null \ | |
| || echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g') |
Was this helpful? React with 👍 or 👎 to provide feedback.
| KLAVIYO_DATA=$(gather_klaviyo) & | ||
| META_DATA=$(gather_meta) & | ||
| GA4_DATA=$(gather_ga4) & | ||
| GSC_DATA=$(gather_gsc) & | ||
| wait |
There was a problem hiding this comment.
Bug: Assigning variables from backgrounded commands like VAR=$(command) & will result in empty variables, as the assignment happens in a subshell. This causes a downstream jq failure.
Severity: CRITICAL
Suggested Fix
Follow the established pattern seen in the ops-dash script. Use temporary files to store the output of the background processes. Create a temporary directory, write the output of each backgrounded command to a file within it, and then after the wait command, read the contents of those files back into the variables.
Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.
Location: claude-ops/bin/ops-marketing-dash#L171-L175
Potential issue: The script executes data gathering functions like `gather_klaviyo` in
the background and attempts to assign their output to variables (e.g.,
`KLAVIYO_DATA=$(gather_klaviyo) &`). Because these assignments run in a background
subshell, the variables (`KLAVIYO_DATA`, `META_DATA`, etc.) remain empty in the parent
shell. These empty strings are then passed to `jq` via the `--argjson` flag. Since an
empty string is not valid JSON, `jq` will exit with an error, causing the entire
`/ops:marketing` dashboard script to fail deterministically on every execution.
Did we get this right? 👍 / 👎 to inform future reviews.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: d6206d770b
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| KLAVIYO_DATA=$(gather_klaviyo) & | ||
| META_DATA=$(gather_meta) & | ||
| GA4_DATA=$(gather_ga4) & | ||
| GSC_DATA=$(gather_gsc) & |
There was a problem hiding this comment.
Capture parallel gather output in parent shell
These backgrounded assignments (KLAVIYO_DATA=$(...) & etc.) execute in subshells, so the parent shell never receives the variable values; with set -u, the later jq --argjson ... "$KLAVIYO_DATA" expansion aborts with an unbound-variable error and the dashboard fails before emitting JSON. Running bash claude-ops/bin/ops-marketing-dash currently exits on this path, so results need to be collected via temp files/FIFOs (or another parent-shell-safe pattern) instead of backgrounded assignments.
Useful? React with 👍 / 👎.
| | revenue, money, mrr, costs | `/ops-revenue` | | ||
| | deploy, ship | `/ops-deploy` | | ||
| | merge, prs, ship-prs | `/ops-merge $ARGUMENTS` | | ||
| | marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` | |
There was a problem hiding this comment.
Remove duplicate
email alias from ops router
Adding email to this marketing route makes routing ambiguous because email is already mapped to /ops-comms earlier in the same table, so /ops email no longer has a deterministic target and can keep dispatching to comms instead of marketing. Since this file is the router instruction source, duplicate aliases make the new marketing entry unreliable; use non-overlapping keywords or remove one mapping.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
claude-ops/skills/ops/SKILL.md (1)
24-33:⚠️ Potential issue | 🟠 MajorRouting conflict:
ops-commsandops-marketing.Line 24 routes
/ops-comms(Gmail/messaging), while line 33 also includes/ops:ops-marketing(Klaviyo). This creates ambiguous routing behavior.Consider disambiguating by:
- Removing
klaviyofor Klaviyo-specific access- Or renaming the comms route to use
gmailinstead ofSuggested fix (option 1 - remove email from marketing route)
-| marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` | +| marketing, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` |🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@claude-ops/skills/ops/SKILL.md` around lines 24 - 33, Routing currently conflicts because the keyword "email" is listed for both ops-comms and the ops-marketing route (/ops:ops-marketing); remove "email" from the marketing keywords so "email" only maps to ops-comms (leave ops-comms' keywords intact) by editing the marketing entry that currently reads "marketing, email, klaviyo, ads, meta, seo, campaigns" to exclude "email" (keep "klaviyo" to target Klaviyo-specific access); ensure the change is reflected in SKILL.md where the `/ops:ops-marketing $ARGUMENTS` route is declared.
🧹 Nitpick comments (1)
claude-ops/skills/ops-marketing/SKILL.md (1)
141-141: Minor grammar nit: "Top performing" → "Top-performing".Compound adjectives before a noun should be hyphenated.
-### Top performing ads (last 7d) +### Top-performing ads (last 7d)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@claude-ops/skills/ops-marketing/SKILL.md` at line 141, Update the header text "Top performing ads (last 7d)" to use a hyphenated compound adjective by changing it to "Top-performing ads (last 7d)"; locate the header string in SKILL.md (the line containing "Top performing ads (last 7d)") and replace it accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@claude-ops/bin/ops-marketing-dash`:
- Around line 170-184: The backgrounded assignments like
KLAVIYO_DATA=$(gather_klaviyo) & run in subshells so the parent shell never gets
the results; change the pattern to have each gather_* producer write its JSON to
a unique temporary file (e.g., tmp_klaviyo, tmp_meta, tmp_ga4, tmp_gsc) while
still running in background (call gather_klaviyo > "$TMP_KLAVIYO" & etc.), wait
for all to finish, then read those files into KLAVIYO_DATA, META_DATA, GA4_DATA,
GSC_DATA (or pass them to jq via --argfile/--slurpfile) and clean up the temp
files; use the gather_klaviyo, gather_meta, gather_ga4, gather_gsc names to
locate where to redirect output.
- Around line 143-144: The current command injects GSC_SITE directly into a
Python string in the python3 -c invocation (producing GSC_SITE_ENCODED), which
allows a value containing a single quote to break out; change the invocation to
pass GSC_SITE via the environment or stdin so it is not interpolated into the
source string (e.g., read os.environ['GSC_SITE'] or read from stdin inside the
Python snippet) and fall back to the sed encoding only if the python invocation
fails; update references to GSC_SITE_ENCODED and the python3 -c usage
accordingly so input is safely obtained from the environment/STDIN rather than
string interpolation.
---
Outside diff comments:
In `@claude-ops/skills/ops/SKILL.md`:
- Around line 24-33: Routing currently conflicts because the keyword "email" is
listed for both ops-comms and the ops-marketing route (/ops:ops-marketing);
remove "email" from the marketing keywords so "email" only maps to ops-comms
(leave ops-comms' keywords intact) by editing the marketing entry that currently
reads "marketing, email, klaviyo, ads, meta, seo, campaigns" to exclude "email"
(keep "klaviyo" to target Klaviyo-specific access); ensure the change is
reflected in SKILL.md where the `/ops:ops-marketing $ARGUMENTS` route is
declared.
---
Nitpick comments:
In `@claude-ops/skills/ops-marketing/SKILL.md`:
- Line 141: Update the header text "Top performing ads (last 7d)" to use a
hyphenated compound adjective by changing it to "Top-performing ads (last 7d)";
locate the header string in SKILL.md (the line containing "Top performing ads
(last 7d)") and replace it accordingly.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 881d2cb7-50b0-4948-ba6c-f5be2dbfa80a
📒 Files selected for processing (4)
claude-ops/.claude-plugin/plugin.jsonclaude-ops/bin/ops-marketing-dashclaude-ops/skills/ops-marketing/SKILL.mdclaude-ops/skills/ops/SKILL.md
| GSC_SITE_ENCODED=$(python3 -c "import urllib.parse; print(urllib.parse.quote('${GSC_SITE}', safe=''))" 2>/dev/null \ | ||
| || echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g') |
There was a problem hiding this comment.
Potential command injection in URL encoding.
The GSC_SITE variable is interpolated directly into a Python string. If it contains a single quote, it could break out of the string context.
Safer approach using environment variable
- GSC_SITE_ENCODED=$(python3 -c "import urllib.parse; print(urllib.parse.quote('${GSC_SITE}', safe=''))" 2>/dev/null \
- || echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g')
+ GSC_SITE_ENCODED=$(GSC_SITE="$GSC_SITE" python3 -c "import urllib.parse, os; print(urllib.parse.quote(os.environ['GSC_SITE'], safe=''))" 2>/dev/null \
+ || printf '%s' "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g')📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| GSC_SITE_ENCODED=$(python3 -c "import urllib.parse; print(urllib.parse.quote('${GSC_SITE}', safe=''))" 2>/dev/null \ | |
| || echo "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g') | |
| GSC_SITE_ENCODED=$(GSC_SITE="$GSC_SITE" python3 -c "import urllib.parse, os; print(urllib.parse.quote(os.environ['GSC_SITE'], safe=''))" 2>/dev/null \ | |
| || printf '%s' "${GSC_SITE}" | sed 's|:|%3A|g; s|/|%2F|g') |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@claude-ops/bin/ops-marketing-dash` around lines 143 - 144, The current
command injects GSC_SITE directly into a Python string in the python3 -c
invocation (producing GSC_SITE_ENCODED), which allows a value containing a
single quote to break out; change the invocation to pass GSC_SITE via the
environment or stdin so it is not interpolated into the source string (e.g.,
read os.environ['GSC_SITE'] or read from stdin inside the Python snippet) and
fall back to the sed encoding only if the python invocation fails; update
references to GSC_SITE_ENCODED and the python3 -c usage accordingly so input is
safely obtained from the environment/STDIN rather than string interpolation.
| # Run all in parallel (subshells) | ||
| KLAVIYO_DATA=$(gather_klaviyo) & | ||
| META_DATA=$(gather_meta) & | ||
| GA4_DATA=$(gather_ga4) & | ||
| GSC_DATA=$(gather_gsc) & | ||
| wait | ||
|
|
||
| # Emit unified JSON | ||
| jq -n \ | ||
| --argjson klaviyo "$KLAVIYO_DATA" \ | ||
| --argjson meta "$META_DATA" \ | ||
| --argjson ga4 "$GA4_DATA" \ | ||
| --argjson gsc "$GSC_DATA" \ | ||
| --arg date "$TODAY" \ | ||
| '{date: $date, klaviyo: $klaviyo, meta: $meta, ga4: $ga4, gsc: $gsc}' |
There was a problem hiding this comment.
Critical: Background subshell variable assignments won't propagate to parent shell.
When you run VAR=$(command) &, the variable assignment occurs in a subshell. After wait, the parent shell's variables (KLAVIYO_DATA, META_DATA, etc.) remain unset/empty, causing the final jq command on lines 178-184 to fail.
Proposed fix using temporary files
-# Run all in parallel (subshells)
-KLAVIYO_DATA=$(gather_klaviyo) &
-META_DATA=$(gather_meta) &
-GA4_DATA=$(gather_ga4) &
-GSC_DATA=$(gather_gsc) &
-wait
-
-# Emit unified JSON
-jq -n \
- --argjson klaviyo "$KLAVIYO_DATA" \
- --argjson meta "$META_DATA" \
- --argjson ga4 "$GA4_DATA" \
- --argjson gsc "$GSC_DATA" \
- --arg date "$TODAY" \
- '{date: $date, klaviyo: $klaviyo, meta: $meta, ga4: $ga4, gsc: $gsc}'
+# Run all in parallel using temp files
+TMPDIR=$(mktemp -d)
+trap 'rm -rf "$TMPDIR"' EXIT
+
+gather_klaviyo > "$TMPDIR/klaviyo.json" &
+gather_meta > "$TMPDIR/meta.json" &
+gather_ga4 > "$TMPDIR/ga4.json" &
+gather_gsc > "$TMPDIR/gsc.json" &
+wait
+
+# Emit unified JSON
+jq -n \
+ --slurpfile klaviyo "$TMPDIR/klaviyo.json" \
+ --slurpfile meta "$TMPDIR/meta.json" \
+ --slurpfile ga4 "$TMPDIR/ga4.json" \
+ --slurpfile gsc "$TMPDIR/gsc.json" \
+ --arg date "$TODAY" \
+ '{date: $date, klaviyo: $klaviyo[0], meta: $meta[0], ga4: $ga4[0], gsc: $gsc[0]}'🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@claude-ops/bin/ops-marketing-dash` around lines 170 - 184, The backgrounded
assignments like KLAVIYO_DATA=$(gather_klaviyo) & run in subshells so the parent
shell never gets the results; change the pattern to have each gather_* producer
write its JSON to a unique temporary file (e.g., tmp_klaviyo, tmp_meta, tmp_ga4,
tmp_gsc) while still running in background (call gather_klaviyo > "$TMP_KLAVIYO"
& etc.), wait for all to finish, then read those files into KLAVIYO_DATA,
META_DATA, GA4_DATA, GSC_DATA (or pass them to jq via --argfile/--slurpfile) and
clean up the temp files; use the gather_klaviyo, gather_meta, gather_ga4,
gather_gsc names to locate where to redirect output.
There was a problem hiding this comment.
Pull request overview
Adds a new “marketing command center” skill to the ops plugin, intended to centralize email (Klaviyo), paid ads (Meta), analytics (GA4), and SEO (Search Console) into a unified /ops-routed experience.
Changes:
- Added
ops-marketingskill documentation/routing, including setup and dashboard behavior. - Added
bin/ops-marketing-dashto pre-gather multi-channel metrics and emit unified JSON. - Extended
/opsrouter and pluginuserConfigschema with marketing-related credentials.
Reviewed changes
Copilot reviewed 3 out of 4 changed files in this pull request and generated 6 comments.
| File | Description |
|---|---|
| claude-ops/skills/ops/SKILL.md | Adds router entry intended to dispatch marketing-related requests to the new skill. |
| claude-ops/skills/ops-marketing/SKILL.md | Introduces the marketing skill content: credential resolution, subcommands, and dashboard/setup guidance. |
| claude-ops/bin/ops-marketing-dash | New Bash gatherer script intended to run channel fetches in parallel and output unified JSON. |
| claude-ops/.claude-plugin/plugin.json | Adds new userConfig keys for Klaviyo, Meta Ads, GA4, and Search Console configuration. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| | revenue, money, mrr, costs | `/ops-revenue` | | ||
| | deploy, ship | `/ops-deploy` | | ||
| | merge, prs, ship-prs | `/ops-merge $ARGUMENTS` | | ||
| | marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` | |
| ```bash | ||
| GA4_PROPERTY="${GA4_PROPERTY_ID:-$(claude plugin config get ga4_property_id 2>/dev/null)}" | ||
| # GA4 uses gcloud application default credentials — check if configured: | ||
| gcloud auth application-default print-access-token 2>/dev/null |
| ```bash | ||
| KLAVIYO_KEY="${KLAVIYO_PRIVATE_KEY:-$(claude plugin config get klaviyo_private_key 2>/dev/null)}" | ||
| if [ -z "$KLAVIYO_KEY" ]; then | ||
| KLAVIYO_KEY="$(doppler secrets get KLAVIYO_PRIVATE_KEY --plain 2>/dev/null)" | ||
| fi |
| | (empty), dashboard | Run full marketing dashboard | | ||
| | email, klaviyo | Klaviyo email metrics | | ||
| | ads, meta | Meta Ads performance | | ||
| | google-ads | Google Ads (if configured) | |
| curl -s "https://graph.facebook.com/v18.0/me/accounts?fields=instagram_business_account" \ | ||
| -H "Authorization: Bearer ${META_TOKEN}" | jq '.data[].instagram_business_account.id' 2>/dev/null | ||
|
|
||
| # Then pull media insights | ||
| curl -s "https://graph.facebook.com/v18.0/${IG_ACCOUNT_ID}?fields=followers_count,media_count,profile_views" \ | ||
| -H "Authorization: Bearer ${META_TOKEN}" | jq '{followers: .followers_count, posts: .media_count, profile_views: .profile_views}' |
| -d '{ | ||
| "startDate": "'$(date -v-28d +%Y-%m-%d 2>/dev/null || date -d '28 days ago' +%Y-%m-%d)'", | ||
| "endDate": "'$(date +%Y-%m-%d)'", |
| META_DATA=$(gather_meta) & | ||
| GA4_DATA=$(gather_ga4) & | ||
| GSC_DATA=$(gather_gsc) & | ||
| wait |
There was a problem hiding this comment.
Background subshell variable assignments are silently lost
High Severity
VAR=$(func) & runs the assignment in a background subshell, so the parent shell never receives the values. After wait, KLAVIYO_DATA, META_DATA, GA4_DATA, and GSC_DATA are all empty, causing the final jq --argjson to fail on invalid (empty) JSON input. The existing ops-gather script in the same repo correctly solves this by redirecting output to temp files and reading them after wait.
Reviewed by Cursor Bugbot for commit d6206d7. Configure here.
| | revenue, money, mrr, costs | `/ops-revenue` | | ||
| | deploy, ship | `/ops-deploy` | | ||
| | merge, prs, ship-prs | `/ops-merge $ARGUMENTS` | | ||
| | marketing, email, klaviyo, ads, meta, seo, campaigns | `/ops:ops-marketing $ARGUMENTS` | |
There was a problem hiding this comment.
Keyword "email" routes to two conflicting skills
Medium Severity
The keyword email appears in both the comms route (line 24, dispatching to /ops-comms) and the new marketing route (line 33, dispatching to /ops:ops-marketing). This ambiguity means /ops email could be routed to either skill depending on which row the router matches first, leading to unpredictable behavior.
Additional Locations (1)
Reviewed by Cursor Bugbot for commit d6206d7. Configure here.
| if [ -z "$META_TOKEN" ]; then | ||
| META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")" | ||
| META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")" | ||
| fi |
There was a problem hiding this comment.
Meta account ID Doppler fallback incorrectly gated
Medium Severity
The Doppler fallback for META_ACCOUNT is nested inside the if [ -z "$META_TOKEN" ] block. If META_TOKEN is resolved via env var or plugin config but META_ACCOUNT is not, the Doppler lookup for META_AD_ACCOUNT_ID is skipped entirely. This causes gather_meta to silently return null even when the account ID exists in Doppler.
Reviewed by Cursor Bugbot for commit d6206d7. Configure here.
Adds ops-ecom skill with orders, inventory, fulfillment, analytics, health, products, customers, and setup sub-commands. Includes ops-ecom-health bin script, ops router entry, and plugin.json userConfig fields for shopify_store_url, shopify_admin_token, shipbob_access_token. Generic — works with any Shopify store via Admin API credentials.
| if [ -z "$META_TOKEN" ]; then | ||
| META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")" | ||
| META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")" | ||
| fi |
There was a problem hiding this comment.
🟡 META_AD_ACCOUNT_ID Doppler fallback only runs when META_ADS_TOKEN is also missing
On lines 18-21, the Doppler lookup for META_AD_ACCOUNT_ID is nested inside the if [ -z "$META_TOKEN" ] block. This means if a user has META_ADS_TOKEN set via environment variable or plugin config but has META_AD_ACCOUNT_ID only in Doppler, the account ID will never be resolved. The gather_meta() function (line 56) will then return null because META_ACCOUNT is empty, silently skipping all Meta Ads data even though valid credentials exist.
| if [ -z "$META_TOKEN" ]; then | |
| META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")" | |
| META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")" | |
| fi | |
| if [ -z "$META_TOKEN" ]; then | |
| META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")" | |
| fi | |
| if [ -z "$META_ACCOUNT" ]; then | |
| META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")" | |
| fi |
Was this helpful? React with 👍 or 👎 to provide feedback.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: f7841576bd
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| SHOPIFY_STORE="${SHOPIFY_STORE_URL:-}" | ||
| SHOPIFY_TOKEN="${SHOPIFY_ACCESS_TOKEN:-}" |
There was a problem hiding this comment.
Resolve Shopify creds from plugin config in health script
ops-ecom-health only reads SHOPIFY_STORE_URL and SHOPIFY_ACCESS_TOKEN, so users who configure the newly added plugin settings (shopify_store_url and shopify_admin_token) still hit missing_credentials. This breaks the documented /ops:ops-ecom setup path unless users also duplicate the values into env vars or Doppler, which makes the health check fail in a common configured state.
Useful? React with 👍 / 👎.
| if [ -z "$META_TOKEN" ]; then | ||
| META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")" | ||
| META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")" |
There was a problem hiding this comment.
Preserve Meta account when applying Doppler token fallback
The fallback branch keyed on missing META_TOKEN also unconditionally rewrites META_ACCOUNT from Doppler. If the account ID is already present from env/plugin config and only the token needs Doppler (or the Doppler account secret is absent), this clears META_ACCOUNT and causes gather_meta to return null despite valid configuration. Keep the existing account value unless it is also missing.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
There are 4 total unresolved issues (including 3 from previous reviews).
Bugbot Autofix is ON, but it could not run because the branch was deleted or merged before autofix could start.
Reviewed by Cursor Bugbot for commit f784157. Configure here.
| if [ -z "$META_TOKEN" ]; then | ||
| META_TOKEN="$(doppler secrets get META_ADS_TOKEN --plain 2>/dev/null || echo "")" | ||
| META_ACCOUNT="$(doppler secrets get META_AD_ACCOUNT_ID --plain 2>/dev/null || echo "")" | ||
| fi |
There was a problem hiding this comment.
Doppler fallback unconditionally overwrites already-resolved META_ACCOUNT
Medium Severity
When META_TOKEN is empty, the Doppler fallback block unconditionally overwrites META_ACCOUNT on line 20, even if it was already successfully resolved from an environment variable or plugin config on line 10. If the account ID exists in env/config but the token only exists in Doppler, entering this block clobbers the valid META_ACCOUNT with whatever Doppler returns — which could be empty, silently breaking Meta Ads data gathering.
Reviewed by Cursor Bugbot for commit f784157. Configure here.
…49) * feat: add /ops:marketing skill — email, ads, analytics, SEO command center Adds ops-marketing skill with Klaviyo, Meta Ads, GA4, and Google Search Console integrations. Includes bin/ops-marketing-dash pre-gather script, ops router update, and plugin.json userConfig fields for all marketing credentials. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * feat: add /ops:ecom — Shopify store command center Adds ops-ecom skill with orders, inventory, fulfillment, analytics, health, products, customers, and setup sub-commands. Includes ops-ecom-health bin script, ops router entry, and plugin.json userConfig fields for shopify_store_url, shopify_admin_token, shipbob_access_token. Generic — works with any Shopify store via Admin API credentials. --------- Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>


Summary
skills/ops-marketing/SKILL.md— full marketing command center with Klaviyo (email), Meta Ads, GA4, Google Search Console, social media, cross-channel campaigns, and interactive setupbin/ops-marketing-dash— bash pre-gather script that hits all configured channels in parallel and outputs unified JSONskills/ops/SKILL.mdrouter withmarketing, email, klaviyo, ads, meta, seo, campaignsroutesuserConfigfields toplugin.json:klaviyo_private_key,meta_ads_token,meta_ad_account_id,ga4_property_id,google_search_console_siteTest plan
/ops:marketingwith no args shows unified dashboard (graceful nulls for unconfigured channels)/ops:marketing setupprompts for each credential and validates with test API call/ops:marketing emailreturns Klaviyo lists + campaigns/ops:marketing adsreturns Meta Ads spend/ROAS for last 7d/ops:marketing analyticsreturns GA4 sessions/conversions (requires gcloud ADC)/ops:marketing seoreturns Search Console clicks/impressions/ops:marketing campaignsshows cross-channel active campaigns/opsrouter correctly dispatchesmarketing,email,klaviyo,ads,meta,seo,campaigns🤖 Generated with Claude Code
Note
Medium Risk
Introduces new scripts/skills that call external APIs (Shopify, Klaviyo, Meta, Google) and adds new credential configuration fields; main risk is misconfiguration or leaking/handling tokens incorrectly in logs/output.
Overview
Adds new
/ops:ops-marketingand/ops:ops-ecomskills to provide a marketing dashboard (Klaviyo, Meta Ads, GA4, Search Console) and a Shopify store command center (orders, inventory, fulfillment, analytics, health).Introduces
bin/ops-marketing-dashandbin/ops-ecom-healthto fetch channel/store metrics and emit unified JSON, with credential resolution via plugin config/env vars and Doppler fallback.Extends
.claude-plugin/plugin.jsonwith new sensitive API key/token fields (plus Shopify/ShipBob settings) and updates the mainskills/ops/SKILL.mdrouter to dispatch newmarketingandecomcommands.Reviewed by Cursor Bugbot for commit f784157. Bugbot is set up for automated code reviews on this repo. Configure here.
Summary by CodeRabbit