Product Updates 15 min read

A Life of CX in a World of Claude: VoC Survey Automation via CLI

Follow a CX leader who orchestrates her entire VoC program through AI-powered CLI automation. From survey creation to custom dashboards — described in natural language, built in minutes.

Sarah Chen VP of Customer Experience

A New Paradigm for CX Operations

The traditional CX workflow looks like this: click through 15 screens to create a survey, configure targeting in another tool, schedule delivery in a third, wait for responses, export to CSV, analyze in a spreadsheet, build a report in yet another tool, and present in a meeting.

What if you could describe what you want in one sentence and have it built in minutes?

This is the story of Sarah, a VP of Customer Experience at a mid-market SaaS company. She’s technology-forward, comfortable with AI, and tired of spending more time operating tools than thinking about customer experience. She uses ActionXM’s CLI with Claude Code — and the result is a CX operations workflow that looks nothing like the dashboard-clicking paradigm.

The UI is still there. Sarah uses it when she wants visual exploration, fine-tuning, and presentation. But the speed of her program comes from AI-powered CLI automation.

Gemini says “Build me an app.” ActionXM says “Build me a CX program.”


Scene 1: Monday Morning — “Build Me a Post-Onboarding Survey”

Sarah opens her terminal. Claude Code is connected to ActionXM’s CLI.

sarah@macbook — claude code
sarah › Create a post-onboarding NPS survey for customers who completed setup in the last 7 days. Include CES question about setup difficulty, open-text for improvement suggestions. Schedule for Tuesday 10am.
claude › I'll set this up. Creating the survey, audience, and schedule...
$ actionxm surveys create --from-prompt \
--name "Post-Onboarding NPS Q1" \
--questions '["nps", "ces:setup_difficulty", "open_text:improvements"]'
✓ Survey created: survey_2026Q1_onboard_nps (3 questions)
$ actionxm audiences create \
--name "Recent Onboarded" \
--filter "setup_completed_at > now() - interval '7 days'"
✓ Audience created: 847 customers match
$ actionxm workflows create \
--trigger "schedule:2026-03-04T10:00:00" \
--action "send_survey:survey_2026Q1_onboard_nps" \
--audience "Recent Onboarded"
✓ Workflow scheduled: Tuesday 10:00 AM → 847 recipients

Three CLI commands. One sentence from Sarah. A survey created, audience targeted, delivery scheduled.

But here’s the key: Sarah opens the ActionXM UI and sees the survey in the visual builder. She tweaks the NPS question wording, adds skip logic for promoters, and reorders the CES scale labels. The CLI built the structure. The UI lets her polish it.

CLI — Speed
30 sec
Survey created, targeted, scheduled
UI — Polish
5 min
Fine-tune wording, add skip logic

Traditional workflow: Open survey builder → click “New Survey” → select template → edit each question → configure scoring → click “Audiences” → build filter → save → click “Workflows” → create schedule → attach survey → attach audience → save. Time: 25-40 minutes, 15+ screens, dozens of clicks.


Scene 2: Wednesday — “How Are Responses Looking?”

Two days after launch. Sarah wants a quick check on the survey performance.

sarah@macbook — claude code
sarah › Show me response rates and early NPS breakdown for the onboarding survey. Flag any open-text responses mentioning API.
claude › Pulling survey analytics and running text analysis...
$ actionxm surveys stats survey_2026Q1_onboard_nps
📊 Post-Onboarding NPS Q1
Sent: 847 | Opened: 412 (48.6%) | Completed: 289 (34.1%)
NPS: +38 | Promoters: 52% | Passives: 34% | Detractors: 14%
CES avg: 3.2/5 (setup difficulty)
$ actionxm responses search --survey survey_2026Q1_onboard_nps --text "API" --with-sentiment
🔍 12 responses mention "API"
Negative (8): "API docs are outdated", "webhook setup took 3 hours",
"couldn't find the API key page"
Neutral (3): "API works fine once set up"
Positive (1): "API integration was seamless"

In 10 seconds, Sarah has response rates, NPS breakdown, CES scores, and a sentiment-analyzed view of every API-related comment. She can open the ActionXM dashboard to see the same data with charts and trend lines — but for a quick morning check, the terminal is faster.

You want UI, you have UI. The dashboard is one click away, showing the same data with interactive charts, filter dropdowns, and exportable reports. CLI for speed. UI for exploration.


Scene 3: Thursday — “Build Me a Custom Dashboard”

Sarah has a stakeholder meeting next week. She needs a dashboard that doesn’t exist yet.

sarah@macbook — claude code
sarah › Create a dashboard that shows onboarding NPS trend by customer segment, overlaid with time-to-first-value metric. Add a filter for company size. Title it "Onboarding Health Monitor".
claude › Building your dashboard with 3 panels and segment filters...
$ actionxm dashboards create \
--name "Onboarding Health Monitor" \
--panels '[
{"type": "trend", "metric": "nps", "group_by": "segment"},
{"type": "overlay", "metrics": ["nps", "time_to_first_value"]},
{"type": "table", "metric": "onboarding_completion", "columns": ["segment", "count", "avg_days"]}
]' \
--filters '["company_size"]'
✓ Dashboard created: /dashboards/onboarding-health-monitor
→ Open in browser: https://app.action-xm.com/dashboards/onboarding-health-monitor

Sarah described a dashboard in one sentence. Claude translated it into a configuration. ActionXM rendered it into a live, interactive dashboard with real-time data, clickable charts, and export capabilities.

This is the “Build me an app” moment for CX. Sarah didn’t file a BI team request. She didn’t wait 2 weeks for a Tableau developer. She described what she wanted and had it in minutes. And because it’s in the ActionXM UI, she can rearrange panels, adjust date ranges, and share the link with her team — no CLI needed.

Traditional vs. AI-Powered Dashboard Creation
Traditional
  • 1. File BI team request
  • 2. Wait in queue (1-2 weeks)
  • 3. Review mockup
  • 4. Feedback round
  • 5. Build in Tableau/Looker
  • 6. Test and deploy
2-4 weeks
AI-Powered
  • 1. Describe in natural language
  • 2. Claude builds config
  • 3. ActionXM renders live
  • 4. Fine-tune in UI if needed
  •  
  •  
5 minutes

Scene 4: Friday — “Something’s Off with Enterprise Customers”

Sarah noticed something in her Thursday dashboard. Enterprise NPS is lower than expected. She wants to dig deeper.

sarah@macbook — claude code
sarah › Compare NPS scores for enterprise vs SMB customers who onboarded this quarter. Cross-reference with session frustration data from their first week. Are enterprise customers hitting different friction points?
claude › Cross-referencing VoC responses with DXA behavioral data across segments...
📊 Segment Comparison: Enterprise vs SMB (Q1 Onboarding)
Enterprise (200+ seats)
NPS: +22
Avg frustration: 64/100
Top friction: /settings/integrations
SMB (1-50 seats)
NPS: +46
Avg frustration: 28/100
Top friction: /billing/plans
⚠ Key Finding
Enterprise customers show 3.2x more rage clicks on /settings/integrations. Session replays show enterprise customers attempting to configure SSO, SCIM, and webhook integrations — the setup flow doesn't adapt to complexity level.

This is where the power of unified data becomes clear. Sarah asked a question that spans two data domains: VoC (NPS scores) and DXA (session frustration). In a traditional multi-tool stack, answering this question requires exporting from Qualtrics, exporting from Hotjar, matching identities in a spreadsheet, and hoping the data lines up.

In ActionXM, it’s one query. The Experience Graph connects survey responses and behavioral data through unified profiles. Claude just asks the API.


Scene 5: The “Build Me an App” Moment

Sarah’s week convinced her: the onboarding health metrics should be tracked continuously, not manually.

sarah@macbook — claude code
sarah › Create a weekly automated report that tracks onboarding health by segment. NPS, CES, completion rate, frustration score, time-to-value. Email it to the CX team every Monday at 9am. Also build me a live view I can pull up in our team standup.
claude › Setting up the automated report and live dashboard...
$ actionxm reports create \
--name "Weekly Onboarding Health" \
--metrics '["nps", "ces", "completion_rate", "frustration_score", "time_to_value"]' \
--group-by segment --period weekly
✓ Report template created
$ actionxm workflows create \
--trigger "schedule:monday:09:00" \
--action "send_report:weekly_onboarding_health" \
--recipients "cx-team@company.com"
✓ Weekly email scheduled: Mondays 9:00 AM
$ actionxm dashboards create \
--name "Standup: Onboarding Live" \
--template "live-monitor" \
--metrics '["nps_realtime", "frustration_live", "active_surveys", "open_cases"]'
✓ Live dashboard: /dashboards/standup-onboarding-live

Sarah described a reporting system in one sentence. Claude built it: a report template, a weekly email schedule, and a live dashboard for standups. The live dashboard updates in real time. The weekly email arrives every Monday with a formatted summary. The report looks like it took a BI team weeks to build.

It took 2 minutes.


The Workflow Comparison

Sarah's Week: Traditional vs. AI-Powered
Task
Traditional
AI + CLI
Create survey + target + schedule
25-40 min
30 sec
Check response rates + sentiment
15 min
10 sec
Build custom dashboard
2-4 weeks
2 min
Cross-reference VoC + behavioral
2-3 hours
15 sec
Set up automated reporting
1-2 weeks
2 min

Not Either/Or — Both

The critical message isn’t “CLI replaces UI.” It’s “CLI accelerates, UI polishes.”

Sarah uses the CLI when she wants speed:

  • Creating surveys from natural language descriptions
  • Pulling analytics without navigating dashboards
  • Building reports and dashboards from one-sentence specifications
  • Cross-referencing data across VoC and DXA

Sarah uses the UI when she wants control:

  • Fine-tuning survey question wording and skip logic
  • Visually exploring dashboard data with filters and drill-downs
  • Presenting polished views to stakeholders
  • Configuring detailed workflow conditions

The AI doesn’t replace the human. It handles the operational overhead — the configuration, the plumbing, the repetitive setup — so the human can focus on the decisions that matter: what to measure, what to investigate, and what to change.

What This Means for CX Operations

The traditional CX leader spends 60% of their time operating tools and 40% thinking about customer experience. The AI-powered CX leader inverts that ratio.

When creating a survey is a one-sentence task instead of a 40-minute click-through, you create more surveys — targeted, contextual, timely. When building a dashboard is a 2-minute task instead of a 4-week BI request, you build dashboards for every question, not just the ones worth the wait.

The bottleneck in CX programs was never insight — it was operations. AI-powered CLI automation removes the operations bottleneck and lets CX leaders focus on what they were hired to do: understand and improve the customer experience.

Gemini says “Build me an app.” ActionXM says “Build me a CX program.”

Ready to Transform Your Experience Program?

See how ActionXM can help you capture, analyze, and act on feedback at scale.