Product Feedback

Get structured feedback on your product from teammates, stakeholders, or recruited users. Share a URL and collect voice or written observations. Custom interview guides let you focus the conversation on specific flows or features.

Design Review with Team

You want your team's honest take on a prototype or staging site. Share a URL and each reviewer records their thoughts via voice while browsing — no recruitment needed, just share the link. Free for unlimited internal reviewers.

Participant view
AI MODERATOR
Browse product
Voice session
Learn more about Voice Interviews
How you'd run it
$ claude "share staging.example.com with the team for design feedback"
What you get back
P0 Hero section — 3/4 found headline confusing
"I don't understand what the product does from this."
P1 Footer links — 2/4 expected social links
P2 Color contrast — 1/4 flagged light gray on white
4 reviewers · 8 findings · Free (internal team)

Checkout Flow Testing

You want to find out where users drop off in your checkout flow. A custom interview guide focuses the AI moderator on payment friction, trust signals, and form usability. You get findings ranked by severity with specific recommendations.

Participant view
AI MODERATOR
Browse product
Voice session
Learn more about Voice Interviews
How you'd run it
$ candor study create --goal "test checkout flow" \
--url "https://store.example.com/cart" \
--type interview --recruit --participants 8
What you get back
P0 Trust — 6/8 hesitated at payment (no badges visible)
"I don't see any security badges, is this safe?"
P1 Shipping — 5/8 wanted cost before entering address
P2 Guest checkout — 4/8 expected option, got signup wall
P3 Promo code — 2/8 couldn't find where to enter it
8 sessions · 14 findings · Drop-off risk: payment step

Landing Page Feedback

You want written feedback on your landing page from real users. Participants review the page and type their observations at their own pace — text-only mode for when voice isn't an option. Async-friendly for distributed teams.

Participant view
Describe what you see...
Learn more about Free Text
How you'd run it
$ claude "get written feedback on our landing page from 10 users"
What you get back
Theme Mentions Sample feedback
Value prop clear 7/10 "Immediately understood what it does"
Pricing missing 6/10 "Had to scroll too far to find cost"
CTA compelling 5/10 "Free trial button stood out nicely"
Load time slow 3/10 "Hero image took a few seconds"
10 participants · 42 observations · Async over 24h

Mobile Experience Testing

You want to know how your website works on different devices. Recruited participants browse on their own phones, tablets, and desktops while the AI moderator records their experience. You get platform-specific usability issues identified automatically.

Participant view
AI MODERATOR
Browse product
Voice session
Learn more about Voice Interviews
How you'd run it
$ claude "test how our site works on mobile with 6 participants"
What you get back
P0 Nav menu — 5/6 couldn't close hamburger menu (iOS)
P1 Images — 4/6 reported slow loading on cellular
P1 Form inputs — 3/6 had keyboard overlap on Android
P2 Horizontal scroll — 2/6 noticed on pricing table
6 sessions · iOS: 3, Android: 2, tablet: 1
10 findings · Mobile-specific: 8