
The conversational flow makes testers open up more, so we get clearer answers faster.
Training the AI with our docs and URLs took almost no time.
The summaries are spot on and help us decide what to fix or test next.
The new 3.0 update finally added features we’d been missing, so we can run deeper tests without switching tools.
More control over how insights are grouped or tagged would help when we’re doing bigger rounds.
Flexible add ons for responses would be nice, especially during busy testing weeks.
We looked at a few other tools before picking Theysaid. Most options either felt too rigid or required extra setup that slowed us down. It let us move quicker and stay close to what users actually meant, not just what they typed into a form.
The integration works well for the basics. We did not need any complex setup and it stayed stable during our tests.
Yes. This was one of the reasons we kept using it. We fed in a good number of voice and chat interviews and it picked out clear themes that matched what we saw manually.
It gave us enough freedom to shape the flow around Impakt without feeling limited or stuck.
Used it for an onboarding redesign. The AI pulled clear insights from voice, chat, and video feedback and helped us fix issues fast. Slight learning curve at first, but the 2-way voice mode unlocks far more honest and detailed user feedback.

TheySaid.io positions itself not just as a feedback hub, but as an AI-powered platform for proactive and conversational user engagement. For an early-stage startup founder, its core strength lies in transforming the tedious process of user research and feedback collection from a static, low-response chore into an interactive, high-engagement dialogue. The platform’s ability to quickly generate and deploy targeted, conversational surveys directly addresses the critical need for timely, high-quality user insights.
I don’t usually write reviews but this truly deserves one. We used it for a small beta test and the AI actually made sense of all the voice feedback we collected and saved our team hours of manual work.
It would be fantastic to have even more ways to customize how feedback is organized or summarized. Aside from that, the overall experience has been really smooth.
I picked Theysaid because it provides straightforward actionable insights about customers without all the extra fluff.
Absolutely. The AI can be trained using a variety of URLs and documents. It's built to take in content from different sources blend that information together and apply it consistently in its responses and insights. This approach makes it super easy to keep the AI up to date as our knowledge base expands.
Absolutely our testers had a seamless experience. The chat interface was simple, intuitive and felt natural for gathering voice feedback. Participants didn’t require any instructions they just dove right in and everything flowed effortlessly. This user friendly design definitely contributed to us receiving higher quality responses.
Absolutely User testing recordings are part of the package and the review process is super user friendly. The platform lays out the recordings in a clear manner, highlights key moments with timestamps and makes it a breeze to navigate, comment and share insights with your team.
The customization options were pretty impressive for a beta version. We managed to tweak the prompts and questions just right to fit our test scenario without any hassle.
The platform takes data handling seriously. All our recordings and transcripts were kept safe with access restricted to our team only. The product does a great job of explaining how data is managed and everything is encrypted while being transmitted. We didn’t encounter any issues during our beta test and the team appears genuinely committed to privacy and security.
Pricing is flexible and adjusts based on how many seats you need and the volume of responses we anticipate. As we bring on more team members or ramp up our feedback collection the plan adapts to fit those changes. This way it’s simple to match costs with actual usage ensuring everything remains clear and predictable.
I love how human the AI sounds. It doesn’t talk like a bot at all. People actually enjoy the conversation which makes the data we get way accurate.
Pricing is based on the number of seats and responses but it could really benefit from offering flexible add ons for additional responses without needing to upgrade plans.
I picked TheySaid because it makes gathering feedback a breeze. Its conversational AI digs up deeper insights compared to the usual surveys.
It’s quite speedy setting up and training the AI typically takes about 10 to 15 minutes based on what TheySaid’s interface shows.
Absolutely. Most participants really appreciate the chat interface for its intuitive and user friendly design which is both clean and conversational.
It really excels at handling branching you can easily set up intuitive conditional logic like based on NPS score, user plan, persona or sentiment so that respondents can follow personalized paths.
Felt a lot of features missing in previous versions but 3.0 really surprised me by including almost everything that was much needed. I got better insights in a single day than I usually do in a week.
We decided to go with Theysaid because we were looking for a quick and reliable way to understand the voice feedback from our beta testers. The AI analysis outperformed anything we could have done on our own and it really saved our team a ton of time.
Based on our experience the pricing appeared to be quite reasonable and primarily depended on usage. When you add more seats or gather more responses, the cost goes up but the pricing structure is clear and easy to understand. During our small beta test it felt budget friendly and the insights we gained from the AI analysis definitely made it worth it.


