AI-Enhanced User Research with Dovetail

Matt Kast
Product Design Manager
Read Time
6 min read
Published On
May 29, 2025

In today's rapidly changing tech space, AI has been making waves (big ones) across all industries, including User Experience (UX) Research. At Perpetual, we've been exploring ways to make our research process more efficient and cost-effective, particularly through the use of AI-powered tools. While we recognize that AI can't completely replace human insight, we saw an opportunity to streamline our research and analysis workflow while still maintaining high-quality results.

Our team had been using a combination of Notion databases, Airtable, and various other tools for research management, but we wanted to find a more integrated solution that could help us work smarter. With AI capabilities becoming more sophisticated, we believed the time was right to explore tools that could help us organize, analyze, and share our research findings more effectively.

If implemented thoughtfully, AI-enhanced research tools can significantly improve efficiency and reduce costs. This improved ROI means easier approval for research initiatives and ultimately, better products.

Trying out Dovetail

For a recent project, we decided to begin evaluating more seriously integrating AI into our research process, specifically in the space of exploratory qualitative research. After assessing a few different tools, we decided to try Dovetail. Dovetail is a purpose-built research repository that aggregates various forms of data (Video, Audio, Text) and aims to more easily uncover insights through the use of AI.

Dovetail has a few select AI features that we were interested in trying out for this test including:

  1. Transcription
  2. Data Summaries
  3. Auto Highlighting
  4. Auto Tagging
  5. Data Querying and Summarization
  6. Insight Generation

Outside of these AI features, we were also intrigued by Dovetail's features around more efficiently organizing research and linking it to actionable insights.

Challenges with Existing Tools

Our previous method for collecting and organizing exploratory qualitative research involved using a series of connected Notion databases including DBs for Participants, Key Observations, and insights. This method served us fairly well, but we faced challenges in the following areas:

  1. It was time-consuming to enter data into databases (specifically observations and insights)
  2. We found that our setup was difficult for stakeholders to use (we would often take time creating decks tailored to the stakeholder in question), and stakeholders didn’t want to learn a new tool or understand how different databases were linked
  3. It was difficult to link to actual video or audio content
  4. Multiple platforms were used to derive insights (using tools like FigJam to do affinity mapping)
  5. It wasn’t easy to query the data

Our Experience

To test Dovetail, we conducted a quick research round with five participants, along with initial stakeholder interviews. We recorded all interviews through Google Meet and uploaded them to Dovetail for analysis.

While our overall experience with Dovetail was positive, its AI features needed significant human oversight. Here's the breakdown of what worked, and what didn’t, in our experience using Dovetail.

1. Transcription

Dovetail's transcription is quite good, making mistakes only occasionally with proper nouns (like company names or people's last names). Even when someone speaks quietly or has a strong accent, the tool transcribes accurately. Dovetail performs comparably to other transcription tools we've used (Gemini, Otter.ai, Zoom).

2. Data Summaries

The platform excels at identifying key discussion topics and converting them into concise, informative snippets. Each summary point includes a timestamp linking to where the discussion begins in the recording. While we found very few errors in these summaries, it's still important to verify their accuracy and ensure the point of focus for a particular topic is correct.

A Dovetail interface showing a video interview transcript with timestamped dialogue and a right sidebar listing chronological interview segments with durations.
A look at Dovetail’s video transcription workflow in action

3. Auto Highlighting

When it comes to auto-highlighting, this is where the AI begins to fall short.

The highlighting functionality really serves only as a starting point. During our review of transcripts and suggested highlights, we found many important points were missed, while less relevant points were often highlighted. With a success rate of roughly 40-50%, the feature isn't reliable enough to trust completely. While it was interesting to see what the AI deemed important, we spent about the same amount of time reviewing and creating highlights as we would have without the AI.

Despite these limitations, the highlighting feature is valuable overall, enabling faster linking between observations, media content, and insights.

4. Auto Tagging

Like the highlighting feature, auto-tagging also needs improvement.

The tagging functionality itself in Dovetail is excellent—you can pull from pre-existing tag banks or create custom ones. However, in our initial test, the AI often applied generic tags incorrectly to highlighted sections. For instance, it would label something as a pain point when it clearly wasn't.

While the system's accuracy improved as we updated highlights and created custom tags, mistagging remained a frequent issue.

A helpful feature is the ability to add context to tags to guide the AI's tagging decisions, which did improve accuracy somewhat.

Adding more details to custom tag prompts for improved insight tagging
Adding more details to custom tag prompts for improved insight tagging

5. Data Querying and Summarization

This is an area where Dovetail does a decent job. Something we used a lot to quickly get answers from the data was the AI-driven data search functionality. For example, asking something like “What were the key decision-making factors used when deciding on a tenant?” yields a summary of the data, as well as the supporting highlights. We found that sometimes we would have issues with seeing all the supporting data, and we’d notice data that should have supported the answer would be missing.

We really like the idea of this application of the AI but need the execution to be a bit better before fully relying on its results. Ultimately there was still a significant amount of manual work to be done when putting together insights, as key highlights that should have contributed to the answer were missing in most cases.

AI-generated summary of tenant decision-making factors from Dovetail, including financial considerations, tenant fit, and impact on other tenants. The summary presents 54 results across 2 projects with detailed bullet points and citation numbers.
AI-generated summaries based on qualitative data in Dovetail

6. Insight Generation

Like highlighting and data querying and summarization, the insight generation tool is definitely more of a starting point than a polished and ready feature. Despite seeing a lot of clear patterns throughout interviews, the evidence provided for the auto-generated insights was often based on a single highlight or discussion point. Occasionally the platform hit the nail on the head with an insight but didn’t connect all the data points one would want to see to support it.

It’s another great concept, but manual work is still required by the researcher to get it right. Does it save time? In this context we would argue no, as we went through a few different generation attempts before deciding the outputs weren’t worth the time, and we manually created insights as we always have.

The overall insight concept within the platform is still great, allowing you to create evidence-supported insights that can be published and delivered to stakeholders in an easily understood format.

Overall Thoughts

Ultimately, we think that the AI features offered by Dovetail can be helpful but still need some time to mature. In its current state, these features serve as a good starting point for research synthesis and insights but cannot be relied upon fully.

There will always be some human element required for interpreting research, but we imagine the partnership between the researcher and AI will continuously improve as models get sharper and tools like Dovetail integrate them for their specific use cases more seamlessly.

We hope to see AI take over the more tedious parts of research (like highlighting and tagging) to allow researchers more time to interpret the findings and strategize solutions.

As a platform overall, we really enjoyed using Dovetail, and it definitely improves our research analysis workflow. After this experience, we’ll certainly be using it again for future research initiatives.