AI in UX Research: What It’s Like Using Gemini In A User Interview Study [2025 Guide]

by | May 15, 2025 | Artificial Intelligence, User Experience, User Research, UX Guide, UX Strategy & Planning

 

These days, AI is everywhere; singing promises of speed, efficiency, and creativity. However, as a UX researcher, I felt a disconnect. I knew the good and the bad of using AIs, but where are the real stories, the nitty-gritty details of how to use AI in actual UX research? To satiate my curiosity, I wanted to understand how to use it, and how useful AI truly is

Fortunately, I had the opportunity to test out Google’s Gemini Advanced (and AI-powered Google Meet features). I used them as my research sidekicks in an internal user interview study to create potential user personas for a digital wallet app company. (Just to clarify, this is not a sponsored post, my company uses Gemini Advanced as part of our internal tools). 

To set expectations ahead, this article is more like a research diary to share my experiences, struggles, and practical examples of using AI in the different phases of the UX research process; from preparation and planning (Phase 1) through data analysis (Phase 4). The study also excluded external factors such as client feedback and discussion, allowing me to concentrate solely on comparing AI-assisted vs. traditional workflows and AI’s impact on the research process itself. 

To give you a little preview, at some points, Gemini felt like a true research sidekick, an efficient and creative brainstorming partner. Yet, as you’ll discover, our human understanding and interpretation were essential in truly making sense of the findings, a real team effort, you could say.

Research phases where limitations of Generative AI occurs

Typical UX research phases

Phase 1: Project Planning – Laying the Groundwork with Gemini

Let’s start with Phase 1 – project planning. My first interaction with Gemini involved creating a research plan. I provided it with the specifics of the project: my role, the objectives, the target participants (internal colleagues in this case), and the expected deliverables. I also uploaded a sample of a past user persona document as a reference point.

Efficiency Boost: Initial Output Generation

As you can already guess, Gemini is capable of almost immediately generating initial project outlines and plans, accelerating the beginning stages. This turnaround was undeniably faster than your usual manual process. All I had to do was transfer the output to a new file for documentation purposes. 

If there is anything to take note of, it is essential for the UX researcher to be clear about research objectives and needs for effective AI utilization, which we will talk about next.

Prompt Engineering: The Art of Guiding AI

Effective AI utilization involves making precise and detailed prompts. Remember the CARE framework (Context, Ask, Rules, Examples) from our earlier article? I tried to employ it to structure my prompts for optimal results. Here’s what I wrote:

Context: I am a mid-level user researcher with 3 years of experience in Malaysia, working in a digital UX agency that serves clients in various industries such as finance and insurance. I am currently in the stage of planning a research study.

 

Ask: I need to develop user personas to understand users’ financial behaviors and payment method preferences.

 

Rules: The study should focus on users in Malaysia. I plan to conduct 6 user interviews, each lasting 30 minutes, and complete the project within 5 working days. I will recruit my colleagues as my participants. 

 

Examples: Please include in the user persona documents: The demographics, key goals, we must’s, we must not’s, behaviors, and pain points. Here is an example of a user persona document for reference [attach relevant document].

With detailed context, Gemini was able to generate a tailored and comprehensive project overview; covering objectives, research methodology, target respondents, topics for discussion, deliverables, and a suggested timeline.

AI in UX Research: Research plan draft

A sample section of the research plan draft generated by Gemini, used as the starting point to shape my internal study

Personalization is Key: Refining AI’s Output

To wrap up the use of AI in this phase, I did my own review, refinement, and personalization of the project plan to make sure they align with my specific needs. AI-generated content is a great starting point, but the danger in fully relying on AI is that it naturally makes assumptions if you do not provide it with sufficient details or only provide ambiguous prompts and context. 

For example, in my first attempt at prompting, I did not mention that I was getting my colleagues as participants for my internal study. Gemini naturally presumes that I am running this study with actual participants – after which, Gemini made adjustments to the recruitment strategy and time needed accordingly. 

Overall, I had no major concerns with Gemini’s Phase 1 output. However, I did need to manually prompt it to add my preferred testing setup: online interviews recorded via Google Meet.

Another thing to note is that you will see a pattern that information from AI tends to be more generalized, lacking the specific nuances and details that stakeholders typically prefer. Hence, I suggest that you take what you need, then review the output to fill in the blanks with information that Gemini missed out.

AI In UX Research: Dos and Don'ts of Using AI in Planning your research project

Phase 2: Participant Recruitment and Scheduling – AI as an Organizational Aid

Next, even though my participants for this internal study were my ever-so-willing colleagues, the process of recruitment and scheduling still requires organization. In this phase, in line with the most common uses of AI to support in participant recruitment, I used Gemini to help me design the invitation messages and initial screening questions

Aiding Communication by Designing Invitation Messages

Instead of manually crafting messages, I turned to Gemini to draft my invitation messages to my colleagues. My prompt was straightforward – I asked Gemini to help create a screener and invitation message including what I need: availability and current payment methods.

Gemini is able to personalize the message based on my current and earlier prompts from Phase 1, and also provided me with options of messages to choose from.

AI In UX Research: Simple and Direct prompt for invitation message

Option 1: Simple and Direct

 

AI In UX Research: Slightly more detailed option for invitation message

Option 2: Slightly More Detailed

Organizational Assistance: Streamlining Scheduling

Since I wanted to utilize Gemini as much as I could, I decided to also use Gemini in Google Sheets to assist in creating a table of my participant’s interview timeslots.

I specified details such as:

  • Start date and time
  • Session duration
  • Buffer duration between interviews
  • Lunch break window
  • End time for my sessions.

AI In UX Research: Gemini generates tables in Google Sheet

Using Gemini in Google Sheets to generate my interview schedule table based on my needs and my availability

 

Gemini then generated tables that not only included columns for participants’ details (name, email address) but also incorporated the corresponding dates and times for the interview sessions. The only limitation was that Gemini could only produce a single table rather than separate ones, so I had to clean those up myself (but that’s just my personal preference!).

AI In UX Research: Gemini generated table in Google Sheet

The table created by Gemini – it needed a little cleanup on my end but I’m not complaining!

 

AI In UX Research: Dos and Don'ts of Using AI in recruiting users

Phase 3: Fieldwork and Testing – A Collaborative Effort

Now, we move on to Phase 3, a crucial stage that involves taking our research plan out into the real world. This phase centers around two main activities, which I’ll discuss individually: creating the discussion guide and the fieldwork, or conducting the user interviews.

Activities in fieldwork and testing in Phase 3

Phase 3.1: Crafting the Discussion Guide with AI – A Collaborative Creation

The discussion guide, typically the roadmap towards gathering user insights, demands careful development. In our manual process, we typically take around two to four working days from creating, reviewing, and finalizing the discussion guide before client walkthrough. This is where AI’s promise of speed, efficiency and creativity could be evaluated.

In short, Gemini proved to me what it could do – it was significantly faster, more efficient, and a valuable partner for brainstorming and exploring different perspectives to prepare a thoughtful discussion guide.

Time-saving: Enhanced Efficiency and Accelerated Initial Drafts

This phase truly highlighted Gemini’s capabilities, particularly in terms of speed. I began by tasking Gemini to create a discussion guide for my user interview. Impressively, it promptly delivered a first draft, providing a solid foundation and the basic structure needed to get started. This rapid generation clearly demonstrates a significant time advantage over our typical three-day manual process.

However, this initial speed came with a caveat: the draft was somewhat generic, lacking the specific focus required to deeply understand financial behavior and payment methods needed for a holistic view of potential user personas.

AI In UX Research: Gemini's first draft of a user interview plan

Gemini’s first draft is a great start, but refine it with more detail to suit your needs

Refinement Process: An Iterative Process with Human Expertise 

Hence, I put on my UX researcher hat to refine the discussion guide to my research needs, which involved multiple rounds of interaction with Gemini. This includes:

  • Adding objectives:
    • Recognizing that Gemini did not automatically include them, I prompted it to define clear objectives for each question, ensuring every inquiry directly contributed to our research goals.
    • Prompt example: “Please include an objective for each question asked.”
  • Targeting the inquiry:
    • I further refined the guide by focusing Gemini on specific areas. For example, I narrowed the scope to prioritize payment method preferences and cash usage over general financial goals.
    • Prompt example: “Can you modify the 30-minute discussion guide to focus on payment methods for daily expenses and minimize the section on general financial behavior/goals? I want to include questions about payment method types, preferences and reasons, top 3 rankings, purpose of using a specified payment method (if any), influencing factors, and questions relating to cash usage.”
  • Incorporating follow-up questions:
    • To ensure comprehensive and consistent data across all interviews, I instructed Gemini to suggest potential follow-up and situational questions based on anticipated responses.
    • Prompt example: “Can you craft out potential follow-up questions accordingly, as well as situational ones such as if the user says yes or no?”
  • Estimating durations:
    • To adhere to the 30-minute interview limit, I also asked Gemini to estimate the time allocation for each section.
    • Prompt example: “Please specify estimated time spent in each section as well.”

The discussion guide improved significantly after several rounds of refinement. However, I ultimately stepped in to manually adjust the question order and flow for a more natural conversation flow. This manual touchpoint made more sense because Gemini struggled to precisely integrate revisions of specific questions into the whole discussion guide. It became more cumbersome and time-consuming than simply tweaking the sequence of the questions myself. 

After I was satisfied with my manual flow adjustment, the final step was to have Gemini review the entire guide, acting as an experienced UX consultant to provide comprehensive feedback.

AI In UX Research: Gemini reviewing the discussion guide

Gemini’s review highlights its overall assessment, strengths, areas to refine, and suggestions to enhance the guide

 

By now, I’m impressed by Gemini’s help and can see it as a valuable tool in my workflow. The key benefit was not the reduced work, but a shift: from manual creation to strategic thinking to refine AI’s output. Instead of starting from scratch, I could focus my expertise on reviewing and aligning Gemini’s suggestions with our standard practices and research goals.

However, this experience also highlighted a crucial point: AI’s effectiveness heavily depends on the UX researcher’s expertise in guiding and evaluating its output.

Human Judgment Remains Key

This refinement process is a good reminder that while Gemini can offer valuable starting points and suggestions, the expertise and skills of a UX researcher remains crucial for effective use of AI.

For example, despite Gemini’s estimated timings, my research experience suggested the session could potentially overrun, just based on the number of questions. So, I prompted it to further prioritize or cut questions based on research goals. While its reasoning was helpful, the final decision on what to include rested with me.

AI In UX Research: Gemini's breakdown of questions to remove

Gemini shared a clear breakdown of which questions to retain or remove and their justification

 

AI In UX Research: Dos and don'ts when crafting discussion guide with AI

Phase 3.2: Fieldwork – Gemini’s Contextual Awareness

With the discussion guide finalized, we move on to the most exciting (personal opinion!) phase – fieldwork! Or the direct execution of user interviews.

While my direct interaction with AI during the fieldwork sessions were minimal, my digital sidekick, Google Meet, subtly worked in the background. I was utilizing Google Meet’s note-taking and transcription features (powered by AI) to capture every detail as my project partner.

Your Silent Ally: AI-Powered Transcription and Note-Taking

Google Meet’s transcription feature was a game-changer, providing both an accurate summary and a detailed word-for-word transcript for each interview, which are foundational for the synthesis and analysis phase to build detailed user persona(s). 

This capability also points to the potential for reduced manpower, a significant benefit during this fieldwork stage where the moderator’s direct engagement is the most crucial element. While there’s a waiting period of about 10-15 minutes for the AI-powered recording, transcription, and summary to be ready—potentially longer depending on your interview’s duration—it is a small trade-off given the valuable insights you gain from directly engaging with the users. 

AI In UX Research: Google Meet's Transcript Example

Sample of transcript by Google Meet

AI In UX Research: Gemini's meeting transcription summary

Sample of the summary derived from the above transcription

AI-Powered Contextual Understanding in Post-Interview Analysis

An additional pleasant discovery was AI’s ability to understand Malaysian’s habit of mixing multiple languages within a single sentence.

As a company based in Malaysia, which is a multicultural and multilingual country where “bahasa campur” (mixed language) is a common way of speaking, we’re used to blending words from Malay, English, and sometimes other languages in everyday conversations. Curious to see how well Google Meet’s transcription could handle this, I intentionally asked a colleague to mix Malay words into their interview responses.

Surprisingly, while Google Meet’s transcription struggled and transcribed some Malay words into gibberish, rendering kampung (village) as “kong kong”, and pisang goreng (banana fritters) as “goring pis”, Gemini was able to infer the intended meaning in its summary. For instance, it correctly picked up that my participant was referring to rural areas, despite the error in the transcript being written as “kong kong”.

This discovery was indeed a nice surprise, demonstrating Gemini’s contextual understanding of mixed-language communication, a reality deeply embedded in the Malaysian experience.

AI In UX Research: Gemini's contextual understanding

“Kong Kong” to kampung: Even with transcription errors, Gemini gets the context

The Mindset Shift: Are We Ready to Move to AI-Assisted Research?

Honestly, while Google Meet’s transcription, note-taking, and summary features proved useful in this phase, the biggest hurdle in this phase was more of a personal one: trusting AI enough to truly consider it a project partner. Despite its capabilities, I still felt compelled to take my own notes – though admittedly, part of that was for the purpose of this article to evaluate and compare with the usual manual process.

It made me wonder, are we truly ready to fully trust AI in its capacity?

On a personal note, the crucial factors are accurate data and easy retrieval of individual participant responses. If AI can deliver on these, I see no issue in making it my primary note-taker. This emergence of AI is a real opportunity for UX researchers to demonstrate their adaptability and embrace what could be a significantly improved workflow.

AI In UX Research: Dos and Don'ts of using AI during research fieldwork

Phase 4: Gemini and the Great Persona Puzzle – My Analysis Adventure

Phase 4 marks the exciting (yet slightly daunting) transition to data analysis. This is where the real puzzle-solving begins: the team deep-dives into participants’ responses, hunting for patterns, spotting key differences, and making those crucial calls on how to cluster users into meaningful personas (and when a group deserves its own spotlight!). This collaborative sprint typically takes 3-5 working days before the team is ready to present the user personas to inform design and business strategies.

Given Gemini’s strong support in Phase 3 during the discussion guide development, I was eager to see its role in this critical analytical stage.

Time-Efficiency Gains in Data Analysis

A primary benefit of using Gemini was yet again, how much time it saved me. The typical 1-2 day process for just reviewing raw data, identifying patterns, and synthesizing findings into potential personas was reduced to half a day. To illustrate this further, let’s look into my step-by-step approach to using Gemini for this analysis.

Step 1: Prompt Engineering: Guiding Persona Generation

As you already know, a lot of what really made Gemini work is in how one “talked” to it. Yup, the CARE prompt comes into play again. After all, my prompts are the instructions that guide Gemini in its analysis and keep it within the parameters of my user persona development.

To start of Phase 4, I first formulated a detailed CARE prompt to instruct Gemini to generate me some recommended user personas.

AI In UX Research: Notes and Prompt for Gemini

Gemini’s direct upload of files, images, and Drive documents make analyzing large amounts of data for user personas easier.

 

Gemini then suggested three distinct user personas for me:

AI In UX Research: Rationalizing with Gemini

Excerpt on the rationale of creating three user personas for this study

 

Upon quick reading, the initial user personas generated were insightful as each one felt distinct, like a real person with their own quirks and motivations.

Step 2: Analyzing Gemini’s Reasoning: Deconstructing Persona Generation

Then, to make sure Gemini’s recommended user personas were not just AI “hallucinations” or guesswork, this round, I put on my “skeptical researcher” hat and did my own analysis. I went back to the transcripts to see if its logic held up on things like:

  • Why it proposed three user personas
  • The steps it took to arrive at that conclusion
  • Reasoning for matching my participants to those user personas

I also probed Gemini on any statements that I had doubts or was unclear about. Overall, for most parts, the information was accurate, convincing even. Gemini could justify why it categorized the persona the way it did.

AI In UX Research: Challenging AI's thinking

Excerpt of Gemini’s output – its thought process and steps for coming up with three user personas

Step 3: Resolving Persona Ambiguity with Gemini’s Guidance

Let’s start with good things about Gemini first: in most cases, Gemini was spot-on. The user personas closely align with my own analysis. My final outcome also revealed three distinct user personas, shaped by their preferred payment methods and underlying attitudes.

AI In UX Research: User Personas Generated by Gemini

Meet Ben, Natalie, and Aisha! 

[Note: These visual elements of these personas were created by yours truly (and stock photos!) and are not generated by Gemini or any other AI]

 

A real turning point in my analysis was realizing Gemini’s potential as a true research partner. During the process, I was stuck categorizing a colleague who used digital payments (like a Digital Native) but is also worried about security for big purchases (like a Cautious Pragmatist). 

Unsure how to proceed because I was contemplating creating a fourth persona, I decided to bounce the dilemma off with Gemini. It was in that moment, that I saw the analytical value I’d been missing, realizing that not every interaction needed to follow a formal prompt framework. Here’s what happened when I directly asked Gemini for its opinion:

AI In UX Research: Asking gemini for opinions

Stepping away from the script and formality, and just chatting research with Gemini about whether this participant should be a single persona

 

AI In UX Research: address the overlap in user data with gemini

Unprompted, Gemini came through with its role as my research partner to suggest next steps for me

 

AI In UX Research: Gemini's recommendation for persona dilemma

Additionally, I was also getting recommendations, but I still need to make the final decision

 

It explained why she appears to fit both Digital Native and Cautious Pragmatist personas, then suggested how to handle the overlap and provided recommendations for the next steps. With the elaborate explanation, I ultimately categorized her as a Cautious Pragmatist, focusing on her underlying attitudes. This moment crystallized for me how Gemini could serve as a supportive research companion, efficiently summarizing contents and helping me see patterns more clearly to make confident, informed decisions.

Human Validation: the Importance of Verification

Okay, so here comes the “but”. Even with my confidence in Gemini growing, as researchers, we have a responsibility to ensure accuracy and validity of the results. After my deep-dive into Gemini’s “thought” process and comparing its findings with mine, it also revealed a limitation; where are the finer nuances?

While Gemini’s analytical strengths were evident, its difficulty with subtle details when trying to be concise showed why human intuition is still important:

  • Lack of rich and vivid details: I needed to ask clarifying questions about transaction specifics or motivations. For example, I wanted to know what were the examples of “wide range of transactions” Ben used for his credit card. While I can manually retrieve or request Gemini to provide me this missing context, doing so for every sentence felt counterproductive and detracted from the initial efficiency gains.

AI In UX Research: Statement to clarify

The original statement about Ben’s persona that I wanted to clarify

 

AI In UX Research: Prompt for Gemini to provide more details

I prompted Gemini to provide more details about that statement

 

AI In UX Research: Long revised description

Gemini can definitely retrieve all the information you need, even if the updated version becomes a bit much

 

AI In UX Research: Gemini's lack of balance in explanation

Gemini’s balancing act is helpful, but repeating it for every point? Manual editing might just seem more efficient at this point, hmm…

 

  • AI Oversimplication: I also noticed that Gemini’s analysis displays a tendency to oversimplify information. It described Ben as hesitant to adopt digital payments, even though my notes showed he was actually quite open to them, but having specific reasons for his strong credit card preference. Fortunately, Gemini is more than capable to provide additional analysis and revert its statement to align more closely with Ben’s true behavior.

AI In UX Research: original statement of persona behaviour

Original statement about Ben’s behavior by Gemini

 

AI In UX Research: prompt to challenge Gemini's statement

I wanted to understand why Gemini labelled Ben as hesitant because from my session, I could tell that Ben openly accepts and uses digital payment methods

 

AI In UX Research: Gemini as a brainstorming partner

Challenging Gemini’s responses—it’s a great brainstorming partner, but be cautious of excessive agreement, as Large Language Models (LLMs) are designed to align with your preferences and values

 

AI In UX Research: using own judgment for persona instead of relying on AI

Using my own judgment, I saw that Ben does acknowledge digital payments so I updated his description to match that

 

Phase 4 showed that while Gemini can handle complex data, it still risks oversimplifying information, much like human researchers. To address this, here are some considerations that worked for me:

  • Let Gemini handle the heavy lifting (data-wise): Leverage Gemini’s speed and efficiency to navigate through large volumes of data and extract key information.
  • We decide on the details that matter: As the human researcher, determine the optimum level of detail for impactful findings, guided by context and goals.
  • Let’s circle back to the word, collaboration: Combine Gemini’s power with our understanding to ensure accurate, relevant, and impactful outcomes.

AI In UX Research: dos and don'ts of using AI during qualitative data analysis

>> Have a look at the final personas here! <<

Phase 5: Communicating and Presenting Findings

In this final phase, I did not use Gemini to prepare to communicate and present findings to stakeholders because it was an internal research study. I concluded the process with the creation of user persona documentations to be presented. At most, I explored and assessed how Gemini would communicate findings by looking at the narrative it crafted for the user persona contents.

Recapping on the risk of oversimplification and lack of nuances from Phase 4, I found Gemini’s initial output often lacked compelling storytelling. While the content might seem promising at first glance, experienced researchers would likely find the AI-generated personas vague and lacking sufficient narrative depth, raising concerns about directly extracting and using information generated by AI.

AI In UX Research: Gemini lack narrative description

Natalie’s behavior description is clear but could benefit from more emotional depth and vivid imagery

AI In UX Research: Putting story and human touch into an AI persona

Natalie’s persona becomes more engaging when vivid scenes, emotional lows, and connector words are used to build a fuller story

 

However, I do see AI’s potential to help in generating visual aids and creating relevant presentation templates. For this phase, I would suggest using Gemini or other AI tools for creative inspiration on how to best display information, both in text form and visually. Then, when preparing for stakeholder presentations, provide AI with clear style guidelines, reference materials, and audience personas, and refine with your own human touch, as we have discussed in a previous article.

 

AI In UX Research: Dos and Don'ts when using AI during presentation

Final Thoughts: My Take on AI for UX Research

Reflecting on my initial questions about AI’s speed, efficiency, and creativity in UX research, this opportunity gave me the answer I need:

  • Speed: Undeniably, AI offers impressive immediate output!
  • Efficiency: Subjective – while AI saves time in certain phases, the iterative refinement and clarification process often requires additional effort.
  • Creativity: A valuable asset for ideation and exploring diverse perspectives.

Rightfully, my understanding of how best to use AI is still a work-in-progress, and I’m open to better ways to do so. But when it comes to its usefulness in UX research? I can confidently say, yes. Even if additional learning is necessary, I can see myself integrating AI into various parts of my UX research workflow. 

The big question for me is still about trusting AI with understanding people and their nuances. It’s powerful, no doubt, but that human intuition to empathize still feels uniquely human. So, for now, I’m thinking the best way forward is embracing AI step-by-step and finding a comfortable balance, similar to the process of adopting any new tool or software.

 

About the Author:
TSL

Swan Ling brings nearly three years of UX expertise in user testing, interviews, and research across finance, insurance, and utilities sector as a UX Researcher in Netizen. With a background in psychology and education, she instinctively critiques UX and CX in daily life—when not lost in the outdoors or deep in the forest.

Recent Articles

How UX Research Is Reshaping Digital Banking Strategies in 2025

Imagine this: You urgently need to transfer money, but the banking app crashes mid-transaction. You restart it, but now you’re unsure—did the payment go through? Will you try again? For a moment, you feel a surge of panic. But take a deep breath and let’s zoom out for...

UX Research Challenges You Can’t Ignore

The rapid digitalisation across industries has made UX research more critical than ever. Poor usability can lead to serious consequences; whether it’s user frustration or losing revenue and trust in the company. But what happens when research itself becomes the...

Limitations of Generative AI in UX and How to Overcome Them

Generative AI is transforming UX research, automating tasks like transcribing interviews, analyzing data, and personalizing contents. It's a big shift from the old, manual ways of working; enabling teams to work more efficiently. But as we embrace AI, we also need to...

5 Stages of Design Thinking Process You Need to Know

Staying ahead of the curve means continually innovating and adapting. Entrepreneurs and mobile app developers often face the challenge of creating solutions that not only solve real problems— but also deliver exceptional user experiences. This is where design thinking...