Reflection on Running a Focus Group-Style Study

Recently, I completed a focus group study that used Repertory Grid techniques to answer the research question: how do people experience Tangible User Interface (TUI) materials? This was the first focus group I have ever ran, so there was quite a learning curve involved with preparing for it. For the most part, however, the study ran smoothly – aside from the expected last-minute participant drop-outs and technological issues with my recording equipment, I didn’t encounter any major problems. That said, my skills at running a study like this can certainly be improved upon. Here, I will discuss the areas that could use a little work, and that maybe I would do a little differently next time around.

Leading the Conversation

One of the main challenges I faced that I was perhaps ill-prepared for was how to lead a conversation without influencing the language or ideas. Considering a major result I was looking for from this study was a series of personal constructs (i.e. descriptive words explaining an experience) identified by the participant group, I had to carefully monitor my word choices, tone and even body language so as not to unduly influence or even control the conversation. There were times when one participant might offer a suggestion, and my instinct might be to pounce on a phrase or idea they presented, revealing through my animated voice or hand motions that I found this idea fascinating or important. I had to constantly remind myself not to do this, as I could end up leading the conversation in a direction wanted, and not necessarily one that the group was interested in pursuing. A skill I certainly need to develop is learning, as a facilitator, when to lead and when to just be quiet and let the conversation flow.

Getting to the Bottom of a Thought

This task was, perhaps, the most challenging of all. Because my study was focused on understanding people’s experiences handling and interacting with the materials, it was vital that I communicated to the participants that this was the information I was looking for. I wanted to understand what emotions, thoughts, reactions, etc. they had to each individual material. However, this proved to be a difficult concept to convey. Most participants found it difficult to describe their experiences – instead, they would end up describing the physical characteristics of the material (‘it feels heavy, it smells bad’). During the group discussion, I had to keep coming back to the idea of experience, asking more and more probing questions that tried to get to the bottom of how the material was being experienced. Perhaps, if I was to run a similar study again, a warm-up activity underlining this concept could be incorporated.

To Script or Not to Script?

Going into this study,  had prepared the study activities in careful detail, down to the table arrangements and camera placements. However, I hadn’t given much thought to what I would actually say. I had a broad idea of introducing the project, explaining to participants what they would be doing, pointing out the bathrooms and refreshments table and just firing ahead with it. Which is exactly what I did, and I am still undecided if this was the right decision or not. On one hand, my off-the-cuff welcome and informal interludes lent a relaxed, chatty vibe to the proceedings, which may have added to the speed at which participants started chatting in their groups. However, I felt that my introduction and overview was very clearly unrehearsed and unprepared – which is not necessarily a good thing. I worry that while it did contribute to a chatty atmosphere, it may have reflected poorly on the professionalism of my study. Participants may have left with an impression of a less-than-perfectly organised event – something that while perhaps true to a certain extent, is not the impression I would like to give off. It may be worth my while contacting a few of the people who took part in the study to ask for their opinions – did the study seem well-run? Did they feel that it was poorly organised? This could provide me with the basis for an answer to the question of whether scripting a study like this is a good idea.

Knowing When to Stop

The final phase of the study involved all participants joining in on a conversation about their experiences using the materials in order to arrive at a number of personal constructs. The key word in that sentence is ‘number’: when is enough enough? How do you know when your participants are thinking, or just plain out of ideas? This was something that I thought I knew the answer to going into this study – aim for 10 to 12 – but soon realised it is not so simple. Participants would seem to be out of ideas, then one would throw out a comment, and suddenly they were off again, bouncing around ideas and filling my camera with a stream of useful data. It was difficult to determine when they were actually done, or when a prod from me was required to spur them on. While it is always worthwhile trying to elicit more and more thoughts and opinions from people, I did not want to draw out ideas that they did not necessarily believe in, but felt the need to adopt to satisfy me. It was important for me to ensure participants only volunteered opinions that they actually held – not ones invented to keep me happy as a asked for more and more. Knowing when to stop, therefore, was important, not just for the participants’ sake but for the quality of the data as well.

 

 

 

2 years ago

Leave a Reply

Your email address will not be published. Required fields are marked *