Clicking, Learning, Telling: Audience Response System Use in Remote Locations

Do you enjoy achallenge? My assignment was to determine if specific harvest skillshad been transferred from our workshop to lead farmers and then on toindividual farmers in remote villages in Rwanda. (Figure 1)

 

photo of villagers in Rwwanda demonstrating their clickers

Figure 1. Kirehe, Rwanda: Members from two farmer cooperatives (Kimaranzara and Tuzamure Ubukungu) holding audience response devices.

 

Completing thisassignment required coming up with answers to four key questions:

  • How can you determine if new skills are being applied in a remote location?

  • How can you expand the size of a focus group and still ensure everyone is heard?

  • How can you get qualitative and quantitative data to analyze training outcomes?

  • How can you create an engaging experience for the participants and the interviewer to learn together?

I discovered that,with an audience response system, it is possible to collectmeaningful data in the most remote places. In this article, I’llshare my experiences with you, and some lessons learned that you canapply no matter where in the world your particular challenge islocated.

The “back story”

The United StatesAgency for International Development (USAID) has been collaboratingwith the government of Rwanda to ensure adequate food supply for thecitizens of this landlocked nation. One strategy has been to increasethe local production of critical staple crops such as maize (corn)and beans.

To this end, ourcompany (ACDI/VOCA) had been selected to work on a program to help25,000 farmers reduce crop loss during and after harvest.Specifically, farmers are to apply new skills to better collect,process, dry, and store their crops.

To reach this largenumber of farmers, we focused on existing agriculture cooperativesthat could help us to facilitate cascade training. In cascadetraining, cooperatives identify and send lead farmers to our harvesttraining. After being trained, these lead farmers return to theircommunities and train other members of their cooperatives.

Twenty-fourcooperatives were selected to participate in the program. We trainedeight cooperatives in first season. After the training, we assessedthe program and made adjustments before training the other sixteencooperatives.

In January 2011,the program trained 240 lead farmers from eight cooperatives inKirehe, a southeastern district near the Tanzania border. In March, Iwent to Rwanda to assess the results of the cascade training. Myassignment was to determine if specific harvest skills had beentransferred from our workshop to lead farmers and then on toindividual farmers in the remote villages of Rwanda.

Clicking: giving everyone in the room a voice.

I used audience response devices fromTurning Technologies. I had thirty responders (affectionatelynicknamed “clickers”) and one handheld receiver. Individualsrespond to multiple-choice questions with a clicker, and I capturethe aggregate results in my handheld receiver (Figure 2).

 

handheld receiver clicker with 12 button pada responsder clicker has n-s-e-w navigation button and black and white simple results interface

Figure 2. Audience response devices from Turning Technologies: receiver (left) and responder/clicker (right)

 

I met with over 130members from all eight cooperatives. With clickers, I was able toincrease the size of my focus groups to 15-30 people. (Figure 3)

 

Photo of Rwandan students in classroom using their clickers as part of classroom activity

Figure 3. COACMU and COACLMA farmer cooperatives

 

Focus groups are aneffective way of collecting information, but the focus group leadermust make efforts to compensate for outspoken participants, responseconforming, withdrawal, and atypical responses. Using audienceresponse devices allowed me to address each of these concerns and toprovide a more comprehensive report to the project staff.

Outspoken participants. An interviewer dreads prolongedsilence after asking a question, but even worse is a participant whomonopolizes the time. The interviewer will try to call on others tospeak, but this can make others uncomfortable by thrusting them inthe spotlight. With clickers, everyone has a chance to immediatelyrespond. It also gives everyone more time to prepare their comments.From the responses, the interviewer can ask questions such as, “I’dlike to hear from those who selected option #2” to prompt differentpeople, even those in the minority, to speak. This strategy waseffective when I was asking the participants about moisture testing.For farmers, testing moisture is critical in knowing when a crop isready to sell. Many farmers are loudly adamant about using an oldertasting technique. After asking which moisture testing method thefarmers used, I noticed that two participants had selected a newermethod. I was able to draw out their story for others to learn fromtheir experience.

Responseconforming. Participants might be inclined to adjust theirinitial responses to conform with perceived leaders or towardsanswers that appear to be more accepted. Clickers democratizeresponses so that the interviewer gets a truer impression ofparticipant opinions. When I asked participants to tell me how manyhours they spent training other farmers, verbal responses tended tonorm around the first answer given. When I used clickers, I got abroader range of answers (and, typically more accurate when Icompared them to time estimates given by trainees).

Withdrawal.This can occur when there are outspoken participants (see above).However, withdrawal can occur with any focus group, as people arepassive while waiting for a turn to speak. Clickers allow everyone torespond immediately after a question. Participants are also eager tohear the results. Hearing how others have responded is valuableinformation to individuals assessing their own behaviors andperformance. I have witnessed groups with clickers buzz with energyas they await the next question and, most importantly, the results.

Atypicalresponses. Focus groups can provide in-depth stories andexamples. But, an interviewer must determine the commonality of anindividual’s experience. A compelling story might be unfairlyweighted as an example of the collective experience. Others in afocus group may be reluctant to provide contradictory evidence.Clickers provide a foundation of quantifiable information upon whichstories can be put into context for frequency and commonality. I hadone farmer tell me about the long distance he had to travel to takehis crops to market. With the clickers, I was able to create ad hocsurvey questions. A quick survey with clickers informed me that hissituation, although a real challenge, was not a common problem.

Learning: are Rwandan farmers applying new skills?

Monitoring andevaluation is a critical process for reviewing activities andassessing performance. I loosely adopted Kirkpatrick’s frameworkfor reviewing the training effectiveness.

My goals were torecord trainee satisfaction and perceived relevance of the training(level 1), confirm learning of new skills (level 2), and identifyskills being adopted in the fields (level 3). It was too soon torecord if the new skills resulted in less crop loss and more revenuefor the farmer (level 4), but I could begin to collect theiranticipated outcome. (Figure 4)

 

Photo of Rwandan farmers huddled in rudimentary classroom with clickers in hand

Figure 4. Remote visit to Indakemwa farmer cooperative

 

Clickersfacilitated my collection of information for all levels of questions.Clickers, in the end, cannot replace the richness of observationsthat come from walking through fields and observing skills in action.However, clickers do provide a foundation of data by which one canconduct observation visits to the farming fields more systematically.

Level 1: trainee satisfaction and perceived relevance

I reviewed the fourstages of the harvest training (collect, process, dry, and store)with the participants. For each stage, I asked if they were satisfiedwith the skills they were taught; specifically, did they believe theskills would make a positive impact in their farming.

With the clickers,I could see that the results were nearly unanimous that all theskills would make an impact. Although this was helpful feedback, Iwanted to delve deeper. I then asked participants to click on thestages that were most helpful and least helpful.

The clickers showedme that participants were most satisfied with training in harvestcollection and processing skills, and that they were least satisfiedwith the training in storage skills. I was then able to have adiscussion as to how we could improve the storage training and makeit more relevant.

Level 2: confirm learning of new skills

Typically,Kirkpatrick’s level 2 asks if the participants have acquired newknowledge or skills. As this was a cascade training program (trainingof trainers), I asked questions about how these lead farmers weredelivering the training to other farmers.

With the clickers,I could rapidly collect information on average number of farmerstrained (16), number of hours spent training each farmer (4), andwhich skills were well received (scheduling the harvest and dryingcrops on plastic sheets). I was also able to identify skills thatwere not being adopted. Assessing moisture content of stored grain isan important skill.

Through the use ofclickers, the results showed few people adopting new techniques formeasuring moisture content. This led to in-depth discussions toidentify resistance and to appropriately change our trainingmaterials.

Level 3: adoption of new skills

As with anytraining program, adoption is the linchpin for success. For cascade(train-the-trainer) training, I essentially had two questions: Whatskills did the trainers (lead farmers) adopt? What skills had theyobserved their trainees (individual farmers) adopting?

Clickers, as withany self-reporting instrument, can be biased towards telling theinterviewer whatever he wants to know. First-hand observation cannotbe omitted. However, having clickers gave me a clear picture of whatto look for and the type of questions to ask during my observationvisits to the fields.

For example, theclickers helped me to see what skills trainers said were beingadopted. There was one glaring inconsistency. In level 2, trainersreported that trainees found drying grain on plastic sheets a greatimprovement over drying on the ground for enhancing grain quality.But here in level 3, trainers were reporting low adoption by farmersof actually using plastic sheets for drying grain. Follow-updiscussions quickly revealed that the more remote villages did nothave access to the polyurethane (plastic) material. When I laterconducted my observation visits, I was now prompted to ask thosefarmers using plastic sheets where they had purchased their materialsand to talk to cooperative leaders about including plastic sheets intheir supply stores.

Level 4: impact at the organization (cooperative) level

Here is the purposeof the program: reduce harvest losses to provide more food and incomefor farmers and their families. Although it was too early to measurethis, I could use the clickers to collect information from farmersabout current land size, historical losses, and anticipated lossreduction with the new skills. Although the forward-looking responsesare conjecture and subject to environmental factors, the clickersallow me to rapidly collect and aggregate information from the peopleclosest to the fields.

Telling: sharing results with external stakeholders

The most effectivereports contain both quantitative and qualitative information.Observation visits to farming fields and rural focus groups aretypically qualitative in nature due to the limited sample size andthe limited ability to collect standardized answers from everyone.Clickers helped to collect quantitative information that wasemphasized with specific qualitative examples from follow-updiscussions. This avoided the overemphasis of interesting, butatypical stories.

To demonstrateclicker functionality to external stakeholders (funding donors andother organizational staff) to whom I was reporting, I had thestakeholders use the clickers to guess answers to information I hadcollected. For example, before presenting how many hours the averagetrainer trained a farmer or which training session was most popular,I had stakeholders click an answer. (Audience response systems alsoallow you to project clicker responses, in real time, into aPowerPoint presentation.) Not only did this to demonstrate the use ofthe clickers, it showed that stakeholders were learning newinformation. And, it invited active participation and meaningfuldiscussion throughout the entire presentation!

Interviewer tips

There is no onesingle method of data collection that should be used exclusively.Researchers must consider an appropriate balance of observation,interviews, and record review. For focus groups, I found the use ofan audience response system to be engaging and informative. I have afew tips to offer, so that you can learn from my mistakes andexperiences.

  • Introduce the clickers to focus groups by asking simple questions. I asked, “Will it rain tomorrow?” as my first question. This helped participants get familiar with the clickers. Also, the diverse results led to a discussion that differing opinions were acceptable and encouraged.

  • Show how results are aggregated and anonymous. After the first question, I walked around and showed everyone the results on my screen. They could see what information I had and that there was no way to identify them individually.

  • Share results after each question. Participants are extremely curious to see the aggregate results. It is the least we can do in providing information back to focus group participants who are volunteering their time to meet with us.

  • Avoid abstract scales. When I asked people to rate their satisfaction on a scale of 1-9 with 9 being high, I got blank stares. Several minutes of explanation didn’t help. It was more effective to say “Press 1 if you didn’t like it, press 2 if you did like it, press 3 if it was your favorite.” Each number needed to represent a specific answer. Scales can be too abstract for some people.

  • Explore several vendors. There are several vendors offering audience response systems. Be sure to research several options before selecting a system that meets your organizational needs.

But I don’t work in Rwanda, how does this apply to me?

Not everyone worksin rural Rwanda. However, if the definition of “remote location”includes any place that isn’t within a standard meeting room, thenthere are many more places that an audience response system can beeffective in collecting information.

At work, trainerscan collect information from employees wherever they might gather –from the break room to the factory floor to the call center. Incommunity workshops, facilitators can query participants in anyindoor or outdoor setting. Even for assessing low-intensitybehavioral change efforts such as public messaging, evaluators caninterview groups of random citizens rapidly and thoroughly throughclicker use.

Remote settings canbe challenging, but an audience response system empowers trainers andevaluators to rapidly gather extensive information with a minimalamount of equipment. There are so many voices waiting to be heard,and you can be just a click away from listening. Good luck and happyclicking!

Share:


Contributor

Topics:

Related