Atrimonta
  • What's New
  • About Us
    • Project Experience Examples
  • Client-Centred Evaluation
    • CCE Awareness
    • CCE Access
    • CCE Needs
    • CCE Services
    • CCE Outcomes
  • Logic Models
    • Logic Model Best Practices
    • Logical Program Approaches
    • Logical Organizational Approaches
  • Templates
    • Scoping Questions
    • Program Profile
    • Logic Model
    • Evaluation Matrix
    • Data Collection Plan
    • Schedule and Budget
    • Data Collection and Analysis Form
    • Report Outline
    • Debriefing Questions
  • Evaluating Reach
    • Who's Participating
    • Reasons for Participating
    • Facilitators & Improvements
    • Challenges & Solutions
    • What Works & Sub-Populations
  • Evaluation Surveys
  • Evaluation Toolbox
  • Contact

Evaluation Surveys

A blog about creating, conducting and reporting on Evaluation Surveys

Learn More

Communicating Reflections cont... Reporting Survey Responses (Specific)

2/6/2022

2 Comments

 
Picture
Heads up – this post is also about content rather than data viz

1. Rating scales 
Report the responses by category (e.g., Excellent, Very Good, Fair and Poor) – especially if this is a standalone question and each of the categories has an interesting number of responses in it.

Report the responses of only the “top two” (positive responses) or “bottom two” (negative responses) for each question – especially if this is one of a series of questions as this can help to highlight what the differences are between the questions.

Report the means (averages) for each question if this is a series of questions – as this can also highlight differences and similarities between them.
 
2. Multiple responses
Choose to report the responses either 1) as a percent of all responses or 2) as a percent of all individuals (cases). For example, if a participant responds they like both apples and oranges. Then the apple response can count as one of two (responses) or as one of one (individuals).

3. Other specifies
Report “other specify” comments in existing categories if they match. For example, if a participant responds in other specify comments that they like “granny smiths” but did not check the existing category of “apples”, then include their response in the apples responses when reporting. Report only other specify responses that cannot be included in existing categories as “other”.

4. Open ends 
Report the themes in open-ended responses. First, identify the theme categories, then look back at the responses to see which themes are present in each response.
 
If you have many open-ended responses to a question, you may want to assign a code to the themes in each response. Then tally the numbers by code to see what percent of your responses mention that theme.
 
If you have lots of themes, use NETs. Identify an overall code (NET) for a theme e.g., fruit and then sub-codes for sub-themes mentioned in the responses e.g., apples, oranges, pears, banana. This way you’ll be able to report on how many survey respondents talked about fruit, as well as how many specifically mentioned each type of fruit.
 
5. Final note on reporting survey responses 
  • adding %s across surveys be sure not to include the same individuals in the total count. 

2 Comments

Communicating Reflections - Reporting Survey Responses (General)

1/10/2022

0 Comments

 
Picture
Heads up – this post is about content rather than data viz
​

1. Describe your sample – they are not the population
​
Describe who answered your survey – as well as how they may be different from who didn’t answer it. For example, if a survey about hot lunches is answered by 100 of 400 parents at a school. Report both these numbers and note that parents who buy lunches are more likely to have answered the survey. NOT “Our hot lunch survey shows that 70% of the parents at our school are satisfied with the lunches being served.”
 
2. Report frequencies by response category rather than number counted
USE:
Oranges   4
Apples      3
Berries     2
Grapes     2
Bananas   1
Plums       1

NOT:
Plums, Bananas    1
Grapes, Berries     2
Apples                     3
Oranges                  4
 
3. Always let your reader know how many people your % is based on – and who they are
In text – In this region, % of 112 participants used…
Under chart/table – Base = 112 participants
In methodology para – All percentages calculated on responses from 112 participants.

Do not use percents when the number of responses is less than 20. Report numbers rather than percents. For example, “four participants responded they liked oranges.”

4. For questions your reader may want to compare - use consistent denominators - for example, calculate percents for each question on a) all responses or b) a sub-group of responses filtered on the same question  

5. Report on all those who were asked the question, not just those who answered it  
Non-responses are “responses” as well. They let you know how relevant the question is to those answering your survey. They also let you know when you’re not getting all the information you want with the response categories you’ve included. If a large proportion of your respondents didn’t answer a question:
  • report the total percentage and some of the possible reasons why so many did not answer it – was it perhaps not applicable to them, did their likely response not fit with the categories listed (based on the themes in responses to other questions). For example, “65% were satisfied with this service, though 20% didn’t rate it – possibly because they didn’t access it.”
  • report the number who answered the question, as well as recalculated percentages and some of the possible reasons why others didn’t answer it. For example, “80 clients rated this service, likely because they accessed it. Of these 80 clients, 81% were satisfied with the service.” 

0 Comments

Jumping In - Inviting Evaluation Survey Responses

12/6/2021

0 Comments

 
Picture
1. Create a Contact List

- Who will you talk to? Make a table containing their… 
        - Names and addresses (email addresses,
          regular mailing addresses or phone
          numbers – only the ones you’re going to
          need) 
         - Consent to contact information – their
         agreement to be contacted 
          - Other information for each person that
          you will use
                   a) as filters in the survey and/or
                  b) to ensure you speak to all groups
                     of interest or analyse their
                     responses e.g., demographics,
                     position/organization (stakeholders), services
                     received (participants), etc.

2. Decide on Formats and Processes

​- How will you… 
          - Send out your questionnaires – think about how your program usually communicates with who you’re
         surveying – is it by email, by phone, online or in person (drop in/ appointments/ meetings/in-class)?
         This is also likely the best way to invite these individuals to your survey. 

          - Have respondents fill out surveys – through online questionnaires, e-forms, printed booklets, over the
         phone, etc.  

        - Receive the surveys back from respondents – online, via a (secure) mailbox, to a specific email address, etc. 

3. Create an Invitation and a Reminder

- Write text for the format you’re using (letter, email, in-person script) and include: 

          - A personalized salutation 
          - Purpose of the evaluation – and the current survey        
          - Response deadline and process(es) for responding        
          - Likely length of time it will take them to answer the questions 
          - Assurance of confidentiality of responses/privacy 
          - Appreciation for their time and response 
          - Link to survey (or mention of attachment/enclosure) 
          - What will be done with the responses – especially if they will be receiving a copy of the totalled responses

4. Send Out Invitations, Reminders and Track Returns

- Check everyone has been sent a survey invitation or reminder. Any bouncebacks? Look for new addresses or back up contacts.

- Track the number who were unreachable – as well as those who you know were reached but: 
          - didn’t respond, 
          - partly responded (answered some questions) and
          - completely responded (answered most questions)

- Calculate your response rate – this is the number who completely responded divided by the number sent an invitation (and reached) – reported as a percentage

0 Comments

Final Checks - Is your evaluation survey asking what you need it to?

11/1/2021

0 Comments

 
Picture
Your questionnaire’s been drafted, discussed, revised, discussed, tweaked, programmed, pre- tested and reprogrammed. Or at least some of the above.
 
Wondering if it’s still asking what you need it to?
 
Here are three ways to take your program evaluation questionnaire and final-check it:
 
1)     To your logic model – code each survey question back to the boxes or the linkages between them on your logic model. For example, maybe Q7 asks about an activity and Q12 about a short-term outcome. Code Q7 to activities and Q12 to outcomes.
 
Then ask yourself - which elements of the program logic will you be able to evaluate based on the survey responses? Are all the activities covered or at least the main ones? All the key short-term outcomes? What key linkages are you able to test with this survey? Any repeats or gaps?
 
2)    To your evaluation issues – code each survey question back to an evaluation issue. For example, maybe Q7 asks about implementation and Q12 about effectiveness. Code Q7 to implementation and Q12 to effectiveness.
 
Then ask yourself– which evaluation issues will you be able to (partly) address using these survey responses? Are all the evaluation issues you thought to (partly) answer using the survey covered off? Any repeats or gaps?
 
3)     To your report outline – create an evaluation report outline and enter each of your survey questions into it e.g., as bullets. Better yet, enter each question in a format which refers to the findings you’ll be reporting. For example, maybe for Q7 - % were satisfied with an activity and for Q12 - % reported a change in behaviour (as a result of the program).
 
Then ask yourself – how well does your report flow? Will you be able to speak to the key findings you plan to? Any repeats or gaps?
 
Not all questions need to be directly connected to a topic – some are filter or flow questions that are needed for you to be able to ask others.
 
Sometimes repeated questions are intentional. But if they’re not then they are a good place to do a final prune. If there are question gaps – and they are important – then do you need to make a last minute addition to your questionnaire?
 
The changes made to the questionnaire can be improvements to a logic model or to evaluation issues developed earlier. As a questionnaire is revised, it better reflects reality. For example, it is more likely to use the words that clients or staff or stakeholders use to describe the program, especially after a pre-test. As well, questions may have been refined when the questionnaire moved from being a paper version to being an online version – as the programming led to more clarity. Once you’ve done your final check of the questionnaire, it may be a good time to go back and tweak your logic model or evaluation matrix.

0 Comments

Following the Path - Drafting an Evaluation Survey Questionnaire

10/4/2021

1 Comment

 
Picture
​Step Four – Create Questions
- Take your outline (see Steps One to Three in Getting a Clear Start to create one if you haven’t already!) and turn your bullets into questions 
      - One topic per question (watch out for the
        words “and” & “or”)

Step Five – Create Response Categories
- Decide on the type of response you’re interested in for each question e.g., single choice, multiple response, open-ended comments
- Create wording for choices e.g., always/often/ occasionally/never
- Group together questions with similar response choices  into question sets or grids
- Use the same scale length throughout the questionnaire:  choose 4-point scales or 5-point scales – not both (exceptions only if you’re going to compare a question’s responses to other research e.g., benchmarks)
- Give positive responses the higher values (then you won’t have to reverse the values during the analysis)
- Provide appropriate “don’t know” and “not applicable” categories (almost always give respondents an “out” so they don’t get frustrated)

Step Six – Create Add-ons
- Add an instruction for each question in the survey e.g., “choose one” or “choose all that apply”
- Add skips if they are needed – use IF and THEN wording to focus on who continues and who doesn’t (and so where they should go)
- Add Introduction and Thank You texts
- Add different formats to different questionnaire components e.g., bold questions, italicize instructions, use ALL CAPS for response categories

QUESTIONNAIRE DRAFTED!

1 Comment

Getting a Clear Start - Creating an Evaluation Survey Outline

7/12/2021

0 Comments

 
Picture
Here are three steps for getting started on an evaluation survey. 
​
Step One – Create a List of Indicators
- Copy and paste indicators from the evaluation matrix into a document. Remove any duplicates.

Step Two – Create Sections
- Organize the indicators into sections with bullets
        - By evaluation issue or topic e.g., 
           relevance, needs, effectiveness
        - In chronological order e.g.,
​           awareness, then participation, then 
           outcomes
        - Taking into account sensitivity – with less sensitive questions near the beginning and demographic questions near
          the end
- Add headings to each section
- Add an introduction section – welcoming respondents and briefly describing the survey purpose/use, likely length and privacy/confidentiality/consent 

Step Three – Create Flow
- Add placeholders for questions needed as filters/for flow
- Check flow
          - Working forwards
          - Working backwards 
          - As a respondent
          - As a survey results user

OUTLINE DONE!

0 Comments

    Archives

    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    July 2021

    Categories

    All

    RSS Feed

Proudly powered by Weebly