Survey Data Interpretation

[av_one_full first min_height=” vertical_alignment=’av-align-top’ space=” margin=’0px’ margin_sync=’true’ padding=’30px’ padding_sync=’true’ border=’3′ border_color=’#666666′ radius=’0px’ radius_sync=’true’ background_color=” src=” attachment=” attachment_size=” background_position=’top left’ background_repeat=’no-repeat’ animation=’fade-in’ mobile_display=” custom_class=”]

[/av_one_full][av_textblock size=” font_color=” color=” custom_class=” admin_preview_bg=”]

Data Interpretation Response Suggestions

Consider Anthony's administration method and response rate. Is there anything he might do differently in the future to increase response rates?Follow-up with non-respondents over the phone
Administer as an e-mail or phone survey
Are there any rating-scale items that could be re-worded or eliminated?Satisfaction with the program items may be redundant.

Include more specific items, such as whether the provider was on-time, kept appointments and deadlines, confidence in provider's advocacy at meetings.

The responses across items are very high. Anthony might consider changing his scale labels to increase the instrument's sensitivity.
Consider how respondents replied to the open-ended item. Should Anthony keep this item or re-word it?He might re-word it to increase item response rates. For example, "Please identify one thing your service provider could do that might improve services for your child and family?"
Any other comments about these data?This administration included clients of supervisees who were at different stages of supervision. Rather than administering the surveys all at once, Anthony could administer them as post-surveys when supervisees are finishing their supervision term.

This administration likely served more to help Anthony refine his measure and survey methods than it did to reliably assess supervision services. Anthony might keep these data as a pilot of the survey tool but not make any serious supervision program modifications.