Devon Whetstone Evaluation Report

Evaluator Participants

The participants involved in the evaluation of the pilot website are a sample of end users with at least three years of student learning assessment experience. They are all Stephens College faculty and subject matter experts in their respective fields and are credentialed with a terminal degree. They were each provided with a summary of the use cases, and were asked to attempt each task and report on their experience using the evaluation form.

Participant Bios

Tina O. is a 50 year old caucasian female, graduate faculty member, and program director for the Strategic Leadership program. She holds an M.Ed and M.B.A., and teaches exclusively online. Her interest in an assessment website is to be able to locate other resources that will help her with her assessment plan. Although T.O. primarily uses Chrome as her browser, I asked her to test the website using Safari as she was the only user with a Mac and another participant had already tested Chrome. She tested the site at home using her own cable internet.

Gina S. is a 47 year old caucasian female, graduate faculty member, and program director for the Counseling program. She holds a PsyD. and teaches primarily in seat students. Her interest in an assessment website is to utilize the rubric repository for soft skills and rubrics related to her program content. G.S. tested the website using Firefox on a Stephens College PC using institutional internet access.

Carrie W. is a 45 year old caucasion female, undergraduate faculty member who teaches general education math and science courses. She holds a PhD. in Math and primarily teaches in seat students but has taught online courses in the past. Her interest in an assessment website is to utilize the rubric repository. She does not like "tutorials" or "How-To Videos" and does not believe she would use that component of the website. C.W. tested their site using Chrome on a Stephens College PC using institutional internet access.

Summary of Evaluators' Comments

Summary of Evaluator Ratings
T.S G.S. C.W. Average by Item Average by Subscale
Design 1 5 5 5 5.0 Design = 3.4
Design 2 5 3 5 4.3
Design 3 5 4 5 4.7
Design 4 5 5 5 5.0
Design 5 1 5 5 3.7
Design 6 1 5 3 3.0
Design 7 1 1 1 1.0
Design 8 2 4 4 3.3
Design 9 1 1 1 1.0
Content 1 5 5 5 5.0 Content = 4.0
Content 2 5 5 5 5.0
Content 3 5 5 5 5.0
Content 4 5 5 5 5.0
Content 5 1 3 5 3.0
Content 6 3 5 5 4.3
Content 7 5 4 5 4.7
Credibility 1 5 5 5 5.0 Credibility = 4.1
Credibility 2 5 5 5 5.0
Credibility 3 1 5 1 2.3
Credibility 4 1 5 1 2.3
Credibility 5 5 5 5 5.0
Credibility 6 5 5 5 5.0

Total Average by Evaluator
T.S G.S. C.W.
3.5 4.3 4.1

Qualitative Feedback Summary

Design Component

Overall, the evaluators were in general agreement regarding what was well done and the improvements that are needed. The general consensus is that the website is logically divided up into its subpages, and the navigation is intuitive. They were all able to access the homepage from every subsequent page. They all commented on the difficulty of the green text used in the navigation menu when overing over a link. It was suggested by all three to change the green to a different color. The coordination of using the insititution's color palette was well received.

To improve this area, it was suggested I make content more accessible, and include alt, height and width attributes for multimedia elements. Tina also suggested I work with our ADA coordinator to make content more accessible.

Content Component

The evaluators determined the use cases were successful and the content appropriate. All of the links were functional and the content was no more than two layers deep. They trust in my subject matter expertise that the information is current. They all mention that the site is not yet in a place to stimulate thinking or deep reflection, but it was suggested that it may become so with more development.

Credibility Component

This area arguably needs the most improvement. The evaluators did not find a bias statement regarding the content and suggested it may be a good idea to include one. While my credentials are easily located on the home page, they suggest I still need to cite my sources even if they are my own. They all agreed spelling and punctuation issues are not present.

Raw Data

This PDF contains copies of the raw data evaluations. I have changed the comment text to red in order to improve the legibility of the feedback.

Improvement Plan

Based on the feedback I received from the evaluators, I intend to make the following improvements to improve design, content, and credibility:

Reflection

This process was useful as well as difficult. In a lot of ways I felt very exposed at having my work evaluated by essentially a jury of experts, especially since it was unfinished. The concept of engaging in formative feedback throughout a project, rather than at the end is a new process for me, and difficult to wrap my head around. I am used to creating a final product, and then going back and revising the whole thing. However, I can certainly comprehend the utility of doing little evaluations throughout the development as it makes more sense to set up a solid foundation before moving on. I was very fortunate to find three end users to pilot my site. They were precisely the population I am targeting, so their feedback was invaluable. However, I do wish I had access to someone who understands html coding who could evaluate my code. I was excited to hear all three evaluators were able to successfully complete the three use cases. I am enthusiastic about the next steps of development. I intend to go through another review process once the tutorials are developed and uploaded. By that time I will have more sophisticated use cases to pilot, and perhaps will have found an html expert to evaluate my work. I would like to try a think-aloud method, in which I sit with the user and have them articulate their thought processes and procedural steps so I an observe how one approaches the site.