Evaluation II

Yesterday I tested the mockup shown in the previous post with a person with a mental disability. I adapted the mockup a little more so the text in the tabbar is replaced by icons. The person I tested the mockup with is 20 years old.

Scenario’s for persons with a mental disability:

  1. Who was the last one that posted something on your wall?
  2. Let something be read out loud (for example a wallpost)
  3. Update your status (This is set that the person is not allowed to do this so there will be popup to enter a password => What is the reaction?
  4. Who’s birthday is it today?
  5. When is event X?
  6. Can you read out loud what person X said to person Y?
  7. Start a new Facebook chat
  8. Can you see who are your friends?
  9. Of which groups are you member?

Afterwards I asked some questions to them:

Persons with a disability:

  1. Have you ever used Facebook?
  2. Do you have a Facebook account yourself? If not, interested in one?
  3. Any experience with a touchscreen? Could be a smartphone,…
  4. Any with the iPad?
  5. Do you think it would be handy to have like a button that would read all statusses of your friends aloud or do you prefer choosing the statusses that you want to hear yourself?
  6. Any remarks?

The goal of the evaluation was again to see whether the application was usable by people with a mental disability who might think totally different and look different at the world. Things that might seem logic for me could not be logic for then and the other way around! This time the evaluation is with a mockup so it should go better than with the paper prototype.

I told the test users that this was a think aloud test so they had to say what they were thinking so I could understand why they were doubting about something, suggestions,…


Person 1:

I gave the iPad with the application open and the first thing that the person did was trying to scroll through the news feed which, because it is a mockup, is not possible yet but it was nice to see that the person with the mental disability knows how to work with iOS devices.

Then I asked to do the different scenario’s:

  1. The first step was to get to his profile and so the person tried to tap his picture which is quite logic but it was not implemented that way and I asked if there might be another to do this. The person then immediately tapped the profile tab and pointed the correct person that posted something on his wall and even pressed the speaker so it would be read out loud.
  2. For the reading out loud I hadn’t explained anything so when I asked to do this for something where there was no small speaker icon, the person tried to select some text because he thought there might show something then. I then told the person that there might be another way to do this and hinted to the box icon. Then the user tried to select the text again and then pressed the speaker icon. I then explained how it actually works and it also seems logic to the person but he was just thinking in another order.
  3. The person immediately found the update status bar, but when he tapped on it and the password prompt showed he immediately tapped somewhere else to remove it because he thought he did something wrong. I then explained that he was right but that the application might be setup to offer some protection for certain actions.
  4. Has some trouble with finding this, saw a number next to chat and was focused on that being the day and thought first that the pictures from the chat were people with a birthday today. Then he saw the gift and associates that with perhaps a birthday but doesn’t understand the text. He then pressed the text because he thinks that should give birthdays but then I said that it would also be a way to see the birthdays but he should search less far. Then he saw the pictures but said that those persons already had their birthday, so it couldn’t be them… Which is kind of correct since he knows the persons on the pictures. This is an example on the difficulty sometimes to test certain things (like with the paper prototype) because they have trouble living into something not real. They can “only” think really logic. So these persons don’t have their birthday today, so it cannot be that…
  5. Immediately saw the calendar icon and used the speaker to read it out loud.
  6. For this the person again selected the large speaker icon and selected the text. I asked then if there would have been another way and the person immediately responded selecting the small speaker icon.
  7. First clicked on friends icon, then groups and then messages. He did not find the solution for this because the concept of chatting was new for him.
  8. First selected profile but then directly the friends icon and immediately tried to scroll through the pictures with the coverflow mechanism. Said that it was like his music on the iPod.
  9. Tapped the correct icon but when on the page he wasn’t sure if it was the right page.

Then I asked the questions:

  1. Yes, always used it with his mother.
  2. Yes
  3. Yes, has an iPod Touch like already mentioned.
  4. No
  5. No idea
  6. Quote: “I think you deserve a 10/10 for what you have made”, hopefully everyone thinks this?

What I can conclude from the evaluation with this person is that apart from some small interface changes the application is usable by already at least one person with a mental disability. I hope to be able to test the mockup with some more persons with a mental disability very soon.

Geef een reactie

Het e-mailadres wordt niet gepubliceerd. Vereiste velden zijn gemarkeerd met *