My first app in the app store!
This application is a very handy tool for Optometrists, Opticians and students. The app contains the full setup of the devices used in the OEP 21 points system and also contains the Efron Grading Scales for Contact Lens Complications.
The app was created as a replacement for the paper notes, asked to be made at the HUBrussel where I study the bachelor Optics and Optometry. I often forgot these, so I started wondering if there would be a way to get these on my iPhone, since I never forget my iPhone…
I hope it can help you as much as it does me!
Download it in the appstore.
A lot of people are negative about Facebook, Twitter etc. They should all just read this quote and think about the truth in it…
“Social media is bringing back humanity to all digital life. We are no longer users, consumers, shoppers. We are all people again.”
ReadWriteWeb, January 2009
Of course I understand the concerns about privacy and all, but my philosophy is, if you don’t want anyone to know, then don’t put it on the web!
The final presentation of Learning 2.0 with Special Needs
The poster of my thesis can be found here.
The presentation I gave today can be found below.
The first one is a demo of the mockup and the second one of the partly implemented application. Turn up your volume to hear the text-to-speech in the second demo.
You can download the paper “Filling the Digital Divide Gaps in Learning 2.0 with Special Needs” here in .pdf.
Since I was only able to test the mockup with one person I contacted an expert to give his opinion on the application and check whether this can be used by most of the persons with a mental disability. When I showed the application the immediate reaction was really positive. He thinks this application can really be a way to close the social gap some more for a lot of people with a mental disability.
For the dilemma for the selection of text and then reading it out loud or the other way around the expert responded that it might be better to choose for first selecting and then reading it loud because in this way it is consistent with other reading applications like Sprint and Kurzweil which some of them already use. For the buttons then it might be a solution when tapped to just read it out loud anyway and also navigate to the page. Facebook does not really have a deep hierarchy of pages so going back to the previous page is not that difficult in this application.
Some other suggestions were to use sclera pictograms for the statuses with a sub hierarchy, like I feel… and when they select this you go to happy, sad,… The same could be done for some standard texts they could put on the walls of friends like: “I would like to…”, “Let’s go to…”,… Another nice thing would be the possibility to change font and font size, this preference (in the way the person can read it best) is really different from person to person.
Yesterday I tested the mockup shown in the previous post with a person with a mental disability. I adapted the mockup a little more so the text in the tabbar is replaced by icons. The person I tested the mockup with is 20 years old.
Scenario’s for persons with a mental disability:
- Who was the last one that posted something on your wall?
- Let something be read out loud (for example a wallpost)
- Update your status (This is set that the person is not allowed to do this so there will be popup to enter a password => What is the reaction?
- Who’s birthday is it today?
- When is event X?
- Can you read out loud what person X said to person Y?
- Start a new Facebook chat
- Can you see who are your friends?
- Of which groups are you member?
Afterwards I asked some questions to them:
Persons with a disability:
- Have you ever used Facebook?
- Do you have a Facebook account yourself? If not, interested in one?
- Any experience with a touchscreen? Could be a smartphone,…
- Any with the iPad?
- Do you think it would be handy to have like a button that would read all statusses of your friends aloud or do you prefer choosing the statusses that you want to hear yourself?
- Any remarks?
The goal of the evaluation was again to see whether the application was usable by people with a mental disability who might think totally different and look different at the world. Things that might seem logic for me could not be logic for then and the other way around! This time the evaluation is with a mockup so it should go better than with the paper prototype.
I told the test users that this was a think aloud test so they had to say what they were thinking so I could understand why they were doubting about something, suggestions,…
I gave the iPad with the application open and the first thing that the person did was trying to scroll through the news feed which, because it is a mockup, is not possible yet but it was nice to see that the person with the mental disability knows how to work with iOS devices.
Then I asked to do the different scenario’s:
- The first step was to get to his profile and so the person tried to tap his picture which is quite logic but it was not implemented that way and I asked if there might be another to do this. The person then immediately tapped the profile tab and pointed the correct person that posted something on his wall and even pressed the speaker so it would be read out loud.
- For the reading out loud I hadn’t explained anything so when I asked to do this for something where there was no small speaker icon, the person tried to select some text because he thought there might show something then. I then told the person that there might be another way to do this and hinted to the box icon. Then the user tried to select the text again and then pressed the speaker icon. I then explained how it actually works and it also seems logic to the person but he was just thinking in another order.
- The person immediately found the update status bar, but when he tapped on it and the password prompt showed he immediately tapped somewhere else to remove it because he thought he did something wrong. I then explained that he was right but that the application might be setup to offer some protection for certain actions.
- Has some trouble with finding this, saw a number next to chat and was focused on that being the day and thought first that the pictures from the chat were people with a birthday today. Then he saw the gift and associates that with perhaps a birthday but doesn’t understand the text. He then pressed the text because he thinks that should give birthdays but then I said that it would also be a way to see the birthdays but he should search less far. Then he saw the pictures but said that those persons already had their birthday, so it couldn’t be them… Which is kind of correct since he knows the persons on the pictures. This is an example on the difficulty sometimes to test certain things (like with the paper prototype) because they have trouble living into something not real. They can “only” think really logic. So these persons don’t have their birthday today, so it cannot be that…
- Immediately saw the calendar icon and used the speaker to read it out loud.
- For this the person again selected the large speaker icon and selected the text. I asked then if there would have been another way and the person immediately responded selecting the small speaker icon.
- First clicked on friends icon, then groups and then messages. He did not find the solution for this because the concept of chatting was new for him.
- First selected profile but then directly the friends icon and immediately tried to scroll through the pictures with the coverflow mechanism. Said that it was like his music on the iPod.
- Tapped the correct icon but when on the page he wasn’t sure if it was the right page.
Then I asked the questions:
- Yes, always used it with his mother.
- Yes, has an iPod Touch like already mentioned.
- No idea
- Quote: “I think you deserve a 10/10 for what you have made”, hopefully everyone thinks this?
What I can conclude from the evaluation with this person is that apart from some small interface changes the application is usable by already at least one person with a mental disability. I hope to be able to test the mockup with some more persons with a mental disability very soon.
It has been a while since I posted an update because of a lot of projectdeadlines and exams, but the last couple of days I’ve been working on the mockup of my application. I made this with the interface builder in Xcode and used some screenshots from other applications and also from the Facebook website. The plan is to have the mockup finished by tomorrow and then test it with some persons.
The segmented control at the top still needs to be changed because it should have pictograms instead of text.
To create this I had to make Controllers and Views to get some interaction so I designed the following Class diagram. The next step after checking whether the mockup works will be designing a model and further implementing the controllers.