To use this website fully, you first need to accept the use of cookies. By agreeing to the use of cookies you consent to the use of functional cookies. For more information read this page.



8 Software usability study

8.1 Introduction

The same participants from the first usability study completed the second study. The purpose of the second study is to assess how well ClickIt has been implemented and to obtain information about what could still be implemented.

8.2 Purpose of the study

The second usability study conducted was used to test the system and identify any issues and bugs within ClickIt, to test ClickIt on several different platforms and obtain valuable feedback from potential users of the final version. The main measurable goal from this is to see whether or not the participants prefer using ClickIt compared with writing HTML code.

8.3 Hypotheses

By the end of this second study, I expect to discover:

  • Participants have a better knowledge of how HTML works, particularly nesting and the indentation required, than they did before the study.
  • Participants have a good understanding of how to use ClickIt.
  • Participants find ClickIt easier than the text editor they are used to using in school (Adobe Brackets).

8.4 Equipment used

The computer systems that the participants used were all HP EliteDesk G1 systems which had the following specifications:

  • Intel Core i5 2.1GHz 4 cores (mixed models, some with Intel HyperThreading technology)
  • 8GB DD3 RAM
  • Windows 7 and the Google Chrome browser

The second class also tested the system on Acer C730 Google Chromebooks with the following specifications:

  • Intel Celeron 2840 2.16GHz
  • 4GB DDR3 RAM
  • Chrome OS with the Google Chrome browser

8.5 Procedure

The procedure for this usability test was as follows. First of all, the statement of consent (this can be found in Appendix A) was read to the participants and they were asked if they still wished to continue with the study.

Participants were asked to open a web browser of their choice (they could choose between Google Chrome and Internet Explorer 11, both of which support ClickIt).

Once they opened the web browser they were asked to open the online survey and type in the passcode for it and then the ClickIt Creator.

They were then given two minutes and thirty-seconds to get used to the ClickIt application. After this, a sample of valid HTML code was put up on the projector and the participants were to produce the same code in ClickIt. This code can be found in Appendix C. Participants were asked to raise their hand on completion. After five minutes the code was taken down and everyone who had finished kept their hand up and waited for their code to be checked.

The second task was more a complex piece of code which involved HTML tables. For this task, the participants were given ten minutes.

In the second class, a second experiment involving the participants using Chromebooks to test this software also took place. The participants almost all said felt it performed better on the Chromebooks than it did on the Windows 7 desktop systems.

Finally, participants completed the same questionnaire as they did for the prototype. This can be found in Appendix D.

8.6 Results

Several issues were encountered with ClickIt within the school network since required parts of the system were delayed or blocked by the school’s proxy server. One of the most critical parts that was delayed was the first three blocks in the toolbox (the HTML, Comment and Text blocks). This was fixed by reloading the page several times.

Qualitative findings

The results reveal that the drag and drop feature was praised throughout as being very helpful, however, several also complained of ClickIt having some issue when they were dragging and dropping elements and difficult to use compared with just writing the code.

The application interface was also described as being user-friendly and easy to use and many praised the simplicity of ClickIt. Other points made included the fact that ClickIt itself was produced in HTML and that the closing blocks are added automatically when opening block are added.

Many of the participants suggested that when users drop elements on the closing blocks that instead of adding the block to that block it should add it after. This was clear in the study that many of the participants assumed this, since this was considered a major issue with ClickIt. There were a few complaints stating: "things weren't going in the right place". For instance:

Figure 19 - a sample of some code blocks which caused issues with participants

This issue occurs if a user had the blocks shown in Figure 19 in their ClickIt window and they dropped a 'Body' block on the 'Head closing tag' block. The result of this was that they would receive a warning to tell them that placing the body block inside the head is not possible. The message shown is shown in Figure 20.

The participants who had problems believed that dropping the 'Body' block here would result in the block being placed after the closing tag of the head block. This issue was reflected in the results.

Figure 20 - the dialogue that appears when a user drops a 'Body' block on a 'Head closing block'

Many also criticised ClickIt over the fact that it makes things almost too easy and because of this makes it too easy to not actually learn the actual code that they need to learn. Some praised the fact it was easy to use and claimed it "makes everything easier".

The step by step introductory tutorial introduced to ClickIt was praised overall but criticised for the lack of information given. Many participants used this feature over and over but many felt that it did not give enough information and assistance.

The table builder was highlighted by a total of 12 participants as being the most useful feature of ClickIt and was overall considered the biggest reason to use ClickIt.

The results show that the most important factor for why they would use of ClickIt is the speed at which pages can be built, particularly with tables.

When the study was being conducted, I observed many of the participants were struggling with the drag and drop system, with many of them dropping the blocks far too early and before their targets. There were also some in the classroom whose browsers did not allow them to drag and drop, leaving them frustrated and blaming ClickIt. This was also reflected in the questionnaires afterward.

In contrast with the prototype study which showed that most participants preferred to use the ClickIt block system to writing code, with the software ClickIt block system the opposite is true.

Quantitive findings

Twelve participants preferred to use ClickIt compared with writing code, however sixteen were in favour of writing code. This may be down to the fact that many of the participants were experiencing these issues with drag and drop, since the feedback in the survey was very critical of this particular feature of ClickIt. Another major criticism of ClickIt was that there was an issue when it loaded that some blocks did not appear. This is an issue has since been fixed, but the main issue with it was perhaps down to the school network being inadequate to download all of the JSON files required.

The time on task for each participant was also recorded as the participants raised their hands. This was timed on a stopwatch and then added to the database afterward.

The range of times was between 33 seconds and 130 seconds for the first task without the ClickIt assistance features enabled (Appendix C. HTML Sample 1). The average time for this task was approximately 47 seconds.

The range of times for the second task with the ClickIt assistance features enabled(Appendix C. HTML Sample 2) was between 42 seconds for the shortest time and 95 seconds for the longest time. For the second task the average time spent on this task was 55 seconds.

Whilst most of the subjects completed the first task significantly faster, the second task was considered easier and feedback received from individuals made it clear that the table building tool made a more complicated task much easier than it should have been.

Of all participants asked, everyone stated that ClickIt was easier than writing HTML.

8.7 Conclusions drawn from the software usability study

Whilst the concept of ClickIt was praised and was credited for being useful for younger children by the participants, the general consensus within the three classes which were aged 13, 14 and 16 was that it was not so useful for their own use, that is, it would not be as useful for students who are in their second year of school or above or have at least some level of HTML experience from previous years at school.

This can however be seen as a positive remark about how easy ClickIt is to use and that it could be useful for younger children.

The time spent working on each of the two tasks was also an indicator of how easy ClickIt is to use, since five minutes of time was allocated but all participants completed each of the tasks in less than two minutes.

It is clear that there are some issues with the way in which the drag and drop feature in ClickIt has been designed. This usability study has made these problems clear.

It was also clear from the results that the HTML5 drag and drop standard is still not implemented fully enough in some browsers to release this application as being completely standards-compliant.

References

  1. Computing Science Project General assessment information, SQA, 2015 . Available from: http://www.sqa.org.uk/files_ccc/GAInfoCompScience.pdf . Accessed 12 Nov 2015
  2. Maiorana, F, 2014. Teaching Web Programming . University of Catania, pp. 49
  3. Jenkins, T, 2001. Teaching Programming - A Journey from Teacher to Motivator. University of Leeds, pp. 1 - 3.
  4. Kahn, K, 1995. ToonTalk - An Animated Programming Environment for Children . Stanford University, pp 198 - 201.
  5. SuperLogo, Axe Valley Community College, Available from: http://folders.axevalley.devon.sch.uk/html/ict/year7/SuperLogo/ . Accessed: 11 Nov 2015
  6. Maloney, J, Resnick, M, Rusk, N, Silverman, B, 2010. The Scratch Programming Language and Environment . Massachusetts Institute of Technology, vol. 10, pp. 2 - 8.
  7. Hourcade, J P, 2001. Interaction Design and Children . University of Iowa, pp. 282 - 283
  8. Hanna, L, Risden, K, Alexander, K, 1997. Guidelines for Usability Testing with Children
  9. Texas Child Care: Back Issues, Lee, T, 2011. Available from: http://www.childcarequarterly.com/spring11_story3.html . Accessed: 17 Nov 2015
  10. Kolb’s Learning Styles, Business Balls, 2015. Available from: http://www.businessballs.com/kolblearningstyles.htm . Accessed: 17 Nov 2015

Bibliography

  1. W3C HTML5 and CSS2 and CSS3 specifications
  2. PHP Reference library
  3. SQA Curriculum for Excellence
  4. SQA Higher and Advanced Higher Computing course notes and past examination papers
  5. ToonTalk website (http://www.toontalk.com)
  6. Logo versions website (https://logo.codeplex.com)
  7. Scratch website (https://scratch.mit.edu)
  8. Zurb Joyride website (http://zurb.com/playground/jquery-joyride-feature-tour-plugin)