Usability Session

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 3819 Experts worldwide ranked by ideXlab platform

Daniel Guldenring - One of the best experts on this subject based on the ideXlab platform.

  • a Usability evaluation of medical software at an expert conference setting
    Computer Methods and Programs in Biomedicine, 2014
    Co-Authors: Raymond Ond, Dewa D Finlay, Chris D Nuge, George Moore, Daniel Guldenring
    Abstract:

    IntroductionA Usability test was employed to evaluate two medical software applications at an expert conference setting. One software application is a medical diagnostic tool (electrocardiogram ECG] viewer) and the other is a medical research tool (electrode misplacement simulator EMS]). These novel applications have yet to be adopted by the healthcare domain, thus, (1) we wanted to determine the potential user acceptance of these applications and (2) we wanted to determine the feasibility of evaluating medical diagnostic and medical research software at a conference setting as opposed to the conventional laboratory setting. MethodsThe medical diagnostic tool (ECG viewer) was evaluated using seven delegates and the medical research tool (EMS) was evaluated using 17 delegates that were recruited at the 2010 International Conference on Computing in Cardiology. Each delegate/participant was required to use the software and undertake a set of predefined tasks during the Session breaks at the conference. User interactions with the software were recorded using screen-recording software. The 'think-aloud' protocol was also used to elicit verbal feedback from the participants whilst they attempted the pre-defined tasks. Before and after each Session, participants completed a pre-test and a post-test questionnaire respectively. ResultsThe average duration of a Usability Session at the conference was 34.69min (SD=10.28). However, taking into account that 10min was dedicated to the pre-test and post-test questionnaires, the average time dedication to user interaction of the medical software was 24.69min (SD=10.28). Given we have shown that Usability data can be collected at conferences, this paper details the advantages of conference-based Usability studies over the laboratory-based approach. For example, given delegates gather at one geographical location, a conference-based Usability evaluation facilitates recruitment of a convenient sample of international subject experts. This would otherwise be very expensive to arrange. A conference-based approach also allows for data to be collected over a few days as opposed to months by avoiding administration duties normally involved in laboratory based approach, e.g. mailing invitation letters as part of a recruitment campaign.Following analysis of the user video recordings, 41 (previously unknown) use errors were identified in the advanced ECG viewer and 29 were identified in the EMS application. All use errors were given a consensus severity rating from two independent Usability experts. Out of a rating scale of 4 (where 1=cosmetic and 4=critical), the average severity rating for the ECG viewer was 2.24 (SD=1.09) and the average severity rating for the EMS application was 2.34 (SD=0.97). We were also able to extract task completion rates and times from the video recordings to determine the effectiveness of the software applications. For example, six out of seven tasks were completed by all participants when using both applications. This statistic alone suggests both applications already have a high degree of Usability. As well as extracting data from the video recordings, we were also able to extract data from the questionnaires. Using a semantic differential scale (where 1=poor and 5=excellent), delegates highly rated the 'responsiveness', 'usefulness', 'learnability' and the 'look and feel' of both applications. ConclusionThis study has shown the potential user acceptance and user-friendliness of the novel EMS and the ECG viewer applications within the healthcare domain. It has also shown that both medical diagnostic software and medical research software can be evaluated for their Usability at an expert conference setting. The primary advantage of a conference-based Usability evaluation over a laboratory-based evaluation is the high concentration of experts at one location, which is convenient, less time consuming and less expensive.

Allison Gates - One of the best experts on this subject based on the ideXlab platform.

  • evaluating user perceptions of mobile medication management applications with older adults a Usability study
    Jmir mhealth and uhealth, 2014
    Co-Authors: Kelly A Grindrod, Allison Gates
    Abstract:

    Background: Medication nonadherence has a significant impact on the health and wellbeing of individuals with chronic disease. Several mobile medication management applications are available to help users track, remember, and read about their medication therapy. Objective: The objective of this study was to explore the Usability and usefulness of existing medication management applications for older adults. Methods: We recruited 35 participants aged 50 and over to participate in a 2-hour Usability Session. The average age ranged from 52-78 years (mean 67 years) and 71% (25/35) of participants were female. Each participant was provided with an iPad loaded with four medication management applications: MyMedRec, DrugHub, Pillboxie, and PocketPharmacist. These applications were evaluated using the 10 item System Usability Scale (SUS) and visual analog scale. An investigator-moderated 30-minute discussion followed, and was recorded. We used a grounded theory (GT) approach to analyze qualitative data. Results: When assessing mobile medication management applications, participants struggled to think of a need for the applications in their own lives. Many were satisfied with their current management system and proposed future use only if cognition and health declined. Most participants felt capable of using the applications after a period of time and training, but were frustrated by their initial experiences with the applications. The early experiences of participants highlighted the benefits of linear navigation and clear wording (eg, “undo” vs “cancel”) when designing for older users. While there was no order effect, participants attributed their poor performance to the order in which they tried the applications. They also described being a part of a technology generation that did not encounter the computer until adulthood. Of the four applications, PocketPharmacist was found to be the least usable with a score of 42/100 ( P <.0001) though it offered a drug interaction feature that was among the favorite features of participants. The Usability scores for MyMedRec (56/100), DrugHub (57/100), and Pillboxie (52/100) were not significantly different and participants preferred MyMedRec and DrugHub for their simple, linear interfaces. Conclusions: With training, adults aged 50 and over can be capable and interested in using mHealth applications for their medication management. However, in order to adopt such technology, they must find a need that their current medication management system cannot fill. Interface diversity and multimodal reminder methods should be considered to increase Usability for older adults. Lastly, regulation or the involvement of older adults in development may help to alleviate generation bias and mistrust for applications. [JMIR Mhealth Uhealth 2014;2(1):e11]

Raymond Ond - One of the best experts on this subject based on the ideXlab platform.

  • a Usability evaluation of medical software at an expert conference setting
    Computer Methods and Programs in Biomedicine, 2014
    Co-Authors: Raymond Ond, Dewa D Finlay, Chris D Nuge, George Moore, Daniel Guldenring
    Abstract:

    IntroductionA Usability test was employed to evaluate two medical software applications at an expert conference setting. One software application is a medical diagnostic tool (electrocardiogram ECG] viewer) and the other is a medical research tool (electrode misplacement simulator EMS]). These novel applications have yet to be adopted by the healthcare domain, thus, (1) we wanted to determine the potential user acceptance of these applications and (2) we wanted to determine the feasibility of evaluating medical diagnostic and medical research software at a conference setting as opposed to the conventional laboratory setting. MethodsThe medical diagnostic tool (ECG viewer) was evaluated using seven delegates and the medical research tool (EMS) was evaluated using 17 delegates that were recruited at the 2010 International Conference on Computing in Cardiology. Each delegate/participant was required to use the software and undertake a set of predefined tasks during the Session breaks at the conference. User interactions with the software were recorded using screen-recording software. The 'think-aloud' protocol was also used to elicit verbal feedback from the participants whilst they attempted the pre-defined tasks. Before and after each Session, participants completed a pre-test and a post-test questionnaire respectively. ResultsThe average duration of a Usability Session at the conference was 34.69min (SD=10.28). However, taking into account that 10min was dedicated to the pre-test and post-test questionnaires, the average time dedication to user interaction of the medical software was 24.69min (SD=10.28). Given we have shown that Usability data can be collected at conferences, this paper details the advantages of conference-based Usability studies over the laboratory-based approach. For example, given delegates gather at one geographical location, a conference-based Usability evaluation facilitates recruitment of a convenient sample of international subject experts. This would otherwise be very expensive to arrange. A conference-based approach also allows for data to be collected over a few days as opposed to months by avoiding administration duties normally involved in laboratory based approach, e.g. mailing invitation letters as part of a recruitment campaign.Following analysis of the user video recordings, 41 (previously unknown) use errors were identified in the advanced ECG viewer and 29 were identified in the EMS application. All use errors were given a consensus severity rating from two independent Usability experts. Out of a rating scale of 4 (where 1=cosmetic and 4=critical), the average severity rating for the ECG viewer was 2.24 (SD=1.09) and the average severity rating for the EMS application was 2.34 (SD=0.97). We were also able to extract task completion rates and times from the video recordings to determine the effectiveness of the software applications. For example, six out of seven tasks were completed by all participants when using both applications. This statistic alone suggests both applications already have a high degree of Usability. As well as extracting data from the video recordings, we were also able to extract data from the questionnaires. Using a semantic differential scale (where 1=poor and 5=excellent), delegates highly rated the 'responsiveness', 'usefulness', 'learnability' and the 'look and feel' of both applications. ConclusionThis study has shown the potential user acceptance and user-friendliness of the novel EMS and the ECG viewer applications within the healthcare domain. It has also shown that both medical diagnostic software and medical research software can be evaluated for their Usability at an expert conference setting. The primary advantage of a conference-based Usability evaluation over a laboratory-based evaluation is the high concentration of experts at one location, which is convenient, less time consuming and less expensive.

George Moore - One of the best experts on this subject based on the ideXlab platform.

  • a Usability evaluation of medical software at an expert conference setting
    Computer Methods and Programs in Biomedicine, 2014
    Co-Authors: Raymond Ond, Dewa D Finlay, Chris D Nuge, George Moore, Daniel Guldenring
    Abstract:

    IntroductionA Usability test was employed to evaluate two medical software applications at an expert conference setting. One software application is a medical diagnostic tool (electrocardiogram ECG] viewer) and the other is a medical research tool (electrode misplacement simulator EMS]). These novel applications have yet to be adopted by the healthcare domain, thus, (1) we wanted to determine the potential user acceptance of these applications and (2) we wanted to determine the feasibility of evaluating medical diagnostic and medical research software at a conference setting as opposed to the conventional laboratory setting. MethodsThe medical diagnostic tool (ECG viewer) was evaluated using seven delegates and the medical research tool (EMS) was evaluated using 17 delegates that were recruited at the 2010 International Conference on Computing in Cardiology. Each delegate/participant was required to use the software and undertake a set of predefined tasks during the Session breaks at the conference. User interactions with the software were recorded using screen-recording software. The 'think-aloud' protocol was also used to elicit verbal feedback from the participants whilst they attempted the pre-defined tasks. Before and after each Session, participants completed a pre-test and a post-test questionnaire respectively. ResultsThe average duration of a Usability Session at the conference was 34.69min (SD=10.28). However, taking into account that 10min was dedicated to the pre-test and post-test questionnaires, the average time dedication to user interaction of the medical software was 24.69min (SD=10.28). Given we have shown that Usability data can be collected at conferences, this paper details the advantages of conference-based Usability studies over the laboratory-based approach. For example, given delegates gather at one geographical location, a conference-based Usability evaluation facilitates recruitment of a convenient sample of international subject experts. This would otherwise be very expensive to arrange. A conference-based approach also allows for data to be collected over a few days as opposed to months by avoiding administration duties normally involved in laboratory based approach, e.g. mailing invitation letters as part of a recruitment campaign.Following analysis of the user video recordings, 41 (previously unknown) use errors were identified in the advanced ECG viewer and 29 were identified in the EMS application. All use errors were given a consensus severity rating from two independent Usability experts. Out of a rating scale of 4 (where 1=cosmetic and 4=critical), the average severity rating for the ECG viewer was 2.24 (SD=1.09) and the average severity rating for the EMS application was 2.34 (SD=0.97). We were also able to extract task completion rates and times from the video recordings to determine the effectiveness of the software applications. For example, six out of seven tasks were completed by all participants when using both applications. This statistic alone suggests both applications already have a high degree of Usability. As well as extracting data from the video recordings, we were also able to extract data from the questionnaires. Using a semantic differential scale (where 1=poor and 5=excellent), delegates highly rated the 'responsiveness', 'usefulness', 'learnability' and the 'look and feel' of both applications. ConclusionThis study has shown the potential user acceptance and user-friendliness of the novel EMS and the ECG viewer applications within the healthcare domain. It has also shown that both medical diagnostic software and medical research software can be evaluated for their Usability at an expert conference setting. The primary advantage of a conference-based Usability evaluation over a laboratory-based evaluation is the high concentration of experts at one location, which is convenient, less time consuming and less expensive.

Chris D Nuge - One of the best experts on this subject based on the ideXlab platform.

  • a Usability evaluation of medical software at an expert conference setting
    Computer Methods and Programs in Biomedicine, 2014
    Co-Authors: Raymond Ond, Dewa D Finlay, Chris D Nuge, George Moore, Daniel Guldenring
    Abstract:

    IntroductionA Usability test was employed to evaluate two medical software applications at an expert conference setting. One software application is a medical diagnostic tool (electrocardiogram ECG] viewer) and the other is a medical research tool (electrode misplacement simulator EMS]). These novel applications have yet to be adopted by the healthcare domain, thus, (1) we wanted to determine the potential user acceptance of these applications and (2) we wanted to determine the feasibility of evaluating medical diagnostic and medical research software at a conference setting as opposed to the conventional laboratory setting. MethodsThe medical diagnostic tool (ECG viewer) was evaluated using seven delegates and the medical research tool (EMS) was evaluated using 17 delegates that were recruited at the 2010 International Conference on Computing in Cardiology. Each delegate/participant was required to use the software and undertake a set of predefined tasks during the Session breaks at the conference. User interactions with the software were recorded using screen-recording software. The 'think-aloud' protocol was also used to elicit verbal feedback from the participants whilst they attempted the pre-defined tasks. Before and after each Session, participants completed a pre-test and a post-test questionnaire respectively. ResultsThe average duration of a Usability Session at the conference was 34.69min (SD=10.28). However, taking into account that 10min was dedicated to the pre-test and post-test questionnaires, the average time dedication to user interaction of the medical software was 24.69min (SD=10.28). Given we have shown that Usability data can be collected at conferences, this paper details the advantages of conference-based Usability studies over the laboratory-based approach. For example, given delegates gather at one geographical location, a conference-based Usability evaluation facilitates recruitment of a convenient sample of international subject experts. This would otherwise be very expensive to arrange. A conference-based approach also allows for data to be collected over a few days as opposed to months by avoiding administration duties normally involved in laboratory based approach, e.g. mailing invitation letters as part of a recruitment campaign.Following analysis of the user video recordings, 41 (previously unknown) use errors were identified in the advanced ECG viewer and 29 were identified in the EMS application. All use errors were given a consensus severity rating from two independent Usability experts. Out of a rating scale of 4 (where 1=cosmetic and 4=critical), the average severity rating for the ECG viewer was 2.24 (SD=1.09) and the average severity rating for the EMS application was 2.34 (SD=0.97). We were also able to extract task completion rates and times from the video recordings to determine the effectiveness of the software applications. For example, six out of seven tasks were completed by all participants when using both applications. This statistic alone suggests both applications already have a high degree of Usability. As well as extracting data from the video recordings, we were also able to extract data from the questionnaires. Using a semantic differential scale (where 1=poor and 5=excellent), delegates highly rated the 'responsiveness', 'usefulness', 'learnability' and the 'look and feel' of both applications. ConclusionThis study has shown the potential user acceptance and user-friendliness of the novel EMS and the ECG viewer applications within the healthcare domain. It has also shown that both medical diagnostic software and medical research software can be evaluated for their Usability at an expert conference setting. The primary advantage of a conference-based Usability evaluation over a laboratory-based evaluation is the high concentration of experts at one location, which is convenient, less time consuming and less expensive.