Published on 19.10.18 in Vol 6, No 10 (2018): October
Preprints (earlier versions) of this paper are available at http://preprints.jmir.org/preprint/11076, first published May 25, 2018.
Teleconsultation Using Mobile Phones for Diagnosis and Acute Care of Burn Injuries Among Emergency Physicians: Mixed-Methods Study
Background: The referral process in acute care remains challenging in many areas including burn care. Mobile phone apps designed explicitly for medical referrals and consultations could streamline the referral process by using structured templates and integrating features specific to different specialties. However, as these apps are competing with commercial chat services, usability becomes a crucial factor for successful uptake.
Objective: The aim of this study was to assess the usability of a mobile phone app for remote consultations and referrals of burn injuries.
Methods: A total of 24 emergency doctors and 4 burns consultants were recruited for the study. A mixed-methods approach was used including a usability questionnaire and a think-aloud interview. Think-aloud sessions were video-recorded, and content analysis was undertaken with predefined codes relating to the following 3 themes: ease of use, usefulness of content, and technology-induced errors.
Results: The users perceived the app to be easy to use and useful, but some problems were identified. Issues relating to usability were associated with navigation, such as scrolling and zooming. Users also had problems in understanding the meaning of some icons and terminologies. Sometimes, some users felt limited by predefined options, and they wanted to be able to freely express their clinical findings.
Conclusions: We found that users faced problems mainly with navigation when the app did not work in the same way as the other apps that were frequently used. Our study also resonates with previous findings that when using standardized templates, the systems should also allow the user to express their clinical findings in their own words.
JMIR Mhealth Uhealth 2018;6(10):e11076
The referral process between primary health care and specialized services remains challenging for several reasons, particularly in resource-constrained settings where specialists are in short supply. Inappropriate or delayed referrals result in inefficient use of resources, both financial and human, but more importantly, it may result in suboptimal care for patients .
Electronic referral and consultation systems have been suggested as a promising replacement for paper-based referrals [, ]. Better medical decisions can be made as all relevant patient and clinical information is available to both the referring clinician and the specialist. Moreover, with electronic systems, the speed of communication and referrals is faster, and the quality of the information exchange is improved by using standardized templates compared with referral letters with illegible handwriting [ ].
One area where over- and under-referrals are common is within burn injury care [- ]. Two core components in the referral process of burn injuries are accurate diagnosis by the initial provider and the ability to effectively communicate these findings to a specialist for referral and management advice. Due to the visual nature of burn wounds, burn care has a tradition of utilizing image- and video-based telemedicine to enhance clinical practice and improve outcomes in patients [ - ]. In the past, telemedicine systems have relied on expensive and bulky infrastructure, which makes implementation difficult in resource-limited settings [ ]. Furthermore, burn injuries are still a significant problem, especially in low- and middle-income countries [ ]. For best possible outcomes in this group of patients, the decision on management and facility destination (whether to transfer or not) must be made in a timely manner. As burns are often difficult to assess by inexperienced doctors [ - ], remote assistance can be crucial for further management and final disposition. Inaccurate estimation of the size and depth of the burn can lead to over or under fluid resuscitation with adverse effects [ ]. Therefore, effective consultation is paramount.
In recent years, as an ad hoc solution to the shortcomings in the referral and consultation process, communication via instant messaging using mobile phones has increased among health care workers. This is exemplified by the increasing reports on the use of chat services such as WhatsApp (WhatsApp Inc) for clinical consultations both within and between hospitals [- ]. The use of such chat services has often evolved spontaneously from a need for more straightforward and faster channels of communication [ ]. Benefits reported include shorter response time [ ], flattening of hierarchies, and the ability to break down geographical barriers [ ]. Areas in which the use of these chat services are particularly appealing are those with a prominent visual component [ , ], including diagnosis and management of acute burns [ , ]. However, there are some issues using services such as WhatsApp in medical practice. One drawback is that the information that is sent is less structured and often without patient identifiers, making it hard to keep track of which patient is being discussed [ ]. In addition, the information including images will not be documented within the hospital information system [ ]. Furthermore, despite some authors emphasizing the security of WhatsApp due to its end-to-end encryption [ , , , ], patient-related information including images is nonetheless stored on users’ phones, and it is ultimately up to the user to delete messages when no longer needed. Another problem is that by default, WhatsApp saves images and videos to the users’ photo gallery, and depending on the settings, images may be uploaded to other third-party cloud services. On the other hand, deleting messages makes it impossible for information to be audited in the future. There have been attempts to resolve these issues by developing apps intended explicitly for medical consultations [ - ]. Apps such as these can be tailored to contain a premade form to add demographic and clinical information, a chat function, and other features. This approach could allow for smoother consultations as the referring clinician will be prompted to provide the information that the consultant is requesting [ ].
Regardless of the app being used, besides proper implementation, it is essential to assess user needs and their perceptions . The most important aspect of technology acceptance and uptake among users is perceived usefulness—“the degree to which a person believes that using a particular system would enhance his/her job performance” [ ]. Another critical factor, especially for continuous use, is ease of use—“the degree to which a person believes that using a particular system would be free from effort” [ ]. The International Standards Organization (ISO 9241-11) defines usability as the “extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use” [ ].
From our experience in implementing a mobile phone-based referral and consultation system for acute burn injuries in South Africa, uptake has been rather slow, where physicians still choose to call the burns consultant or send them a message via other general text messaging apps (primarily WhatsApp) [, ]. For example, in a recent report from Cape Town, South Africa, WhatsApp was the preferred method for communicating clinical findings for pediatric burn care [ ]. Although there are several reasons for the low uptake of new systems, usability is one important aspect to consider. Therefore, the aim of this study was to assess the usability of a mobile phone app for remote consultations and referrals of burn injuries.
The Vula App
Vula mobile is a mobile phone app for remote consultation and referrals between emergency doctors at point of care and specialists, which was developed by Vula Mobile, Mafami Pty Ltd. The app runs on both iPhone and Android operating systems, and it can be downloaded free of charge. The app was introduced to the Western Cape, South Africa, in 2014 and now allows for referrals to 15 specialties such as ophthalmology, orthopedics, dermatology, and burns. The app currently handles over 5000 referrals per month, with 62 specialist teams actively taking referrals on Vula. The ability to refer patients with burns was introduced in April 2016, and it handled around 250 referrals during 2017. The app provides a template for each specialty, including patient and clinical data (see, screenshot 1). There are specific features related to each specialty, the ability to take photographs, and a chat function. The burns section of the app includes a feature to draw the burn on a depiction of a body, which will calculate burn size and fluid requirements (see , screenshots 2, 3, and 4). This feature also allows the burns consultant to see the location, size, and depth of the burn. All the documented information including photos are only saved within the app and not saved elsewhere on the phone. The form and the features were established in collaboration with burns experts and emergency specialists before the development of the burns section of the app. When a referring clinician has completed the form, the information is uploaded to a secure server and shared with the burns specialist on call who will be notified. The specialist reviews the information on their mobile phone or computer and sends back treatment and referral advice either via predefined medical advice options or an instant messaging function. After the consultation is complete, the specialist can archive the referral, which will remove it from both the referring doctor’s and the specialist’s devices. Users are mandated to take the necessary precautions in accordance with the “South African Protection of Personal Information Act, 2013” and to act in accordance with national and local legislation. The user is prompted by the app to make sure that the patient has consented to data being stored electronically. As with any consultation, whether it is face-to-face or telephonic, both the referring doctor and the specialist have to make the appropriate medical documentation as mandated by the health care organization where they operate. In this study, participants assessed the app on their own device.
Participants and Setting
Cape Town has experienced rapid urbanization with a large part of its population living in suboptimal housing that is characterized by crowded living situations, lack of running water, lack of electricity, and houses built out of flammable materials . It has a high burden of burn injuries, with an estimated mortality rate of 7.9 per 100.000 person-years [ ]. In Cape Town, there are 2 hospitals with specialized burns units serving the Western Cape province. Depending on the burn severity and day-to-day capacity, patients can be treated at local clinics or referring hospitals. Furthermore, to improve referral of patients in the region, referral criteria have been implemented [ ]. However, studies indicate that inappropriate or delayed referrals are still commonplace in this region [ ].
This study was conducted in the emergency centers at 2 health facilities, 1 district hospital, and 1 health clinic in Cape Town. Participants were doctors sampled at the emergency center at Khayelitsha Hospital and Gugulethu Community Health Centre by convenience sampling. The primary focus of this study is on emergency doctors; however, an additional 4 of the burns consultants who were registered with the app were selected by convenience sampling and interviewed for the study to explore their view of the app. Although training is available upon request, no training is required to use the app. All participants had used the app for other specialties, and some participants used the app for burns consultations. Among the participants, 6 had never used the app for burns, 14 had used it less than 5 times, and 4 had used it more than 5 times. The burns consultants had all used the app for consultations at least 10 times before the interview.
This study was a mixed-methods study including a qualitative approach using the think-aloud method and a questionnaire measuring different usability metrics. As the app is already in use, the goal of this usability study was to identify usability problems related to the Vula app to improve the existing app.
A think-aloud test was conducted with each participant to assess the usability of the app. The purpose of think-aloud protocols is to have users think aloud while performing a set of tasks. The users are asked to verbalize whatever they are looking at, thinking, doing, and feeling . The think-aloud method is suitable to generate data on the cognitive processes when performing a set of tasks [ ]. Participants were given a case description of a patient with 2 burn injuries on the right forearm, 1 on the back (posterior side), and 1 on the front (anterior side) (see ). Each burn was equal in size and covered about 3% of the body surface, 1 full thickness and 1 partial thickness. The interviewer who acted as the patient had drawn on the arm with a marker to indicate location, size, and depth to facilitate an examination. The participants were asked to use the app as they would in a real situation. Descriptions of tasks for emergency doctors are shown in and for burns specialist in . A camera (GoPro Hero4, GoPro, Inc, San Mateo, California) was mounted to each participant’s chest to record their hands holding the mobile phone (see ). The interviews took place during their working hours in either an empty examination room or a break room. Interviews lasted for 10 to 20 min. Data collection took place during December 2016 and August 2017. During the second period of data collection, we extended the interview by going through the app once again where the interviewer asked about their thoughts on the different parts and asked further questions about specific problems encountered.
At the end of the interview, the participants completed the previously validated Health Information Technology Usability Evaluation Scale (Health-ITUES) . The Health-ITUES is a customizable questionnaire that subjectively measures the usability of eHealth tools. Each question can be customized to address the specific type of eHealth tool, the type of user, and the specific tasks that users are expected to perform using the system in a specified context. In this study, user=emergency staff, tool=app, task=management of burns, and context=emergency care services. The questionnaire covers 4 different domains: quality of work life, perceived usefulness, perceived ease of use, and user control. In total, the questionnaire contains 20 questions measured on a 5-point Likert-scale, ranging from strongly disagree (1) to strongly agree (5), as well as a non-applicable option.
|Task identification||Tasks associated with the Vula app for referring doctors|
|Task 1||Launch app from home screen|
|Task 2||Choose new referral|
|Task 3||Choose “Burns” in the list of specialties|
|Task 4a||Choose doctor on call at the referral facility|
|Task 4b||For first time users: Press the button saying “Yes, I am allowed to refer”|
|Task 5||Fill out patient information|
|Task 6||Choose cause of burn from list|
|Task 7||Select how long ago the burn injury happened|
|Task 8||Click on “Calculate burn area”|
|Task 9||On the “Calculate burn area” page, draw the burn on the picture and click save|
|Task 10||A box will appear with the fluid calculation, and percentage of the burn area will be displayed as well as burn depth.|
|Task 11||If applicable, select any of the conditions in the list under medical history|
|Task 12||Take one or more photos of the burn injury|
|Task 13||Add comments|
|Task 14||“Click refer” or “Save and refer later”|
|Task identification||Tasks associated with the Vula app for the specialists|
|Task 1||Launch app from home screen|
|Task 2||Select the new referral from the list of referrals|
|Task 3||Review the information about the referral|
|Task 4||Go into “Send advice” page|
|Task 5||Select where the patient should be treated from the drop-down list|
|Task 6||Choose fluid resuscitation protocol|
|Task 7||Choose drugs to be given|
|Task 8||Choose recommended dressings|
|Task 9||Write further instructions if any|
|Task 10||Write your assessment of the total burn surface area|
|Task 11||Write your assessment of the burn depth|
|Task 12||Click “send advice” button (the user will be taken to the chat window where the selected information has been compiled into a message.)|
|Task 13||If necessary, chat with the referring doctor|
|Usability-related aspects||These codes are used to describe usability problems and issues identified when analyzing video usability data. The codes focus on aspects of the user interface and the user system|
|Usefulness of content codes||These codes are used to describe issues regarding the usefulness of the user interface or system being evaluated from analyzing the data|
|Safety- and technology-induced error codes||These codes are used to identify and tag errors made by users when analyzing data|
Video recordings were analyzed with MAXQDA (Version 12, VERBI Software, Berlin, Germany), a software for analysis of qualitative data. The program allows for analysis of both video and text. First, the audio recordings from the video were transcribed verbatim. In MAXQDA, this can be done using timestamps to have easy access to each audio and video segment that correlates with the text segment in the transcript. The videos can be analyzed by coding directly into the timeline when viewing the video. Content analysis with predefined codes developed by Kushniruk and Borycki was used for coding the data . The codes cover 3 themes: usability, usefulness, and safety- and technology-induced errors (see ). The coding scheme was adapted to suit this study, and codes were also added if findings did not fit the predefined codes. Conversely, some codes were omitted if they did not apply to the material. AK and SCF independently coded all interviews and discussed the labeling of each coded event. PYY worked in the final stage with AK to ensure the codes were accurately assigned to each scenario. The data from the questionnaire were analyzed using descriptive statistics with SPSS (Version 23, IBM Corp, Armonk, NY).
All participants were presented with both verbal and written information about the study before the interview, and written consent was obtained. Consent forms with participant identifiers were separated from the collected data. Data are presented on an aggregated level. The study was approved by the University of Cape Town, Faculty of Health Sciences Human Research Ethics Committee.
Characteristics of Study Participants
Overall, 24 emergency doctors were included in the study, and all of them were working in the emergency department at the 2 different health facilities. Demographics of the study participants are presented in. Most participants were rotating, that is, they were on a short-term allocation to the emergency center. Experience in emergency care varied from 3 months to 7 years with a median time of 1 year. Participants rated their experience with burn care as either minimal (n=11), moderate (n=10), extensive (n=1), or none (n=2). The participants all used mobile phones for private and work purposes. The purpose of use was sending images, apps, or browsing the internet for reference on medical conditions, criteria, and drug dosage calculations. The mobile phones were also used to discuss cases with seniors using either instant messaging or voice calls.
Findings From Think-Aloud Sessions
All participants were able to complete all tasks with no major difficulties. However, several usability issues were identified. Usability issues were classified as either usability problems, usefulness of content, or safety- or technology-induced errors, as described in the Methods section (). Codes and frequency of problems encountered related to each domain in this analysis are presented in .
The video analysis revealed 149 problems related to the usability of the app. Most of these problems were related to navigation, consistency, meaning of icons and terminology, and a lack of user instructions. The majority of the problems occurred in the section where the user is prompted to draw the burn on a depiction of a body (, Task 9). In this part of the app, the most prevalent problems were related to navigation, specifically to zooming and moving the picture.
|Gender, n (%)|
|Age (in years)|
|Experience in emergency care (in months)|
|Median (SD)||12 (18)|
|Experience in burn care, n (%)|
|Code||Definition of code||Times usability problem occurred, n||Users experiencing problems (N=24), n (%)|
|Consistency||Relates to aspects of consistency in the user interface||23||10 (42)|
|Font||Relates to aspects of font size or text readability||1||1 (4)|
|Graphics||Relates to aspects of graphics of the system||4||3 (13)|
|Lack of user instructionsa||Relates to aspects of lack of user instructions||17||10 (42)|
|Layout||Relates to aspects of the layout of screens or information on those screens||6||5 (21)|
|Meaning of icons/terminology||Relates to aspects of understanding language or labels used in the interface||20||12 (50)|
|Navigation||Relates to aspects of moving through a system or user interface||30||14 (58)|
|Overall ease of use||Coded when the user makes comments of the overall ease of use of the system||27||9 (38)|
|Speed/response time||Relates to aspects of system speed or response time||1||1 (4)|
|Understanding instructions||Relates to aspects of understanding user instructions||5||5 (21)|
|Visibility of system status||Relates to aspects of understanding what the system is doing||15||11 (46)|
|Usefulness of content codes|
|Accuracy/correctness||Relates to aspects of the accuracy or correctness of information or advice provided by the system||19||14 (58)|
|Overall usefulnessa||Coded when a user makes comments on the overall usefulness of the system||5||3 (13)|
|Relevance||Relates to aspects of the relevance of information and features to the user carrying out their task.||30||13 (54)|
|Safety- and technology-induced error codes|
|Mistake||Coded when a review of the data indicates the user has made a mistake that is not corrected||7||4 (17)|
|Slip||Coded when a review of the video data indicates the user has made a mistake but corrects the mistake||34||20 (83)|
|Work-around||Coded when the user is not using the approach to carrying out work that is recommended by the health care organization or computer system||8||8 (33)|
aNew code added to the original coding scheme.
The users often tried to move the picture using one finger instead of two, which resulted in accidental marks which had to be erased (, point 6). The users expressed varying degrees of frustration when this happened. A few users mentioned that they did not see the textbox saying “Pinch to zoom, use two fingers to scroll” ( , point 2), which is related to the codes lack of user instructions and understanding of user instructions. This textbox is only visible when entering the page and disappears when the user starts drawing. Even if they said they saw it and knew they had to use two fingers to scroll, they would still use one finger out of habit:
So initially it took me a while. Just navigating was a bit hard, because I want to go there, want to move up, and then I kind of you know want to move. I know that you need to put two fingers I’m just not used to it.
Other problems in this section were related to the layout and meaning of icons and terminology. Some users did not see the erase button and as a work-around, they canceled and re-entered (, point 4). A few users also did not see the button for turning the body around to display the back of the patient, resulting in the user drawing both burns on the same side ( , point 5):
And then also didn’t initially see that there is a turn body to do the other side.
Similarly, a few users did not know that it was possible to change the color to indicate different burn depths, indicating a lack of user instructions. There was also a problem with the system visibility in this part illustrated by the fact that users tried to press the buttons for burn depth even though they were already selected, which is indicated by a line right beneath the button (, point 1). Some users also thought the percentage indicators at the bottom of the screen were buttons to select burn depth ( , point 3):
I didn’t see that you could change the color of the paint, I just used to write it in the description.
Some users were also confused about the calculation of the burn surface, and some users did not agree with the calculated percentage in relation to their perception of how big the burn was:
It’s quite difficult to mark with like a finger, because I think with what I’ve drawn and what you have [on your arm] it’s quite a difference, in that it looks like you have a bigger burn on the diagram than what you have.
Overall, most users found the drawing function to be a useful way to convey the information about burn surface and depth. However, a few users did not find it easy to use and suggested other ways of conveying the burn surface area:
This is too difficult in my opinion because when you try to zoom and then suddenly it draws I’m not a big fan of this I’m not going to lie, I rather just work it out by myself [laughing] than use this.
Usefulness of Content
Many users had comments or experienced issues that related to the content of the app. These were either related to the relevance or accuracy or correctness. One re-occurring theme was that they often could not provide information that was asked for in the app, such as patient’s phone number, weight, or time of injury:
...time of injury is obviously very very important, which I think also gets guesstimated quite a bit here in this area.
The users did not indicate that any information asked for in the app was redundant. Instead, they suggested more fields or options to provide information in some areas that they thought would be relevant. For example, a section to provide the overall condition of the patient, such as other trauma, or a section where they could describe what management had already been undertaken:
I would rather like this part again to rather to be, type in because this is like limited choice.
I don’t think there was a section that says what your management has been so far. I think that is quite helpful just to kind of let the other doctor know what you have done.
Some issues were related to the accuracy and correctness of some of the information. For example, some users had problems selecting the cause of the burn and were debating whether hot coffee should be classified as a hot liquid or as hot water. Although most users did not seem to struggle when choosing the cause, some discrepancy remains; while 8 users chose hot water, 16 chose hot liquid:
...ok cause of burn, I would choose hot water burn, coffee is basically hot water, but I find that a bit ambiguous
In addition, some users thought that the list of comorbidities was too limited. Many users suggested that there should be a free text field below each of these sections where they could provide more information.
After the user has finished and saved the drawing of the burn, a box with the fluid calculation is displayed. The users thought this was a useful feature, but some users noted that this was not relevant information, as they would not give intravenous fluid to a patient with a small burn like in the test scenario:
So, the one thing it doesn’t speak about is whether you can use, so a minor burn like that I wouldn’t necessarily go with IV-fluids.
Many users also talked about how the ability to send pictures was one of the most relevant parts of the app. Most users did not encounter any problems when taking the photos. However, one user thought that the app would automatically save photos to the phone library and was confused when he could not find them. Another user chose to take photos before starting the app and then import the pictures:
I’m going to add the photos from the gallery, because I have the photos already, ehm, so that’s the, where is the picture now? Ok, seems I lost the images.
At the bottom of the form, there is a field for additional comments, which many users used to write a message to the consultant with information they thought would be relevant. These messages often included information that was already filled in, such as age and gender of the patient or the size of the burn. Many users also used this field to specify that the burn was the result of hot coffee.
Safety- and Technology-Induced Error Codes
This theme included slips, mistakes, and work-arounds. One of the most common issues was that users made a slip while using the drawing section. Although this did not cause any significant problems, the users found it frustrating. Only a few users made mistakes that they seemed unaware of making. One user selected an expert from the list that was not on call; during a real scenario, this could have resulted in no response. When using the drawing tool, one user only circled the burn without coloring it in, which resulted in a 0% calculation. The user manually typed in the percentage that the user estimated to be 9%. As some users did not know the body could be flipped around, these users drew both burns on the same side of the body, also resulting in a smaller burn area calculation. Slips, mistakes, and work-arounds were most of the time the result of an underlying usability problem such as understanding user instructions, meaning of icons or terminology, or visibility of system status.
During the think-aloud sessions, some users made comments about the app that was not specifically related to the design or functions of the app itself. For example, some users said they would not use the app on a small burn such as the one in the test scenario, that is, they did not find the app useful for smaller burns:
Since it’s not a circumferential burn, I would consider admitting this patient to our surgical team and for them to debride the patient rather than referring the patient. So, I wouldn’t refer the patient via the app.
Furthermore, when talking about their previous experience using the app, some users said that the specialists were slow to respond, and sometimes they would call them if they had not heard back from them in a while.
Questions and scores relating to each construct in the Health-ITUES questionnaire are presented in. In general, the app scored high on most of the constructs, with the construct “ease of use” scoring the highest. Usefulness also scored high; however, the questions relating to the ability to receive management advice in a timely manner tended to be lower (items 5, 10, and 12). The construct “user control” scored the lowest, especially the items relating to error prevention (item 18 and 19).
Interviews With Burns Consultants
A total of 4 interviews were also conducted with burns consultants who used the app to give diagnostic support to emergency doctors. The first screen the consultants see displays all information about the patient sent by the referring doctor. From this page, they can either go into “Send advice,” “Chat,” or review photos. Some consultants did not use the app in the way it was intended to be used; mostly they found it easier to initiate a chat with the referring doctor. In one case, the consultant was unaware of the send advice function. One of the participants said, in general, it is good with predefined options, but in this app, the options were not very useful:
It irritates me that I need to tell them the dose. So, if I choose morphine, I have to write the dose. That’s standard protocol, the nurse or doctor should know this, or use another app, or it should be in this app.
They all found the pictures to be very beneficial as they could then more accurately assess the burn depth and size. However, 1 consultant expressed that many doctors will not clean the burns or remove blisters before taking the pictures, which then made it harder for the consultant to be able to assess the burn. The consultant suggested there should be instructions in the app about this before the user takes a picture. One consultant seemed unaware that the referring doctor could take pictures within the app and would usually ask them to send pictures through WhatsApp instead.
|Item||Concept||Score (1-5)||Cronbach alpha|
|Quality of work life||4.42||.76|
|1. I think the app has improved the emergency staff’s ability to care for burns||System impact – career mission||4.67||—a|
|2. I think the app has been a positive addition to burn care at the hospital||System impact – organizational level||4.46||—|
|3. The app is an important part in the acute management of burns||System impact – personal level||4.13||—|
|4. Using the app makes it easier to receive expert advice on management of burns||Productiveness||4.5||—|
|5. Using the app enables me to receive burn management advice more quickly||Productiveness||3.70||—|
|6. Using the app makes it more likely that I have sufficient knowledge on how to manage acute burns||Productiveness||4.00||—|
|7. Using the app is useful for receiving information about burn management||General usefulness||4.13||—|
|8. I think that the app presents a more equitable process for burn management||General usefulness||4.38||—|
|9. I am satisfied with the app for receiving information on burn management||General satisfaction||4.13||—|
|10. I can receive information on burn management in a timely manner because of the app||Performance speed||3.88||—|
|11. Using the app increases receiving information about burn management||Productiveness||4.33||—|
|12. I am able to receive advice on burn management whenever I use the app||Information needs||3.88||—|
|Perceived ease of use||4.64||.74|
|13. I am comfortable with my ability to use the app||Competency||4.71||—|
|14. Learning to operate the app is easy for me||Learnability||4.63||—|
|15. It is easy for me to become skillful at using the app||Competency||4.67||—|
|16. I find the app easy to use||Ease of use||4.54||—|
|17. I can always remember how to log on and use the app||Memorability||4.67||—|
|18. The app gives me error messages that clearly tell me how to fix problems||Error prevention||2.67||—|
|19. Whenever I make a mistake using the app, I recover easily and quickly||Error prevention||3.87||—|
|20. The information (such as on-screen messages and other documentation) provided with the app is clear.||Information needs||4.33||—|
Although the Health-ITUES questionnaire showed the usability to be satisfactory, the think-aloud evaluation revealed several important usability problems that should be considered for improvements for this particular app and for others planning to design apps for remote consultations.
Most issues occurred in the drawing feature, which is a central feature of the app that allows the users to describe the size, depth, and location of the burn. For the burns consultant, this information is of great importance when assessing a burn. Some users expressed frustration, and a few even said they thought this feature was too difficult to use. Conversely, many users said that this was one of the strengths of the app, suggesting it may be a central feature that is not present in other communication apps. The consultants also mentioned that being able to see the location of the burn visually is very important for them to do their assessment. This resonates with the findings by Blom et al in their study on expectations of burn specialists about image-based teleconsultation .
Given that the drawing function is one of the central features but at the same time where most users struggled, it calls for some reflection. Any extra added feature needs to be well justified, that is, increase the usefulness of the app, but at the same time, it needs to be easy to use, that is, free of effort. Rust et al argue that adding extra features may be attractive but could lead to a product that is overly complex with decreased usability, resulting in feature fatigue . Considering these problems that the users encountered, we suggest that any extra feature be well justified (add value to the user) and be designed to be easy to use.
Other usability problems were related to the meaning of icons and their functionalities. Although design changes can improve these, it also demonstrates the importance of user testing with the intended users. Ehrler et al, who found similar issues assessing a mobile app to support nurses, suggest both design changes and also training to mitigate some of these problems .
Another finding related to the drawing feature was that some users did not agree with the calculation of the burn surface and said that they would have assessed the burn to be larger had they not used the app. It is well documented in the literature that nonburns specialists tend to overestimate burn surface area when visually assessing the burn [, ]. One way to at least make it more apparent in the app would be to have the percentage indicated on each body part as a reminder of the burn surface area.
One critical part of usability testing of medical apps is to identify problems that could result in harm to the patient; for example, the fact that some users made mistakes that led to either a smaller or larger calculation of the percentage, which in turn could lead to adverse effects if this discrepancy is significant. This is one of the reasons why supplementing with photos is important so that the burns consultant can make their assessment of the burn.
Usefulness of Content
Although it is important that new technologies are easy to use, the usefulness of the content of a system is equally or even more important to end users . Although the participants found the content to be useful in the app, many users thought that the app lacked some information or options. First, many users felt limited by the predefined choices and often wanted to describe more about the injury. For example, the medical conditions that are asked for in the app are limited to a few that are of interest to the burn specialist. However, some users thought this list could have been extended. In a similar vein, many users were also not sure whether hot coffee should be classified as hot liquid, not water or hot water. Both these options are meant to indicate that the burn was a scald. Hot liquids, not water is meant to represent what can be referred to as a dense liquid burn that is caused by liquids such as milk and oil [ ]. In contrary to less dense liquids such as water, dense liquids will retain more heat and will adhere to the skin longer because of their higher viscosity. Therefore, such burn injuries may result in deeper burns. Although coffee or tea, for example, is not just water, it still has the same properties, which many users also mentioned. However, as one participant pointed out, if milk is added, this will change. In terms of the usability of the app, this relates to the meaning of icons/terminology, as well as information needs. At best, this may only cause confusion and irritation, but there could be instances where misunderstandings might have implications for the patient.
Despite a field for additional comments at the bottom of the form, some users suggested that there should be a possibility to specify with free text under each subsection when the options were too limiting. We did not further explore why some participants felt like this, but one explanation could be that at this point in the user test, they were unaware of the comment box that is located at the end of the form. A study by Hysong et al of e-referral systems reported similar findings where the primary health practitioners felt constrained by the use of templates and that they were not able to communicate findings clearly . Similarly, in the tests with the burn consultants, some of them said they did not like the predefined options in their present form and would rather use the chat function instead. Hysong et al, on the other hand, found that specialists thought that more rigid templates with mandatory fields would enhance the quality of the referrals [ ]. Other studies have found that feedback from consultants is more consistent and timely when referral templates are used [ , ]. One interesting finding regarding the comment box was that many participants used this to write a message to the specialist repeating much of the information already filled out. These messages would often be of a friendly nature including phrases such as “dear Dr” or “kind regards.” This highlights the fact that a consultation is not merely an exchange of information but also a collaboration that requires personal communication [ ].
Our findings indicate that both emergency doctors and consultants wanted more flexibility within the system. Flexible systems can make it easier to transfer information about special cases, reducing the need for additional phone and text communication. However, when designing a system, the need for flexibility must be balanced with the need for its usability. Although it might be tempting to design a system for every scenario to maximize flexibility, one needs to take into account that when the flexibility of a system increases, so does its complexity and consequently its usability. This is often termed the flexibility-usability trade-off , and when doing this trade-off, it is important to understand the users’ needs, both present and future. This is, however, not always possible, especially with newer technologies. In addition, as acute burn injuries can manifest in different ways, it is difficult to design a template that will fit the clinical presentation of every single patient. Esquivel et al outlined 10 recommendations for improving the effectiveness of electronic referrals [ ]. One recommendation that our findings support is to design and use standardized electronic referral templates that include both structured and free text fields. They argue that when designing electronic referral templates, there must be both structured fields to capture required information and free text fields for providers to freely expand on their clinical findings.
The usability tests with the consultants revealed that the 2 more experienced users found the ”Send advice” function cumbersome and would rather just chat with the referring doctor. The 2 less experienced consultants seemed to be unaware of some of the functionalities in the app, such as the “Send advice” function or how to access the photos, suggesting that the user interface may not be clear for new users.
Significance of Findings
Although the participants in our study perceived the burns section of the app to be useful and easy to use, it is still not used on a regular basis. However, there are other reasons for low uptake that are not related to the usability of the app. These include, but are not limited to, awareness of the app, resistance to new technologies and attitudes among colleagues . Even though traditional means of discussing and referring patients with burns by telephonic consultations and paper-based referrals are still prevalent, recent reports from South Africa have described the use of WhatsApp as a mode of communication for clinical decision support [ , ]. The fact that WhatsApp is very prevalent may be a reason why a specific app for medical referrals is not widely adopted. For the referring doctor seeking advice, the most important aspect will inevitably be to receive advice back from the consultant in a timely manner. Therefore, the consultants have an important role to play as pioneers for apps such as the Vula app. Nikolic et al note in their recent study that the widespread use of WhatsApp may impede the introduction of other communication apps designed for medical consultations [ ]. Future research should focus on identifying the barriers to using mobile phone-based referral and consultation systems.
The methods and equipment for data collection used in this study were simple, low cost, and allowed for quick data collection. None of the participants objected or showed any signs of being uncomfortable wearing the recording equipment. The video coding scheme was useful and covered most aspects. However, we changed the original definitions to cover both positive and negative aspects of the events and not only focus on problems. By doing this, the data generated more findings concerning things that the users liked about the app. We also added some codes that we thought were not covered in the original coding scheme. For example, Lack of user instructions was added as an extension to the code Not understanding user instructions.
During the think-aloud session, the interviewer tried not to interfere, except for encouraging the interviewee to keep talking. However, we found that the participants would mostly verbalize what they were doing but not what they were thinking and feeling. Consequently, during the second period of data collection, we extended the interview by going through the app once again where the interviewer asked about their thoughts on the different parts of the app.
Although the users rated the app to be easy to use, the think-aloud interviews found several problems related to the ease of use. This is not surprising as the purpose of a usability questionnaire is to assess the users’ overall perception of the app and not identify specific problems. It is, however, useful for comparing different user groups, different designs of the app, or how user perceptions change over time.
As this was a mixed-methods study spanning both emergency burn care and health informatics, we think it was a strength that the authors come from different disciplines, including public health, political science, sociology, emergency medicine, nursing, and health informatics. Another strength of the study was the large number of participants in the user tests. A limitation was the relatively homogenous sample, where all users were young junior doctors. Sonderegger et al found that age affects several usability measures such as speed and accuracy . Cillessen et al studied the user satisfaction of a clinical notes app and only differences by medical specialty but no differences by sex, age, professional experience, or training hours [ ]. Nevertheless, our sample does represent the typical user from our study population.
Mobile referral and communication systems are a relatively new concept that could simplify the consultation and referral process within burn care and other specialties. However, when a new system is competing with other technologies such as instant messaging apps or traditional phone calls, usefulness and ease of use become highly important. Although the Health-ITUES questionnaire showed high satisfaction with the system, the think-aloud interviews provided additional insights for further improvements. The users liked the ability to describe the burns through both the built-in drawing function as well as with photographs. For the burn consultants, the drawings and the photos were considered the most valuable information for their assessment. We also found that users had problems with navigation in the drawing section when the app did not act in the same way as other apps that were frequently used. There was also some confusion regarding terminology and the meaning of buttons and icons. Our study also resonates with previous findings that when using standardized electronic consultation and referral templates, the system should also allow the users to freely express their clinical findings in their own words.
The authors are grateful for the doctors who gave their valuable time and feedback. They specially thank Sa’ad Lahri and Jennie Morgan for organizing the interviews at each hospital. The study was funded by the Swedish Research Council.
AK, MH, PYY, and LAW contributed to the design of the study; AK, MH, PYY, and SCF contributed to the development of measurement tools; AK collected data and LAW assisted with logistics in the field. AK and SCF contributed to data analysis, and PYY assisted with unresolved codes. AK, SCF, and MH contributed to interpretation of the results. AK drafted the manuscript. All authors read, commented on, and approved the final manuscript.
Conflicts of Interest
- Mehrotra A, Forrest CB, Lin CY. Dropping the baton: specialty referrals in the United States. Milbank Q 2011 Mar;89(1):39-68 [FREE Full text] [CrossRef] [Medline]
- Naseriasl M, Adham D, Janati A. E-referral solutions: successful experiences, key features and challenges-a systematic review. Mater Sociomed 2015 Jun;27(3):195-199 [FREE Full text] [CrossRef] [Medline]
- Esquivel A, Sittig DF, Murphy DR, Singh H. Improving the effectiveness of electronic health record-based referral processes. BMC Med Inform Decis Mak 2012 Sep 13;12:107 [FREE Full text] [CrossRef] [Medline]
- Chipp E, Walton J, Gorman D, Moiemen NS. Adherence to referral criteria for burns in the emergency department. Eplasty 2008 May 09;8:e26 [FREE Full text] [Medline]
- Doud AN, Swanson JM, Ladd MR, Neff LP, Carter JE, Holmes JH. Referral patterns in pediatric burn patients. Am Surg 2014 Sep;80(9):836-840. [Medline]
- Klingberg A, Wallis L, Rode H, Stenberg T, Laflamme L, Hasselberg M. Assessing guidelines for burn referrals in a resource-constrained setting: demographic and clinical factors associated with inter-facility transfer. Burns 2017 Aug;43(5):1070-1077. [CrossRef] [Medline]
- den Hollander D, Albert M, Strand A, Hardcastle TC. Epidemiology and referral patterns of burns admitted to the Burns Centre at Inkosi Albert Luthuli Central Hospital, Durban. Burns 2014 Sep;40(6):1201-1208. [CrossRef] [Medline]
- Carter JE, Neff LP, Holmes JH. Adherence to burn center referral criteria: are patients appropriately being referred? J Burn Care Res 2010;31(1):26-30. [CrossRef] [Medline]
- Reiband HK, Lundin K, Alsbjørn B, Sørensen AM, Rasmussen LS. Optimization of burn referrals. Burns 2014 May;40(3):397-401. [CrossRef] [Medline]
- Wallace DL, Smith RW, Pickford MA. A cohort study of acute plastic surgery trauma and burn referrals using telemedicine. J Telemed Telecare 2007;13(6):282-287. [CrossRef] [Medline]
- Atiyeh B, Masellis A, Conte C. Optimizing burn treatment in developing low- and middle-income countries with limited health care resources (part 1). Ann Burns Fire Disasters 2009 Sep 30;22(3):121-125 [FREE Full text] [Medline]
- Saffle JR, Edelman L, Theurer L, Morris SE, Cochran A. Telemedicine evaluation of acute burns is accurate and cost-effective. J Trauma 2009 Aug;67(2):358-365. [CrossRef] [Medline]
- Turk E, Karagulle E, Aydogan C, Oguz H, Tarim A, Karakayali H, et al. Use of telemedicine and telephone consultation in decision-making and follow-up of burn patients: Initial experience from two burn units. Burns 2011 May;37(3):415-419. [CrossRef] [Medline]
- Holt B, Faraklas I, Theurer L, Cochran A, Saffle JR. Telemedicine use among burn centers in the United States: a survey. J Burn Care Res 2012;33(1):157-162. [CrossRef] [Medline]
- Syed-Abdul S, Scholl J, Chen CC, Santos MD, Jian WS, Liou DM, et al. Telemedicine utilization to support the management of the burns treatment involving patient pathways in both developed and developing countries: a case study. J Burn Care Res 2012;33(4):e207-e212. [CrossRef] [Medline]
- Atiyeh B, Dibo SA, Janom HH. Telemedicine and burns: an overview. Ann Burns Fire Disasters 2014 Jun 30;27(2):87-93 [FREE Full text] [Medline]
- Boccara D, Bekara F, Soussi S, Legrand M, Chaouat M, Mimoun M, et al. Ongoing development and evaluation of a method of telemedicine: burn care management with a smartphone. J Burn Care Res 2018 Jun 13;39(4):580-584. [CrossRef] [Medline]
- Combi C, Pozzani G, Pozzi G. Telemedicine for developing countries. A survey and some design issues. Appl Clin Inform 2016 Nov 02;7(4):1025-1050 [FREE Full text] [CrossRef] [Medline]
- Rybarczyk MM, Schafer JM, Elm CM, Sarvepalli S, Vaswani PA, Balhara KS, et al. A systematic review of burn injuries in low- and middle-income countries: epidemiology in the WHO-defined African Region. Afr J Emerg Med 2017 Mar;7(1):30-37. [CrossRef]
- Harish V, Raymond AP, Issler AC, Lajevardi SS, Chang LY, Maitz PK, et al. Accuracy of burn size estimation in patients transferred to adult Burn Units in Sydney, Australia: an audit of 698 patients. Burns 2015 Feb;41(1):91-99. [CrossRef] [Medline]
- Baartmans MG, van Baar ME, Boxma H, Dokter J, Tibboel D, Nieuwenhuis MK. Accuracy of burn size assessment prior to arrival in Dutch burn centres and its consequences in children: a nationwide evaluation. Injury 2012 Sep;43(9):1451-1456. [CrossRef] [Medline]
- Thom D. Appraising current methods for preclinical calculation of burn size - a pre-hospital perspective. Burns 2017 Feb;43(1):127-136. [CrossRef] [Medline]
- Goverman J, Bittner EA, Friedstat JS, Moore M, Nozari A, Ibrahim AE, et al. Discrepancy in initial pediatric burn estimates and its impact on fluid resuscitation. J Burn Care Res 2015;36(5):574-579. [CrossRef] [Medline]
- Chan QE, Barzi F, Cheney L, Harvey JG, Holland AJ. Burn size estimation in children: still a problem. Emerg Med Australas 2012 Apr;24(2):181-186. [CrossRef] [Medline]
- Parvizi D, Kamolz LP, Giretzlehner M, Haller HL, Trop M, Selig H, et al. The potential impact of wrong TBSA estimations on fluid resuscitation in patients suffering from burns: things to keep in mind. Burns 2014 Mar;40(2):241-245. [CrossRef] [Medline]
- Dulhunty JM, Boots RJ, Rudd MJ, Muller MJ, Lipman J. Increased fluid resuscitation can lead to adverse outcomes in major-burn injured patients, but low mortality is achievable. Burns 2008 Dec;34(8):1090-1097. [CrossRef] [Medline]
- Stahl I, Katsman A, Zaidman M, Keshet D, Sigal A, Eidelman M. Reliability of smartphone-based instant messaging application for diagnosis, classification, and decision-making in pediatric orthopedic trauma. Pediatr Emerg Care 2017 Jul 11. [CrossRef] [Medline]
- Giordano V, Koch HA, Mendes CH, Bergamin A, de Souza FS, do Amaral NP. WhatsApp Messenger is useful and reproducible in the assessment of tibial plateau fractures: inter- and intra-observer agreement study. Int J Med Inform 2015 Feb;84(2):141-148. [CrossRef] [Medline]
- Gulacti U, Lok U, Hatipoglu S, Polat H. An analysis of WhatsApp usage for communication between consulting and emergency physicians. J Med Syst 2016 Jun;40(6):130. [CrossRef] [Medline]
- Johnston M, Mobasheri M, King D, Darzi A. The Imperial Clarify, Design and Evaluate (CDE) approach to mHealth app development. BMJ Innov 2015 Mar 31;1(2):39-42. [CrossRef]
- Mars M, Scott RE. WhatsApp in clinical practice: a literature review. Stud Health Technol Inform 2016;231:82-90. [Medline]
- den Hollander D, Mars M. Smart phones make smart referrals: the use of mobile phone technology in burn care - a retrospective case series. Burns 2017 Feb;43(1):190-194. [CrossRef] [Medline]
- Mars M, Scott RE. Being Spontaneous: the future of telehealth implementation? Telemed J E Health 2017 Sep;23(9):766-772. [CrossRef] [Medline]
- Eksert S, Aşık MB, Akay S, Keklikçi K, Aydın FN, Çoban M, et al. Efficiency of instant messaging applications in coordination of emergency calls for combat injuries: a pilot study. Ulus Travma Acil Cerrahi Derg 2017 May;23(3):207-211 [FREE Full text] [Medline]
- Kamel Boulos MN, Giustini DM, Wheeler S. Instagram and WhatsApp in health and healthcare: an overview. Future Internet 2016 Jul 26;8(3):1-14. [CrossRef]
- Martinez R, Rogers AD, Numanoglu A, Rode H. The value of WhatsApp communication in paediatric burn care. Burns 2018 Jun;44(4):947-955. [CrossRef] [Medline]
- Thomas K. Wanted: a WhatsApp alternative for clinicians. BMJ 2018 Dec 26;360:k928. [Medline]
- Pecina JL, Wyatt KD, Comfere NI, Bernard ME, North F. Uses of mobile device digital photography of dermatologic conditions in primary care. JMIR Mhealth Uhealth 2017 Nov 08;5(11):e165 [FREE Full text] [CrossRef] [Medline]
- Wyatt KD, Willaert BN, Pallagi PJ, Uribe RA, Yiannias JA, Hellmich TR. PhotoExam: adoption of an iOS-based clinical image capture application at Mayo Clinic. Int J Dermatol 2017 May 11;56(12):1359-1365. [CrossRef] [Medline]
- Börve A, Dahlén Gyllencreutz J, Terstappen K, Johansson Backman E, Aldenbratt A, Danielsson M, et al. Smartphone teledermoscopy referrals: a novel process for improved triage of skin cancer patients. Acta Derm Venereol 2015 Feb;95(2):186-190 [FREE Full text] [CrossRef] [Medline]
- Paik AM, Granick MS, Scott S. Plastic surgery telehealth consultation expedites Emergency Department treatment. J Telemed Telecare 2017 Feb;23(2):321-327. [CrossRef] [Medline]
- Cinaglia P, Tradigo G, Guzzi PH, Veltri P. Design and implementation of a telecardiology system for mobile devices. Interdiscip Sci 2015 Sep;7(3):266-274. [CrossRef] [Medline]
- Gagnon MP, Ngangue P, Payne-Gagnon J, Desmartis M. m-Health adoption by healthcare professionals: a systematic review. J Am Med Inform Assoc 2016 Jan;23(1):212-220. [CrossRef] [Medline]
- Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q 1989 Sep;13(3):319. [CrossRef]
- Iso.org. Ergonomics of human-system interaction — Part 11: Usability: Definitions and concepts URL: https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-2:v1:en [accessed 2018-09-17] [WebCite Cache]
- Maritz D, Wallis L, Van Der Merwe E, Nel D. The aetiology of adult burns in the Western Cape, South Africa. Burns 2012 Feb;38(1):120-127. [CrossRef] [Medline]
- Van Niekerk A, Laubscher R, Laflamme L. Demographic and circumstantial accounts of burn mortality in Cape Town, South Africa, 2001-2004: an observational register based study. BMC Public Health 2009 Oct 06;9:374 [FREE Full text] [CrossRef] [Medline]
- Stander M, Wallis LA. The emergency management and treatment of severe burns. Emerg Med Int 2011;2011:161375 [FREE Full text] [CrossRef] [Medline]
- Jaspers MW, Steen T, van den Bos C, Geenen M. The think aloud method: a guide to user interface design. Int J Med Inform 2004 Nov;73(11-12):781-795. [CrossRef] [Medline]
- Yen PY, Wantland D, Bakken S. Development of a customizable health IT usability evaluation scale. AMIA Annu Symp Proc 2010;2010:917-921 [FREE Full text] [Medline]
- Kushniruk AW, Borycki EM. Development of a video coding scheme for analyzing the usability and usefulness of health information systems. Stud Health Technol Inform 2015;218:40599. [Medline]
- Blom L, Laflamme L, Mölsted Alvesson H. Expectations of medical specialists about image-based teleconsultation - a qualitative study on acute burns in South Africa. PLoS One 2018;13(3):e0194278 [FREE Full text] [CrossRef] [Medline]
- Rust RT, Thompson DV, Hamilton RW. Defeating feature fatigue. Harv Bus Rev 2006 Feb;84(2):98-107, 165. [Medline]
- Ehrler F, Weinhold T, Joe J, Lovis C, Blondon K. A mobile app (BEDSide Mobility) to support nurses' tasks at the patient's bedside: usability study. JMIR Mhealth Uhealth 2018 Mar 21;6(3):e57 [FREE Full text] [CrossRef] [Medline]
- Keil M, Beranek PM, Konsynski BR. Usefulness and ease of use: field study evidence regarding task considerations. Decis Support Syst 1995 Jan;13(1):75-91. [CrossRef]
- Bayramoglu A, Sener MT, Cakir Z, Aslan S, Emet M, Akoz A. Characteristics of patients who admitted to the emergency department because of burns due to dens liquids such as hot milk/oil. Eurasian J Med 2016 Feb;48(1):20-23 [FREE Full text] [CrossRef] [Medline]
- Hysong SJ, Esquivel A, Sittig DF, Paul LA, Espadas D, Singh S, et al. Towards successful coordination of electronic health record based-referrals: a qualitative analysis. Implement Sci 2011 Jul 27;6:84 [FREE Full text] [CrossRef] [Medline]
- Jenkins S, Arroll B, Hawken S, Nicholson R. Referral letters: are form letters better? Br J Gen Pract 1997 Feb;47(415):107-108 [FREE Full text] [Medline]
- Tan GB, Cohen H, Taylor FC, Gabbay J. Referral of patients to an anticoagulant clinic: implications for better management. Qual Health Care 1993 Jun;2(2):96-99 [FREE Full text] [Medline]
- Zuchowski JL, Rose DE, Hamilton AB, Stockdale SE, Meredith LS, Yano EM, et al. Challenges in referral communication between VHA primary care and specialty care. J Gen Intern Med 2015 Mar;30(3):305-311 [FREE Full text] [CrossRef] [Medline]
- Dvořák O, Pergl R, Kroha P. Tackling the flexibility-usability trade-off in component-based software development. In: Rocha Á, Correia AM, Adeli H, Luis Paulo R, Costanzo S, editors. Recent Advances in Information Systems and Technologies. New York City: Springer International Publishing; 2017:861-871.
- Nikolic A, Wickramasinghe N, Claydon-Platt D, Balakrishnan V, Smart P. The use of communication apps by medical staff in the Australian health care system: survey study on prevalence and use. JMIR Med Inform 2018 Feb 09;6(1):e9 [FREE Full text] [CrossRef] [Medline]
- Sonderegger A, Schmutz S, Sauer J. The influence of age in usability testing. Appl Ergon 2016 Jan;52:291-300. [CrossRef] [Medline]
- Cillessen FH, de Vries Robbé PF, Biermans MC. A hospital-wide transition from paper to digital problem-oriented clinical notes. A descriptive history and cross-sectional survey of use, usability, and satisfaction. Appl Clin Inform 2017 Dec 17;8(2):502-514. [CrossRef] [Medline]
|Health-ITUES: Health-Information Technology Usability Evaluation Scale|
Edited by G Eysenbach; submitted 25.05.18; peer-reviewed by M Mars, J Pecina, K Blondon; comments to author 04.07.18; revised version received 18.07.18; accepted 18.07.18; published 19.10.18
©Anders Klingberg, Lee Alan Wallis, Marie Hasselberg, Po-Yin Yen, Sara Caroline Fritzell. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 19.10.2018.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.