Published on in Vol 3, No 1 (2020): Jan-Dec

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/16714, first published .
Delivering Clinical Skin Examination Education to Nurse Practitioners Using an Internet-Based, Microlearning Approach: Development and Feasibility of a Video Intervention

Delivering Clinical Skin Examination Education to Nurse Practitioners Using an Internet-Based, Microlearning Approach: Development and Feasibility of a Video Intervention

Delivering Clinical Skin Examination Education to Nurse Practitioners Using an Internet-Based, Microlearning Approach: Development and Feasibility of a Video Intervention

Original Paper

1College of Nursing, The University of Arizona, Tucson, AZ, United States

2College of Nursing, Oregon Health & Science University, Portland, AZ, United States

Corresponding Author:

Delaney B Stratton, PhD, DNP, FNP-BC

College of Nursing

The University of Arizona

1305 N Martin Ave

Tucson, AZ, 85721

United States

Phone: 1 4802136556

Email: dstratton@email.arizona.edu


Background: Skin cancer is the most common cancer; survival of the most serious skin cancers and malignant melanomas depends on early detection. Early detection relies on accessibility to clinical skin examination (CSE). Primary care nurse practitioners (PCNPs) are well-positioned to conduct CSEs; however, they require further education on CSE and have time constraints for continuing education. A digitally delivered intervention grounded in microlearning is a promising approach to deliver new information over a brief period.

Objective: Our objective was to develop and explore the feasibility of implementing a 1-week digital video intervention with content on CSE skills, defined as melanoma risk assessment, head-to-toe skin examination, and pigmented lesion assessment, for PCNPs. Specific aims were as follows: (1) Aim 1: to develop three microlearning-based melanoma videos with content on CSE that are suitable for digital delivery to PCNPs in various formats and (2) Aim 2: to assess the feasibility of the video intervention, including enrollment and retention rates, adherence, and acceptability and usability of the video intervention.

Methods: For Aim 1, the research team created storyboards for videos that addressed each CSE skill. An expert panel of three dermatologists reviewed the storyboards and videos for relevance, comprehension, and clarity using the content validity index (CVI). The panel evaluated the usability of the video intervention delivery by Research Electronic Data Capture (REDCap) and Vimeo using the System Usability Scale (SUS) and technical video production using Beaudin and Quick’s Quality Evaluation of Video (QEV). Aim 2 evaluated enrollment and retention rates of PCNPs, based on metrics from previous studies of CSE in the literature, and video intervention adherence. SUS and the Attitudes toward Web-based Continuing Learning Survey (AWCL) assessed usability and acceptability.

Results: CVI scores indicated relevance and clarity for each video: mean scores ranged from 3.79 to 4, where 4 indicated the video was highly relevant and very clear. The integration of REDCap and Vimeo was usable: the SUS score was 96, where 0 was the worst and 100 was the best. The digital delivery of the videos was rated as exceptional on all five technical items: the mean score was 5, where scores ranged from 1 (poor) to 5 (exceptional). Of the 32 PCNPs who were sent emails, 12 enrolled (38%) and, out of these 12, 10 (83%) completed the intervention and the surveys. Video intervention adherence was ≤50%. Participants rated the usability as better (mean 85.8, SD 10.6; better=70-90) and favorably ranked the acceptability of the AWCL’s constructs of perceived usefulness (mean 5.26, SD 0.08), perceived ease of use (mean 5.40, SD 0.41), behavior (mean 5.53, SD 0.12), and affection (mean 5.77, SD 0.04), where scores ranged from 1 (strongly disagree) to 7 (strongly agree).

Conclusions: The video intervention was feasible to deliver to PCNPs using a digital, microlearning approach. The findings provide support for using the videos as an intervention in a future pilot randomized trial targeting behavioral CSE outcomes among PCNPs and other primary care providers.

JMIR Dermatol 2020;3(1):e16714

doi:10.2196/16714

Keywords



Background

The incidence of the deadliest skin cancer, melanoma, has doubled in the United States over the past 20 years [1]. An estimated 96,480 new cases of melanoma were diagnosed in 2019, and 7320 deaths resulted from melanoma [2]. Early detection leads to a greater proportion of removal of thin melanomas (<2 mm in thickness), which is associated with better outcomes [3-5]. Early detection is best accomplished by clinical skin examination (CSE), most often performed by a dermatologist. CSE is defined as melanoma risk assessment, head-to-toe skin examination, and pigmented lesion assessment. Currently, there is a shortage of dermatologists—the United States averages just 3.6 dermatologists per 100,000 people [6,7], which has not changed in the last 30 years even as skin cancer incidence continues to rise [8-10]. Patients could benefit from the availability of more practitioners, such as nurse practitioners (NPs), to conduct quality CSEs.

Previous research indicates that NPs may lack confidence and skills to perform CSE or assess skin lesions [11-14]. NPs have demonstrated mixed ability to distinguish lesions that are suspicious for melanoma from nonsuspicious lesions [11,12]; in one study, a majority of NPs stated that they would rather refer any skin lesion to a specialist [15], which is problematic given the shortage of dermatologists. Although NPs’ confidence in CSE is low, they believe that primary care providers, in general, help to detect skin cancer early [13]. In a pilot study, NPs showed promise for cultivating good sensitivity (ranging from 50% to 100%, n=4) and excellent specificity (ranging from 99% to 100%, n=4) when asked to identify suspicious lesions [16].

NPs desire more training and resources about skin cancer training [17]. However, a recent systematic review [12] concluded that there are minimal CSE activities for NPs, and the activities that exist are not well explicated. The majority of NP CSE-focused intervention studies had small sample sizes (eg, ranging from 1 to 30) [18-21], were lengthy (eg, 14 weeks to 6 months) [18], or did not describe NPs’ dermatological training [18-21]. Fewer interventions had modules lasting under an hour (eg, 15-45 minutes) or were self-directed (eg, reviewing pamphlets, posters, and two presentations) [22-24]. This manuscript describes the development and feasibility testing of a brief CSE educational video intervention for primary care nurse practitioners (PCNPs).

Purpose Statement

The purpose of this study was to develop and explore the feasibility of implementing a 1-week, digitally delivered video intervention with content on CSE for PCNPs. The videos covered three CSE skills, defined as melanoma risk assessment, head-to-toe skin examination, and pigmented lesion assessment. The specific aims were as follows: (1) Aim 1: to develop three microlearning-based melanoma videos with content on CSE that are suitable for digital delivery to PCNPs in various formats and (2) Aim 2: to assess the feasibility of the video intervention, including enrollment and retention rates, video intervention adherence, and acceptability and usability postcompletion. Table 1 presents the details of the aims, hypotheses, measures, scoring [25-30], and outcomes.

Table 1. Aims, measures and tools, scoring, and outcomes of the clinical skin examination (CSE) educational video intervention for primary care nurse practitioners (PCNPs).
Aim with subaim or hypothesisMeasures or toolsScoringOutcomes
Aim 1: to develop, over 3 months, three theory-based, short skin cancer videos with content on comprehensive CSE skills that are suitable for digital delivery to PCNPs in various formats (eg, mobile phone, tablet, and computer)

Aim 1a: to assess content validity of the video intervention using an established methodContent validity index (CVI)Relevance: 1 (not relevant) to 4 (highly relevant)
Clarity: 1 (not clear) to 4 (very clear) [25-27]
Dermatology experts score content relevance and clarity highly

Aim 1b: to assess the integration of the videos and surveys into Research Electronic Data Capture (REDCap)System Usability Scale (SUS)1 (strongly disagree) to 5 (strongly agree)
Sum of item scores is calculated and multiplied by 2.5
Scores: concerning (<50), passable (50-69), better (70-90), and truly superior (>90) [28]
Dermatology experts score usability as better to truly superior

Aim 1c: to assess the digital delivery of the videosBeaudin and Quick’s Quality Evaluation of Video (QEV)1 (poor) to 5 (exceptional)
Option for open-ended comments follows each item [29]
Dermatology experts give high scores for technical production and navigation (ie, video design, intended content, visual quality, audio quality, and audio-visual relationship)
Aim 2: to determine enrollment and retention rates, video intervention adherence, and acceptability and usability postcompletion

Hypothesis 2.1: enrollment rates will be equal to or better than 60%Calculate the number of participants recruited and enrolled compared to those recruited who chose not to consent or enrollN/AaEnrollment rates

Hypothesis 2.2: retention rates will be greater than 50%Calculate completion of videos and surveys per number of participants enrolledN/ARetention rates

Hypothesis 2.3: video intervention adherence will be greater than or equal to 50%Vimeo “finishes”: counted from start-to-play to viewing of the very last video frameN/AParticipant completion rates of the videos

Hypothesis 2.4: usability scores will be equal to or higher than 70System Usability Scale (SUS)1 (strongly disagree) to 5 (strongly agree)
Sum of item scores is calculated and multiplied by 2.5
Scores: concerning (<50), passable (50-69), better (70-90), and truly superior (>90) [28]
Participants score usability highly

Hypothesis 2.5: acceptability scores will be equal to or higher than 5Attitudes toward Web-based Continuing Learning Survey (AWCL)
Acceptability: perceived usefulness (5 items), perceived ease of use (4 items), behavior (3 items), and affection (3 items)
1 (strongly disagree) to 7 (strongly agree) [30]Participants score acceptability highly

aN/A: not applicable.


The study design was a one-group, posttest, cross-sectional design. This section describes methods for each study aim. The University of Arizona Institutional Review Board approved the study.

Aim 1: CSE Video Development

Video development was guided by the use of a microlearning conceptual framework and operational transparency. The informational content of the videos was adapted from previous studies [31-33]; adaptation focused on key concepts that could be addressed in a short amount of time [34]. Development also included creation of the video storyboards.

Microlearning allows for the dissemination of short, meaningful knowledge, which benefits practitioners with busy schedules. Microlearning is defined as “special moments or episodes of learning while dealing with specific tasks or content and engaging in small but conscious steps” [35]. Research findings document that the use of short content may increase information retention by 20% [36]. Microlearning is for users who have difficulty creating the time to engage in long stretches of learning activities outside of dedicated study times and institutional programs [34].

For operational transparency, we conducted a systematic review to ascertain the rigor of previous CSE interventions [37] using Sidani and Braden’s clarifying elements (see Multimedia Appendix 1) [38]. The goal of the intervention was to inform the participants about CSE, enhance their CSE skills, and motivate them to perform CSE. Each of the three videos had a specific learning objective. The specific strategies, respective components, and immediate goals for each video are in Multimedia Appendix 2. The media for intervention delivery were written (ie, reading) and verbal (ie, audio). The format consisted of a video of skills instruction along with a PowerPoint presentation with voiceover. The videos were each 5-10 minutes in length (ie, amount). Vimeo was used to house the videos; this is an online platform and community developed to create, upload, and share videos [39]. Participants viewed each video one time and spaced video viewings within a 1-week period (ie, frequency); therefore, the duration of the intervention was 1 week.

The research team created and reviewed storyboards for each video. The videos were produced in collaboration with video technology experts at the institution and were uploaded to Vimeo. The web application Research Electronic Data Capture (REDCap) maintained the surveys and the separate fields for each video link from Vimeo. REDCap is a secure workflow methodology and software application designed for the development and deployment of digital data capture tools to assist with clinical and translational research [40]. REDCap allowed placement of the Vimeo video link into the survey, enabling viewing of the video within the survey, without having to open a new browser window. The expert panel, which consisted of three dermatologists, evaluated the integration of the videos in REDCap.

The components of each module were assessed using Sidani and Braden’s content validity assessment [41] and the content validity index (CVI). The first content validity survey asked the dermatology experts to evaluate the relevance (ie, the degree to which the content has an appropriate sample of activities for the component being measured) and clarity (ie, the extent to which the storyboard is concise, accurate, and direct) of the storyboard content [42]. The dermatology experts could add comments with each item to provide further clarification. Based on the CVI scores and recommendations, video content and activities were refined. The dermatology expert panel reviewed the content validity for a second time while viewing the actual videos 1 month after the first review. They accessed the videos in REDCap and completed the System Usability Scale (SUS).

The five technical production items from Beaudin and Quick’s Quality Evaluation of Video (QEV) evaluated the integration of the videos into Vimeo. The following steps helped to confirm that the REDCap and Vimeo platform was functioning appropriately for the delivery of the surveys and videos:

  1. Assemble all surveys and videos into REDCap.
  2. Set the timeline for the delivery of each video intervention within REDCap.
  3. Conduct an initial test of REDCap to ensure that surveys and videos displayed appropriately and when prompted by the scheduled timeline on both the mobile device and computer.
  4. Finalize survey and video delivery schedule.

Aim 2: Feasibility

Feasibility was assessed via enrollment and retention rates as well as video intervention adherence, usability, and acceptability (see Table 1). Video adherence was monitored by Vimeo, which calculated the number of plays, number of finishes (ie, participant viewed to the very last video frame), and average percentage of the video watched per module [43]. Participants completed a short satisfaction survey at the end of the study, which consisted of free-text and scaled items.

Sample

A purposive sample of 12 PCNPs was enrolled and 10 PCNPs completed the videos. A sample size of 10 is sufficient for a feasibility study because even a few cases are likely to be very informative with respect to the difficulty of recruitment, the acceptability of the intervention, costs, and logistics [44,45]. Eligibility criteria were as follows:

  1. Had a Masters NP Certification or a Doctorate of Nursing Practice with clinical specialty.
  2. Had Family, Adult, or Geriatric NP board certification from either the American Nurses Credentialing Center (ANCC) or the American Association of Nurse Practitioners (AANP).
  3. Worked in an outpatient setting at least 16 hours/month or 192 hours/year.
  4. Had a minimum of one year of experience.
  5. Had access to the internet through a computer or a mobile phone.
  6. Had English-language proficiency.

Participants were not excluded from the study based on gender, age, or race. Exclusion criteria were any previous skin cancer continuing education in CSE or training in CSE. Participants were recruited during a statewide meeting (ie, Arizona) and a local NP meeting (see Table 2 for demographic information). Interested NPs received an email that summarized the feasibility study along with a link to the consent form, surveys, and first video. Each email contained a unique link to the intervention. REDCap creates closed surveys, where each unique link is assigned a study ID number. The consent document was a disclosure form that listed the intervention length, investigator identity, and the purpose of the study (see Multimedia Appendix 3).

Data Collection

All data instruments, automated data capture, videos, and contact information were managed using REDCap. The schedule for delivering each video and the posttest survey were automated with REDCap. Human involvement was limited to in-person recruitment and sending the initial email to start the intervention. The videos were incorporated into the questionnaire. Participants completed the posttest survey, along with the usability and acceptability items, after the third video. A total of 1 week after the third video, participants self-reported personal use of CSE in their practice. All outcomes were self-reported though these online surveys and all questions had forced-choice answers. Participants did not have an option to review or change their answers after submission. Prompts were sent daily for up to 2 days if the participant did not view the video within the first 24 hours of enrollment. Participants who failed to submit the survey were not compensated, and the surveys were ineligible for analysis. Each participant who completed the intervention received a US $50 Amazon gift card. Data collection occurred from March to April 2019.

Data Protection

REDCap was developed specifically around the Health Insurance Portability and Accountability Act (HIPAA) Security Rule guidelines. The REDCap electronic data management system at the University of Arizona is housed on two virtual servers: one supporting database services and the other supporting web services. Hardware is located at the University of Arizona’s Information Technology Services Center (UITS). The space is physically secured within a keyless entry area. Hardware management and support are provided by UITS. The database server is located behind a firewall and the web server is in a Data Management Zone. REDCap software support is provided by the University of Arizona Center for Biomedical Informatics and Biostatistics. All web-based information transmission is password protected and encrypted in transit. Administration of REDCap is managed through virtual servers located at the University of Arizona College of Medicine [46].

Statistical Analysis

Data from REDCap were exported into SPSS Statistics for Windows, version 26.0 (IBM Corp) [47], for data analysis. Analysis of sample demographic data consisted of descriptive statistics (ie, frequencies and measures of central tendency). For Aim 1a, the CVI was determined by dividing the number of dermatology experts giving the fact or item a score of 3 or 4 and dividing this by the total number of experts (ie, 3) [26]. Scores for the SUS and the QEV were analyzed using descriptive statistics (Aims 1b and 2). For Aim 2, the Attitudes toward Web-based Continuing Learning Survey (AWCL) was used and data analysis consisted of item mean scores, mean construct scores, and correlations between each construct.


Overview

Characteristics of the sample are listed in Table 2.

Table 2. Demographic and practice characteristics of the sample.
CharacteristicValue (N=10), n (%)
Gender

Women9 (90)

Men1 (10)
Age in years

30-393 (30)

40-492 (20)

50-594 (40)

>601 (10)
Nurse practitioner (NP) certification

Family nurse practitioner (FNP)8 (80)

Adult nurse practitioner (ANP)0 (0)

Geriatric nurse practitioner (GNP)0 (0)

FNP + ANP1 (10)

FNP + GNP1 (10)
Type of NP practice

Group9 (90)

Individual1 (10)
Highest degree obtained

Masters NP Certification8 (80)

Doctorate of Nursing Practice2 (20)
Years in clinical practice

1-54 (40)

6-104 (40)

11-200 (0)

21-301 (10)

31-401 (10)
Which electronic device are you using for this intervention?

Computer6 (60)

Mobile device4 (40)

Aim 1a: Content Validity

The dermatology expert panel conducted two reviews to assess the content validity of the intervention. Content validity scores primarily increased or were consistent during the second round of reviews after the storyboards were adjusted to address the reviewers’ recommendations from the first round. The scores of the following components decreased during the second round: right arm nevus count (relevance) and discuss systematic approach (relevance) (see Table 3).

Table 3. Content validity index (CVI).
StrategiesFirst review, mean score (SD)Second review, mean score (SD)

RelevanceaClaritybRelevanceClarity
Video 1 strategies




What is skin cancer?3.33 (1.52)2.33 (1.53)4.00 (0)3.67 (0.58)

Malignant melanoma prevalence3.67 (0.58)3.67 (0.58)3.67 (0.58)4.00 (0)

Malignant melanoma risk factors3.67 (0.58)3.00 (1.00)4.00 (0)4.00 (0)

Right arm nevus count4.00 (0)3.00 (1.73)3.67 (0.58)3.67 (0.58)

Grand mean3.67 (0.27)3.00 (0.54)3.84 (0.19)3.83 (0.19)
Video 2 strategies




Discuss systematic approach4.00 (0)3.67 (0.58)3.67 (0.58)4.00 (0)

Discuss hard-to-see areas3.33 (0.58)3.00 (1.73)3.67 (0.58)4.00 (0)

Strategies for incorporating clinical skin examination3.00 (1.00)2.67 (1.15)4.00 (0)4.00 (0)

Grand mean3.44 (0.50)3.11 (0.58)3.78 (0.19)4.00 (0)
Video 3 strategies




ABCDE rule (Asymmetry, Border, Color, Diameter, Evolution)4.00 (0)4.00 (0)4.00 (0)4.00 (0)

Ugly duckling sign4.00 (0)3.33 (1.15)4.00 (0)4.00 (0)

Images of nonsuspicious vs suspicious pigmented lesions3.67 (0.58)2.00 (1.73)3.67 (0.58)4.00 (0)

Grand mean3.89 (0.33)4.00 (0.88)3.89 (0.33)4.00 (0)

aRelevance scale: 1 = not relevant, 2 = somewhat relevant, 3 = quite relevant, and 4 = highly relevant.

bClarity scale: 1 = not clear, 2 = somewhat clear, 3 = quite clear, and 4 = very clear.

Aims 1b and 1c: Vimeo and REDCap Integration and Digital Delivery

The dermatology expert panel rated the integration of REDCap and Vimeo and the system’s usability as truly superior (mean 95.8, SD 7.2), with a range of scores from 87.5 (better) to 100 (truly superior). All expert panel members scored each of the five technical concepts—video design, intended content, visual quality, audio quality, and audio-visual relationship—as exceptional (mean 5, SD 0) (see Multimedia Appendix 4).

Aim 2: Enrollment and Retention Rate and Intervention Adherence

A total of 12 PCNPs consented from a list of 32 emails (38% enrollment rate). Out of 12 participants, 10 completed the intervention and the surveys (83% retention rate). Completion rate of the surveys was 100% (10/10). A total of 50% (5/10) of participants watched the videos in their entirety. Vimeo recorded 6 plays for video 1. Of the 6 participants who played video 1, only 1 (17%) failed to finish the video, ending it 30 seconds before the content was complete. Therefore, the lowest percentage of possible participants who completed all of the videos is 50% (5/10). See Table 4 for the video adherence information for each module.

Table 4. Vimeo report regarding video adherence.
VideoPlays (N=10), n (%)Finishes (N=10), n (%)Average amount of the video watched, %
Video 16 (60)4 (40)92
Video 27 (70)4 (40)97
Video 38 (80)2 (20)98

Aim 2: Usability, Acceptability, and Satisfaction

The mean usability was better (mean 85.8, SD 10.6), with a range of scores from 72.5 (better) to 100 (truly superior). Acceptability of the intervention was assessed using the AWCL. The mean for each of the constructs—perceived usefulness, perceived ease of use, behavior, and affection—all ranged between somewhat agree and mostly agree (see Table 5).

Table 5. Attitudes toward Web-based Continuing Learning Survey (AWCL) item scores and grand mean scores of the constructs.
ItemScorea, mean (SD)
Perceived usefulness 1: web-based continuing learning helps my work become more interesting5.2 (1.4)
Perceived usefulness 2: web-based continuing learning helps to increase my creativity for work5.3 (1.3)
Perceived usefulness 3: web-based continuing learning facilitates the development of my work5.4 (1.2)
Perceived usefulness 4: web-based continuing learning effectively enhances my learning5.2 (1.0)
Perceived usefulness 5: web-based continuing learning helps me attain better learning outcomes5.2 (1.4)
Perceived ease of use 1: it is convenient to receive training on the job using web-based continuing learning5.7 (1.2)
Perceived ease of use 2: it is easy to get web-based continuing learning to do what I want it to5.1 (1.5)
Perceived ease of use 3: it is easy for me to solve problems at work when I participate in web-based continuing learning4.9 (1.6)
Perceived ease of use 4: the flexibility of web-based continuing learning makes me learn in an easier way5.9 (1.1)
Behavior 1: I hope to spend more time using web-based continuing learning5.7 (1.5)
Behavior 2: I hope to use web-based continuing learning more often5.5 (1.4)
Behavior 3: I want to increase my use of web-based continuing learning in the future5.4 (1.4)
Affection 1: I think it is interesting to use web-based continuing learning5.8 (1.3)
Affection 2: web-based continuing learning provides an interesting and attractive environment5.7 (1.5)
Affection 3: using web-based continuing learning can improve my working ability5.8 (1.3)
Constructs, grand mean (SD)

Perceived usefulness5.26 (1.03)

Perceived ease of use5.40 (0.85)

Behavior5.53 (1.25)

Affection5.77 (1.37)

aThe scores are based on the following scale: 1 = strongly disagree, 2 = mostly disagree, 3 = somewhat disagree, 4 = neither agree nor disagree, 5 = somewhat agree, 6 = mostly agree, and 7 = strongly agree.

The overall mean satisfaction with the study and the intervention was 99% (SD 1.87, N=10; 100% = best). All participants (N=10) would watch the videos even if they did not receive compensation, they believed that the length of the video was “just right,” and they believed that the content was “just right.” Out of 10 participants, 8 (80%) preferred that the videos be accessible for 1 month. When participants were asked what they learned, they cited specific videos (eg, “scaling suspicious vs nonsuspicious lesions”) (5/10, 50%) and increased motivation (eg, “11 plus nevus right arm increased likelihood to have 100 plus. So quick and easy to check!”) (1/10, 10%), and some identified having already completed skin assessments (eg, “was a great refresher”) (2/10, 20%).


A strength of this study is the rigorous, transparent development of the videos and the use of an expert panel of dermatologists to ensure valid content. Each specific strategy recommended by Sidani and Braden for operational transparency of the video intervention was outlined (see Multimedia Appendix 2) and graded by the dermatology expert panel using a CVI. This process is absent from many studies describing intervention development [37]. The CVI indicated that the videos were relevant, except for two index scores. Scores for both right arm nevus count and discuss systemic approach decreased from 4 (highly relevant) to 3.67 (between highly relevant and quite relevant). According to the scale, both components were ranked as at least quite relevant and were kept in the videos. One explanation for this alteration is that there was a different, third expert reviewer during the second round. Otherwise, clarity increased overall during the second round of reviews. The expert panel’s comments primarily focused on promoting clarity, such as adding the definition of skin cancer to the first video and the definition of a pigmented lesion to the third video. Wording was also adjusted (eg, “get melanoma” to “develop melanoma” and “11 nevi tool” to “right arm nevus count”). The expert panel also offered relevant CSE tips, such as ensuring that the patient removes glasses or hearing aids during the head-to-toe skin examination to better visualize the conchal bowl.

Prior to the dissemination of videos to the PCNPs, the dermatology expert panel evaluated the videos’ technical production on Vimeo and their delivery through REDCap. Visual quality, audio quality, and audio-visual relationship were considered exceptional, suggesting that it was feasible to use Vimeo and REDCap to deliver the video intervention.

The hypothesis that the enrollment rate will be equal to or better than 60% was not supported. During the study, 22 NPs were recruited at a conference in November 2018. However, enrollment did not start until February 2019, and just 5 NPs consented after the initial study invitations were sent. An additional 10 participants were recruited at another NP meeting and 8 more participants were enrolled. This highlights challenges in recruiting NPs [48], as well as the importance of a timely follow-up after recruitment.

The hypothesis that the retention rate will be greater than or equal to 50% was supported. Out of 12 recruited participants, 10 (83%) completed the surveys. This is comparable to the retention rate of a prior skin cancer study involving NPs, which reported a retention of 10 out of 14 participants (71%) [13].

Adding strength to this study are the comments from the 2 participants who did not progress past the first video. The first participant stated that she was unable to complete the other videos because of work requirements. She rated the first video favorably, stating that it was very educational. The second participant accessed the first video on her mobile phone without problems but had difficulty accessing the second video, receiving a message that she had already reviewed the video. When she contacted the investigator, she was outside the 1-week time limit for video completion and was withdrawn from the study. Future studies will be formatted to have the REDCap email, which includes the video link, request that participants check their Wi-Fi or cellular connections prior to opening the REDCap link and starting the video.

The hypothesis that video intervention adherence will be greater than or equal to 50% was not supported. Vimeo defines a finish as when the participant views a video to the very last frame. However, the amounts viewed of each video were high: the average amount of video 1 watched was 92% and it was played 6 out of 10 times (60%), the average amount of video 2 watched was 97% and it was played 7 out of 10 times (70%), and the average amount of video 3 watched was 98% and it was played 8 out of 10 times (80%). Participants likely viewed all the content; however, they most likely dropped off at the references and acknowledgements portions of the videos. We were unable to retrospectively connect the times the videos were played according to the Vimeo report with the time the participant finished the REDCap module. However, at least 5 out of 10 PCNPs (50%) fully viewed the content before exiting out of the video and met the “finishing” criteria. Future studies will require active monitoring of the reports to be able to connect the participants’ REDCap and Vimeo information or to obtain a time stamp on REDCap. One way to obtain a time stamp would be to keep a daily record of which participants completed the REDCap survey and correlate this record with the time the Vimeo video was viewed. To increase the number of finishes as defined by Vimeo, the references and the acknowledgements will be moved to the beginning of the video. References can be sent with the email invitation and the acknowledgements can be placed at the start of the video.

The hypothesis that usability scores will be equal to or higher than 70 was supported. The PCNP’s mean SUS scores were 85, or better, indicating that the participants viewed the usability of REDCap and Vimeo as positive. Similar SUS scores have been recorded for other interventions, such as a web-based simulation in psychiatry residency training (n=16, score 86.5) [49] and a web-based multimedia application called Electrolyte Workshop, which has e-learning tutorials called WalkThru (n=18, score 87.9) and HandsOn (n=27, score 81.5) [50].

Another strength of this study was the use of the microlearning framework to guide intervention development. The hypothesis that acceptability scores for microlearning will be equal to or higher than 5 was supported. Mean scores for the scales of perceived usefulness, perceived ease of use, behavior, and affection all were higher than those reported by Liang et al [30]. These findings suggest that microlearning was an acceptable framework for this intervention. However, further testing is needed to further demonstrate that microlearning is an optimal framework for teaching complex concepts to busy practitioners in a short amount of time.

There were also some limitations to the study. Purposive sampling decreases the generalizability of the results and adds to selection bias [51]. Only the participants who completed the intervention were able to complete the usability, acceptability, and satisfaction surveys. Another limitation to this study was the posttest-only design. This design does not allow for a comparison between the participants before and after the video interventions. However, this design is appropriate for a feasibility study, where the focus is on the practicality of the study [51].

In conclusion, three theory-based, short videos with content on CSE for malignant melanoma were developed that are suitable for internet delivery to PCNPs in various formats. Findings from this feasibility study provide a foundation for the use of microlearning as a guide for delivering brief CSE training to PCNPs. The findings also provide support for using the videos as part of an intervention in future trials targeting behavioral CSE outcomes in PCNPs or other practitioners. This feasibility study provided valuable lessons to inform components of the next research phase, such as the timeliness of enrollment and redesign of the modules to support better measurements of intervention adherence. The long-term goal is to promote the early detection of skin cancer by providing CSE education to PCNPs and ultimately improving skin cancer prognoses.

Acknowledgments

This study was funded by the University of Arizona Cancer Center, Skin Cancer Institute, Seed Grant Program and the College of Nursing Fall 2018 Frederick Lange Memorial Endowment Award. The authors acknowledge Stephen Purcell for video production assistance.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Definitions of Sidani and Braden’s clarifying elements of an intervention.

DOCX File , 14 KB

Multimedia Appendix 2

The components and activities of a clinical skin examination (CSE) intervention.

DOCX File , 23 KB

Multimedia Appendix 3

Consent form.

PDF File (Adobe PDF File), 107 KB

Multimedia Appendix 4

Descriptions of the different items from Beaudin and Quick’s Quality Evaluation of Video (QEV).

DOCX File , 21 KB

  1. Guy GP, Thomas CC, Thompson T, Watson M, Massetti GM, Richardson LC, Centers for Disease Control and Prevention (CDC). Vital signs: Melanoma incidence and mortality trends and projections - United States, 1982-2030. MMWR Morb Mortal Wkly Rep 2015 Jun 05;64(21):591-596 [FREE Full text] [Medline]
  2. Siegel RL, Miller KD, Jemal A. Cancer statistics, 2019. CA Cancer J Clin 2019 Jan;69(1):7-34 [FREE Full text] [CrossRef] [Medline]
  3. Bartlett EK, Karakousis GC. Current staging and prognostic factors in melanoma. Surg Oncol Clin N Am 2015 Apr;24(2):215-227. [CrossRef] [Medline]
  4. Glazer AM, Rigel DS, Winkelmann RR, Farberg AS. Clinical diagnosis of skin cancer: Enhancing inspection and early recognition. Dermatol Clin 2017 Oct;35(4):409-416. [CrossRef] [Medline]
  5. Gershenwald JE, Scolyer RA, Hess KR, Sondak VK, Long GV, Ross MI, for members of the American Joint Committee on Cancer Melanoma Expert Panel and the International Melanoma Database and Discovery Platform. Melanoma staging: Evidence-based changes in the American Joint Committee on Cancer eighth edition cancer staging manual. CA Cancer J Clin 2017 Nov;67(6):472-492 [FREE Full text] [CrossRef] [Medline]
  6. Uhlenhake E, Brodell R, Mostow E. The dermatology work force: A focus on urban versus rural wait times. J Am Acad Dermatol 2009 Jul;61(1):17-22. [CrossRef] [Medline]
  7. Slade K, Lazenby M, Grant-Kels JM. Ethics of utilizing nurse practitioners and physician's assistants in the dermatology setting. Clin Dermatol 2012;30(5):516-521. [CrossRef] [Medline]
  8. Kimball AB, Resneck JS. The US dermatology workforce: A specialty remains in shortage. J Am Acad Dermatol 2008 Nov;59(5):741-745. [CrossRef] [Medline]
  9. Resneck J, Kimball AB. The dermatology workforce shortage. J Am Acad Dermatol 2004 Jan;50(1):50-54. [CrossRef] [Medline]
  10. Aquino LL, Wen G, Wu JJ. Factors affecting the pursuit of academic careers among dermatology residents. Cutis 2015 Apr;95(4):231-236. [Medline]
  11. Loescher LJ, Harris JM, Curiel-Lewandrowski C. A systematic review of advanced practice nurses' skin cancer assessment barriers, skin lesion recognition skills, and skin cancer training activities. J Am Acad Nurse Pract 2011 Dec;23(12):667-673. [CrossRef] [Medline]
  12. Loescher LJ, Stratton D, Slebodnik M, Goodman H. Systematic review of advanced practice nurses' skin cancer detection knowledge and attitudes, clinical skin examination, lesion detection, and training. J Am Assoc Nurse Pract 2018 Jan;30(1):43-58. [CrossRef] [Medline]
  13. Blake J, Malone L. Current behaviors, attitudes, and knowledge of nurse practitioners in primary care toward skin cancer screening/prevention. J Dermatol Nurses Assoc 2014 Sep 1;6(2):65-69. [CrossRef]
  14. Azima S. Expert views, opinions, and recommendations. J Dermatol Nurses Assoc 2016 Sep;8(5):312-317. [CrossRef]
  15. Shelby DM. Knowledge, Attitudes and Practice of Primary Care Nurse Practitioners Regarding Skin Cancer Assessments: Validity and Reliability of a New Instrument [doctoral dissertation]. Tampa, FL: University of South Florida; 2014 Feb 27.   URL: https://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=6320&context=etd [accessed 2020-07-07]
  16. Oliveria SA, Nehal KS, Christos PJ, Sharma N, Tromberg JS, Halpern AC. Using nurse practitioners for skin cancer screening: A pilot study. Am J Prev Med 2001 Oct;21(3):214-217. [CrossRef] [Medline]
  17. Roebuck H, Moran K, MacDonald DA, Shumer S, McCune RL. Assessing skin cancer prevention and detection educational needs: An andragogical approach. J Nurse Pract 2015 Apr;11(4):409-416 [FREE Full text] [CrossRef]
  18. Ali FR, Samarasinghe V, Russell SA, Lear JT. Increasing capacity for skin surveillance in a transplant review clinic. Transplantation 2014 Apr 27;97(8):e48-e50. [CrossRef] [Medline]
  19. Armstrong A. Dermoscopy: An Evidence-Based Approach for the Early Detection of Melanoma [doctoral dissertation]. Jacksonville, FL: University of North Florida; 2011.   URL: https://digitalcommons.unf.edu/cgi/viewcontent.cgi?article=1324&context=etd [accessed 2020-07-07]
  20. Wray EV, Brant B, Hussian F, Muller FM. A new model of teledermoscopy combining service and education. Br J Dermatol 2013;169:138-139.
  21. DeKoninck B, Christenbery T. Skin cancer screening in the medically underserved population: A unique opportunity for APNs to make a difference. J Am Assoc Nurse Pract 2015 Sep;27(9):501-506. [CrossRef] [Medline]
  22. Bradley HB. Implementation of a skin cancer screening tool in a primary care setting: A pilot study. J Am Acad Nurse Pract 2012 Feb;24(2):82-88. [CrossRef] [Medline]
  23. Chen C, Woyansky S, Zundell C. The effects of education on compliance with skin cancer risk reduction guidelines. J Dermatol Nurses Assoc 2015;7(2):97-100. [CrossRef]
  24. Hartnett PD, O’Keefe C. improving skin cancer knowledge among nurse practitioners. J Dermatol Nurses Assoc 2016 Sep;8(2):123-128. [CrossRef]
  25. Polit DF, Tatano Beck C. Developing and testing self-report scales. In: Nursing Research: Generating and Assessing Evidence for Nursing Practice. 9th edition. Hagerstown, MA: Lippincott Williams & Wilkins; 2012:351-378.
  26. Polit DF, Beck CT, Owen SV. Is the CVI an acceptable indicator of content validity? Appraisal and recommendations. Res Nurs Health 2007 Aug;30(4):459-467. [CrossRef] [Medline]
  27. Yaghmaei F. Content validity and its estimation. J Med Educ 2003;3(1):25-27. [CrossRef]
  28. Bangor A, Kortum PT, Miller JT. An empirical evaluation of the System Usability Scale. Int J Hum Comput Interact 2008 Jul 30;24(6):574-594 [FREE Full text] [CrossRef]
  29. Beaudin BP, Quick D. Instructional video evaluation instrument. J Ext 1996 Jun;34(3) [FREE Full text]
  30. Liang J, Wu S, Tsai C. Nurses' internet self-efficacy and attitudes toward web-based continuing learning. Nurse Educ Today 2011 Nov;31(8):768-773. [CrossRef] [Medline]
  31. Goldsmith LA, Bolognia JL, Callen JP, Chen SC, Feldman SR, Lim HW, American Academy of Dermatology. American Academy of Dermatology Consensus Conference on the safe and optimal use of isotretinoin: Summary and recommendations. J Am Acad Dermatol 2004 Jun;50(6):900-906. [CrossRef] [Medline]
  32. Harris JM, Salasche SJ, Harris RB. Can internet-based continuing medical education improve physicians' skin cancer knowledge and skills? J Gen Intern Med 2001 Jan;16(1):50-56 [FREE Full text] [Medline]
  33. Learning module: The skin exam. American Academy of Dermatology. 2018.   URL: https://www.aad.org/education/basic-derm-curriculum/suggested-order-of-modules/the-skin-exam [accessed 2018-04-02]
  34. Bruck PA, Motiwalla L, Foerster F. Mobile learning with micro-content: A framework and evaluation. In: Proceedings of the 25th Bled eConference. eDependability: Reliable and Trustworthy eStructures, eProcesses, eOperations and eServices for the Future. 2012 Presented at: 25th Bled eConference. eDependability: Reliable and Trustworthy eStructures, eProcesses, eOperations and eServices for the Future; June 17-20, 2012; Bled, Slovenia p. 527-543   URL: https://aisel.aisnet.org/cgi/viewcontent.cgi?article=1041&context=bled2012
  35. Hug T, Friesen N. Outline of a microlearning agenda. In: Hug T, editor. Didactics of Microlearning: Concepts, Discourses and Examples. Münster, Germany: Waxmann; 2007:15-31.
  36. Giurgiu L. Microlearning an evolving elearning trend. Sci Bull 2017;22(1):18-23 [FREE Full text] [CrossRef]
  37. Stratton DB, Loescher LJ. Educational interventions for primary care providers to improve clinical skin examination for skin cancer. J Am Assoc Nurse Pract 2020 May;32(5):369-379. [CrossRef] [Medline]
  38. Sidani S, Braden CJ. Clarifying elements of the intervention. In: Design, Evaluation, and Translation of Nursing Interventions. Ames, IA: John Wiley & Sons; 2011:43-60.
  39. Vimeo.   URL: https://vimeo.com/features [accessed 2018-04-02]
  40. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap): A metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009 Apr;42(2):377-381 [FREE Full text] [CrossRef] [Medline]
  41. Sidani S, Braden CJ. Selecting, training, and addressing the influence of interventionists. In: Design, Evaluation, and Translation of Nursing Interventions. Ames, IA: John Wiley & Sons; 2011:111-124.
  42. Polit DF, Beck CT. The content validity index: Are you sure you know what's being reported? Critique and recommendations. Res Nurs Health 2006 Oct;29(5):489-497. [CrossRef] [Medline]
  43. What does Vimeo consider a play? Vimeo. 2019.   URL: https://vimeo.zendesk.com/hc/en-us/articles/360046057111-User-Level-Analytics-in-Showcases [accessed 2019-05-20]
  44. Nieswiadomy RM. Foundations of Nursing Research. 4th edition. Upper Saddle River, NJ: Pearson Education; 2002.
  45. Hertzog MA. Considerations in determining sample size for pilot studies. Res Nurs Health 2008 Apr;31(2):180-191. [CrossRef] [Medline]
  46. REDCap boilerplate text. The University of Arizona Health Sciences Center for Biomedical Informatics and Biostatistics.   URL: https://redcap.cats.med.arizona.edu/redcap/redcap_boilerplate_text.doc [accessed 2020-05-20]
  47. IBM Corp. IBM SPSS Statistics for Windows 26.0. Armonk, NY: IBM Corp; 2019.   URL: https://www.ibm.com/support/pages/node/876320 [accessed 2020-07-06]
  48. Schoenthaler A, Albright G, Hibbard J, Goldman R. Simulated conversations with virtual humans to improve patient-provider communication and reduce unnecessary prescriptions for antibiotics: A repeated measure pilot study. JMIR Med Educ 2017 Apr 19;3(1):e7 [FREE Full text] [CrossRef] [Medline]
  49. Gorrindo T, Baer L, Sanders KM, Birnbaum RJ, Fromson JA, Sutton-Skinner KM, et al. Web-based simulation in psychiatry residency training: A pilot study. Acad Psychiatry 2011;35(4):232-237. [CrossRef] [Medline]
  50. Davids MR, Chikte UM, Halperin ML. Effect of improving the usability of an e-learning resource: A randomized trial. Adv Physiol Educ 2014 Jun;38(2):155-160 [FREE Full text] [CrossRef] [Medline]
  51. Shadish WR, Cook TD, Campbell DT. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston, MA: Houghton Mifflin; 2002.


AANP: American Association of Nurse Practitioners
ANCC: American Nurses Credentialing Center
AWCL: Attitudes toward Web-based Continuing Learning Survey
CSE: clinical skin examination
CVI: content validity index
HIPAA: Health Insurance Portability and Accountability Act
NP: nurse practitioner
PCNP: primary care nurse practitioner
QEV: Quality Evaluation of Video
REDCap: Research Electronic Data Capture
SUS: System Usability Scale
UITS: University of Arizona’s Information Technology Services Center


Edited by G Eysenbach; submitted 16.10.19; peer-reviewed by E Hacker, M Janda; comments to author 02.04.20; revised version received 12.06.20; accepted 15.06.20; published 27.07.20

Copyright

©Delaney B Stratton, Kimberly D Shea, Elizabeth P Knight, Lois J Loescher. Originally published in JMIR Dermatology (http://derma.jmir.org), 27.07.2020.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Dermatology Research, is properly cited. The complete bibliographic information, a link to the original publication on http://derma.jmir.org, as well as this copyright and license information must be included.