Europe PMC

This website requires cookies, and the limited processing of your personal data in order to function. By using the site you are agreeing to this as outlined in our privacy notice and cookie policy.

Abstract 


No abstract provided.

Free full text 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
IRB. Author manuscript; available in PMC 2020 Aug 5.
Published in final edited form as:
IRB. 2012 Nov-Dec; 34(6): 1–15.
PMCID: PMC7405731
NIHMSID: NIHMS1005657
PMID: 23342734

Effectiveness of Multimedia Aids to Enhance Comprehension During Research Consent: A Systematic Review

Barton W. Palmer, Ph.D, Professor,1 Nicole M. Lanouette, M.D., Assistant Clinical Professor,2 and Dilip V. Jeste, M.D., Distinguished Professor3

Abstract

We conducted a systematic review of empirical studies of the effectiveness of multimedia tools to enhance the research consent process. Relative to routine consent procedures, multimedia aided consent resulted in significantly better participant comprehension in 10 of 20 reviewed studies, and in six of the remaining studies multimedia aided consent resulted in superior comprehension or retention for some subgroups or for at least some key aspects of the disclosed material. The overall pattern of findings suggests that multimedia tools can be effective aids to the research consent process under some circumstances. However, further research is needed with multimedia tools that are more firmly grounded in conceptual models of human information processing in the consent process. Such conceptual model driven research is critical to determine which multimedia tools are useful in which specific contexts and for which specific participants.

Keywords: multimedia, informed consent, decision making, competence, research ethics

INTRODUCTION

Clinical researchers, as well as those charged with human subjects protection, regularly face an ethical dilemma when balancing respect of individuals’ autonomy with the duty to protect those with diminished decision-making capacity.1 One means of simultaneously promoting both aspirations is to improve the consent process itself.2 In the 1990s, concerned that clinical trials consent forms were getting longer and more complex, several organizations under the U.S. Department of Health and Human Services formed an Informed Consent Workgroup. Among the Workgroup’s recommendations was that investigators consider using interactive computer programs, video, and other multimedia tools to complement printed consent documents.3 Yet, more than a decade after those recommendations were published, the average research consent form has continued to increase in length and complexity, and multimedia tools are rarely used in the research consent process.47

The ongoing emphasis on printed consent documents in research enrollment may at least partially reflect lingering uncertainty about whether multimedia tools effectively improve participant comprehension of disclosed information relevant to research consent. Earlier reviews of studies of the relevant literature led to mixed or indeterminate conclusions, in part because of the relative paucity of well controlled published trials available at the time of those reviews.811 Even the most recent of these prior reviews included only studies available through January 1, 2007.8 Thus, the primary objective of the present report is to provide an up to date comprehensive and critical review of empirical studies on the efficacy of multimedia tools as a means to enhance participant comprehension in the research consent. As we have previously argued,12 there is no logical reason to expect multimedia tools to be universally superior to standard consent procedures. Thus, the other intended added contribution of this review is to consider the degree to which the multimedia consent tool have been grounded in a specific conceptual model or theory regarding the conditions under which multimedia methods might reasonably be expected to effectively aid the consent process.

METHODS

Data Sources and Search Strategy

The literature search through May 8, 2012 was conducted with the PubMed and PsycINFO Cambridge Scientific Abstracts (CSA) Illumina™ databases. [The term multimedia specifically refers to integration of two or more forms or channels of information, such as auditory (voice and other sound), visual (still and motion pictures, animation, graphs), and/or text.13 However, as computer presentation is often intermixed with multimedia methods, in the context of the present review, our use of the term “multimedia” will include computer-based consent procedures.] PubMed search terms were “(informed consent OR consent forms) AND (computer-assisted instruction OR audiovisual aids OR computerized OR multimedia OR video)”. The PsycINFO CSA database search phrase was: “(de=computer mediated communication OR de=audiovisual communications media OR de=computer applications OR de=technology OR de=computers OR de=human computer interaction OR de=videotapes OR de=videotape instruction) AND de=informed consent”. For both databases, the search was further limited to English-language journal articles tagged as involving human subjects. No restrictions were placed in terms of year of publication.

Inclusion criteria/study selection

To be included in this review, studies had to: (a) be an empirical report of original data published in a peer-reviewed English-language journal; (b) focus on efficacy or effectiveness of multimedia tools to enhance participant comprehension in the research consent process, and (c) evaluate the utility of multimedia consent compared to routine or other control consent conditions. We excluded reports focused on comprehension of a single methodologic component of research (e.g. placebo control14), those focused only on outcomes other than participant comprehension such as participant satisfaction or agreement to enroll,12,15,16 as well as those focused on multimedia decision aides for clinical rather than research purposes (for review of clinical multimedia decision aides please see Jeste et al.17).

Reports identified

Applying the above search criteria yielded 761 records (712 in the PubMed database, and 49 in the PsycINFO CSA database); after identifying and removing 15 duplicate records (those appearing in both databases), there were 746 unique records. Through review of the titles, abstract, and where necessary, full text, we identified 16 reports from the electronic database search meeting the above stated inclusion/exclusion criteria.1833 In addition, through cross-references from other articles, we identified four additional reports that had not been identified with the above electronic search, yielding a total of 20 reports for this review.3437 The 20 reports were published between December 1988 and January 2012.

Review/information extraction

We carefully read each of the 20 reports and recorded information on the setting, sample, type of protocol, comparison group, conceptual model or theory guiding the intervention design and implementation (if any), details of the multimedia consent, and the key findings including whether the multimedia consent was more efficacious than the comparison condition (yes, partial, or no). To further standardize and structure the review, we also evaluated each included report using a modified version of the Scale to Assess Scientific Quality of Investigations (mSASQI) that had been developed and employed in a prior review of multimedia aides to educate patients and aid treatment decisions.17 As modified for use in the present review, the mSASQI consisted of 15 items, each referring to a specific aspect of study design, methods, analyses, or interpretation, and each rated by the first (BWP) and second (NML) authors as 0 (absent or inadequate) or 1 (present and adequate), such that the mSASQI total score had a potential range of 0 to 15. Although BWP and NML completed their mSASQI ratings independently, they met after rating five of the articles24,26,30,35,36 to identify any discrepancies or ambiguities in scoring rules and then independently scored the remaining reports (Intraclass Correlation Coefficient for mSASQI total score = 0.921). BWP and NML then discussed any discrepancies; their final consensus scores were applied for subsequent analyses.

Statistical Analysis

Given the small number of reports using overlapping interventions or outcome measures, the primary focus in the present review is on qualitative rather than meta-analytic or other quantitative review of the empirical literature. However, as a tentative exploration of the degree to which efficacy findings may have differed by overall methodological quality of the empirical reports, we used the trichotomized efficacy findings (coded as: “Yes” = 1, “Partial” = 0, “No” = −1) and calculated the bivariate correlation between this trichotomized variable to mSASQI total score using Spearman’s rho. Significance was defined as p < .05 (two-tailed).

RESULTS

Details of the 20 studies are summarized in Table 1.

Table 1.

Research on multimedia or computer aides for the research consent process.

AuthorsSetting or sample (and type of protocol - real vs. simulated)ComparisonTheory, model, or rationale for design of enhancementMultimedia consent results in better comprehension?Quality (mSASQI)Comments
Norris & Phillips (1990)34Duodenal ulcer patients (N=200) (Real protocol)Routine consent versus Routine consent + informational videotapeVideo intervention atheoretical (although there was allusion to multimedia principles in the Discussion as an explanation of the results)YesTotal score = 8;100% of those in videotape condition correctly answered ≥ 8 of 10 post-test questions vs. 30% of those in the routine consent condition
Dunn et al (2002)21Older patients with psychosis (n=100); NC subjects (n=19) (Real protocol)Routine consent vs. PowerPoint aided consent (both conditions received corrected feedback)Enhanced consent based on review of empirical literature, but no specific theory or modelYesTotal score = 12;PowerPoint led to better understanding than routine consent; Corrected feedback also aided understanding.
Wirshing et al (2005) 23Schizophrenia (n=83); medical patients (n unstated); undergraduate students (n unstated) (Real, but not protocol specific)Instructional videotape vs. control videotapeAtheoretical interventionYesTotal score = 11;Instructional videotape resulted in improved understanding.
Moser et al (2006)37Schizophrenia (n=30) and NC subjects (n=30) (Simulated protocol)Routine text-based consent followed by PowerPoint presentationEnhanced consent based on review of empirical literature, but no specific theory or model was specified.YesTotal score = 11;PowerPoint improved understanding, and eliminated group differences in appreciation and reasoning.
Hutchinson et al (2007)24173 oncology patients (Real clinical trials)Consultation + Audiovisual presentation about oncology clinical trials (customized to tumor type but not protocol specific) versus Consultation alone (refusal/acceptance of enrollment was primary outcome; effects on comprehension tested only as a secondary outcome.)The audiovisual materials were prepared through a thorough multicomponent development process of development, described in a companion paper,68 but it remained unclear from information in the report if this included grounding in a particular conceptual model or theory of multimedia learningYesTotal score = 12;Pre-post improvement about clinical trials higher in audiovisual presentation arm than in routine consultation arm
Strevel et al (2007)26Oncology patients qualifying for participation in phase I clinical trials (N=49) (Real, but not protocol specific)Instructional DVD (general information on phase I trials) versus placebo DVD (describing accomplishments of the researchers and cancer institute).The content of the DVD was "based upon knowledge deficits described in the literature in the phase I population; script content was reviewed and modified by medical oncologists involved in drug development …"YesTotal score = 11;Relative to the placebo DVD, those receiving educational DVD were less likely to believe phase I drugs have proven efficacy against cancers in humans, or that the goal of phase I clinical trials is to establish effectiveness, and were more likely to know that the study drug had not been thoroughly tested in humans.
Hultgren et al (2009) 28Undergraduate college students (N=41) consenting for a study employing transcranial direct current stimulation (tDCS) (Simulated protocol)Standard text consent form (read by subjects over the internet) versus Standard text consent form (over internet) plus a 5-minute video (presented over internet)Principles guiding the development and implementation of the video were not described.YesTotal score = 8;Relative to those receiving text consent alone, those receiving the video plus text had significantly higher scores on a post-consent test of the nature and risks associated with tDCS and participant rights.
Kass et al (2009)30Oncology patients (N=130) (Real, but not protocol specific)Computerized multimedia program vs. text-based pamphletAtheoretical intervention; on the other hand, modification was based on feedback from relevant stakeholder representativesYesTotal score = 11;Computer condition led to better understanding than pamphlet in regard to most key informational components.
Karunaratne et al (2010)31Diabetes patients (N=30) (Simulated protocol)Computerized multimedia program vs. printed consent formAtheoretical intervention; Although the computer presentation was described in detail, the principles guiding its development were not specified.YesTotal score = 12;Average percentage correct answers significantly higher in computer consent than the paper-based consent
OĽonergan & Forester-Harwood32170 Parent-adolescent dyads (N=340) (Consent/assent for a simulated study of general pediatric research)PowerPoint with video hyperlinks versus printed consent formThe materials were pretested and then refined in accord with responses from an independent sample of parent-child dyads; but no theory or model was presented.YesTotal score = 12.Multimedia led to better overall comprehension than routine consent
Fureman et al (1997)19Intravenous drug users (N=186) (Real, but not protocol specific)Pamphlet vs. pamphlet plus videotape of a ‘TV talk show’ format discussionInformational videotape developed with input from community advisory board and clinical researchers. No theoretical or empirical rationale was given for choice of this format. However, as the focus was not only on comprehension but also trust and willingness to participate, one might expect such a format to be useful in those two domains based on participant modeling/ social learning theory (see Bandura69).PartialTotal score = 11;Baseline knowledge increased in both conditions; videotape group had better 1-month retention.
Weston et al (1997)20Pregnant women (N=90) (Simulated)Text-based consent vs. text-based consent plus videoNo explicit rationale or theory stated.PartialTotal score = 11;No difference observed at initial post-test; however, video group > retention over 2 to 4 weeks.
Agre & Rapkin (2003)36Oncology patients (n=204); Family/friends (n=109); Non-patients (n=128) (Real protocol)Printed text-based (standard consent or booklet) vs. Multimedia (computer assisted instructional program or video)The only rationale provided was that some previous studies of clinical or research consent had shown "success" with videotape, computer, and booklet format consent materials.PartialTotal score = 11;Comparison of group means were not significant; however, relative to those receiving text-based consent, participants receiving multimedia consent were more likely to be in the tail clusters (excellent understanding or poorest understanding groups) and less likely to be in the intermediate understanding clusters. No clear interaction between consent type and participant (patient, family, or non-patient) type. However, participants with lower education did worse with the multimedia/ computer tools than with standard consent.
Bickmore et al (2009)27Community sample, ages 28–91 (N=29) (Simulated)Computerized agent versus explanation by human versus self-studyThe enhanced consent tool in this study was developed in reference to theory and prior data on the use of "animated agents" to duplicate the communication benefits of face-to-face interaction, particularly for persons with low health literacy, together with considerations of limitations in sole reliance on face-to-face disclosure with the actual clinician or researcher.PartialTotal score = 9;Significant main effects of consent condition on comprehension. However, health literacy may have moderated effectiveness: comprehension scores for Computer and Human presentation significantly better than self-study among the 16 participants with “adequate” health literacy.” No significant comprehension effects of condition among n=13 with “inadequate health literacy.” Subgroup sample sizes unclear
Jeste et al (2009)29Middle-aged or older patients with schizophrenia (n = 128); NC subjects (n=20) (Simulated protocol)DVD versus routine consent procedureDevelopment and implementation of the DVD consent aid was guided by several key principles from multimedia learning theory, including the multiple representation, contiguity, coherence, personalization, signaling, and interactivity principles (Mayer13).PartialTotal score = 15Among patients (but not NCs), DVD-aided consent resulted in better understanding and greater likelihood of being categorized as “capable to consent.” than those in the routine consent condition (as categorized with several previously established criteria.
McGraw et al.33Oncology biobank (N=43) (Real protocol)Video versus printed consent formVideo was modified based on feedback from a small pilot study; no multimedia theory or model specified.PartialTotal score = 10Descriptive only (no inferential statistics presented. Coded transcripts of qualitative interview which a series of content queries: “What would you tell a friend if you were explaining ___?” (purpose, risks, benefits, etc.)Specific purpose of biobank appeared less salient in printed consent condition, but there were no notable differences in salience of study risks among the two groups. Results in regard to salience of procedures were equivocal.
Benson et al.(1988)18Different consent methods were examined sequentially within each of four psychiatric studies: (1) antidepressant clinical drug trial (n=24); (2) schizophrenia clinical drug trial (n=24); (3) Social skills training for schizophrenia (n=20); (4) Borderline personality clinical drug trial (n=20) (Real protocols)Four different disclosure techniques utilized sequentially within each of the four studies: (a) routine “usual care” consent, (b) routine + instructional video, (c) assisted disclosure with “improved” video [designed in light of results from the first two conditions], (d) Neutral educator provided information to prospective participants.Video intervention was atheoretical (modified in response to input from bioethics experts, but nature of theoretical or empirical basis of those modifications was not specified)NoTotal score = 10The enhanced consent (particularly the improved video and neutral educator) methods tended to result in slightly better comprehension than routine consent, but the differences were not statistically significant, the effect sizes were small and varied by patient characteristics. Notably, subjects comprehension mean scores were abysmal under all four conditions, ranging from 15 of 30 points in routine consent (50%) to 20 of 30 points (67% ) with the neutral educators [each of whom was an author of the paper and expert on bioethics and consent issues]
Llewellyn-Thomas et al (1995)35Oncology patients (N=100) (Simulated protocol)Audiotape + printed consent form vs. text presented on a computer organized by menus and submenusLargely atheoretical; there was no explanation of how the text-tree on the computer fosters comprehension relative to a printed consent form, which itself can be scanned and re-visited in non-sequential order.NoTotal score = 11;Presentation format did not affect levels of understanding.
Campbell et al (2004)22Parents of children in Head Start (N=233) (Simulated protocol – not relevant study)Standard print, enhanced print, video, or computer presentationThe enhanced consent in this study was guided by prior surveys that indicated high rates of illiteracy among adults, combined with dual processing and multimedia principles.NoTotal score = 13;No effects of disclosure format. It is noted that research staff were instructed not to answer participant’s questions. The investigators speculated that “it may be easier for attention to wander when one is passively watching a video. In support of this possibility, we found that, for poorer readers, the enhanced print version and the laptop computer version, both of which required the active involvement of the participant, led to more information being recalled than was true for either the original written form or the video” (p. 213).
Mittal et al (2007)25Patients with: Alzheimer’s disease (n=19) Mild Cognitive Impairment (n=13) (Simulated protocol)PowerPoint slideshow presentation (SSP) versus enhanced written consent procedure (EWCP)The design and implementation intervention was largely based on prior empirical reports in other populations, although there was reference to multimedia learning in support of one hypothesis.NoTotal score = 12;Corrective feedback improved understanding in both conditions, but no effects of SSP versus EWCP

Populations sampled

The two most commonly sampled populations were people with cancer.24,26,30,33,35,36 and people with schizophrenia or other psychoses.18,21,23,29,37 However, a variety of other patient populations were also studied, including people with drug abuse,19 depression,18 borderline personality disorder,18 Alzheimer’s disease or mild cognitive impairment,25 diabetes,31 duodenal ulcers,34 or other unspecified medical conditions.23 One additional study focused on enhancing consent for perinatal research with pregnant women.20 Several of the studies also included non-patient samples, either as the primary study sample with which to test the effects of multimedia consent,22,27,28,32 or as basis for comparing the results from the patient group.21,23,29,36,37

Type of multimedia aid or platform

The most common multimedia intervention was videotape, studied either alone,1820,23,28,33,34 or in comparison to a computer-based intervention.22,36 Three studies used DVDs,24,26,29 two used bulleted text on a computerized PowerPoint presentation,21,37 and two used bulleted text via PowerPoint plus supplementary embedded videos.25,32 Six studies used other forms of computer presentation with text only35,36 or with embedded video and graphics.22,27,30,31

Efficacy of multimedia consent aids

Ten of the 20 reports (50%) found multimedia-aided consent was associated with significantly better understanding (either overall comprehension or understanding of key informational components) of disclosed information than was achieved without multimedia aids.21,23,24,26,28,3032,34,37 Six additional studies (30%) reported partial benefits of multimedia consent, i.e., the multimedia-aided consent was more effective than the control consent for at least one study subgroup, at initial or follow-up assessment, or in other subanalyses.19,20,27,29,33,36 Negative results, i.e. no significant differences between multimedia and comparison consent procedures, prevailed in only four studies (20%).18,22,25,35

Overall quality

The mSASQI quality ratings across the 20 reports are summarized in Table 2. The mean mSASQI total score ranged from 8 to 15 (mean = 11.0 [SD=1.6]). There was no significant correlation between the mSASQI total score and overall outcome (trichotomized in terms of the demonstrated superior efficacy of the multimedia consent over the comparison condition: “yes” = +1, “partial” = 0, “no” = −1) rs = −0. 121, p = 0.611.

Table 2.

Results from modified Scale to Assess Scientific Quality of Investigations)

Proportion of Reports Meeting Criterion

Was the key dependent variable operationalized via standardized scale or other appropriately established method?100.0%
Were the conclusions justified by the data/findings?100.0%
Was (were) the sampled population(s) appropriate to the study aims/hypotheses? [EG patient groups justified, presence or absence of non-patient comparison group appropriate to study aims]90.0%
Were the inclusion and exclusion criteria clearly described and appropriate?90.0%
Were effects of enhanced consent tested relative to an appropriate control condition (e.g., routine consent rather than another experimental condition)?90.0%
Was the consent strategy tested in an ecologically valid context? [EG, either actual research consent, or, if simulated, functionally equivalent.]90.0%
Were statistical analyses appropriate to aims/hypotheses?90.0%
Were the key limitations of the study appropriately addressed in Discussion/conclusions?90.0%
Are there any concerns about power (sample size)?85.0%
Was assignment to experimental conditions done with appropriate randomized assignment method?85.0%
Were demographic or other confounds between compared groups appropriately addressed via analyses and/or interpretation?85.0%
Was risk of type I and/or type II errors appropriately addressed?65.0%
Was one or more falsifiable a priori hypotheses specified/tested?20.0%
Were ratings of key dependent variable(s) done by blinded interviewers?"20.0%
Were the design and implementation of the enhanced consent appropriately grounded in a specified theory or model?15.0%

Note: Items in above Table presented in order of decreasing frequency; modified from the Scale to Assess Scientific Quality of Investigations (mSASQI)17

Common critical limitations of published studies

Use of conceptual models or theory

Of the 20 reports included in this review, only three22,27,29 described a conceptual model or theoretical rationale guiding development and/or implementation of multimedia tools to enhance participant comprehension. The computer-based tool devised by Campbell et al.22 as well as the DVD-based consent tool from our research group 29 were each partially guided by what are known as the multiple representation and contiguity principles of multimedia theory. Prevailing models of human information processing posit separate channels for initial storage and manipulation of verbal versus visual-spatial information;38 according to the multiple representation and contiguity principles, learning is facilitated when information is provided simultaneously through both the auditory and visual-spatial channels13 There were other considerations given in each of these two studies (Campbell et al. focused specifically on modifications to reduce the influence of literacy levels; we considered additional multimedia learning principles). The third conceptually guided study was not focused on multimedia learning principles per se, but rather the goal of the investigators was to use animated computer presentations to duplicate the non-verbal behaviors (such as hand gestures) that would be exhibited by an expert explaining consent material to a potential participant.27

Some of the other published reports (including two from our research group) cited findings from prior studies of enhancing clinical or research consents as a basis for one or more components of their multimedia tools (i.e., use of bulleted text), but no specific theory or model was specified as to why or under what conditions the enhancement components should be expected to facilitate comprehension.21,25,37 The multimedia aids described in some of the other reports were developed or refined in response to suggestions or feedback from clinicians or clinician researchers,26 bioethicists,18 participants,32,33 or a mixture of representatives from these relevant stakeholder groups,19,24,30 but there were no clear indications that such input was obtained from experts in multimedia learning.

Exploratory versus hypotheses driven analyses

Four of the 20 reports explicitly stated one or more a priori hypotheses about the effects of multimedia consent on participant comprehension.21,22,25,29 The implicit/unstated hypothesis in the other 16 reports was presumably that the multimedia-enhanced consent process would lead to superior participant comprehension relative to that achieved with the routine or other non-multimedia comparison consent procedure, but the expected outcomes in the presence of multiple analyses could not generally be inferred as representing implicit a priori hypotheses.

Other key methodological issues

Only four reports clearly described use of independent interviewers blind to consent condition.22,25,29,32 Several other studies employed self-administered questionnaires.19,20,24,28,31,3436. (With self-administered questionnaires there is less opportunity to ask follow-up questions for clarification.) In the remaining studies, either the interviewer was not kept blind to consent condition, or the description in the Methods section of the associated report was not sufficiently detailed to discern whether the interviewer was kept blind to consent condition.18,21,23,26,27,30,33,37

DISCUSSION

We identified 20 empirical reports testing the efficacy of multimedia aids relative to routine or other comparison conditions in fostering comprehension of information disclosed in the research consent process. The studies varied widely in terms of the populations targeted, the form and content of multimedia interventions, the nature of measures employed to assess comprehension, as well as their overall conceptual and methodological nature. Based on the reviewed findings, it appears that multimedia consent tools can be effective aids to the consent process under some circumstances and/or with some study populations, but the effectiveness is neither uniform across all study populations, contexts, or all types of multimedia interventions. The three most common methodologic limitations were (a) the lack of specification of a theory or model guiding the structure, design, content, and/or implementation of the multimedia media consent (provided in only 3 of 20 [15%] reports), 22,27,29 (b) the lack of specific a priori hypotheses (provided in only 4 of 20 [20%] reports), 21,22,25,29 and (c) the lack of a structured interview based assessment of participant comprehension by an interviewer blind to consent condition (provided in only 4 of the 20 [20%] studies).22,25,29,32

Due to the diversity of methods and populations in the existing literature, it is difficult to identify clear trends that would indicate the degree to which the various factors influenced the key outcomes. Although the existing studies represent an excellent foundation, there is clearly a need for a “second generation” of conceptually grounded empirical research on multimedia-aided consent. This second generation of studies will be critical to identifying which types of multimedia tools are useful in which specific contexts and for which specific clinical research participants.

A potential objection to our call for more theory-grounded research is that positive findings within the reviewed studies did not appear dependent on whether or not a study was hypothesis driven, firmly grounded in theory, or even associated with overall methodologic quality as indexed by the mSASQI ratings. However, the role of a conceptual model or theory in science is not to guarantee positive results, but rather to enable investigators to approach experimental manipulations, and plan follow-up studies, in an organized manner to reduce ambiguity when interpreting and comparing (positive or negative) results.39 Theory grounded research informs not only what does and does not work, but also gives insight into why an intervention is or is not effective which then helps guide further refinements or application to the consent process for new studies.40

Information processing models from cognitive psychology, as well as multimedia learning theory from educational psychology, provide a useful framework from which to develop reasoned, specific, and falsifiable a priori hypotheses for future development in studies of multimedia aids for consent, as well as for understanding many of the results in the existing empirical literature.13,38 The working memory system is thought to be a core component of information acquisition (learning) and the use, short-term storage and manipulation of information required in decision-making and problem solving.38,41 It includes separate auditory and visual channels for representing new information.38,41 Two of the reviewed studies made reference to a component of multimedia learning theory which suggests learning is facilitated by simultaneous presentation of information to the auditory and visual channels.22,29 But information processing models also predict that under some conditions simultaneous audio and visual presentation may be a hindrance, rather than facilitate learning.4245 For example, if a participant is simultaneously presented with important but distinct (non-redundant) information in the auditory and visual channels, this can create what has been called a “split-attention effect,” hindering rather than facilitating learning and comprehension.46 As discussed below, the key concept in understanding such differential effects is that of “cognitive load”.46

A firmly-established and critical aspect of the auditory and visual-spatial components of working memory is that they have limited capacity (resources) in the number of units (or “chunks”) that can be simultaneously held and processed.47,48 The concept of “cognitive load,” essentially referring to how much of the limited working memory resources are taken up by a cognitive task, is key to developing theory-grounded predictions about the types and conditions under which specific form of multimedia presentation should facilitate comprehension of consent relevant information.46 Graphic presentation is more effective than text when the figures or images reduce the need to rely on limited working memory resources. Empirical data outside the context of studies of the research consent process have shown that graphic presentation fosters more efficient comprehension than text or speech when the images permit the recipient to simultaneously see or grasp key relationships among components.4345 A very basic example is that it is easier to communicate and comprehend the relative positions of the 50 states in the U.S. when presented as map than it would be with words or text alone. In contrast, there is no reason to expect that a video of an investigator describing a study would be any more effective than if the same information were provided in person. Indeed, a video might be less effective than an in-person presentation because the former tends to be a more passive situation, and it is harder to adapt the rate of information to the processing needs of individual recipients.

As we noted previously,12 there is also no reason to expect that presenting text on a computer screen, in itself, would facilitate more efficient processing of information than when presented as printed text. However, with hypertext, computers have the potential for presenting adjunctive information in a way that facilitates keeping the standard text relatively succinct, while making the additional information readily available to those participants for whom it may apply.4951 One of the studies included in the present review did employ hypertext presented on a computer screen so that the information could be organized under menus and submenus.35 No significant benefits of such presentation were found relative to when information was presented in a fixed serial format (via audiotape accompanied by a printed consent form). However, printed consent forms can also be scanned and read in a non-serial order. From an information processing perspective, the best use of hypertext may be to link it to supplemental material so that the core (essential) material remains uncluttered. Computers also foster a relatively seamless and efficient integration of text with audio/video components, and potentially allow for a more interactive consent process, which can lead to better attention and therefore retention of information.

Similar considerations of the demands on limited working memory resources also explain the value and potential limits of bulleted text as a consent aid. Specifically, bullet points may facilitate comprehension because the relevant information is made salient, reducing the need to search through and process non-essential details to identify the relevant components. Three of the four studies employing PowerPoint reported positive effects.21,32,37 In the fourth study there were no differences between the PowerPoint and the comparison condition, but the latter itself was an enhanced consent procedure, albeit without multimedia, designed to make critical information more salient.25 Given the ubiquity of PowerPoint and similar computer slideshow software programs, as well as the ease and low cost of producing such presentations, it seems such methods could be commonly and readily incorporated into standard consent procedures with little added cost or burden. On the other hand, such tools may be best employed as an adjunct to printed consent forms, as there is a balance between providing too much and too little detail. That is, supporting text can provide contextual information activating relevant prior knowledge or conceptual schema in the reader’s working memory, which, as discussed further below, also facilitates efficient information processing.52,53

Even when inclusion of visual presentation is clearly preferable, however, the information processing demands of specific types of information may affect which form of visual presentation is the most effective. There is strong evidence from studies of medical decision making that comprehension of risk and benefit probabilities is facilitated when communicated graphically rather than through spoken or printed words alone,54 but the type of graphic presentation is also important. Specifically, there have been a number of studies of hypothetical health-care decisions that indicate understanding of risk ratios and other probabilistic information may be better achieved with icon arrays (pictographs) than with bar graphs.5459 Pictographs appear to be superior in such contexts because they foster processing key information about the relationship between the numerator and denominator which people otherwise tend to process in suboptimal form (a.k.a. “denominator neglect”).60,61 [The reports in the present review did not generally provide sufficient detail to discern what specific forms the graphics (e.g., bar graphs, pictogram, and/or icon arrays) may have been employed.]

Another consideration in incorporating multimedia tools into the consent process is what specifically to communicate. Fifteen of the 20 reviewed studies employed multimedia tools to convey protocol specific information, i.e., as an alternative way of communicating the information that would appear in a protocol specific printed consent form. However, in five of the studies the multimedia presentation was used as a primer to teach potential participants about research concepts, such as randomized assignment, placebo control, the distinction between early and later phase trials, and/or about the consent process itself.19,23,24,26,30 Four of the latter five studies found positive effects for the multimedia tool 23,24,26,30 and in the fifth study, in which subjects were given general information about HIV vaccine trials via video or an informational pamphlet, baseline knowledge increased in both conditions, and the videotape group had better 1-month retention.19 None of these studies specified a theoretical rationale for this intervention, but such findings make conceptual sense in relation to limited working memory/processing resources, particularly from the perspective of schema theories.6266 “Schemas” (or schemata) are conceived of as mental structures or organized bundles of knowledge and expectations about specific types of objects or situations; these schemas guide and foster efficient information processing and response. In the context of research consent, having relevant knowledge and expectations about research concepts, methods, and terms, and about the consent process itself, should enable individuals to more rapidly discriminate essential versus non-essential information and reduce the need to devote limited working memory resources for active processing. The increased efficiency should foster better comprehension and retention of the information.

Beyond the lack of theoretical grounding and a priori hypotheses, another difficulty in comparing outcomes across studies is the lack of a standard method for assessing the effectiveness of multimedia consent tools. Three studies25,29,37 used the MacArthur Competence Assessment Tool for Clinical Research (MacCAT-CR)67 but by far the most common outcome measure was self-administered questionnaires idiosyncratically developed for each specific enhanced consent study.19,20,23,24,2628,31,3436 The remaining studies used other semi-structured interviews, 18,30,32or a questionnaire read aloud by the research staff.21,22 One study employed qualitative interviewing (which was consistent with the primary goals of that study but less ideal for drawing definitive conclusions about the effectiveness of multimedia tools).33 Common evaluation approaches would facilitate comparing observed effect sizes across independent studies.

One caveat should also be noted in regard to our quality ratings. The focus of the present review was on the effectiveness of multimedia consent tools in enhancing participant comprehension and, as reflected in the specific mSASQI item content (provided in Table 2 in Results), our assessments of methodology emphasized criteria deemed relevant to that particular focus. But many of the reviewed studies had multiple aims, and the methods of some studies may have been selected for the investigators’ other, perhaps more primary, aims. Thus, our ratings of quality should be read solely in the context of the goals of this review, rather than as a statement about the merits of individual studies in their own right.

The above comments noted, what still stands out from the present review is that at least partial benefits in terms of improved comprehension were seen from multimedia presentation in 16 of 20 reviewed studies. Thus, it appears multimedia consent tools often have at least partial utility in the consent process. This conclusion contrasts with that from a 2004 review by Flory & Emanuel10 at which time they noted that multimedia tools “often failed to improve research participants’ understanding” (p. 1559), and that from the 2007 review by Ryan et al.11 who concluded that “The value of audio-visual interventions for people considering participating in clinical trials remains unclear” (p. 2). And yet, we agree with the spirit of the conclusions from both the prior reviews in recommending further, conceptually grounded and methodologically rigorous research is needed to definitively identify the conditions under which multimedia has sufficient added value to warrant the production costs and burden. As described above, an information processing perspective, including the concept of “cognitive load” offers a clear framework in which to ground this future work and make substantive progress in the design and evaluation of multimedia aids for the consent process. In the interim, and as noted above, use of bulleted summaries presented via PowerPoint or similar slide-show programs, along with corrective feedback, appears to be at least one low-cost minimal burden method that is readily available to enhance the consent process. There also seems to be clear value in not only teaching subjects about protocol specifics, but in at least some cases, to prime that discussion with a brief discussion about clinical research concepts.

Acknowledgments

This work was supported, in part, through grants from the National Institutes of Health (NIA AG028827 and NIMH MH064722 and MH097274).

REFERENCES

1. National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research 1979; http://ohsr.od.nih.gov/guidelines/belmont.html. Accessed April 1, 2011. [Abstract]
2. Palmer BW. Informed consent for schizophrenia research: What is an investigator (or IRB) to do? Behavioral Sciences & the Law. 2006;24(4):447–452. [Abstract] [Google Scholar]
3. National Cancer Institute. Simplification of informed consent documents 1999; http://www.cancer.gov/clinicaltrials/understanding/simplification-of-informed-consent-docs/allpages. Accessed July 15, 2010.
4. Beardsley E, Jefford M, Mileshkin L. Longer consent forms for clinical trials compromise patient understanding: So why are they lengthening? J. Clin. Oncol. March 20 2007;25(9):e13–14. [Abstract] [Google Scholar]
5. LoVerde ME, Prochazka AV, Byyny RL. Research consent forms: continued unreadability and increasing length. J. Gen. Intern. Med. Sep-Oct 1989;4(5):410–412. [Abstract] [Google Scholar]
6. Berger O, Gronberg BH, Sand K, Kaasa S, Loge JH. The length of consent documents in oncological trials is doubled in twenty years. Ann. Oncol. February 2009;20(2):379–385. [Abstract] [Google Scholar]
7. Prentice KJ, Appelbaum PS, Conley RR, Carpenter WT. Maintaining informed consent validity during lengthy research protocols. IRB. Nov-Dec 2007;29(6):1–6. [Abstract] [Google Scholar]
8. Cohn E, Larson E. Improving participant comprehension in the informed consent process. J. Nurs. Scholarsh. 2007;39(3):273–280. [Abstract] [Google Scholar]
9. Dunn LB, Jeste DV. Enhancing informed consent for research and treatment. Neuropsychopharmacology. July 2001;24(6):595–607. [Abstract] [Google Scholar]
10. Flory J, Emanuel E. Interventions to improve research participants’ understanding in informed consent for research: a systematic review. JAMA. October 6 2004;292(13):1593–1601. [Abstract] [Google Scholar]
11. Ryan RE, Prictor MJ, McLaughlin KJ, Hill SJ. Audio-visual presentation of information for informed consent for participation in clinical trials. Cochrane Database Syst. Rev. 2008(1). [Abstract] [Google Scholar]
12. Henry J, Palmer BW, Palinkas L, Glorioso DK, Caligiuri MP, Jeste DV. Reformed consent: Adapting to new media and research participant preferences. IRB. Mar-Apr 2009;31(2):1–8. [Europe PMC free article] [Abstract] [Google Scholar]
13. Mayer RE. Multimedia learning. New York, NY: Cambridge University Press; 2001. [Google Scholar]
14. Dunn LB, Palmer BW, Keehan MK. Understanding of placebo controls among older people with schizophrenia. Schizophr. Bull. 2006;32(1):137–146. [Europe PMC free article] [Abstract] [Google Scholar]
15. Jimison HB, Sher PP, Appleyard R, LeVernois Y. The use of multimedia in the informed consent process. J. Am. Med. Inform. Assoc. May-Jun 1998;5(3):245–256. [Europe PMC free article] [Abstract] [Google Scholar]
16. Dunlop AL, Leroy ZC, Logue KM, Glanz K, Dunlop BW. Preconsent education about research processes improved African Americans’ willingness to participate in clinical research. J. Clin. Epidemiol. August 2011;64(8):872–877. [Europe PMC free article] [Abstract] [Google Scholar]
17. Jeste DV, Dunn LB, Folsom DP, Zisook D. Multimedia educational aids for improving consumer knowledge about illness management and treatment decisions: A review of randomized controlled trials. J. Psychiatr. Res. January 2008;42(1):1–21. [Abstract] [Google Scholar]
18. Benson PR, Roth LH, Appelbaum PS, Lidz CW, Winslade WJ. Information disclosure, subject understanding, and informed consent in psychiatric research. Law Hum. Behav. 1988:455–475. [Abstract] [Google Scholar]
19. Fureman I, Meyers K, McLellan AT, Metzger D, Woody G. Evaluation of a video-supplement to informed consent: injection drug users and preventive HIV vaccine efficacy trials. AIDS Educ. Prev. August 1997;9(4):330–341. [Abstract] [Google Scholar]
20. Weston J, Hannah M, Downes J. Evaluating the benefits of a patient information video during the informed consent process. Patient Educ. Couns. March 1997;30(3):239–245. [Abstract] [Google Scholar]
21. Dunn LB, Lindamer LA, Palmer BW, Golshan S, Schneiderman LJ, Jeste DV. Improving understanding of research consent in middle-aged and elderly patients with psychotic disorders. Am. J. Geriatr. Psychiatry. Mar-Apr 2002;10(2):142–150. [Abstract] [Google Scholar]
22. Campbell FA, Goldman BD, Boccia ML, Skinner M. The effect of format modifications and reading comprehension on recall of informed consent information by low-income parents: a comparison of print, video, and computer-based presentations. Patient Educ. Couns. May 2004;53(2):205–216. [Abstract] [Google Scholar]
23. Wirshing DA, Sergi MJ, Mintz J. A videotape intervention to enhance the informed consent process for medical and psychiatric treatment research. Am. J. Psychiatry. January 2005;162(1):186–188. [Abstract] [Google Scholar]
24. Hutchison C, Cowan C, McMahon T, Paul J. A randomised controlled study of an audiovisual patient information intervention on informed consent and recruitment to cancer clinical trials. Br. J. Cancer. 2007;97(6):705–711. [Europe PMC free article] [Abstract] [Google Scholar]
25. Mittal D, Palmer BW, Dunn LB, et al. Comparison of two enhanced consent procedures for patients with mild Alzheimer disease or mild cognitive impairment. Am. J. Geriatr. Psychiatry. February 2007;15(2):163–167. [Abstract] [Google Scholar]
26. Strevel EL, Newman C, Pond GR, MacLean M, Siu LL. The impact of an educational DVD on cancer patients considering participation in a phase I clinical trial. Support. Care Cancer. July 2007;15(7):829–840. [Abstract] [Google Scholar]
27. Bickmore TW, Pfeifer LM, Paasche-Orlow MK. Using computer agents to explain medical documents to patients with low health literacy. Patient Educ. Couns. June 2009;75(3):315–320. [Europe PMC free article] [Abstract] [Google Scholar]
28. Hultgren B, Zaghi S, Carvas M, Nascimento B, Kwiatkowski J, Fregni F. Challenges in consenting subjects for studies with brain stimulation: feasibility of multimedia video use during the informed consent process. Brain Stimulation. 2009;2(3):174–178. [Abstract] [Google Scholar]
29. Jeste DV, Palmer BW, Golshan S, et al. Multimedia consent for research in people with schizophrenia and normal subjects: A randomized controlled trial. Schizophr. Bull. July 2009;35(4):719–729. [Europe PMC free article] [Abstract] [Google Scholar]
30. Kass NE, Sugarman J, Medley AM, et al. An intervention to improve cancer patients’ understanding of early-phase clinical trials. IRB: Ethics & Human Research. 2009;31(3):1–10. [Europe PMC free article] [Abstract] [Google Scholar]
31. Karunaratne AS, Korenman SG, Thomas SL, Myles PS, Komesaroff PA. Improving communication when seeking informed consent: a randomised controlled study of a computer-based method for providing information to prospective clinical trial participants. Med. J. Aust. April 5 2010;192(7):388–392. [Abstract] [Google Scholar]
32. O’Lonergan TA, Forster-Harwood JE. Novel approach to parental permission and child assent for research: improving comprehension. Pediatrics. May 2011;127(5):917–924. [Europe PMC free article] [Abstract] [Google Scholar]
33. McGraw SA, Wood-Nutter CA, Solomon MZ, Maschke KJ, Benson JT, Irwin DE. Clarity and appeal of a multimedia informed consent tool for biobanking. IRB. Jan-Feb 2012;34(1):9–19. [Abstract] [Google Scholar]
34. Norris DR, Phillips MR. Using instructive videotapes to increase patient comprehension of informed consent. Journal of Clinical Research and Pharmacoepidemiology. 1990;4(4):263–268. [Google Scholar]
35. Llewellyn-Thomas HA, Thiel EC, Sem FW, Woermke DE. Presenting clinical trial information: a comparison of methods. Patient Educ. Couns. May 1995;25(2):97–107. [Abstract] [Google Scholar]
36. Agre P, Rapkin B. Improving informed consent: a comparison of four consent tools. IRB. Nov-Dec 2003;25(6):1–7. [Abstract] [Google Scholar]
37. Moser DJ, Reese RL, Hey CT, et al. Using a brief intervention to improve decisional capacity in schizophrenia research. Schizophr. Bull. 2006;32(1):116–120. [Europe PMC free article] [Abstract] [Google Scholar]
38. Baddeley AD. Working memory, thought, and action. New York: Oxford University Press; 2007. [Google Scholar]
39. Rosenblueth A, Wiener N. The Role of Models in Science. Philosophy of Science. 1945;12(4):316–321. [Google Scholar]
40. Mayer RE. Applying the science of learning: evidence-based principles for the design of multimedia instruction. Am. Psychol. November 2008;63(8):760–769. [Abstract] [Google Scholar]
41. Baddeley AD, Hitch G. Working memory In: Bower GA, ed. The psychology of learning and motivation. Vol 8 New York: Academic Press; 1974:47–89. [Google Scholar]
42. Larkin JH, Simon HA. Why a diagram is (sometimes) worth ten thousand words. Cognitive Science. 1987/3// 1987;11(1):65–100. [Google Scholar]
43. Wallace DS, West SC, Ware A, Dansereau DF. The effect of knowledge maps that incorporate gestalt principles on learning. Journal of Experimental Education. Fal 1998;67(1):5–16. [Google Scholar]
44. Tergan S-O, Gräber W, Neumann A. Mapping and managing knowledge and information in resource-based learning. Innovations in Education and Teaching International. November 2006;43(4):327–336. [Google Scholar]
45. Anglin GJ, Vaez H, Cunningham KL. Visual representations and learning: The role of static and animated graphics Jonassen David H. (2004). Handbook of Research on Educational Communications and Technology (2nd ed.). (pp. 865–916). Mahwah, NJ, US: Lawrence Erlbaum Associates Publishers; xiv, 1210 pp. Mahwah, NJ: Lawrence Erlbaum Associates Publishers; 2004. [Google Scholar]
46. Mayer RE, Moreno R. Nine Ways to Reduce Cognitive Load in Multimedia Learning. Educational Psychologist 2003/03/01 2003;38(1):43–52. [Google Scholar]
47. Baddeley AD. The magical number seven: Still magic after all these years? Psychological Review. Special Issue: The centennial issue of the Psychological Review. April 1994;101(2):353–356. [Abstract] [Google Scholar]
48. Miller GA. The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychol. Rev. March 1956;63(2):81–97. [Abstract] [Google Scholar]
49. Brusilovsky P. Methods and techniques of adaptive hypermedia. In: Brusilovsky P, Kobsa A, Vassileva J, eds. Adaptive hypertext and hypermedia. Boston, MA: Kluwer Academic; 1998:1–43. [Google Scholar]
50. Brusilovsky P, Kobsa A, Vassileva J, eds. Adaptive hypertext and hypermedia. Boston, MA: Kluwer Academic; 1998. [Google Scholar]
51. Spallek H Adaptive hypermedia: A new paradigm for educational software. Adv. Dent. Res. December 1, 2003. 2003;17(1):38–42. [Abstract] [Google Scholar]
52. Bradshaw GL. Multimedia textbooks and student jearning. MERLOT Journal of Online Learning and Teaching. 2005;1(2):1–10. [Google Scholar]
53. Bransford JD, Johnson MK. Contextual prerequisites for understanding: Some investigations of comprehension and recall. Journal of Verbal Learning and Verbal Behavior. 1972;11(6):717–726. [Google Scholar]
54. Fagerlin A, Ubel PA, Smith DM, Zikmund-Fisher BJ. Making numbers matter: present and future research in risk communication. Am. J. Health Behav. Sep-Oct 2007;31 Suppl 1:S47–56. [Abstract] [Google Scholar]
55. Hawley ST, Zikmund-Fisher B, Ubel P, Jancovic A, Lucas T, Fagerlin A. The impact of the format of graphical presentation on health-related knowledge and treatment choices. Patient Educ. Couns. December 2008;73(3):448–455. [Abstract] [Google Scholar]
56. Zikmund-Fisher BJ, Fagerlin A, Ubel PA. Improving understanding of adjuvant therapy options by using simpler risk graphics. Cancer. 2008;113(12):3382–3390. [Europe PMC free article] [Abstract] [Google Scholar]
57. Fagerlin A, Wang C, Ubel PA. Reducing the influence of anecdotal reasoning on people’s health care decisions: is a picture worth a thousand statistics? Med. Decis. Making. Jul-Aug 2005;25(4):398–405. [Abstract] [Google Scholar]
58. Zikmund-Fisher BJ, Ubel PA, Smith DM, et al. Communicating side effect risks in a tamoxifen prophylaxis decision aid: The debiasing influence of pictographs. Patient Educ. Couns. 2008;73(2):209–214. [Europe PMC free article] [Abstract] [Google Scholar]
59. Tait AR, Voepel-Lewis T, Zikmund-Fisher BJ, Fagerlin A. The effect of format on parents’ understanding of the risks and benefits of clinical research: a comparison between text, tables, and graphics. J. Health Commun. July 2010;15(5):487–501. [Europe PMC free article] [Abstract] [Google Scholar]
60. Reyna VF, Brainerd CJ. Numeracy, ratio bias, and denominator neglect in judgments of risk and probability. Learning and Individual Differences. 2008;18(1):89–107. [Google Scholar]
61. Garcia-Retamero R, Galesic M, Gigerenzer G. Do Icon Arrays Help Reduce Denominator Neglect? Med. Decis. Making. May 18 2010. [Abstract] [Google Scholar]
62. Marshall SP. Schemas in problem solving. New York, NY: Cambridge University Press; 1995. [Google Scholar]
63. Minsky M A framework for representing knowlege In: Winston PH, ed. The Psychology of Computer Vision. New York: McGraw-Hill; 1975:211–277. [Google Scholar]
64. Norman DA, Shallice T. Attention to Action: Willed and automatic control of behavior. La Jolla, CA: Center for Human Information Processing, University of California, San Diego; December 15, 1980 1980. 8006. [Google Scholar]
65. Rumelhart DE, Ortony A. The representation of knowledge in memory In: Anderson RC, Spiro RJ, Montague WE, eds. Schooling and the acquistion of knowledge. New York: Lawrence Earlbaum Associates, Publishers; 1977:99–135. [Google Scholar]
66. Schank RC. What’s a Schema Anyway? Contemporary psychology: APA Review of Books. 1980;25(10):814–816. [Google Scholar]
67. Appelbaum PS, Grisso T. MacCAT-CR: MacArthur Competence Assessment Tool for Clinical Research. Sarasota, FL: Professional Resource Press; 2001. [Google Scholar]
68. Hutchison C, McCreaddie M. The process of developing audiovisual patient information: challenges and opportunities. J. Clin. Nurs. 2007;16(11):2047–2055. [Abstract] [Google Scholar]
69. Bandura A Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: ce-HallPrenti, Inc.; 1986. [Google Scholar]

Citations & impact 


Impact metrics

Jump to Citations

Citations of article over time

Article citations


Go to all (22) article citations

Similar Articles 


To arrive at the top five similar articles we use a word-weighted algorithm to compare words from the Title and Abstract of each citation.


Funding 


Funders who supported this work.

NIA NIH HHS (2)

NIMH NIH HHS (4)