Oil spill stakeholders, including decision makers and other groups, have expressed concerns about and questioned the use of dispersants and other non-mechanical response options for years. Concerns in past decades were primarily ecological, but during the Deepwater Horizon oil spill some individuals and communities in the Gulf of Mexico states also articulated perceptions of public health risks associated with the use of dispersants. Effective risk communication is essential to manage the potential risks associated with oil spills. Stakeholders concerned about risks want or need information in the form of communications products, such as guides or briefs. Because people process new information within the context of their existing beliefs, such communication products are likely to be more effective and useful for their intended audiences if they are designed to: (1) take into account the communication recipients' existing beliefs; and (2) directly address the decisions/judgments faced by recipients by providing them with the information they want and need to make those decisions. Stakeholder engagement is essential to learn about risk perceptions, to learn about what information stakeholders want and need to make decisions effectively, and to develop communication products to that end.

This paper builds upon a mental models approach to dispersant risk communications research from the 1990s. It describes and presents results from an industry-government collaborative project to develop risk-based tools designed to communicate the subject of dispersants to local level stakeholders among others. This project includes an expanded science-informed dispersant decision model, two stakeholder open houses, and two surveys (interactive and online) to gather data-driven insights about local stakeholder knowledge and understanding of dispersants, as well as their perceptions of the risks and benefits associated with dispersant use during a spill in relation to other response options. The surveys were distributed at two open houses for local stakeholders on the Eastern Shore of Virginia (Wallops Is.) and the Pacific Northwest (Port Townsend, WA). Both open houses were co-sponsored with USCG-led Area Committees. The Virginia workshop was also co-sponsored by The Nature Conservancy and the Virginia Shore Keeper. It is expected that the surveys may be incorporated into future meetings and open houses involving stakeholders at any level, e.g., local, state, regional or national. The data from the surveys can guide the improvement of future communication efforts about dispersants, as well as provide support for enhanced stakeholder engagement during preparedness and response.

One approach to assessing oil spill response decisions is to ask those who have been making such decisions for over a decade, long enough to have developed true expertise in the topic (Ericsson 2009; Ericsson et al 1993a and b). Both who to consult and how to consult them can shape the results (see, e.g., Pidgeon and Fischhoff 2011; Wood et al 2012). Because oil spill response is by nature a multidisciplinary problem, this project convened a diverse group of scientists and practitioners experienced with the behavior and effects of oil and response technologies in oil spills, as described below.

To constrain the results to information that would inform oil spill response choices, scientists and senior response spill experts were consulted in a workshop to revise a qualitative decision model developed in a similar workshop 15 years earlier. The focus of that initial model was to assess the ecological effects of dispersant use on marine oil spills. For this project the decision problem was expanded to assess a broader range of response options, e.g., controlled in-situ burning (ISB) and to include other potential harms from oil spills and oil spill response, including social, economic, and human health as well as ecological risks.

The decision modeling problem can be described as characterizing the structure (attributes) of the decision problem, including the choices that exist (options), the factors that inform those choices, any additional factors that influence choice outcomes, and the potential outcomes. In this project, the strategy was to develop an influence diagram (Howard 1989; Morgan et al 2002). A quantified influence diagram would include an assessment of the relative desirability and contingent likelihood of the possible outcomes, depending on choices and other influences. Risk assessment is an implicit element of risk decision modeling.

Risk assessment includes characterizing the potential harm resulting from exposures to hazards and estimating the probability that the harm will be realized (i.e., risk) and the uncertainties in measurements, models, and methods of risk analysis (NRC 1989). While risk assessment may be regarded by some as a purely technical exercise because it entails evaluating harm, the outcomes of this often qualitative and technical process are determined in part by the perceptions of those assessing the risk, and must be communicated with wide variety of decision makers and potentially affected parties. Perceptions of harm and risk differ from person to person due to differences in individual values and circumstances as well as to differences in exposure and vulnerability. It follows that risk management is dependent on both effective risk assessment and effective communications of that assessment between all stakeholders (e.g., the risk managers, assessors, and analysts as well as those at risk in the community).

Effective risk communication includes, but is not limited to, the exchange of information about the nature of risk and risk management (NRC 1989). The potential recipients of communications about oil spill and dispersants as well as other response options include at least five key stakeholder groups: (1) Decision-makers for response options/dispersants: USCG Federal On-scene Coordinator (FOSC), Responsible Party Incident Commander, State OSC and Regional Response Team (RRT) representatives; Federally-recognized tribal resource trustees; and US EPA, DOI, and DOC/NOAA RRT and resource trustee representatives; (2) Elected officials/politicians and their environmental staff; (3) Resource users: commercial and recreational fishermen, shellfish growers, businesses dependent upon beach tourism, including eco-tours; (4) Local authorities: county officials representing emergency management, planning, health departments, harbor and waterways management among others; and (5) Non-governmental organizations/advocacy groups: The Nature Conservancy, Waterkeeper Alliance, Pew Environmental Trust (representatives attended the Wallops Island workshop), and others.

Hence it is important not only to inform the oil spill response decision model with the best science, but also to understand the perceptions, knowledge and needs of these stakeholders. It is apparent from above the list of stakeholders that they likely face decisions ranging from whether or not to use dispersants, to close fisheries, or to support pre-spill authorizations for non-mechanical options, as well as resource allocation and planning investment decisions. Development of the survey tools described here included surveys of local level oil spill response stakeholders in two open houses in Virginia and Washington. These local stakeholders typically operate outside the traditional oil spill preparedness activities, e.g., Area Committees and Regional Response Teams, because their normal activities and missions exclude oil pollution.

To describe and predict stakeholder beliefs, knowledge and needs for oil spill response information in a way that is both representative and generalizable is challenging. As Figure 1 illustrates, both measurement (how perceptions, knowledge and decision needs are measured) and sampling errors (which stakeholders are asked about their perceptions, knowledge and needs) can affect the results.

Figure 1.

Lavrakas' schematic for the Total Error Framework, from Lavrakas (2013).

Figure 1.

Lavrakas' schematic for the Total Error Framework, from Lavrakas (2013).

Close modal

Mental models of hazardous processes include ideas people have about identifying a risk, exposure to the risk, the effects of exposure, how to mitigate the risk, and how the risk unfolds in time. Mental models research strategies of the type pursued here follow a few related but methodologically distinct steps (Bostrom 1992; Bostrom et al 1995; DeBruin and Bostrom 2013; Morgan et al 2002). These generally include: (1) developing state-of-the-art-science-informed decision model to address the target decision (also sometimes called an “expert decision model”), (2) eliciting causal beliefs and decision strategies from the parties for whom any communications are intended (i.e., the target audience), a common method for which is conducting semi-structured individual mental models interviews, then content analyzing the interview results with a coding scheme based on but not limited to the expert decision model, (3) testing the reliability of findings regarding decision-support opportunities—that is, the overlaps, gaps, and differences identified in step two findings--to develop and implement a follow-on survey with a representative sample of the target audience, (4) designing communications to address the results of step three, and (5) evaluating the effectiveness of the resulting communications, preferably experimentally with a probability sample of the target audience, and revising the communications as warranted.

This mental models research method is designed to identify what content would be useful in risk communications by comparing differences in expert and lay person mental models. Designing communications—and science communications more generally—involves many other decisions, including wording, format, framing, and media decisions, among others (e.g., Fischhoff 2007, 2013). While mental models research can provide insights to inform some of these, empirical evaluation of any resulting communications products or processes is important to verify that communications were effective in meeting lay person information needs (see, e.g., de Bruin and Bostrom 2013; Morgan et al, 2002). The survey items developed in steps one through three above comprise one tool with which to evaluate communications products and processes, such as the Area Committee (AC) and RRT meetings that are held nationwide.

As noted above, three interactive meetings informed the development of the survey toolkit, including the August 2012 workshop organized by SEA at NOAA in Seattle, and two open houses to engage local partners in oil spill response, the Wallops Island workshop and the Port Townsend workshop. For each of the two local oil spill response open houses, those participants registered in advance received an online questionnaire about a week in advance of the workshop (pre-workshop). All workshop participants were sent a link to the same online questionnaire after the workshop (post-workshop). These activities were funded by API under the scope of the Joint Industry Oil Spill Preparedness and Response Task Force (JITF) D1 Workgroup (Dispersant Communications).

At the beginning of each of the open houses, several questions were asked interactively using audience response (“clicker) systems. Participants each received a “clicker” device on which they could respond to questions; these systems tabulated responses and display the aggregated responses immediately after the 30 second response period, so that the audience saw in real time how responses are distributed, and how their personal responses compare to other participants' responses. Audience response systems tend to increase participants' engagement, improve peer discussions, and may facilitate learning (Nelson et al 2012; Smith et al 2012). The questions for the interactive survey were selected such that some had also been asked previously; all were central to the oil spill response and dispersant topics being addressed at the workshop, and were derived from and represented key elements of the oil spill response decision model.

The pre-, post- and interactive surveys assess participating stakeholders' beliefs with regard to oil spills and oil spill response decisions. Analysis of the post-workshop survey by comparison to the pre- and interactive surveys can provide a measure of how the workshop might have affected these beliefs. The pre- and post-survey were conducted online. For the Wallops Island workshop surveys, the lead authors on this report shortened, modified and split the survey instrument developed in earlier SEA mental models study (Bostrom et al 1995, 1996) into online and interactive sections, assisted by several oil spill scientists and experts, some of whom had worked on the expert decision model in the earlier Marine Spill Response Corporation-funded study.1 For the Port Townsend workshop survey instruments, these survey instruments were further refined with direct input from oil spill scientists and experts at the SEA oil spill expert decision modeling workshop in August 2012, with the additional assistance of a survey research methodologist.2 While it would have been preferable to conduct another full mental models study to determine how best to revise the questionnaire, this was not an option at the time, due to resource and time constraints.

Decision Model development in the SEA workshop in Seattle, August 2012

The decision-focused expert model from Bostrom et al (1995, 1996) included ten categories of factors affecting decisions regarding the use of dispersants on oil spills and the ecological effects thereof: Time, initial oil (characteristics), physical and environmental conditions, logistics, response options, restoration and rehabilitation options, trajectory, monitoring options, fate and transport, and impacts. The model was hierarchical, with considerably more details in some categories (e.g., impacts) than others. An extensive national survey of over 160 oil spill response decision makers and stakeholders, developed from the expert model and mental models interviews, was conducted in the 1990's and demonstrated diverse understandings of fate and transport processes, and in particular of dispersion and dissolution (Bostrom et al 1996).

The SEA August 2012 workshop included 16 oil spill scientists and experts, with on average over 20 years of experience in the field, ranging 5 to 48 years. Ten of the 16 held doctorates in pertinent sciences such as ecotoxicology. Participants engaged in extensive discussion of what information might improve oil spill response decisions, and of the effects of possible response choices, revised and expanded the decision model, and responded to, then assessed and commented on how to improve the online and survey instruments used in the Wallops Island workshop earlier that year, after a presentation from the survey methodologist2 on considerations for survey research (illustrated in Figure 1), with a focus on survey design issues, including question wording problems and the effects of common response scale designs.

Wallops Island Open House – February 16, 2012

The oil spill dispersants open house on the Eastern Shore (DELMARVA Peninsula) was held at the Marine Science Consortium, Wallops Island, VA. The Nature Conservancy (TNC) owns approximately 80 percent of the ocean shorelines of Virginia's Eastern Shore. This workshop was an AC preparedness activity sponsored by the US Coast Guard Sector Commander of Hampton Roads, CAPT Mark Ogle in response to requests from TNC and other local partners to learn more about dispersants. The objective was to directly engage with the kind of local stakeholders who expressed great concern over dispersant use in the Gulf. The overall goal was to exchange information with local stakeholders in this ecologically sensitive area, develop relationships with them, and strengthen the relationships among oil spill technical specialists.

Over 100 individuals participated in this workshop from: local government (including emergency management, mayor, planning department, emergency medical services and rural health department, harbormaster, public works); environmental groups (The Nature Conservancy, Pew Environmental, Sierra Club, Maryland Waterkeeper and Virginia Shorekeeper, Citizens for a Better Eastern Shore); the fishing community (sport and recreational fishermen, clam and oyster aquaculture, finfish and offshore clams); academic researchers and extension agents (College of William Mary/Virginia Institute of Marine Science, University of Virginia); state agency representatives from Virginia, Maryland and Delaware (both RRT representatives and others, e.g., Virginia Dept. of Health Shellfish Sanitation); and federal agency representatives from the US Coast Guard (Sector Hampton Roads, Atlantic Strike Team, and HQ Coordination and Outreach Division), US EPA (RRT-co-chair and research ecologist from the National Risk Management Research Laboratory, Office of Research and Development), NOAA (SSCs and DWH Operational Science Advisory Team eco-toxicologist) and the Department of the Interior (pollution coordinators from Fish and Wildlife Service and National Park Service and National Refuge managers); industry (ExxonMobil, BP, and their consultants); Tri-state Bird Rescue; Congress – staff from Virginia Congressman Scott Rigell and the Government Accounting Office Assistant Director, Natural Resources and Environment (who was writing a report for Congress on dispersants).

The meeting opened with remarks by CAPT Ogle, Steve Parker of TNC, and David Burden the Virginia Shorekeeper, followed by an interactive survey using local USCG audience response system equipment. This was followed by four 20-minute presentations related to dispersants, then two hours of “world café” style engagement (Brown and Isaacs 2005; Fullarton and Palermo 2008), in which participants engage in small group discussions at each of several topically focused information stations, in turn. The world café format here adopted elements of a science café, allowing participants one-on-one Q&A (open house format of risk communications) with technical specialists in small groups and supporting deliberative risk communication, which are central to effective risk communication and management practices (NRC 1996; NRC 1989). Approximately 30 government and industry response technical specialists, augmented by scientists from the FDA Division of Seafood Science and Technology/Chemical Hazards Branch and the CDC/Agency for Toxic Substances and Diseases Registry, staffed nine world café information stations. Following the open house wrap-up, participants were given a written form to evaluate the logistics and organization of the open house, which they completed before leaving. Participants were subsequently asked via email to complete the online evaluation survey a second time (post-open house).

Townsend Open House – November 1, 2012

SEA organized this open house by working with a Coordinating Committee of agency representatives from the Region X RRT and NW Area Committee. The Coordinating Committee was chaired by Heather Parker, the USCG RRT Coordinator and District 13 Scientific Advisor and was comprised of the D1 JITF representative, USCG, NOAA, EPA, Washington DoE, DOI, and ATSDR. The Coordinating Committee suggested the location, helped identify the participants, and provided input on the agenda and speakers to assure the event would be mutually beneficial. The purpose of the Port Townsend open house was to facilitate and improve knowledge-based communications (engagement and education) with key local stakeholders (e.g., NGOs, elected officials, fishermen, academics, health leaders) about oil spill response in sensitive, coastal environments and response options, and surface applications of dispersants. The Coordinating Committee specifically requested to expand the focus beyond dispersants, to include mechanical and ISB, and hold an abbreviated session in the evening for those who could only attend after work. Similar to the Wallops Island open house, this open house also combined an interactive survey, speakers, and ample opportunity for participants to have one-on-one discussions with subject matter experts at information stations, in a world-café type arrangement.

A pre-event survey was emailed to registered participants in advance of the event. The 79 local stakeholders and 29 oil spill response practitioners who attended the open house represented elected and appointed officials from multiple counties and commissions, several NGOs, several tribes, local government agencies, public health (state and local level), and fisheries (tribal, commercial/aquaculture, recreational). Two sessions were held on November 1, 2013: from 0900-1500 and from 1800-2100. The post-event survey was emailed to participants November 9, 2012.

Key products from the development process are the oil spill science-informed decision model, the online and interactive surveys for the open houses, and the analysis of the survey results. While space constraints preclude including the entire detailed decision model or survey instruments, the top level view of the decision model and illustrative questions from the surveys are presented in this section along with descriptive analyses of the survey data.

Expert model and survey responses.

As illustrated in Figure 2, independent responses from experts at the August 2012 workshop did not always agree. Given the diverse scientific specialties of the experts, this is not surprising. However, high levels of agreement are evident on most of the survey items.

Figure 2.

Expert workshop participant responses (independent individual responses, paper survey copies), August 2012

Figure 2.

Expert workshop participant responses (independent individual responses, paper survey copies), August 2012

Close modal

The revised oil spill response decision model is hierarchical (see Figure 3 below) and includes the seven categories: Time (oil age, time to shoreline/impact); Initial Oil (dispersibility); Physical and Environmental Conditions; Fate and Transport Processes; Logistics; Response options (best management practices); and Impacts (of spill and response). As noted above, this model includes a full range of response options, from doing nothing to mechanical cleanup, dispersant use, and controlled burning, as well as a broader set of potential impacts, including public health effects.

Figure 3.

Oil spill response decision model, top-level. Detailed model available on request.

Figure 3.

Oil spill response decision model, top-level. Detailed model available on request.

Close modal

Interactive “clicker” surveys

Figures 4 and 5 illustrate clicker survey questions and responses. The clicker surveys appeared to effectively generate engagement, achieving very high participation rates by attendees (76 from Wallops Island and 67 from Port Townsend, or about a 70 percent response rate).

Figure 4.

Clicker survey sample results from Port Townsend open house, November 2012. Responses are shown real-time, after participants are given 30 seconds to select their responses.

Figure 4.

Clicker survey sample results from Port Townsend open house, November 2012. Responses are shown real-time, after participants are given 30 seconds to select their responses.

Close modal
Figure 5.

Oil budget estimation task included in the interactive survey.

Figure 5.

Oil budget estimation task included in the interactive survey.

Close modal

By including questions from across the decision model, the interactive survey defines a common, coherent framework for a technical discussion of oil spill response decisions at the open houses or other meetings.

Key findings from pre- and post-workshop surveys

Response rates were lower for pre- and post-open house surveys than they were for the on-site interactive survey, with stakeholder characteristics differing for pre- and post-workshop survey respondents (i.e., apparently most attendees answered either one or the other online survey, not both; confidentiality considerations precluded unambiguous matching of pre-interactive-post responses for both open houses). The differences in answers between pre- and post- surveys illustrated in Figure 6 suggest that the workshop resulted in shifts toward the areas of agreement indicated by the decision modeling workshop participants (i.e., greater agreement with subject matter experts). However, these results do not control for individual respondent characteristics, such as professional identity (i.e., whether the respondent was from a federal agency or from industry), which our previous mental models research indicated was correalated with knowledge about oil spills and response. The overlap between those who responded to the pre- and post-open house surveys is very small in this study; pre- and post- respondents were not similarly representative of the mix of stakeholders attending. Hence the data do not unambiguously support reliable conclusions about the effects of the open houses on attendees' beliefs and judgments about oil spills and oil spill response decisions.

Figure 6.

Pre-post open house survey results Wallop Island

Figure 6.

Pre-post open house survey results Wallop Island

Close modal

Evaluating risk communication efforts is essential (de Bruin and Bostrom 2013; Fischhoff 2013; Morgan et al 2002; NRC 1989) to assess the degree to which the efforts address gaps and inconsistencies in understanding of risk assessment decisions processes, and evaluations should inform improvements in such efforts. The updated decision model is important for establishing a basis to compare expert and lay mental models about response options, including dispersants, to inform oil spill risk communications going forward. The online and interactive survey results showed differences in expert and lay person understanding, providing data to guide future risk communication efforts, both written products and engagement opportunities, such as open houses or other interactive meetings. The online surveys before and after the two open houses described here did not provide clear evidence of the open houses significantly influencing participants' knowledge or attitudes, probably due to the following implementation factors:

  • The pre- and post- online surveys were deployed following some standard survey research practices, but not all recommended practices, such as issuing repeated personalized invitations to participate in the survey (e.g., Dillman's Tailored Design Method, which addresses sources of error identified in Figure 1; Dillman 2000), in part due to the novelty of this evaluation strategy for oil spill response-related open houses, and in part due to resource constraints.

  • Respondents filled out a paper evaluation of open house logistics at the end of each open house, which they may have construed as meaning that they had completed their open house evaluation and so did not need to look at the online post-open house survey. Further, the post-open house survey was sent after a delay, which is recommended to allow new information to be processed and retained, but the delay was longer than ideal, which may have increased the resistance to respond.

  • Pre- and post- online surveys were noticeably longer than the interactive survey conducted at the open houses. Survey length can reduce response rates.

These observations suggest two recommendations:

  • (1)

    Short interactive “clicker” survey activities at the outset and conclusion of open houses appear more likely than pre- and post-open house surveys conducted online to provide useful data evaluating what participants are taking away from the open house at the end of the day.

  • (2)

    A small set of questions selected strategically from the survey toolkit developed in this project can provide insight on relevant causal beliefs and trigger productive discussions of oil spill response science and decisions.

Online pre-workshop surveys can provide information to design risk communications for both traditional written products and engagement opportunities, such as meetings with stakeholders. Pre- and post- surveys might have better response rates if they are implemented according to survey research practices described in Dillman (2000) and discussed in Lavrakas (2013). Post-workshop surveys may be useful, but the experiences here suggest that they require very careful implementation, with incentives and reminders such as those described in Dillman (2000) to encourage a representative sample of workshop participants to respond. One improvement would be having the FOSCs, recognized leaders in oil spill preparedness and response, send the reminders rather than consultant staff.

The set of survey tools discussed here have the potential to provide useful information for oil spill response risk communications, but use of online survey tools to generate evaluative data about the success of communications would need to be modified going forward. Nevertheless, the set of survey tools successfully provides data to identify differences between the information experts deem important for response decisions and that in lay mental models. These can be used to guide future written products and interactive engagement with oil spill stakeholders.

Last, comments from participants about the interactive survey and open houses were positive, even enthusiastic. One sea clam fisherman said, “Every fisherman should come to one of these!” TNC commented, “It was a terrific workshop with a level of local participation and focus that is very rare here. There is obviously a lot of interest and concern. Everyone I spoke to called it a real eye opener and said they learned a great deal.” At the end of the Virginia open house, the FOSC asked participants to tell him if they would stand behind his decision to use dispersants in the scenario discussed; they indicated with a show of hands that they would. In the Port Townsend, WA some participants indicated they would have preferred a scenario to consider use of dispersants inside Puget Sound. This excerpt from the Virginia Shore Keeper blog (Schultz 2012) exemplifies constructive thinking about oil spills and response options from one local level stakeholder following the open house: “Minimizing negatives requires difficult but necessary choices… The Eastern Shore has been lucky in this regard, but an oil spill is within the realm of possibility. Federal and state agencies and their partners are doing an impressive job of preparing for such an occurrence locally.”

Bostrom
,
A.
,
B.
Fischhoff
, and
M.G.
Morgan
.
1992
.
Characterizing mental models of hazardous processes: a methodology and an application to radon
.
Journal of Social Issues
48
(
4
):
85
100
.
Bostrom
,
A.
,
C.J.
Atman
,
B.
Fischhoff
, and
M.G.
Morgan
.
1994
.
Evaluating risk communications: completing and correcting mental models of hazardous processes, Part II
.
Risk Analysis
14
(
5
):
789
798
.
Bostrom
,
A.
,
P.
Fischbeck
,
J.H.
Kucklick
, and
A.H.
Walker
.
1995
.
A Mental Models Approach for Preparing Summary Reports on Ecological Issues Related to Dispersant Use
.
Marine Spill Response Corporation, Washington, DC. MSRC Technical Report Series 95-019
,
28
pp
.
Bostrom
,
A.
,
P.
Fischbeck
,
JH
Kucklick
,
R.
Pond
,
A. H.
Walker
,
1996
.
Ecological Issues in Dispersant Use: Decision-Makers’ Perceptions and Information Needs
.
Scientific and Environmental Assoc. Inc., for the Marine Preservation Association
.
Washington DC
.
Brown
,
J. B.
, &
Isaacs
,
Isaacs
(
2005
).
The world café: Shaping our futures through conversations that matter
.
San Francisco
:
Berrett-Koehler
.
DeBruin
,
W.
&
A.
Bostrom
.
2013
.
Assessing what to address in science communication
.
Dillman
,
D. A.
(
2000
).
Mail and internet surveys: The tailored design method
.
NY
:
Wiley
.
Ericsson
KA
(
Ed
).
The development of professional expertise: Toward measurement of expert performance and design of optimal learning environments
.
NY
:
Cambridge U. Press
;
2009
.
Ericsson
KA
,
Krampe
RT
,
Tesch-Römer
C
(
1993a
).
The role of deliberate practice in the acquisition of expert performance
.
Psychol Rev
100
(
3
):
363
406
.
Ericsson
,
K.A.
and
H.A.
Simon
.
1993b
.
Protocol Analysis: Verbal Report as Data. Revised edition
.
MIT Press
:
Cambridge, MA
.
Fischhoff
,
B
(
2007
).
Nonpersuasive Communication about Matters of Greatest Urgency: Climate Change
.
Environmental Science & Technology
,
7205
7208
,
Nov 2007
.
Fischhoff
,
B.
(
2013
).
The sciences of science communication
.
Proceedings of the National Academy of Sciences
,
110
(
Supplement 3
),
14033
14039
.
Fullarton
,
Christie
and
Palermo
,
Josephine
2008–11
,
Evaluation of a large group method in an educational institution: the World café versus Large Group facilitation
.,
Journal of Institutional research
,
vol. 14
,
no. 1
,
pp
.
109
117
.
Howard
,
RA
,
1989
.
Knowledge maps
.
Management Science
,
35
(
8
):
903
22
.
Lavrakas
,
P.E.
,
2013
.
Presidential Address: Applying A Total Error Perspective For Improving Research Quality In The Social, Behavioral, And Marketing Sciences
.
Public Opinion Quarterly
,
Vol. 77
,
No. 3
,
pp
.
831
850
.
Morgan
MG
,
Fischhoff
B
,
Bostrom
A
,
Atman
C.
2002
.
Risk Communication: A Mental Models Approach
.
New York
:
Cambridge University Press
.
National Research Council (NRC), Committee on Risk Perception and Communication
,
1989
.
Improving Risk Communication
.
National Academy Press
,
Washington, DC
.
National Research Council (NRC)
,
1986
.
Understanding Risk
.
National Academy Press
,
Wash, DC
.
Nelson
,
C.
,
Hartling
,
L.
,
Campbell
,
S.
, &
Oswald
,
A. E.
(
2012
).
The effects of audience response systems on learning outcomes in health professions education. A BEME systematic review: BEME Guide No. 21
.
Medical Teacher
,
34
(
6
),
e386
e405
.
Pidgeon
,
N.
, &
Fischhoff
,
Fischhoff
(
2011
).
The role of social and decision sciences in communicating uncertain climate risks
.
Nature Climate Change
,
1
(
1
),
35
41
.
Schultz
,
Ken
2012
,
In the Event of an Oil Spill off the Virginia Coast
.
Smith
,
M. K.
,
Annis
,
S. L.
,
Kaplan
,
J. J.
, &
Drummond
,
F.
(
2012
).
Using Peer Discussion Facilitated by Clicker Questions in an Informal Education Setting: Enhancing Farmer Learning of Science
.
PloS one
,
7
(
10
),
e47564
.
Wood
M
,
Bostrom
A
,
Bridges
T
,
Linkov
I.
Cognitive Mapping Tools: Review and Risk Management Needs
.
Risk Analysis
Volume 32
,
Issue 8
,
pages
1333
1348
,
August 2012
.

1SEA (Ann Hayward Walker, Debbie Scholz, Melinda McPeek), and advisory group including Tom Coolbaugh, Ann Bostrom, Ed Levine, Bob Pond, Bob Pavia, Susan Shelnutt, Steve Lewis.

2Professor Fred Conrad, Director of Graduate Programs for the Institute for Social Research, Program in Survey Methodology, University of Michigan. Dr. Conrad provided valuable input to the final survey design decisions, but bears no responsibility for them.