Few standardised frameworks are designed to assess the full range of oil spill preparedness activities, from plan development, implementation, equipment, training, exercises, and response sustainability.
This paper analyses the international practice of oil spill preparedness measures and compares them to Swedish practice. Friedman’s test and Dunn’s post-test have been used to compare the RETOS™ evaluation scores of Finland, Russia, Latvia, Lithuania, Poland, Germany, Denmark, and Norway to Sweden. The United States is examined as an external reference. The RETOS™ programme is an Excel tool developed for the International Oil Spill Conference 2008. It is a guide for industry and governments to assess their level of oil spill response, planning, and preparedness management in relation to established criteria, and is intended for international best management practices.
Swedish oil spill preparedness is shown to be comparable to the Baltic Sea regional practice. The Swedish RETOS™ evaluation score is 69%, compared to the average 73.1% of the examined countries. A statistical difference exists between Sweden and both Norway and the United States.
Swedish oil spill preparedness is comparable to the Baltic Sea Region countries despite: not having a National Contingency Plan, not using the Tiered Preparedness and Response concept, nor having adopted an Incident Management System. This suggests that these concepts are not essential for a functioning preparedness regime, although Sweden instead has a system serving the same function. However, it also questions what effect implementing these concepts would have on Swedish preparedness.
This paper is based on a study of the Swedish oil spill preparedness regime, comparing it to neighbouring countries in the Baltic Sea Region and Norway. The United States was additionally examined, representing a control country with well-developed oil spill preparedness. Sweden is located in the north of Europe and has a land border with Norway to the west and Finland to the east. Across the Baltic Sea, it borders Russia, Estonia, Latvia, Lithuania, Poland, Germany, and Denmark (see Figure 1).
Sweden shares many agreements concerning oil spill response cooperation with its neighbouring countries, for example the 2013 Agreement on Cooperation on Marine Oil Pollution Preparedness and Response in the Arctic (Arctic Council, 2013), 1969 Agreement for cooperation in dealing with pollution of the North Sea by oil and other harmful substances (Bonn Agreement, 1983), 1971/93 Copenhagen Agreement (Køpenhavnsavtalet, 2004), and the 1971/92 Convention on the Protection of the Marine Environment of the Baltic Sea Area (Helsinki Convention, administered by the Helsinki Commission, HELCOM) (HELCOM, 2008). Sweden has historically been an active player in environmental protection and oil spill preparedness (Hassler, 2008). This history of environmental protection and close collaboration with neighbouring countries made Sweden particularly interesting for evaluating oil spill preparedness. Oil spill preparedness in Sweden was highlighted in the wake of the 2011 Golden Trader oil spill in Denmark, that impacted the Swedish island Tjörn and led to the largest Swedish oil spill response since Fu Shan Hai in 2003 (MSBHaV, 2014; Pålsson & Wåhlander, 2013).
Several indicators can be used to compare the level of oil spill preparedness among countries. However, these indicators must be standardised as much as possible, logically making those that can be objectively quantifiable preferable to use. There are few formal frameworks utilising indicators that are designed to assess the full range of oil spill preparedness activities, from plan development, implementation, response equipment, training, and preparedness sustainability, and to which assessments can be compared (Taylor, Steen, Meza, Couzigou, Hodges, Miranda, Ramos, & Moyano, 2008b).
As the interest in assessing performance of oil spill preparedness has grown, many expert organisations have published guidelines. For example, the International Organization for Standardization (ISO) has published general guidelines for exercises and testing (ISO, 2011), the International Maritime Organization (IMO) has published manuals on oil pollution concerning prevention (IMO, 2011), contingency planning (IMO, 1995), oil combating (IMO, 2005), and administration (IMO, 2009), and the International Oil Spill Conference (IOSC) has developed RETOS™, an international tool to assess oil spill response planning and readiness (Taylor & Lamarche, 2014; Taylor, Moyano, & Steen, 2014). From the industry side, organisations such as the American Petroleum Institute (API), the International Tanker Owners Pollution Federation (ITOPF), and the International Petroleum Industry Environmental Conservation Association (IPIECA) have published numerous guides and reports on oil spill preparedness issues (API, 2013a; 2013b; 2013c; IPIECA, 1994; 2004a; 2004b; ITOPF, 2011a; 2011b; 2011c). Several projects have been setup to disseminate these guidelines and train national authorities and companies, for example the Global Initiative (GI), the Regional Maritime Pollution Emergency Information and Training Center for the Wider Caribbean Region (RAC/REMPEITC-Caribe), and the Project on Oil Spill Preparedness and Response in the ASEAN Seas Area (ASEAN-OSPAR Project). Many of these guides mention the use of a National Contingency Plan (NCP) to guide oil spill preparedness on a national level, an Incident Management System (IMS) to manage the oil response, and the Tiered Preparedness and Response concept when discussing the scale of an oil spill.
The “best international practice” for oil spill preparedness is thus not a single manual, but a compilation of guidelines and recommendations for certain aspects of the oil spill response management system or country specific systems (Cashman, Stephens, & Boyles, 2003; Taylor, Steen, Meza, Couzigou, Hodges, Miranda, Ramos, & Moyano, 2008a).
Data for the RETOS™ analysis has been gathered from the respective country representatives. The analysis is limited to the Baltic Sea countries and Norway, as they represent the neighbouring countries to Sweden. The United States is included as a reference country. Expert opinions have been collected from interviews with the members of the Swedish National Cooperation Group for Oil Combating (NSO) during January 2015. NSO is the national expert group on oil spill preparedness and consists of representatives from the Swedish Coast Guard (SCG), the Swedish Civil Contingencies Agency (MSB), the Swedish Agency for Marine and Water Management (SwAM), the Oil Spill Advisory Service (OSAS), the Swedish Maritime Administration (SMA), the Swedish Transport Agency (STA), the County Administrative Boards (CABs), and the Swedish Association of Local Authorities and Regions (SALAR).
The RETOS™ programme has been used for the standardised evaluation comparison. It is an Microsoft Excel tool based on original work developed for the ARPEL Governance Project (AGP) and refined for the International Oil Spill Conference between 2008 and 2014 (Taylor et al., 2014; Taylor & Lamarche, 2014; Taylor, Steen, Meza, Couzigou, Hodges, Miranda, Ramos, & Moyano, 2008a; 2008b). RETOS™ provides a general guide for industry and governments to assess their level of oil spill response, planning, and readiness management in relation to established criteria, and is intended for international best management practices.
The chosen countries have all been analysed using the National RETOS Level A part of the RETOS™ Programme, evaluating the basic aspects of national preparedness. The B and C levels go further into details of the preparedness and assume that Level A is passed. Levels B and C have not been used for the comparison, as most countries did not pass the Level A evaluation.
The statistical analyses (Friedman’s test and Dunn’s post-test) and graphs were created using the GraphPad Prism statistical software. Friedman’s test is a non-parametric statistical test that compares three or more matched or paired groups by ranking each row separately (GraphPad Software, 2015). The ranks in each column are then summed. The value of the Friedman statistic is then calculated from the sums of ranks and the sample sizes. If the p value is small, the differences between columns are not random, and at least one of the columns differs from the rest. A post-test (usually Dunn’s post-test) is needed to see which columns differ from which other columns. Dunn’s post-test is performed after Friedman’s test. It calculates the expected average difference in the rank sum between two columns and compares it to the real value (GraphPad Software, 2015). The p value then takes into account the number of comparisons and is calculated for each pair of columns, identifying which pairs of columns are different from the rest.
Sweden, its neighbouring countries, and the United States were evaluated using the RETOS™ Programme at the National Level A evaluation. The United States represents a country with a highly developed oil spill preparedness and has been included as a reference. No data could be obtained from Estonia. The total scores of the neighbouring countries ranged from 56% to 98%, with an average of 73.1% (see Table 1).
The examined scores of the RETOS™ evaluation were added together to analyse topical trends (see Figure 2).
The examined countries are collectively best at Legislation, Regulations, Agreements; Response Coordination; and Tracking, Assessment & Information Management. The scores of the individual countries were mapped out on a radar chart in Figure 3.
A statistically significant difference was found between the examined countries, Friedman’s test (N = 10), p < .001. Comparing Sweden to the other countries, Sweden was significantly different from the United States and Norway, Dunn’s post-test (N = 7), p < .05.
The evaluation results from the RETOS™ Programme show a large variation between the evaluated countries. It is reasonable that the United States has a high score in all categories of the test, as the RETOS™ Programme, to a large extent, is modelled after the preparedness system in the United States. This model is the most developed system in North and Central America and the United States has great influence in the region. All the issues listed in the National RETOS™ Level A evaluation have been addressed in the United States. It is interesting to note that all of the Baltic Sea countries are more or less developed to the same degree, with an evaluation score between 56% and 74%. Sweden received an evaluation score of 69%, corresponding to the RETOS™ status “In Development” and is statistically equal to the remaining Baltic Sea countries. As Norway and Finland have used a larger part of their oil revenue for developing oil spill preparedness than the other Baltic Sea Region countries, they are expectedly scoring higher.
Results from the NSO interviews (January 2015), show a divided opinion among the Swedish experts on the statement “Swedish oil spill preparedness is equivalent to international standard.” Two informants disagreed with the statement, two were in agreement, and two did not have an opinion. The informants argue that lessons learned from other countries have been taken into account when developing the Swedish oil spill preparedness regime. As Sweden is active in several regional agreements and forums on oil spill preparedness, comparable practices to the regional countries have been developed. Informants of both opinions have worked with oil spills or exercises abroad, although primarily in the Baltic and North Sea contexts. The results of the RETOS™ evaluation shows that Swedish oil spill preparedness is indeed equivalent to international practice, at least the regional practice.
Sweden scores well in the RETOS™ evaluation, in comparison to the neighbouring countries. This is in spite of the country not having a National Contingency Plan, or using an Incident Management System (IMS) and the Tiered Preparedness and Response concept. This shows that it has been possible for Sweden to build a functional preparedness system on a comparable level to other Baltic Sea countries that utilise any or all of these concepts. This suggests that those measures are not needed to maintain a good preparedness level, although familiarity with these concepts will likely aid communication during international cooperation and operations. However, Sweden instead has an organisational system for oil spill preparedness that serves the same function. Responsibilities for the involved organisations are mandated (although sometimes vaguely) and divided between municipal, county, and national levels, without having it formalised in a plan.
There are three main limitations to this study. Firstly, the evaluation results are not approved as the “official” evaluations by all countries, as all the respective governments have not approved them. However, highly qualified individuals, and generally the person responsible for the national contingency planning, supplied the data. Secondly, as many of the RETOS™ evaluation indicators and most of the work in the Baltic Sea Region and Norway are focusing on the response at sea, it may not adequately reflect the onshore preparedness and response. Thirdly, the statistical method used for comparing the countries, Friedman’s test, uses a ranking sum. This means that there is a tendency to only find differences with the most extreme case, when comparing between the countries. Since there is quite a large gap between most countries and the United States and Norway, this was not an issue. However, comparing only between the remaining countries showed a difference that did not exist when the United States and Norway were included.
This paper introduces a novel method to statistically compare the results of the RETOS™ country evaluations using Friedman’s test and Dunn’s post-test. However, the method requires some tweaking to make it easier to distinguish between outliers in the data. In this study, the United States and Norway are two external controls that the Baltic Sea countries were compared to for the RETOS™ evaluation. However, such a comparison is somewhat crude, as several indicators may or may not be relevant for the respective countries. Respondent feedback indicates that the RETOS™ evaluation might be too focussed on the Incident Command System used in the United States, but not in the Baltic Sea region.
Despite not having a National Contingency Plan or using the Tiered Preparedness and Response concept, Sweden is shown to be at a similar oil spill preparedness level to its neighbouring countries. However, Sweden instead has an organisational system for oil spill preparedness that serves the same function, with defined responsibilities for the involved organisations at municipal, county, and national levels. This suggests that a National Contingency Plan and the Tiered Preparedness and Response concept are not critical, if corresponding measures exists. However, this also raises the question how implementing these concepts would benefit the national preparedness. The main advantage for Sweden to adhere to international practice, such as developing a National Contingency Plan, would be to harmonise the various regional and municipal plans into a national system and use terminology in line with the global practice. This would utilise the best practices from abroad and improve communication and understanding, and thereby cooperation, during international operations.
*World Maritime University, P.O. Box 500, 201 24 Malmö, Sweden