How MarineMap ‘Changed the Game’
Amanda E. Cravens, Stanford University
Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision making. Discussions about DSTs, however, rarely recognize the range of ways software can influence users’ negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of policy processes that used technology.
Conducted with the input and support of the McClintock Lab (University of California, Santa Barbara), this study examined how one geospatial DST, MarineMap, influenced participants’ experiences during the multi-year Marine Life Protected Area Initiative (MLPA) process in California.
The study—which draws on approximately 60 interviews, a participant survey, and log analysis—identified specific ways that MarineMap added value to the marine planning process and linked each mechanism to particular tool features. The study also highlights the importance of the social context in which software is implemented, suggesting the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. These results suggest considerations to inform the future use of DSTs in environmental decision-making processes.
Background
MLPA stakeholders faced a challenging task in negotiating where to site marine protected areas (MPAs). “Science guidelines” developed by the advisory science panel provided constraints and structured stakeholders’ problem solving, but following multiple guidelines required decision making subject to multiple simultaneous constraints. A web-based decision-support tool called MarineMap allowed users to test possible MPA solutions against the scientific criteria by which proposals would be officially evaluated and to visualize attributes of proposed protected areas.
Major Findings
What it did: Five ways MarineMap added value
• Helped users understand geography holistically at scales and with combinations of data greater than most people’s personal experience. Notably, given that this was a marine planning exercise, it allowed users to visualize relationships between habitat and substrate under the ocean in a way that few except divers or spear fisherman had done before.
• Helped users understand science criteria by which their proposals would be evaluated. By giving users the capacity to iteratively see how the science guidelines evaluated areas they knew intimately, MarineMap helped participants understand how their ideas would be assessed by the scientists.
• Facilitated communication between various participants by creating a common language. The way that participants used MarineMap as a shorthand vocabulary suggests that many users’ understanding of the information was closely tied to the depiction of it presented in MarineMap. Thus it wasn’t just a medium for communication, but a medium for shaping common understanding.
• Helped users identify shared or diverging interests and thus gave them a better understanding of what was causing conflict in a given area. One interviewee described the tool as a way to “not concede ground, but look at it.”
• Facilitated joint problem solving by aiding users in identifying mutually acceptable options amongst a plethora of choices and making the implications of tradeoffs clear.
How it did it: Key MarineMap features
Certain MarineMap features were instrumental to the tool’s ability to add value as described above. These include its status as the sole authoritative source of geospatial information, its spatial and visual interface, its dynamic interface that allowed users to select only the data in which they were interested at a given time, its accessibility online between meetings, and its real-time reporting capabilities.
Sources of value can also create challenges
As users described these features, however, it became clear that the same feature that added value in one circumstance could be challenging in another. Because the majority of interview subjects viewed their overall experience with the tool very positively, the fact that they reported challenges makes these limitations important to consider.
Take the example of the spatial user interface. While there were many reports about how that feature added value (e.g. by providing a visual aid and thus aiding communication), in other cases it became a limitation. For instance, data that did not appear within the tool essentially did not exist within the process. For one city employee concerned that MPAs line up with city jurisdiction lines to make enforcement easier, the fact that MarineMap (at that time) did not contain city boundaries meant she was unable to convey her concerns to others. She described trying to draw the boundaries on scraps of paper, getting nowhere, and as a result years later still trying get a boundary moved 100 feet south.
A technology team with a service orientation
From the beginning, the MarineMap technology team (which included members with software as well as marine science backgrounds) was given the freedom to build a tool suited for the MLPAI Initiative and provided with sufficient resources to meet their mandate to create a geospatial DST. During the process, the team was highly integrated (“embedded”) into the planning process. Interviewees viewed the technology professionals as approachable, neutral, and helpful. Stakeholders also commented on the team’s responsiveness; many users were not able to recall tool challenges because the challenges had been resolved as they arose. Interviewees attributed the success of the tool in great part to the personality of the development team and to their embedded position.
Frustrated expectations and ‘undesigned’ use
Having access to MarineMap during the years of the MLPA Initiative seems to have created expectations of the tool’s continued availability. Many users came to rely on it for uses not directly related to MPA siting. In the online survey, 69% of users (including state agency staff and members of the public) reported using MarineMap after concluding their participation in the MLPA Initiative and thus for purposes outside the original planning scope of the tool design. Specifically, 24% used it “to access data about state waters in one place,” 32% used it “to review locations or attributes of proposed MPAs as part of the process of implementation,” and 15% used it for other reasons (e.g., “analyzing potential sites for a new National Marine Sanctuary” and to “try to generate [the next] US Nationals Spearfishing competition tournament zone.”) Given this extensive unplanned use, interviewees were extremely frustrated that the tool (which many inaccurately viewed as being built with public money) was no longer available.
Implications
This empirical study of a decision support tool suggests we need to adopt a more nuanced view of “pro” and “con” when thinking about how tools work in decision-making processes. The same features that add value in one circumstance may create challenges in others. Calls to remove “bias” are misguided as any DST will have some bias built in; these applications work by simplifying the world into a representation. What managers and facilitators can do is pay attention to the specifics of how a tool is influencing problem solving, negotiation, and decision making for individuals in particular processes. These results also point to the importance of incorporating the impact of a DST into evaluations of policy processes that use such software. For More Information
More information about this research can be found on the author’s website. She can be reached acravens@stanford.edu.