Managing Nuclear Information in the Cloud

0

Source: Locus Technologies

The Solution for Corporate Control of Environmental Impact Data

Nuclear Decommissioning Report, July 21st, 2011 - The burden of data management for the nuclear power industry is second only to that of managing nuclear material itself. For utilities operating nuclear power plants, deploying a centralized environmental management system to the “Cloud,” and housing all of their data in it, is offering unprecedented opportunities for better monitoring, aggregating and reporting of data. Aggregating data in the Cloud allows them and all interested parties to know where samples of various media have been taken, which parties collected them, how the samples were analyzed, what the levels of radionuclides are in these samples, and what the legal limits and long-term effects of each isotope are.

Software systems already accomplish this enormous task at some U.S. nuclear utilities fleets and U.S. DOE nuclear weapons sites, including Los Alamos National Laboratory and the Stanford Linear Accelerator. These systems have served these U.S. organizations well by organizing millions of analytical and other records in a single web-based database and eliminating many legacy systems built on many incompatible software platforms over the last several decades. As a result, these facilities are now benefitting from a more efficient information management system, social networking type of program management and collaboration that allows an easy way to share data and information with project participants, streamlining reporting and operations, and offering more transparency that is welcomed by watchdog groups. The Japanese government and TEPCO, the utility that owns the Fukushima Daiichi nuclear plant, should take a lesson from these stateside organizations and put a similar system in place as soon as possible, as the requirements for long-term monitoring will generate an enormous quantity of data. This “data tsunami” is already on its way and may be as challenging to manage as the actual tsunami that hit the Fukushima Daiichi nuclear plant without the proper information management system.

Since mid-March, the public’s fear and uncertainty concerning releases from the Fukushima Daiichi nuclear plant have been exacerbated by a lack of information on the nature and extent of the radionuclide contamination emanating from this facility. To date, the focus of the crisis, rightfully so, has been on dealing with the primary issues of taming the reactors and containing the leaks. After some initial setbacks, the situation at the plant has come closer to being under control. Very soon attention will shift to characterizing the impact of the disaster on human health and the environment, and on long-term monitoring and stewardship.

Anyone who has attempted to follow the levels of contamination reported by Japanese authorities and TEPCO in the vicinity of the Fukushima site has to be struck by both the magnitudes and variability of the reported concentrations. Samples of groundwater taken beneath the No. 1 Reactor’s turbine building on April 1, for example, contained radioactive iodine at 10,000 times the legal threshold. On March 30, Japan’s Nuclear Industry Safety Agency found that levels of iodine-131 in the seawater near the plant were 4,385 times the maximum level permitted under Japanese law. Several days later, this number soared to 7.5 million times the legal limit after the aforementioned release of contaminated water from the plant had commenced. And this month, another 3,000 tons of water contaminated with radioactive cobalt 58 and 60 isotopes, iodine-131, and manganese 54, were released from the Fukushima Daini nuclear complex, the sister plant of the stricken Fukushima Daiichi complex. Although the levels are below the legal limit for discharge water set in Japan, levels of cesium-134 and 137 isotopes in the water—at two and three becquerels per cubic centimeter, respectively—exceeded the legal limits of 0.06 and 0.09 becquerels.

The new data released by the EPA shows the spread of radioactive iodine from the Fukushima nuclear accident in the drinking water supply in Philadelphia and additional U.S. cities. Tokyo and other Japanese cities have also reported iodine at various concentrations in their drinking water supply systems.

In the coming months and years, authorities, scientists, and TEPCO, the owner of the utility, will collect an immense amount of data on the nature and extent of the radiological contamination. They will take samples of air, soil, groundwater, seawater and various biota, including crops and fish, then measure these samples for various radionuclides, all of which have different half-lives. The resulting information will need to be evaluated for possible short- and long-term impacts on humans and the environment. The most effective way to accomplish a comprehensive analysis would be to aggregate all of the relevant data and store them in a centralized information management system that is accessible to all stakeholders. The Web makes this type of data aggregation possible and more cost-effective than any other technology solution we have at our disposal.

Monitoring in the U.S. in the Face of On-going Risk

Radionuclide information management is not only necessary after a Fukushima-size disaster. In the U.S., it is necessary for ongoing compliance and reporting to regulatory agencies (EPA and U.S. Nuclear Regulatory Commission [NRC]) at all nuclear facilities. Over the last several years, and long before Fukushima refocused a spotlight on the nuclear industry as a whole, a number of issues have arisen regarding leaking and possibly dangerous levels of carbon-14 (C-14) and tritium, a radioactive byproduct of the nuclear process. The U.S. NRC, which oversees the inspection and licensing of nuclear facilities, says roughly 30 of the nation’s 104 reactor units have experienced tritium leaks. According to the NRC, none of the leaks have affected public health or safety, but the unmonitored and unexpected releases have raised concerns within the industry and among watchdog groups.

Similar concerns regarding leaks at several plants in the mid-2000s prompted the members of the Nuclear Energy Institute (NEI) to put forth a Ground Water Protection Initiative (NEI-07-07) in 2007. This initiative identifies actions that utilities can take to improve their management and response to instances where the inadvertent release of radioactive substances may result in low but detectible levels of plant-related materials in subsurface soils and water, even when these levels are well below the NRC limits pertaining to the protection of public health and safety.

One of the key actions that adoptees of the initiative are expected to undertake is the establishment of a monitoring program involving on-site monitoring or regular sampling and analyses to ensure the timely detection of inadvertent radiological releases. According to the NRC, we will almost certainly see additional requirements pertaining to the monitoring of air releases of tritium and C-14 in the future. To meet these requirements (e.g., 10CFR-51 and 10CFR-52), facilities will need to install new monitoring instruments and data management tools since many facilities are still using more than 30-year-old stack monitors and few, if any, currently have H-3 or C-14 stack monitors in place. For that reason, some plants in the U.S. are already installing new, state-of the-art, stack monitors that read, analyze, and record beta-gamma particulates, iodine, noble gases, C-14, tritium and, optionally, alpha particulates. These data need to feed in real time to an intelligent, Web-based database so that utilities can instantly interpret and act upon the information in the case of non-permissible releases. The same type of database would be used if a disaster, such as the one at Fukushima, struck a U.S. facility.

Tracking Radionuclide Levels in Water

The key to properly managing radionuclides in the environment lies in managing water quality, because water is the first environmental system affected by any leaks. Plus, the nuclear industry uses large quantities of water for cooling the reactor core.

One of the industry’s most vexing impediments in responding to environmental problems has been difficulties in properly centralizing and managing captured water quality data—the individual facts, statistics and items of information that represent the results of testing and analyses. The critical information that nuclear power plants require in order to evaluate their overall water-quality footprint and environmental impact is too often largely inaccessible to key decision makers and risk managers because it resides in untold numbers of different formats in the scattered offices of their environmental advisory consultants.

The sheer complexity of managing radionuclides in water demands sophisticated technological solutions for monitoring and measuring potential radionculides releases at nuclear power plants sites. Utilities have an acute need for innovative approaches to persistent monitoring, site investigations, data evaluation, remediation and containment methods, which means using tools that support scientifically defensible risk assessments, improving the decision-making ability of C-level executives and risk managers, and facilitating successful efforts at removing pollutants and contaminants where required.

As the regulatory environment threatens to impose more and more onerous operating costs for compliance, many utilities have begun to realize that they need to take ownership of this information, streamline their processes to control their costs, and improve their ability to make decisions and mitigate risks. It is now essential that utilities adopt a different approach from the heretofore standard (and now outdated) “consultantcentric,” spreadsheet-based environmental information management system with its typical project delays and increased costs. The new approach relies on leading-edge Web-based technologies that give environmental professionals Google-like abilities to search complex water data sets and growing piles of seemingly unrelated water, soil and air quality information. Or to put it simply, the solution lies in Cloud computing.

Naturally, some would argue that with recent outages at Amazon and Sony, Cloud computing might not be ready for prime time. However, critics who cite the recent Cloud outages forget that the Fukushima plant had nothing comparable in place at the time of the tsunami. In any case, even if the plant had any type of the environmental information management system installed in-house, the last location that such a system should be running from is the plant itself. Both Fukushima and the BP Gulf spill disaster have shown that no critical software should run at the facility itself; that software and data could be destroyed along with the facility, and the data that would have helped determine the cause of the disaster would be lost for all practical purposes. For similar reasons, the aviation industry is considering a Cloud-based alternative for the so-called “black boxes” that provide flight data for downed aircraft.

Such a system would also facilitate ongoing study, as any party would have access to all of the results collected by others. Russian authorities have not done this sort of aggregating with the Chernobyl site to this date, and the general public still does not know the exact extent and impact of that disaster on the human health and environment. Other utilities should not make the same mistake. Nuclear contamination is like no other modern threat to the environment: it cannot be hidden and it is easily detectable even from a distance; no facility should really try to hide it. Its insidious nature makes it necessary to be transparent.

Measuring, Monitoring and Managing Radionuclides in the Environment

Nearly all of the activities associated with water, air and soil protection at nuclear power plants and other nuclear facilities, including the assessment of site characteristics, the ongoing monitoring of site conditions, and the remediation of adverse environmental impacts, involve the collection and/or analysis of data. The tools and systems used to manage and store this information must satisfy strict security and QA/QC requirements to ensure that only the appropriate people can access the data, and that the quality of the data adheres to the highest NRC standards. It is also critical that these applications allow engineers and scientists to do their work in a cost-effective way, allowing them to focus less of their time on finding the data they need and formatting various outputs, and more on the evaluation and analysis of these data.

The Cloud-based software is specifically designed for managing subsurface and other data at nuclear facilities, including commercial reactor sites, research labs and nuclear materials production and storage complexes. The system provides an unmatched level of data security and enforces an extensive set of QA/QC requirements on all uploaded data. At the same time, it provides a variety of easy-to-use options to upload, validate, flag, examine, map, plot, download and report data. The system has the capability to store such radioanalytical parameters as uncertainty, uncertainty type (Standard, Combined Standard, and Expanded), and required method uncertainty. It also has the capability to convert weight to activity concentrations, calculate sums of ratios, and evaluate action limits that pertain to either single or groups of parameters. The system helps reporting entities enforce data quality in accordance with the NRC or other standards such as NQA-1, and ANSI/ISO/ASQ Q 9001:2000, and validate incoming analytical data.

After the Fukushima disaster we can safely assume that regulators around the world will impose stricter monitoring and associated data-management standards for operating facilities and the decommissioning of older-generation power plants and weapons complexes. What the market needs is an off-the-shelf tool to manage radioactive data that are subject to a different set of regulatory standards from those managed under regular chemistry data under the U.S. EPA and other guidance documents. The radioanalytical functionality provided in the Cloud-based system gives any nuclear facility that has a need for data management and reporting—and almost all have—a tool to get the job done at minimal cost and in a completely transparent way.

The software was introduced to the market at the Electric Power Research Institute (EPRI) Groundwater Protection Workshop (in collaboration with NEI), held in 2009 in Charleston, South Carolina. Since that time several large U.S. utilities have successfully deployed the system, including the U.S.’s largest nuclear weapons complex, Los Alamos National Laboratory.

This Cloud-based system is called EIM. It completely replaces existing stand-alone data systems and reporting tools to provide a comprehensive integrated solution to one of the environmental industry’s most vexing problems— the centralization and management of complex data pertaining to contaminated water, groundwater, soil and/or air. EIM provides for the complete electronic processing of analytical data, beginning with the upload of electronic data deliverables from labs, and terminating in state-mandated regulatory exports and reporting. EIM is deployed through a Software as a Service (SaaS) model, which eliminates most of the difficulties associated with the adoption of a new technology, while offering the opportunity for more rapid customization to meet the ever-changing needs of its users.

SaaS via Cloud Computing

In the SaaS delivery model, the software vendor provides access to its software and functions remotely as a Web-based service. SaaS allows organizations to access business functionality at a cost that is typically lower than paying for licensed applications, since SaaS pricing is based on a monthly rental fee. Instead of the utility buying software and paying for periodic upgrades, its use of a SaaS application is subscription-based and all (rolling) upgrades are provided during the term of the subscription. When the subscription period expires, all a client needs to do is renew.

This on-demand service provides measurable economies of scale and cost advantages because the more customers a SaaS vendor has, the less each customer pays for a subscription. This process continuously drives down costs while improving software quality as a SaaS application benefits from the “wisdom of the crowd,” i.e., its many users.

When a large network effect is present, as is the case with SaaS-based software, the value of a product or service increases as more people use it. This “network effect,” originally used to describe the rapid spread of telephones, states that the value of a communications network to its users rises exponentially with the number of people connected to it.

SaaS applications are maintained in the service provider’s datacenter, and every time users launch their browsers and log on, they get the latest version of the software as well as access to the most current data, which are also stored in the service provider’s datacenter. Because the software is hosted remotely, users don’t need to invest in additional hardware.  SaaS removes the need for organizations to handle installation, set-up and often daily upkeep and maintenance.

“Cloud computing” is a general term for anything that involves delivering hosted services over the Internet. Cloud computing describes all data processing activity that occurs outside of the organization’s networked computer systems. The Cloud provides the computing power required to run SaaS and other types of applications. Since SaaS is a subservice of Cloud Computing, all SaaS applications are in the Cloud, which provides the computing power to run those applications.

In environmental information management, Cloud computing puts utilities back in charge of their own data while at the same time offering individuals with the appropriate logon privileges unfettered access not only to relevant data, but also to tools needed to analyze these data. But probably the most important of all is that the Cloud computing model can be launched almost instantly or at worst within days, eliminating the need for long, self-defeating software procurement processes.

Providing Transparency

Scarcity usually encourages better management of resources—or forces extinction due to inadequate or inappropriate responses (like Germany’s decision to shut its nuclear program by 2022). Consequently, external demands on utilities that operate and plan to continue to operate nuclear reactors around the world will require them to demonstrate that they have systems in place to manage any potential data overflow. We are starting to see the new practices in the U.S., and we can only expect to see increased adoption. Cloud computing-based environmental solutions represent the best way for companies to take ownership of their data, satisfy looming reporting requirements, and tackle their radionuclide levels and other environmental management challenges.

Those nuclear power plant managers who have not moved to a Cloud-based data management model would find themselves able to make quicker, more confident decisions at lower costs if they managed the data associated with radionuclides (and other environment-related risks) using robust Web-based information management systems. These systems are similar to common enterprise resource planning (ERP) systems used to manage and coordinate all of the “back office” resources, information and functions of a business—now it’s time for them to put them to work with their environmental data.

Today, the mainstream media, as well as the business and trade press and the public at large, seem to be focused almost exclusively on issues related to greenhouse gas emissions and climate change. But after the Fukushima disaster, awareness is beginning to grow that it is just as important to give equal attention to managing information about radionculides at operating nuclear power plants and nuclear weapons sites. In the coming years, the nuclear industry will be pressed to demonstrate not only that it can prevent another Fukushima-type disaster, but also that it can manage radionuclide fallout more effectively.

The crews managing the 1979 Three Mile Island accident, the 1986 Chernobyl accident, the 2010 BP Gulf Oil Spill, and the 2011 Fukushima meltdown were not able to organize the data and information to characterize the impact of the disaster quickly. In 1979 and 1986 the Internet did not exist. But in 2011, in a world where we can use Google to find information instantly about anything, we should be able to use a similar technology to manage environmental information that may be of existential value for this planet.

Additionally, there is one unassailable benefit to the industry making the information about environmental monitoring and impacts as public as possible: Transparency. The global community needs to understand how our nuclear activities are affecting the planet—and not just in times of crisis—and sharing that information through a centralized resource will help all countries manage environmental threats in concert. Japan is not the only country that was (or will be) affected by the fallout at Fukushima, and that will be the case for any nuclear incident that may ever occur. Without a shared understanding of the risks and damages, we cannot begin to address them.

Listening to members of the Japanese public interviewed as the Fukushima incident unfolded, one could sense a decreasing level of trust with authorities and their growing frustration and powerlessness. As former U.S. Supreme Court Justice Louis Brandeis once said, “Sunshine is the best disinfectant.” Had the Japanese authorities shed light on what they were finding at the plant, perhaps they could have staved off much of the criticism of how they handled the crisis from a public-opinion standpoint.

With a public-facing database, accessible even to its critics, the nuclear industry can greatly reduce accusations that it is withholding information. Soviet and Russian authorities never shared the complete picture its data revealed about the Chernobyl site, and the general public still does not know the exact extent of that disaster on human health and the environment. BP has not done it for the Gulf oil spill. The biggest benefit of such a system would also facilitate ongoing study, as any party would have access to all of the results collected by others.

About the Author:

Neno Duplan is the founder, president and CEO of Silicon Valley based Locus Technologies. The company organizes environmental, energy and radionuclides information for U.S. nuclear utilities, DOE nuclear weapons sites, and other industries in the Cloud.

Customer comments

No comments were found for Managing Nuclear Information in the Cloud. Be the first to comment!