Tessella Support Services plc
- Home
- Companies & Suppliers
- Tessella Support Services plc
- Downloads
- Delivering the Next Generation of IT ...
Delivering the Next Generation of IT Platforms for Subsurface Engineering Brochure
Delivering the Next Generation of IT Platforms for Subsurface Engineering 2About theAuthorRay Millward PhDAt Tessella, Ray has worked in the Upstream sector of large oil and gas companies. His work has involved the implementation of several bespoke analysis IT platforms. A large part of this has been helping companies use the latest research and innovation to gain commercial advantage.Before working at Tessella, Ray completedhis PhD at the University of Bath, studyinga new adaptive multiscale finite elementmethod with applications to high contrastinterface problems. One such applicationwas to fluid flow through heterogeneousmedia where the permeability varied byseveral orders of magnitude. Standardmethods converge slowly in this case, buthis research showed that his multiscalemethod converged optimally.3Introduction Founded in 1980, Tessella is an international provider of science-powered technology and consulting services to the oil industry. Our blend of science, engineering and sector expertise helps deliver innovative and cost-effective solutions to complex real-world commercial and technical challenges. As such, we are uniquely positioned to explore these challenges.This white paper identifies and investigates the areas of expertise needed to deliver IT Platforms for digital subsurface engineering. It explores the hard questions that teams building these platforms must address.4To stay competitive, the geophysical and reservoir modelling sector of Upstream constantly needs to innovate so as to incorporate smarter and faster methods for reservoir evaluation and production planning. IT Platforms The goal for IT Platforms is to provide the insight needed to optimize hydrocarbon production while minimising operational risk. Insight is created when people interact with the information presented to them through their software tools, and this feeds directly into a decision process which can determine the success or failure of a project. To stay competitive, the geophysical and reservoir modelling sector of Upstream constantly needs to innovate so as to incorporate smarter and faster methods for reservoir evaluation and production planning. Innovations in this area should:1. Deliver New Insight to enable safer, faster and more profitable decisions to be made within a collaborative, globally-distributed working environment. 2. Meet Strategic Knowledge Management Goals by ensuring that the IT Platforms act as a framework for capturing and distributing knowledge globally across the business. 3. Increase Productivity by freeing up qualified and experienced staff to focus on the key engineering, scientific and strategic tasks.This paper will explore how to engage with each of these requirements, and, how it is also possible to Minimise The Total Cost of Ownership, through amalgamation of existing disparate legacy systems into single, more easily-maintained toolkit collections.5EssentialSkillsThe development of IT platforms involves harnessing a variety of advanced technical computing techniques in order to provide clearer visualisations and faster simulations. New innovations in this area require the industrialisation of academic work or the rapid deployment of intellectual property, capturing new concepts and making these accessible to the wider business. To be successful, the development team needs to fully understand the requirements of the project, and how this project will complement the functionality of existing commercially-available tools and legacy systems. This means developers need to have both a deep understanding of the underlying science as well as outstanding proficiency in rigorous software engineering methods, and ideally they will also have experience of developing solutions which interact with a number of commercially-available solutions.Delivering this kind of solution requires multi-disciplinary teams that: ? Can solve the technical computing challenges that underpin innovations in digital engineering. ? Are fluent in the complex scientific language used by the client Subject Matter Experts to define their requirements. ? Understand organisational knowledge management. ? Are experienced in delivering systems to an organisation’s project management standards and IT standards. ? Have domain expertise working with a variety of commercially-available solutions.A particular advantage that these skills can bring is to allow the edge cases to be tested to ensure delivery of a robust, domain-focused solution. In addition, this blend of expertise ensures that Subject Matter Experts can have their recommendations implemented quickly and correctly, so that they can direct more attention to their core responsibilities. 6Opportunities to Develop New Insights from IT Platformsa. Inverting seismic data using new techniques and computer technologiesConsider the need for better interpretation of geological survey data in order to more accurately determine whether a field might contain profitable oil1. There is an increasing trend towards inverting the full elastic wave equation for seismic and electromagnetic data2 which is computationally difficult to do both accurately and quickly.However, with advances in solvers and algorithms from earthquake and tectonic modelling10, techniques are being developed to solve these problems. They are still computationally difficult, but can be achieved with a careful analysis of the method and the available technology (computing clusters, GPUs, domain transformations like wavelets and computing hardware designed for signal processing, etc.).b. Data reduction of seismic measurements, or the resulting permeability field The data gathered from seismic and electromagnetic survey is vast thanks to modern sensor technology, yet still sparse for the purposes of inverting the data back into the corresponding rock structure. The accuracy and viability of the solution must be given due consideration, specifically: the presence of system errorsand how these are tracked through each calculation; the possibility of scaling the speed of an algorithm; and how the data can be incorporated into models. Simple averaging to reduce the field data can prevent a simulation converging to a solution. Homogenisation works well for extremely fine regular granularity but not for highly heterogeneous systems. Averaging and homogenisation do not reliably pull out all the important features from your data.Increasingly, the full elastic wave equation is being used to more accurately model the underlying physical systems rather than relying on approximations and linearizations. This is still at the leading edge of research and computational capabilities, but the payoff from a more accurate model is to obtain better forecasts for the future.When using large datasets in models it is often necessary to reduce the amount of data, where possible, for the purpose of speed, but not in a manner that will compromise on accuracy11, 12. Upscaling and multiscale techniques preserve the important features and also provide a better visualisation of these features, leading to more accurate results3.Multiscale methods used in the academic world have been improving for nearly a decade, and continue to do so: they are now ripe for industrialization.Some of the key areas where innovative technical computing techniques can improve recovery are: ? Inverting seismic data using new techniques and computer technologies1,2? Data reduction of seismic measurements, or the resulting permeability field3? Analysing geological data for potential petroleum reserves ? Modelling fluid dynamics through porous media (the core function of reservoir simulation)4,5,6? Well position/production planning (optimization for both placement and production)7,8,9? Immersive environments and 3D visualisation 7c) Analysing geological data for potential petroleum reservesThe primary challenge in analysing data for potential reserves is to accurately predict production yields with only a limited amount of information. This is typically approached by answering two questions: could oil have been produced, and has it been trapped in a reservoir?When trying to determine if oil has been produced, a number of factors need to be considered, notably: location; the organic composition of the oil; thermal profiles of the source rocks through time, and how the pressures and temperatures affect the composition and flow of the hydrocarbons.d) Modelling fluid dynamics through porous media (the core function of reservoir simulation).The numerical methods used to solve the partial differential equations of fluid flow are constantly evolving. Whether it is linearizing a non-linear model, or trying to solve the full non-linear system (e.g. Navier-Stokes equation), the challenge is to keep up with the most recent developments. Consider how efficient the time step solver is and whether it requires many small time steps to be accurate. Ask whether it converges. Assess how these solvers perform when they transition from 2D to 3D.5 Confirm that the steady state solver incorporates advanced methods such as multi-scale techniques and homogenization.6These challenges can be met by understanding the science and abstracting the software architecture to allow new modules, features and algorithms to be integrated. For instance, integrating software with a computing cluster provides the computational power needed for large complex simulations; a careful analysis of the algorithms and their parallel complexity could then show how to boost performance (perhaps by performing certain tasks in serial when the communication overhead is deemed too great). The smarter use and scheduling of parallel tasks can boost productivity for an entire business group.The visualisation of reservoir data is inherently complex. The process can involve trawling through a huge amount of data for a few crucial elements, determining related clusters of data, or perhaps looking at spatially and temporally distributed data with animations. Graphing theory, clustering and image analysis techniques can be applied to display data for analysis.In an increasingly visual world, it is important to be able to see and interact more intuitively with data. In addition to visualising data with standard plotting functions, users want to be able to examine a reservoir in 3D from any angle, view cross-sectional slices through the data and watch replays of a simulation in order to analyse the evolution of those physical properties of interest. Developing such features presents a formidable challenge.e) Well position/production planning (Optimization for both placement and production)7, 8, 9 Optimisation problems occur with varying degrees of complexity and predictability: some are simple, while others can be more complicated, or indeed chaotic in nature. The operation of a production rig in an oil reservoir is a complex system; it is not necessarily possible to correctly capture and analyse all of the significant data, making it difficult to fully predict and fully optimize in such circumstances.To make the best of a complex situation, it is essential that analysts have a clear understanding of the relevant physics and mathematics and can select or develop an optimizer that best captures the complex dynamics of the system13, 14. Statistical methods can provide greater insight but may require millions of simulations, and so the software needs to be as efficient as possible to be able to meet with this demand.f) Immersive environments and 3D visualisationExamining data in its raw form (or even in tables and simple graphs) can inhibit user understanding, sometimes due to its complexity, and sometimes because only a limited number of experts can actually understand the meaning of the data. An intuitive presentation of the data increases the chance of obtaining an accurate analysis and more effective reviews.It can often be useful to transform data, whether mathematically or from a visual point of view. For example, it might be better to create a 3D view for evaluating possible well locations rather than rely upon entering very specific numerical values. Another important aspect to immersive environments is the incorporation of new technologies such as mobile apps, tablets, touch screens, multi-monitor support and video walls.8Meeting Strategic Knowledge Management Goals Subject Matter Experts (SMEs) play an essential role in creating and distributing knowledge within the upstream business. They can use their extensive research and experience to identify new methods for processing data, they can develop more effective ways to analyze information, and they can design new engineering techniques and workflows that will deliver more profit to the business. Ideally, SMEs should be able to share their knowledge with colleagues through seminars, training courses and collaborative workflows so that the knowledge can be incorporated on a global scale, and greater insight can be extracted from the available data and information.To accelerate this process, IT Platforms need to be developed in tandem with the distribution of new knowledge, so that agents can immediately practise their new analytical skills when they return to their updated applications. It has been observed that when engineers receive training but are unable to immediately start using their new analytical skills, that they will continue to implement their standard practices16. It is also vital to produce a robust and intuitive solution first time around, so that users can easily adapt to using their new tools.To ensure that the upstream business gets the most value out of new developments, software development teams must capture SMEs’ requirements correctly first time around. SMEs often have huge demands on their time. Asking the right kind of questions is key to eliciting SME knowledge.? What are the important aspects of your data? Which are the key quantities to visualise and pull out? Do you need to view this data on the move? ? What type of equation are you solving for inversion? Is it appropriate, or could it be more realistic? Can it be made faster with GPUs? ? When modelling rock structures, how are the errors controlled if a highly discontinuous field is used? Is it perhaps through mesh refinement, upscaling or multiscale techniques?3? When analysing geological data, which methods do you use to collaborate with other parts of the business? How do you intend to distribute a new idea for others to use or advance? ? Which optimization techniques are used for well placement? Is the algorithmic complexity appropriate for the business timeframe? 7By understanding the same concepts, and speaking the same language as SMEs, it becomes possible to create applications that complement the new knowledge that is being distributed across the business. By understanding both the underlying science and the underlying business needs, it is also possible to capture experts’ tacit knowledge, so it can be abstracted and recast as critical business logic. This can then be represented in software tools and made available for younger engineers to understand and inform their decisions, thereby mitigating the risks associated with experienced staff or SMEs leaving the business. 9Increasing ProductivityIt is common that IT Platforms will consist mainly of commercially-available software tools that have been customized and then integrated, and there will also be various smaller innovations developed by SMEs to provide the business with a competitive edge. When there are so many technologies, and multiple agents interacting together, there are many opportunities to speed up the overall efficiency of the system. To be successful, development teams need to understand how the upstream business is already capitalizing on its investments in both the commercially-available software tools as well as its own proprietary software.Good business analysis makes it possible to identify the core business logic and the rules associated with engineering tasks.Building this into the software provides validation and reduces both the risk of errors and the associated costs and time for rework, freeing up time and boosting productivity.Simple examples of productivity improvements include:? Pre-processing geological data to preferred formats. ? Displaying seismic data in ways that make it easier to interpret, manipulate and transform.? Providing clear workflow systems to allow common tasks to be tracked and audited, but also provide guidance to new users; particularly for software that is rarely used.Simple enhancements such as these, can reduce the time spent relearning how to perform a task, and minimise the frustration and stress associated with technical faults or a cumbersome user interface.When there are so many technologies, and multiple agents interacting together, there are many opportunities to speed up the overall efficiency of the system. 10Conclusion Those companies wishing to get the most value from their IT Platforms need to bridge the gap between their Subject Matter Experts and their software development teams: finding a partner that can build that bridge is a crucial first step in maximizing oil and gas production through the use of intelligent tools and analysis.Tessella provides a range of services spanning all areas of a project cycle, from consultancy through to business analysis and then system design, development and enhancement. The expertise of Tessella’s staff within these areas is underpinned by a combination of strong academic scientific backgrounds (well over 50% of our staff hold PhD from leading universities) and experience gained in delivering a wide range of challenging projects.Our scientific background enables us to understand and discuss the relevanttechnical issues, which removes the burden on the SMEs to simplify their descriptions (potentially losing critical detail), and aids them with critical design work. This greatly speeds up the process and avoids the frustration of constantly repeating and explaining issues. Our diverse project experience means we understand the practice of delivering innovative and cost-effective business solutions that need to incorporate advanced and complex engineering, technology and computing techniques. We recognize the role that good design and rigorous documentation plays in minimizing the cost of system maintenance and support of these systems, as well as user training.Oil and gas companies aiming to deliver the vision of Intelligent Energy should look to partner with organizations like Tessella.11Bibliography [1] Helge Løseth, Lars Wensaas, Marita Gading, Kenneth Duffaut and Michael Springer, “Can hydrocarbon source rocks be identified on seismic data?,” Geology, vol. 39, no. 12, pp. 1167-1170, 2011.[2] J. Xiong, A. Abubakar, Y. Lin, T. M. Habashy, Schlumberger-Doll Research, “2.5D Forward And Inverse Modeling of Elastic Full-waveform Seismic Data,” in SEG Annual Meeting, San Antonio, Texas, 2011.[3] R. Millward, “A new adaptive multiscale finite element method with applications to high contrast interface problems.,“PhD Thesis, University of Bath, 2011.[4] R. W. Lewis, Y. Sukirman, “Finiteelement modelling of three-phase flow in deforming saturated oil reservoirs,” International Journal for Numerical and Analytical Methods in Geomechanics, vol. 17, no. 8, pp. 577-598, 1993.[5] M. F. Wheeler., “AdaptiveDiscontinuous Galerkin and Mixed Finite Element Methods for Single and Two Phase Flow and Reactive Transport in Porous Media,” in The Second International Conference on High Performance Computing and Applications, Shanghai, China, 2009.[6] T. Arbogast., “Homogenization-Based Multiscale Finite Elements for Heterogeneous Porous Media,” inSIAM Conference on Mathematical and Computational Issues in the Geosciences, Leipzig, 2009.[7] Alexandre A. Emerick, SPE, Petrobras S.A., and Eugênio Silva, Bruno Messer, Luciana F. Almeida, Dilza Szwarcman, Marco Aurélio C. Pacheco, and Marley M.B.R. Vellasco, PUC-Rio, “Well Placement Optimization Using a Genetic Algorithm With Nonlinear Constraints,” inSPE Reservoir Simulation Symposium, The Woodlands, Texas, USA, 2009.[8] W. Bangerth, H. Klie, M. F. Wheeler, P.L. Stoffa, M.K. Sen, “On optimization algorithms for the reservoir oil well placement problem,” ComputationalGeoscience, vol. 10, pp. 303-319, 2006.[9] D. Bourdet, WELL TEST ANALYSIS: THE USE OF ADVANCED INTERPRETATION MODELS HANDBOOK OF PE TROLEUM EXPLORATION & PRODUCTION VOL 3 (HPEP), Elsevier, 2003.[10] “Mathematical and Computational Issues in the Geosciences,” in SIAM, Long Beach, California, 2011.[11] Mark Allen, David Smith, “Enhancing Production, Reservoir Monitoring and Joint Venture Development Decisions With a Production Data Centre Solution,” inSPE Intelligent Energy International, Jaarbeurs, Utrecht, 2012.[12] Ricardo Borjas, Leopoldo Martinez, Carlos Perez, Reginaldo Rodriguez, “Real-Time Drilling Engineering: Hydraulics and T&D Modeling for Predictive Interpretation While Drilling,” in SPE Intelligent Energy International, Jaarbeurs, Utrecht, 2012.[13] Emmanuel Udofia, Mamoke Akporuno, Frans Vandenberg, Vincent Beijer, Gbenro Oguntimehen, Olatunbosun Oni, “Advances in Production Allocation: Bonga Field Experience,” in SPE Intelligent Energy International, Jaarbeurs, Utrecht, 2012.[14] David Echeverría Ciaurri, Andrew R. Conn, Ulisses T. Mello,E, and Jerome E. Onwunalu, “Integrating Mathematical Optimization and Decision Making in Intelligent Fields,” in SPE Intelligent Energy International, Jaarbeurs, Utrecht, 2012.[15] Richard A. Morneau, Williams G. Van Herreweghe, Joe W. H. Little, and Dominique B. Lefebvre, “Energy Company Perspective on Virtual Worlds / 3-D Immersive Environments,” in SPE Intelligent Energy International, Jaarbeurs, Utrecht, 2012. [16] Dupriest ,F.E, Pastusek, P. Prim, M.T. “The Critical Role of Digital Data in a Drilling Performance Work flow” in SPE Intelligent Energy International, Jaarbeurs, Utrecht, 2012.Copyright © May 2013 Tessella all rights reserved. Tessella, the trefoil knot device, and Preservica are trademarks of Tessella.Tessella - www.tessella.com
Most popular related searches
