Speaker(s): Paul Baville

Date: Thursday 21st November 2019 - 1:00 pm

Location: room G201, ENSG, Nancy

Abstract:

My main Phd topic is the computer assisted well correlation and I will present you two correlation rules based on (1) facies interpretations and (2) time surface geometry applied on data set from the North Sea, provided by Equinor ASA.

  1. Facies correlation. Wells are cored and these cores are interpreted in several facies categorized by their paleo-depth (from distal to continental deposits). Wells are also categorized regarding their distality in the sedimentary basin and the main sediment transport direction in the basin. Knowing the well distality and the facies paleo-depth, it is possible to compute a correlation cost for each couple of markers (between two wells).
  2. Time surface geometry. Dip meter data is acquired along well paths during their drilling. Using this 3D data, it is possible to generate a 2D curve (3D surface) between 2 markers (3+ markers) using bezier curves (surfaces). In this seminar I will focus on 2D curves and 3D triangles (correlation between 2 and 3 wells, respectively). Knowing the paleo-environment, the bezier curve or triangle is generated. The comparison of its geometry to the top and base reservoir surfaces leads to a validation or not of the correlation.

Speaker(s): Nicolas Clausolles

Date: Thursday 17th of October 2019

Location: room G201, ENSG, Nancy

Abstract:

Seismic imaging of salt environments is prone to large uncertainties due to the peculiar salt physical properties. A lot of studies have been focusing on the development of automatic interpretation methods during the last decades. These methods aim at determining the best possible predicate for the salt boundary, but do not allow for the characterizaion of the uncertainties underlying the generation of seismic images.
During my three years of PhD, I have been working on the development of a method for stochastically generating different structural interpretations of salt bodies from a given seismic image. During this seminary, I will present the method I have developed and early results on its application to the characterization and the sampling of the uncertainties underlying seismic images.

 


Speaker(s): Corentin Gouache

Date: Tuesday 08th of October 2019

Location: room G201, ENSG, Nancy

Abstract:

Les approches probabilistes du risque sismique sont sensibles à un paramètre-clé qu'est le catalogue de sismicité utilisé en entrée. Celui-ci conditionne aussi bien la spatialisation que la fréquence et la sévérité des évènements possibles modélisés. Dans le but de générer une multitude de séries de séismes indépendants pour la France (métropole et Antilles), une approche basée sur les temps de latence entre les évènements sismiques (Hainzl et al., 2006) est ici présentée. Celle-ci nous permet de calculer le pourcentage de séismes indépendants contenus dans un catalogue de sismicité. En multipliant ce pourcentage à la fréquence annuelle de l’ensemble des séismes par tranche de magnitude (Gutenberg-Richter, GR) il est possible de calculer une répartition temporelle des séismes indépendants en fonction de la magnitude. Cependant, les catalogues sismiques incorporent des incertitudes ou lacunes : faibles représentativité des faibles magnitudes dû à la résolution et à la géométrie du réseau de sismomètres, incertitudes sur les principaux paramètres des catalogues (localisation 3D et magnitude) dues aux modèles de vitesses utilisés ainsi qu’aux méthodes employées pour le calcul de la magnitude.

Afin de ne travailler que sur des catalogues exhaustifs en fréquence/magnitude des évènements selon différentes profondeurs historiques, une méthode d'optimisation basée sur la minimisation d'un résidu entre une loi GR théorique et une loi GR observée (Weimer et Wyss, 2000) est réalisée. Ainsi il est possible, à partir du catalogue initial, de ne garder que les informations validées selon des critères et servant à la génération stochastique de séismes.

Enfin, dans le but de propager les incertitudes sur les différents paramètres, et notamment la magnitude des séismes, une approche type Monte Carlo est utilisée. A partir du catalogue sismique de base, une infinité de catalogues initiaux peuvent être générés. Chacun de ces catalogues possède un nombre identique de séismes dont la magnitude est tirée dans une loi normale centrée sur la valeur initiale de la magnitude ± son incertitude.


Speaker(s): Modeste Irakarama

Date: Thursday 18th of July 2019

Location: room G201, ENSG, Nancy

Abstract:

Potential methods, such as gravity and magnetic, are extensively used for subsurface exploration of mineral and geothermal reservoirs. In the last five decades, a number of efficient schemes for forward modeling of potential methods have been proposed. In this seminar, I will present a basic review of these forward modeling methods. I will insist mainly on the numerical aspects of the methods. These methods rely on an efficient implementation of an integral solution to the Poisson equation. Because this integral solution does not require matrix inversion, and because the Poisson equation appears in a number of physical phenomena, I will conclude the talk by contemplating the possibility of solving physical problems on implicit subsurface structural models without matrix inversion. The physical problems of interest obviously include gravity and magnetic problems, which are the subject of this talk, but also other problems such as fluid flow in the subsurface.

Speaker(s): Anais Ibourichene

Date: Thursday 11th of July 2019

Location: room G201, ENSG, Nancy

Abstract:

The presence of fractures or cracks in the crust of the Earth is a source of the anisotropy detected by seismic waves.
Previous papers have built analytical solutions or numerical simulations in order to determine how such features can affect the properties of crustal rocks. They evidenced that the impact of fractures and cracks on a background medium depends on characteristics such as their orientation, density or size. These parameters are keys to understand the effective properties of a fractured medium.
However, seismic waves are not sensitive to structures shorter than a certain wavelength. The homogenization method will therefore be used to provide the effective properties in a given 2D/3D model and represent what seismic waves are able to “see” when probing fractures or cracks. In particular, the application of this tool to a fractured medium will allow to determine how the different characteristics of fractures impact a background medium.
During this seminar, the concept of fractured medium in rock mechanics will be first presented. The theory behind the homogenization will then be shortly introduced before presenting its applications and advantages for the purpose of fractured media. Finally, the perspectives of this work will be discussed.

Speaker(s): Guillaume Caumon

Date: Thursday 20th of June 2019

Location: room G201, ENSG, Nancy

Abstract:

In this bibliographic seminar, I will present some recent work on structural uncertainty done in Nice (Thea Ragon's PhS) and UWA (Evren Pakyuz-Charrier's paper). Ragon proposes a way to include fault geometric uncertainty when inverting seismic rupture models from interferometry (SAR) data. Pakyuz-Charrier proposes a topological distance to cluster stochastic structural models obtained by data perturbation and implicit modeling.
In the seminar, I will summarize and discuss these two approaches.
Docs: https://www.solid-earth-discuss.net/se-2019-78/#discussion
Et https://academic-oup-com.insu.bib.cnrs.fr/gji/article/214/2/1174/4996353

Speaker(s): Paul Baville

Date: Thursday 23rd of May 2019

Location: room G201, ENSG, Nancy

Abstract:

As a repetition of my oral presentation EAGE, I will present some results obtained during my internship graduation at OMV. These results have been summarized in an EAGE extended abstract whose authors are Paul Baville, Jörg Peisker (OMV), Guillaume Caumon.

This paper addresses stratigraphic uncertainty and its impact on subsurface forecasts. For this, we introduce a new assisted automatic method which detects possible sequence boundaries from well log data. This method uses multi-scale signal analysis (discrete wavelet transform) to compute the probability density of finding maximum flooding surfaces and maximum regressive surfaces as a function of depth. It then recursively decomposes the studied stratigraphic section into sub-intervals where the analysis is repeated. We applied this method on a shallow marine wave dominated silicoclastic reservoir located in the Vienna Basin. We observe that several reservoir models with different stratigraphic layering (keeping all other parameters constant) have a different reservoir behavior. This allowed us to locally resolve the mismatch between measured and simulated tracer tests. This illustrates the significance of stratigraphic uncertainties in reservoir modeling and the role of automatic methods to help assess and reduce these uncertainties.

Speaker(s): Nicolas Clausolles

Date: Thursday 16th of May 2019

Location: room G201, ENSG, Nancy

Abstract:

Salt welds are surfaces or zones resulting from the (nearly) removal of salt from an initial layer or diapir, putting in contact  originally separated sedimentary layers. They are still poorly known and few sudies and data are available in the scientific literature, although their impact on reservoir sealing is globally recognized.

This seminary is the occasion to present my recent works on the modeling of salt welds, starting from an initial implicit structural model where salt geobodies have already been interpreted. The first step consists in detecting the isolated salt volumes in the model that require to be connected. Then the weld surface is extracted from the scalar field defining the structural model using an image segmentation method, the watershed segmentation. This segmentation defines a semi-infinite plane through the model. This plane is finally truncated using some visibility criteria to obtain the final weld surface.

Speaker(s): Melchior Schuh-senlis

Date: Tuesday 11th of April 2019

Location: room G201, ENSG, Nancy

Abstract:

The Finite Element Method (FEM) is widely used to solve Partial Differential Equations. It relies on performing computation on a reference element and linking the results to the rest of the model through shape functions. Since multigrids have elements that are very similar to each other (different sizes of rectangles in 2D and hexaedron in 3D), applying the FEM on them simplifies the associated shape functions while reducing the computational cost compared to very fine regular grids. The aim of this seminar is to show how the deal.II library deals with this problem, especially the handling of hanging nodes, and the advancements I have made in using it for implementing mechanical simulations in the subsurface.