As a group of EAGE members and volunteers, the EAGE A.I. Committee is dedicated to helping you navigate the digital world and finding the bits that are most relevant to geoscientists.
This month’s newsletter will provide you with an overview of the activities they have prepared for the EAGE Annual Convention, to be held in Oslo next June. We hope to count on your active participation!
You are welcome to join EAGE or renew your membership to support the work of the EAGE A.I. Community and access all the benefits offered by the Association.
EAGE Annual Hackathon – ‘Coding to Net-Zero: AI for Energy-Efficient Future’
Summary by: Oleg Ovcharenko and Ashley Russell
As part of our commitment to driving innovation in the field of geosciences, engineering and AI, the EAGE AI Committee, in collaboration with NVIDIA, is thrilled to announce the upcoming hackathon event, “Coding to Net-Zero: AI for Energy-Efficient Future,” taking place on 9-10 June in Oslo, during the EAGE Annual Conference.
What?
This two-day event promises to be a deep dive into the latest AI technologies and methodologies, focusing on their application towards achieving a sustainable, energy-efficient future. Participants will have the opportunity to tackle real-world challenges, collaborate with like-minded professionals, and push the boundaries of what’s possible with artificial intelligence.
A few possible ideas include:
* LLM Tuning or Optimization for Domain Data: Tasks include fine-tuning LLM on geophysical research papers, building evaluation datasets for niche domains, creating RAG assistants, optimizing data collection, and developing faster AI for seismic processing.
* Physics-ML Integration: Projects range from reservoir and wind simulations using ResSim and Modulus, to optimization of wind turbine positioning and predicting CCS in data-poor areas.
* Open Subject: Participants are encouraged to bring their unique problems, algorithms, code or tools or to try optimizing open-source projects, showcasing their innovations.
Workshops:
To maximize your hackathon experience, we are hosting two online workshops prior to the event. Registered participants will receive all the information a week before the session. The first will cover data curation and LLM fine-tuning with the NeMo Framework, while the second will focus on end-to-end reservoir simulation using PINN/FNO within the Modulus framework.
Join us! This event is not just a competition; it’s an opportunity to contribute to significant advancements in energy sustainability, network with industry and academic leaders, and showcase your skills on a global platform.
For detailed instructions and to register, visit our hackathon instructions page. Make sure to select the Hackathon option on the EAGE Annual registration page to confirm your participation.
The winners of the hackathon will be presenting their ideas and solutions in the Energy Transition Theatre during the EAGE Annual Conference.
Image credit: DALL-E
Workshop on generative AI – deep dive into theory and practical subsurface use cases
Summary by: Lukas Mosser and George Ghon
What?
Following the recent success of generative models such as ChatGPT, released by Open AI, we will organize a focussed event covering the topic through a subsurface lens during the upcoming EAGE Annual Conference in Oslo. Leading industry figures will provide an introduction to the theory behind modern natural language processing architectures and introduce use cases that have been developed by operators on the Norwegian Continental Shelf.
In addition, we are offering a practical session with hands-on labs, provided by Microsoft that will offer participants to gain experience with the tools that allow them to put these machine learning models into production and serve them to users in an enterprise environment. To round off this full day workshop offer, selected participants will join in a panel discussion that further outlines the opportunities and risks of generative models in the geoscience space.
The EAGE committee hopes to serve an up-to-date event in the rapidly evolving space of artificial intelligence with this workshop.
Eager to know more?
For detailed information and instructions to register, please visit the dedicated site on the EAGE Annual homepage. Make sure to select Workshop 17 on the EAGE Annual registration page to confirm your participation.
Curious about real applications of ChatGPT for your daily work?
Summary by: Ashley Russell
What?
Ever wondered how to use ChatGPT or similar tools for technical work tasks in geoscience and engineering? Join the members of the AI committee at the Digital Transformation Theatre during the EAGE Annual in Oslo to learn some of our favorite technical tips and tricks on how ChatGPT can help you with specific tasks around data preparation and data analysis.
This session is meant to be informal where you can also bring your phone or laptop and try prompts together – and to give time to answer questions around security, prompt engineering, and questions on how Large Language Models work.
Image credit: DALL-E
Transformer Networks
Summary by: Jan H. van de Mortel
What?
Two common algorithms for capturing internal structure in data are Convolutional Neural Networks (CNN; although ‘Correlational’ would strictly speaking be a more correct term) and Recurrent Neural Networks (RNN, LSTM).
Not going into the exact details of both approaches, in short, the CNN design is excellent in capturing the immediate vicinity around any point in data, hence ‘short distance / wavelength’, but typically does not capture any longer distance / wavelength relations.
RNN type networks analyze data sequentially, only capturing the adjacent data points in each step. This means it in principle cannot be parallelized, and also is difficult to scale up. Moreover influence by / from longer distance data points is by it’s nature multiply indirect at best, hard to trace and usually fading fast.
The Transformer architecture is designed to process the entire data item (sequence, wavelet, image, etc.) simultaneously, capturing the whole internal structure all at once regardless of internal distance / wavelength.
The fundamental building blocks are self attention (a mechanism for weighing the relative importance of all the individual elements in the data) and positional encoding (to retain the relative positions of said elements within the data).
This in real terms means that both the small scale details as well as the overall context is effectively analyzed / processed. Other advantages include it is structurally well suited for upscaling and parallelizing (using GPU). A good introduction into the details is given in this presentation by Jay Jalammar.
The technique was first published in the 2017 paper Attention is all you need. The first applications were Natural Language Processing, in fact online tools such as ChatGPT are based on it.
This was followed by applications in image handling, see this link for an overview on the adoption of the technology in the medical world.
It appears that in practice, training a Transformer alone requires a large training dataset and many epochs (expensive in time and hardware). Using a hybrid approach based on CNN and Transformer has proven very effective and far more efficient.
Image credit: A. Vaswani et al (2017). ‘Attention is all you need’.
Why this is useful?
In conclusion, next to it forming the basis for large language models such as ChatGPT, this technique has potential for significant benefit for seismic, sonic, borehole imaging, etc. Current experiments are already showing this.
Curious to know all EAGE is doing for the digital transformation?
This newsletter is edited by the EAGE A.I. Committee.
Name | Company / Institution | Country |
---|---|---|
Jan H. van de Mortel | Independent | Netherlands |
Julio Cárdenas | Sorbonne Université | France |
George Ghon | Capgemini | Norway |
Lukas Mosser | Aker BP | Norway |
Oleg Ovcharenko | NVIDIA | United Arab Emirates |
Nicole Grobys | DGMK | Germany |
Roderick Perez | OMV | Austria |
Surender Manral | Schlumberger | Norway |