top of page

Can My Seismic Data Get Any Better?

The debut of the Interview Series for the GSH Journal features one-to-one conversations with scientists to help connect individuals in the fast-paced, global buisness of geoscience.We will learn about the people that drive science, the science that inspires people, and the constant machine of progress. These interviews allow the reader to enrich their own careers with examples from others experiences. Each conversation will offer a deeper understanding regarding applications of geosciences, different perspectives to enhance an individual's own work, or new information surrounding traditional or inventive methods. We hope you enjoy the interviews to come.


Industry Expert Interview Series: Ron Kerr

Intro: Seismic Processing is an integral part of what makes seismic data so valuable for managing the resources of the subsurface, and can be used for drilling and producing oil and gas, managing disposal wells, identifying subsurface hazards. or setting up shallow surveys for construction. Having the best seismic image to interpret is not much different than a doctor wanting the best brain imaging equipment with the highest resolution to make the best decision for his patient. The workflow of processing is amazing: it involves taking noisy, raw data; filtering, computing with correct velocities; and taking care at every part of the workflow to insure that the image is the best possible given the geology and technology can provide. Let's delve into the details of processing with an industry expert. Ron Kerr.


Rene’: What interested you in studying geophysics and beginning a career in seismic processing?


Ron: I was somewhat of a math nerd in high school, so I was drawn to something like geophysics. As far as seismic processing, I have always been amazed that you can push a couple of buttons in processing to make the data so much better. Processing to me is a mix of art and science.


Rene’: What is [the goal or definition] of quality control (QC) of seismic processing?


Ron: There are a lot of individual steps in processing a dataset from start-to-finish. I believe that if each of these steps is done a bit better than default, then at the end you’ll have a wonderful result. The part of QC is to ensure that each of these incremental steps is done a bit better than default.


Rene: How has unconventional drilling changed seismic processing from conventional drilling projects?


Ron: It seems like unconventional [development] helped foster in 5D interpolation as a standard practice and the sorting of data into unique offset and source/receiver azimuth traces. Interpolation has always been an optional processing step, but the sorting into offset-vector tiles for 5D creates unique offset/azimuth outputs. One reason this is important is that vertical fracturing can sometimes be determined by looking at migrated gathers and their affiliated products; Vfast, Vslow, Azimuth of Vfast can give insights to the fracture nature of the subsurface.


Detailed noise reduction seems to have increased in importance as well, perhaps in part due to unconventional projects. Land data can be quite noisy. Some of the denoise techniques of years past seem insufficient compared to current techniques. Thankfully, we’ve advanced quite a bit, and better noise reduction is a large part of the advancements.


Rene: Since time migration is the input to depth and many companies are doing PSDM, what is a critical step in the time migration?


Ron: To me, every processing step is critical – whether for time migration or depth migration. For land data, certainly the denoise and statics are extremely critical. On land datasets, it seems the denoise is continuous throughout the project and never stops. You’re always reducing the noise throughout the project. Velocities, velocities, and more velocities are critical for both time migration and especially depth migration. The better estimation of velocities, the output is better where subtle changes in velocities can create not-subtle changes in a depth migration.


Rene: Is quality control worth the additional [expense] on a project?


Ron: Yes QC is worth the cost! I spent over 10 years as a hands-on processor at a contractor company. I had clients that came in for weekly QC visits. Just being a continuous QC presence throughout the project can go a long way towards ensuring the client gets better results.


It is interesting how I’ve had different types of clients, some more hands-on and others not. I remember one client wanted to come to a QC meeting at the processing center. The meeting was typical, they showed some procedures; I asked a few questions and the project moved forward. As we were leaving the meeting in the parking lot my client said to me, “I’m glad you’re on my team and you are up to date on processing because I'm not!”


I’ve also been privileged to QC higher-profile projects including over the major gas fields in the Eastern Mediterranean, numerous deep water GOM surveys (US and Mexico), a tremendous number of Permian 3D volumes and even perhaps the first seismic survey in South Dakota! Lately I’ve been working with a geothermal company helping them with an upcoming seismic project.


Rene’: [It is i]nteresting that you are involved with a geothermal company. Are you seeing a trend develop [away from] traditional Oil and Gas clients?


Ron: I think there have always been clients from outside oil and gas, and I’ve been meeting several. I have some colleagues and clients who have left oil and gas and are now working in the carbon capture industry, for example. One in particular invited me to make a presentation to their company covering land seismic processing.


Rene: Is there a set rule of thumb for QC or is QC more organic and problem dependent?


Ron: I have a few set rules. First, everybody is on the same team. We’re all professionals, and we all want good results. Obviously, the client wants good results, but so does the processor. Forming good relationships and trust is quite important. I’ve seen projects turn when the client and/or processor mistrust each other, which can lead to sub-optimal results. There can be project issues: e.g. key people might be sick, have personal problems at home, or be pulled away on other projects – sometimes you need upfront discussions and understanding to get everything back on track.


When it comes to technical problems: no problem. Every project will have challenges. You work through them, discuss options, make a decision with all involved, document the issues, and move on.


A recent project had a challenge where we thought there might be a problem with the refraction statics. Everybody voiced their opinions on the potential issue, we looked a little deeper and it turned out there was no problem. The original results were fine. Everyone was professional and the “problem” was resolved.


On another project, some steep dips became aliased after migration. I asked a few questions and the processing company realized they made an error putting in the migration parameters. They re-ran the migration with the corrected procedures and the aliasing was gone. Again, not a problem if handled properly.


My favorite QC meetings are when the processor starts off by showing me a problem that they’ve discovered and are proactively working to fix. Without a standard QC meeting, these issues might never be addressed, much less resolved. A lot of seemingly minor mistakes can be buried inside a 3D volume – leaving you with a less-than-desirable segy product.


Rene’: How often do you collaborate with the client/team during the process?


Ron: I communicate frequently. I always send a full status report to the client after every QC meeting with the processor. The report covers processing steps completed with comments about the parameters, what each step accomplishes, any changes to the procedures based on the QC observations, next steps forward. There’s also a lot of communication with the processing team. Technical ability is only half of the skill-set required for good QC; communication is the other half. Besides written reports, there are emails, texts, PowerPoints, phone calls, and any other client requirements.


Rene: Can you share an example of a very interesting/challenging/odd processing problem and its solution?


Ron: I actually have a few.


On one project a cycle-skip was found that put in a false structural fault after residual statics. After some questions, turns out the processor had recently added adjacent velocity picks that differed; this caused the events to mis-align laterally, leading to the cycle-skips and a false fault. So the velocities were fixed and the residual statics re-run for better results.


On another project, I noticed that a processor inserted artificially super-fast migration velocities at the surface. There was no evidence other than the client had previously made an off-hand comment that there might be thin salts locally in the near surface. The processor took that to mean put in fast velocities everywhere in the shallow. Obviously the communication was cleared up and we got the velocities improved.


It’s common for processors to show a modestly-changing color bar for velocity displays, however, the gradual transitions in the color bars can often hide issues. I’ve had to convince processors to use a more detailed color bar, revealing non-geological bulls-eyes in the velocities. Velocity bulls-eyes aren’t necessarily wrong, but they might indicate something to discuss. This is more critical in depth migrations.


On the topic of velocities, with another project, a processor showed velocities along a depth-slice in the gas flat spot, with a detailed color bar like previously mentioned. The results showed velocities artificially aligning with the marine streamer acquisition direction. The processor removed the velocity artifacts generated by acquisition footprint.


Rene’: What are some innovative/revolutionary processing techniques you have experienced since the 80’s?


Ron: Depth migration would be number one. I was actually a depth migration manager at a processing company for over ten years and saw the growth and benefits of the technology change over the years.


Another advancement would be 5D interpolation, enhancing the signal/noise uplift and in the related sorting of data into unique offsets/azimuths.


Similarly, would be the improvements in noise reduction for land data. Some older land seismic surveys had the vibroseis sweep starting at 10Hz, 12Hz, or even higher. One of the motivations for the high starting frequency was to reduce the amount of ground roll in the data. Current denoise algorithms are much better at removing noise at all frequencies without harming the underlying signal. Modern surveys are acquired with much lower starting frequencies. The lower signal frequencies can now be retained and they help shape the wavelets for better interpretation.


Currently the advances of FWI full waveform inversion for depth migration are stunning. Some examples are on my linked in page (74) "ron kerr" | Search | LinkedIn(72) Houston Seismic: Overview | LinkedIn


Rene’: Thank you for sharing your experiences and impressing that quality control is important to the workflow and communications of a successful project. Some stories that you have shared show that small issues unresolved can make for errors in the interpretation of the data later on. Advances over the years in workflows, computing power, and algorithms have made step function improvements to the end user. Seismic processing is not a stagnant science as there are ongoing advancements in processes: FWI, 5D interpolation, migration, denoise, and others. Industries like geothermal and carbon capture are using seismic data to reduce risk.


Ron: Thank you Rene’ for the opportunity to answer your questions and for all you do with GSH.

2 comments
bottom of page