Skip to main content
Workflow Tectonics

Why Your Core Log and Your Seismic Section Are Telling Two Different Stories: A Process Comparison for Subsurface Teams

This comprehensive guide explores the persistent challenge subsurface teams face when core log data and seismic sections appear to contradict each other. Drawing from workflow and process comparisons at a conceptual level, we explain why these discrepancies occur—rooted in differences in scale, resolution, measurement physics, and interpretation workflows. The article provides a step-by-step framework for reconciling the two datasets, compares three common integration approaches (direct calibrat

Introduction: When the Data Disagrees, the Workflow Is Usually the Culprit

Every subsurface team has faced the moment: the core log from a newly drilled well shows a clear, blocky sand with excellent porosity, yet the seismic section crossing the same location displays a dim, chaotic amplitude with no obvious reflection continuity. The geologist insists the core is ground truth. The geophysicist argues the seismic is the only continuous constraint. The reservoir engineer wonders which dataset to trust for the static model. This tension is not a failure of people but a mismatch of processes. The core log and the seismic section record fundamentally different properties at vastly different scales, using different physics, and are interpreted through different workflows. As of May 2026, this guide offers a structured process comparison to help teams dissect why the stories diverge and how to reconcile them without sacrificing intellectual honesty.

We will avoid inventing specific studies or dollar amounts. Instead, we focus on the conceptual frameworks that drive interpretation. The goal is not to declare one dataset superior but to equip teams with a decision tree: when to trust the core, when to trust the seismic, and when to suspect both are telling partial truths. This overview reflects widely shared professional practices and should be verified against current official guidance where applicable.

The Fundamental Scale Gap: Why Core and Seismic Cannot Agree on Detail

The most common root cause of apparent disagreement between core logs and seismic sections is scale. A core sample is typically a few inches in diameter and represents a few feet of vertical section. A seismic trace, depending on frequency content and processing, integrates over tens to hundreds of feet vertically and hundreds of feet laterally. Asking these two datasets to tell the same story without a scale-bridging process is like comparing a high-resolution photograph to a satellite image: both are accurate, but they describe different levels of detail.

The Vertical Resolution Trap

Many teams assume that if a core shows a 2-foot sand bed, the seismic should show a corresponding reflection. In practice, seismic resolution is limited by the dominant wavelength. For typical exploration frequencies (20-50 Hz), vertical resolution is roughly 25 to 100 feet. A 2-foot bed is below tuning thickness and will not produce a distinct reflection event. Instead, it may appear as a composite amplitude variation or be completely invisible. The core log is not wrong—it is simply revealing details the seismic cannot see.

In one composite scenario, a team mapped a reservoir based on seismic amplitude anomalies and drilled a well targeting a bright spot. The core showed thin, high-porosity sands interbedded with shales. The seismic amplitude matched the gross package but not the individual beds. The team initially thought the core and seismic disagreed on net pay. After applying a Backus average to the core-derived elastic properties, they found that the effective medium response matched the seismic amplitude within uncertainty. The mismatch was not a data conflict—it was a scale artifact.

Lateral Resolution and the Smearing Effect

Seismic data also suffers from limited lateral resolution due to the Fresnel zone and migration aperture. A core sample represents a point measurement. A seismic bin may average over hundreds of feet, including multiple lithologies and fluid contacts. When the core shows a sharp fluid contact but the seismic shows a gradual amplitude change, the discrepancy is often due to spatial averaging.

Teams should routinely compute the effective resolution limits of their seismic data (via amplitude spectra and migration tests) and compare them against core descriptions before diagnosing a mismatch. A decision rule of thumb: if the feature of interest is smaller than one quarter of the dominant wavelength in thickness or smaller than the bin spacing in lateral extent, expect the seismic to smear or miss it entirely.

Measurement Physics: Porosity, Impedance, and the Translation Problem

Core logs measure physical properties directly: porosity, grain density, permeability, and sometimes sonic velocity. Seismic sections measure acoustic impedance contrasts—the product of density and compressional velocity. Translating between these domains requires a rock physics model. When the model is inappropriate or absent, the stories diverge.

Porosity vs. Impedance: A Nonlinear Relationship

Porosity and acoustic impedance are not linearly correlated except in simple, clean lithologies. In shaly sands, carbonate vugs, or fractured reservoirs, the relationship breaks down. A core may show 20% porosity, but if the rock contains clay or microporosity, the impedance may be higher than expected, producing a dim seismic amplitude that suggests lower quality rock. Conversely, low-porosity, fractured carbonates can have low impedance due to fluid-filled fractures, causing the seismic to overpredict reservoir quality.

In a typical deepwater turbidite project, a team calibrated a linear porosity-impedance transform from core data and applied it to the seismic inversion volume. The resultant porosity map showed a high-quality fairway that contradicted the core-derived porosity from a later well. The mismatch was traced to variable clay content. The team then used a multi-mineral model (e.g., combining quartz, clay, and fluid components) and found that the seismic-inverted porosity aligned with core data within error bars. The process failure was not the data but the assumption of a single rock type.

Fluid Effects: The Core Is Dry, the Seismic Is Wet

Another common physics mismatch: core plugs are typically cleaned and dried before measurement. Seismic data responds to in-situ fluids. A core measured under dry conditions may show high velocity and density, while the same formation in situ with oil or gas may have lower impedance. Teams often forget to apply fluid substitution (using Gassmann's equations) to core measurements before comparing to seismic.

A practical workflow: always compute the saturated elastic properties from core data using the measured dry frame moduli, the in-situ fluid properties, and the porosity. Only then compare core-derived impedance to seismic impedance. Without this step, the two datasets will systematically disagree, leading to false confidence or unnecessary worry.

Processing Artifacts: How Seismic Workflows Create Phantom Stories

Seismic data undergoes extensive processing: deconvolution, stacking, migration, and various filtering steps. Each step introduces assumptions and potential artifacts. Core logs undergo their own processing (depth shifting, environmental corrections, normalization). When these two processing sequences are not aligned, the stories they tell diverge.

The Migration Smear and Positioning Errors

One common artifact is mispositioning of seismic events due to inaccurate velocity models. A core log at a given well location may show a sand at 10,000 feet TVD, but the seismic section may place the corresponding reflector at 10,050 feet due to a velocity pull-up from a shallow gas cloud. The team sees an apparent depth mismatch and concludes the correlation is wrong. In reality, the seismic depth conversion is the source of error.

Teams should always compute a synthetic seismogram from the core-derived sonic and density logs (even if approximate) and tie it to the seismic section at the well location. If the synthetic tie shows a systematic depth shift, the issue is processing, not geology. A rule of thumb: if the synthetic tie is off by more than half the dominant period of the seismic wavelet, revisit the velocity model before concluding a fundamental data conflict.

The Null and the Negative: Polarity Conventions

Another subtle but frequent mismatch arises from seismic polarity conventions. A hard kick (increase in impedance) produces a positive amplitude in normal polarity, but many processing workflows use reverse polarity or display with a zero-phase wavelet. Core logs show impedance directly, but teams may forget to check the polarity convention of the seismic volume. A sand-to-shale boundary that should produce a hard kick may appear as a trough instead of a peak, leading to a false correlation.

Process fix: always verify the polarity of the seismic volume by examining the well tie with a known marker (e.g., a regional shale above a clean sand). If the synthetic tie requires flipping polarity to match, document that and ensure all interpreters use the same convention.

Three Integration Approaches: A Comparative Framework

Teams have developed several strategies to reconcile core and seismic data. Each approach has strengths, weaknesses, and appropriate use cases. The choice depends on data quality, rock complexity, and project phase.

ApproachMethodStrengthsWeaknessesBest For
Direct Calibration (Rock Physics Templates)Use core-measured elastic properties to build templates that relate porosity, lithology, and fluid to impedance. Apply template to seismic inversion.Physically grounded; handles complex lithologies if core data is representative.Requires high-quality core data; templates may be non-unique if scatter is large.Mature fields with abundant core; simple lithologies.
Statistical Inversion (Bayesian or Geostatistical)Use core data as prior information in a probabilistic inversion that produces multiple realizations of reservoir properties from seismic.Quantifies uncertainty; can incorporate scale mismatch through variograms.Computationally intensive; results sensitive to prior assumptions.Exploration or appraisal with limited core; high-risk decisions.
Machine Learning (Neural Networks or Random Forests)Train a model on core-derived properties and seismic attributes (amplitudes, AVO, spectral decomposition) at well locations. Apply to full volume.Captures nonlinear relationships; fast to apply after training.Black-box nature; risks overfitting; requires careful validation and feature selection.Large 3D surveys with many wells; complex reservoirs with no simple rock physics model.

In a composite deep carbonate project, a team used direct calibration initially but found the template scatter too high due to variable vuggy porosity. They switched to a statistical inversion that incorporated a variogram model from core data and produced multiple realizations. The resulting porosity map showed a range of possibilities, and the team drilled a successful well targeting the P50 scenario. The process choice directly affected the outcome.

Another team in a tight gas sand used machine learning with 30 wells and 40 seismic attributes. The model predicted high porosity in a zone where core data was sparse. A subsequent well found the prediction accurate within 1 porosity unit. However, the team noted that the model failed in a compartment with different clay mineralogy, highlighting the need for geological constraints in the feature set.

Step-by-Step Workflow for Reconciling Core and Seismic

To systematically address discrepancies, teams should follow a structured workflow that isolates the source of mismatch before jumping to conclusions.

Step 1: Quality Control Both Datasets Independently

Before any comparison, verify that each dataset is internally consistent. Check core depth shifting against wireline logs (gamma ray, density, neutron). Identify core gaps, sample bias (e.g., only competent rock sampled), and measurement artifacts (e.g., stress relaxation). For seismic, review processing reports for migration algorithms, velocity model accuracy, and multiple attenuation. Compute the seismic wavelet (via well tie or statistical extraction) and assess bandwidth. If either dataset has unresolved quality issues, stop and fix them first.

Step 2: Compute a Synthetic Seismogram from Core Data

Using the core-measured sonic and density logs (or derive sonic from core porosity and lithology using empirical transforms), compute a reflection coefficient series. Convolve with the seismic wavelet extracted at the well location. Compare the synthetic to the seismic trace at the well. Identify depth shifts, polarity mismatches, and amplitude scaling issues. Document the tie quality (correlation coefficient is common but not sufficient; also compare event shapes).

Step 3: Apply Fluid and Pressure Corrections

If the core was measured under dry conditions, use Gassmann fluid substitution to compute saturated elastic properties using the estimated in-situ fluid (oil, gas, or brine) and pressure conditions. If core measurements were taken at surface conditions, apply pressure corrections using empirical relationships or laboratory-derived stress sensitivity. Compare the corrected core impedance to the seismic impedance at the well.

Step 4: Upscale Core Properties to Seismic Scale

Apply a Backus average or a running median filter to the core-derived impedance log to match the seismic resolution (typically 20-40 feet vertical window). Compare the upscaled log to the seismic trace. If they now agree, the original mismatch was purely a scale issue. If they still disagree, proceed to Step 5.

Step 5: Investigate Rock Physics and Geological Variability

If scale and fluid corrections do not resolve the mismatch, examine rock physics. Plot core porosity vs. core impedance and compare to the seismic-derived impedance at the well. If the core points fall off the trend, consider additional mineralogy (e.g., clay, pyrite, or carbonate cement). Use thin sections or XRD data to identify the cause. If the seismic impedance at the well differs from core-derived impedance, check for processing artifacts like residual multiples or migration swings.

Step 6: Document and Communicate Uncertainty

After identifying the source of mismatch, document the findings in a brief memo. Include the scale, physics, or processing cause, and the estimated uncertainty in each dataset. Communicate to the team that no single dataset is truth—both are models with limitations. Recommend a probabilistic approach (e.g., multiple realizations) for reservoir modeling.

Common Questions and Practical Pitfalls

Teams frequently ask: “How much mismatch is acceptable?” The answer depends on the decision being made. For a drilling target, a mismatch of 5% in porosity may be acceptable if the net pay is thick. For a fluid contact determination, a mismatch of 50 feet in depth may be critical. Define a decision threshold before starting the reconciliation.

Another common question: “Should we always trust core over seismic?” Not necessarily. Core can be biased (preferential sampling of competent rock, missing fractures, or altered by drilling fluids). Seismic, while lower resolution, samples the entire volume. In one composite scenario, a team trusted core data showing high permeability and drilled an injector well. The well underperformed because the core missed a fault zone that the seismic had indicated as a potential barrier. The seismic story was correct about compartmentalization, even though the core story was correct about rock quality.

Finally, teams often ask about the role of AVO (amplitude versus offset) in reconciliation. AVO analysis can help distinguish fluid effects from lithology effects. If the core and seismic disagree on fluid content, AVO attributes (intercept and gradient) can provide independent evidence. However, AVO is also sensitive to processing and should be used as a cross-check, not a standalone answer.

Conclusion: Embrace the Conflict as a Learning Tool

The core log and the seismic section will never tell exactly the same story—they are not designed to. The core provides local truth at high resolution. The seismic provides regional context at lower resolution. The apparent conflict is not a problem to be eliminated but a signal to be interpreted. When teams approach the discrepancy with a structured process—scale assessment, physics translation, and quality control—they uncover deeper insights about the reservoir.

Our process comparison suggests that the most successful teams are those that treat mismatch as diagnostic. A systematic reconciliation workflow reduces costly surprises, builds confidence in both datasets, and ultimately improves drilling success rates. As subsurface work becomes increasingly integrated, the ability to translate between core and seismic languages is not just a technical skill—it is a business advantage.

This overview reflects widely shared professional practices as of May 2026. Verify critical details against current official guidance where applicable.

About the Author

This article was prepared by the editorial team for this publication. We focus on practical explanations and update articles when major practices change.

Last reviewed: May 2026

Share this article:

Comments (0)

No comments yet. Be the first to comment!