Accurate Greeks for Non-smooth Payoffs

WBS Conference 2025 https://matlogica.com/WBS2025

Dmitri Goloubentsev, dmitri@matlogica.com

The Challenge

Traditional finite differencing for Greeks calculation:

  • High computational cost (bump-and-revalue for each risk factor)
  • Poor accuracy for discontinuous payoffs (digitals, barriers, autocalls)
  • Especially problematic for complex derivatives with multiple barriers
  • Real Impact: Out-of-money options, correlation risk particularly challenging
  • High model parametrization granularity yields small changes in payoffs → more numerical noise from Monte Carlo methods

Bottom line: Traditional methods struggle with modern structured products

Why AAD Fails for Non-smooth Payoffs

Pathwise AAD Limitation: Only works for continuous payoffs

  • Digital functions → Zero gradients everywhere
  • Example: European payoff ✓, digitals ✗
// This gives zero gradients with AAD:
Real digitalPayoff = (S > K) ? 1.0 : 0.0;

Need for smoothing to make payoffs differentiable

Less Trivial Example: sqrt(|X_t|)

Watch out for less obvious problems:

  • Path-wise MC method can randomly fail for sqrt(abs(X_t))
  • Finite Differencing gives noisy but reasonable derivatives
  • AAD can fail catastrophically!
// Problematic for AAD when X_t crosses zero:
Real payoff = sqrt(abs(underlying_path[t]));

Case Study: Phoenix Autocallable Notes

Two-asset autocallable structure:

  • Contingent coupons (2.55% quarterly if worst performer > 70%)
  • Early redemption if both assets > 100% initial
  • Principal protection with barrier at 70%

Multiple discontinuities:

  • Autocall triggers
  • Barrier events
  • Worst-of conditions

Perfect test case for smoothing + AAD approach

Smoothing Methodology

Mathematical Foundation:

Replace Heaviside functions H(x) with smooth sigmoids:

\[\text{contLess}(a, b, h) = \begin{cases} \mathbf{1}_{a < b} & \text{if } h = 0 \\ \frac{1}{2}\left(\frac{(b-a)/(0.02h)}{\sqrt{1 + ((b-a)/(0.02h))^2}} + 1\right) & \text{if } h > 0 \end{cases}\]

Key Properties:

  • Parameter selection: Balance accuracy vs. differentiability
  • Economic preservation: Area under smoothed curve ≈ original digital
  • Smooth transitions: Enable AAD computation

Smoothing Implementation Details

Explain how smoothing works with not_autocalled_so_far:

  • Change pricing from digital events to “proportion” of called contract
  • Specific path can enter multiple smoothed call regions
  • At maturity T, total contract value is weighted sum of all possible digital events
Real not_autocalled_so_far = 1.0;
// ... for each observation date:
Real autocalled_on_this_coupon = contLess(barrier, price, h);
total_amount += not_autocalled_so_far * (
    autocall_amount + coupon_amount
);
not_autocalled_so_far *= contNot(autocalled_on_this_coupon);

Implementation Approach

Three calculation methods compared:

  1. Base: No smoothing, finite differencing (bump-and-revalue)
  2. Smoothed: Smoothing + finite differencing
  3. AADC: Smoothing + Automatic Adjoint Differentiation

Technical details:

  • Central differencing (not forward!)
  • Optimized bump sizes (1-5%)

Source code available: github.com/matlogica/QuantBench

Benchmark Results: Key Findings

Performance improvements:

  • Smoothed payoffs converge faster than non-smoothed
  • AADC with smoothing: orders of magnitude improvement
  • Correlation Greeks: Challenging even at 10M paths without smoothing

CPU Time vs. Path Count:

  • Base case (non-smoothed): 10M paths for reasonable stability
  • Smoothed AAD: comparable accuracy with <1M paths
  • Orders of magnitude improvement in computational efficiency

Live demo: Interactive results viewer

Practical Considerations

Smoothing parameter selection:

  • Trade-off between stability and accuracy
  • Similar concept to finite difference bump size
  • Applied at return rate level → similar magnitude for all assets
  • Different units/volatilities need individual scaling

Advanced Features & Extensions

Based on Peter Jaeckel’s feedback - Future work on this benchmark:

  • Quasi-Monte Carlo: Sobol sequences + Brownian Bridge
  • Low-discrepancy numbers: Not “quasi-random” - they’re deterministic!
  • Open source - contributions welcome

Generalization with PayOff Language:

AADC-xVA-prototype

  • Compositional approach to financial contracts
  • Longstaff-Schwartz regression for callable products
  • XVA calculations via regression and AAD
  • Framework ready for AAD integration

Q&A

Resources:

Thank You

Contact:

Dmitri Goloubentsev
Head of Automatic Adjoint Differentiation
Matlogica LTD

  • Email: dmitri@matlogica.com
  • Web: matlogica.com
  • Phone: +447378414528