Testing Guide

AADC provides comprehensive testing infrastructure to validate function compilation, execution, and automatic differentiation accuracy. This guide covers the testing framework, best practices, and common testing patterns.

Testing Infrastructure Overview

AADC’s testing framework consists of several key components:

  • TestDriver: Main test execution engine with automatic validation
  • Function validation: Comparing AADC results against reference implementations
  • Gradient validation: Verifying automatic differentiation using finite differences

Core Testing Framework

TestDriver Template

The TestDriver template provides automated testing for AADC functions:

#include <aadc/tests/TestDriver.h>
#include <gtest/gtest.h>

template<typename mdouble>
class MyFunction : public aadc::test::FunctionBase<mdouble> {
public:
    virtual int dimX() { return 3; }  // Number of inputs
    virtual int dimY() { return 2; }  // Number of outputs
    
    void CodeGenBigFunc(const std::vector<mdouble>& x, std::vector<mdouble>& y) {
        // Your mathematical function implementation
        y[0] = std::exp(x[0] * x[1] + x[2]);
        y[1] = std::sin(x[0]) + std::cos(x[1] * x[2]);
    }
};

TEST(MyTests, BasicFunction) {
    TestDriver<MyFunction<idouble>, MyFunction<double>>("MyFunction", 25, 100);
    //          active_function    passive_function     name    tests checks
}

mdouble can be either idouble (active) or double (passive).

The test driver automatically:

  1. Generates random input scenarios
  2. Compiles the AADC kernel for the active function
  3. Evaluates both active and passive functions
  4. Compares results and gradients with configurable tolerances

Test Parameters

The TestDriver accepts three parameters:

  • name: Test identifier for output messages
  • num_tests: Number of different random tests (default: 25)
  • num_checks: Number of validation points per test (default: 100)

Each test generates random input combinations and validates both function values and gradients.

Validation Methods

Function Value Validation

The testing framework automatically:

  1. Generates random inputs across specified ranges
  2. Compiles AADC kernels for active type functions
  3. Evaluates reference functions using passive (native) types
  4. Compares results with configurable tolerance

Automatic validation - no additional code needed

Default tolerance: 1e-3, minimum: 1e-5

Gradient Validation

Automatic differentiation accuracy is verified using high-order finite differences:

  1. 6th-order finite differences for accurate gradient approximation
  2. Central, left, and right difference schemes for robustness
  3. Adaptive step sizes based on input magnitudes
  4. Multiple test points to catch edge cases

The framework automatically:

  • Sets unit adjoint seeds for each output
  • Executes reverse pass to compute gradients
  • Compares AADC gradients with finite difference approximations
  • Reports mismatches with detailed error information

TestDriver tries Central, Left, Right finite differences and assumes the best match. If none match within tolerance, the test fails.

Variable Type Testing

The framework tests different variable marking strategies:

Variable Types

  • Input+Diff (Type 0): Variables marked with markAsInput() - vary between executions and support derivatives
  • Diff-only (Type 1): Variables marked with markAsDiff() - constant during execution but support derivatives
  • Input-only (Type 2): Variables marked with markAsInputNoDiff() - vary between executions but no derivatives needed

Automatic Type Assignment

Tests automatically assign random variable types to validate different scenarios:

// Test 0: All variables as input+diff (comprehensive baseline)
// Test 1: All variables as diff-only (parameter sensitivity)
// Tests 2+: Random assignment (30% input+diff, 40% diff-only, 30% input-only)

This ensures robust testing across different use cases and variable configurations.