Deriving AD Endpoints from Each Other

This guide shows how to derive missing autodiff endpoints from ones you have already implemented, using the experimental AD fallback helpers.

Warning

This feature is experimental and available in tesseract_core.runtime.experimental. The API may change in future releases.

Warning

All four helpers materialise the full Jacobian matrix, which can be expensive for high-dimensional inputs or outputs and often defeats the purpose of using JVPs/VJPs. Use with caution.

Overview

Tesseracts expose up to three AD endpoints — jacobian, jacobian_vector_product (JVP), and vector_jacobian_product (VJP). The helpers jvp_from_jacobian, vjp_from_jacobian, jacobian_from_jvp, and jacobian_from_vjp let you derive any one of these endpoints from another you have already implemented, without writing additional gradient code.

Deriving JVP and VJP from the Jacobian

If you have a jacobian implementation (e.g. through an analytical derivation or a simulation adjoint), you can derive JVP and VJP from it:

from tesseract_core.runtime.experimental import jvp_from_jacobian, vjp_from_jacobian

def jacobian(inputs, jac_inputs, jac_outputs):
    # Your existing Jacobian implementation
    ...

def jacobian_vector_product(inputs, jvp_inputs, jvp_outputs, tangent_vector):
    return jvp_from_jacobian(
        jacobian, inputs, jvp_inputs, jvp_outputs, tangent_vector
    )

def vector_jacobian_product(inputs, vjp_inputs, vjp_outputs, cotangent_vector):
    return vjp_from_jacobian(
        jacobian, inputs, vjp_inputs, vjp_outputs, cotangent_vector
    )

Deriving the Jacobian from JVP or VJP

If you have a JVP or VJP but no explicit Jacobian, you can materialise the full Jacobian matrix from either:

  • jacobian_from_jvp — sweeps one-hot tangent vectors over each input element. Costs N JVP calls (N = total input elements). Prefer this when outputs are high-dimensional.

  • jacobian_from_vjp — sweeps one-hot cotangent vectors over each output element. Costs M VJP calls (M = total output elements). Prefer this when inputs are high-dimensional.

From JVP

from tesseract_core.runtime.experimental import jacobian_from_jvp

def jacobian_vector_product(inputs, jvp_inputs, jvp_outputs, tangent_vector):
    # Your existing JVP implementation
    ...

def jacobian(inputs, jac_inputs, jac_outputs):
    return jacobian_from_jvp(
        jacobian_vector_product, inputs, jac_inputs, jac_outputs
    )

From VJP

jacobian_from_vjp needs to know the output shapes before probing, so it takes an eval_fn argument — either apply or abstract_eval. abstract_eval is preferred because it determines shapes without running the full forward computation.

from tesseract_core.runtime.experimental import jacobian_from_vjp

def vector_jacobian_product(inputs, vjp_inputs, vjp_outputs, cotangent_vector):
    # Your existing VJP implementation
    ...

def abstract_eval(inputs):
    # Your existing abstract_eval implementation (preferred)
    ...

def jacobian(inputs, jac_inputs, jac_outputs):
    return jacobian_from_vjp(
        vector_jacobian_product, abstract_eval, inputs, jac_inputs, jac_outputs
    )

Example code

A concrete example is the univariate_adfallbacks Tesseract — a variant of univariate (Rosenbrock function) where the Jacobian is computed via JAX and JVP/VJP are derived automatically using jvp_from_jacobian and vjp_from_jacobian.

tesseract_api.py
# Copyright 2025 Pasteur Labs. All Rights Reserved.
# SPDX-License-Identifier: Apache-2.0

"""Example Tesseract demonstrating AD endpoint derivation fallbacks.

This example shows how to use jvp_from_jacobian and vjp_from_jacobian to
automatically derive the JVP and VJP endpoints from an existing Jacobian
implementation, without writing additional gradient code.

This is a variant of the univariate example (Rosenbrock function) where
the JVP and VJP endpoints are derived from the Jacobian instead of being
implemented manually.
"""

import jax
from pydantic import BaseModel, Field

from tesseract_core.runtime import Differentiable, Float32, ShapeDType
from tesseract_core.runtime.experimental import jvp_from_jacobian, vjp_from_jacobian


def rosenbrock(x: float, y: float, a: float = 1.0, b: float = 100.0) -> float:
    return (a - x) ** 2 + b * (y - x**2) ** 2


#
# Schemas
#


class InputSchema(BaseModel):
    x: Differentiable[Float32] = Field(description="Scalar value x.", default=0.0)
    y: Differentiable[Float32] = Field(description="Scalar value y.", default=0.0)
    a: Float32 = Field(description="Scalar parameter a.", default=1.0)
    b: Float32 = Field(description="Scalar parameter b.", default=100.0)


class OutputSchema(BaseModel):
    result: Differentiable[Float32] = Field(
        description="Result of Rosenbrock function evaluation."
    )


#
# Required endpoints
#


def apply(inputs: InputSchema) -> OutputSchema:
    """Evaluates the Rosenbrock function given input values and parameters."""
    result = rosenbrock(inputs.x, inputs.y, a=inputs.a, b=inputs.b)
    return OutputSchema(result=result)


#
# Optional endpoints
#


def jacobian(
    inputs: InputSchema,
    jac_inputs: set[str],
    jac_outputs: set[str],
):
    rosenbrock_signature = ["x", "y", "a", "b"]

    jac_result = {dy: {} for dy in jac_outputs}
    for dx in jac_inputs:
        grad_func = jax.jacrev(rosenbrock, argnums=rosenbrock_signature.index(dx))
        for dy in jac_outputs:
            jac_result[dy][dx] = grad_func(inputs.x, inputs.y, inputs.a, inputs.b)

    return jac_result


def jacobian_vector_product(
    inputs: InputSchema,
    jvp_inputs: set[str],
    jvp_outputs: set[str],
    tangent_vector,
):
    return jvp_from_jacobian(jacobian, inputs, jvp_inputs, jvp_outputs, tangent_vector)


def vector_jacobian_product(
    inputs: InputSchema,
    vjp_inputs: set[str],
    vjp_outputs: set[str],
    cotangent_vector,
):
    return vjp_from_jacobian(
        jacobian, inputs, vjp_inputs, vjp_outputs, cotangent_vector
    )


def abstract_eval(abstract_inputs):
    """Calculate output shape of apply from the shape of its inputs."""
    return {"result": ShapeDType(shape=(), dtype="Float32")}
tesseract_config.yaml
name: univariate_adfallbacks
version: "0.1.0"
description: |
  Variant of the univariate (Rosenbrock) example demonstrating AD endpoint derivation
  fallbacks. The Jacobian is implemented analytically via JAX; JVP and VJP are derived
  automatically using jvp_from_jacobian and vjp_from_jacobian.

See also