Inverse Problem: Finding Unknown Coefficients of a PDE


In this tutorial, you will use Modulus to solve an inverse problem by assimilating data from observations. You will use the flow field computed by OpenFOAM as an input to the PINNs whose job is to predict the parameters characterizing the flow (eg. viscosity (\(\nu\)) and thermal diffusivity (\(\alpha\))). In this tutorial you will learn:

  1. How to assimilate analytical/experimental/simulation data in Modulus.

  2. How to use the PointwiseConstraint in Modulus to create constraints from data that can be loaded in form of .csv files/numpy arrays.

  3. How to use the assimilated data to make predictions of unknown quantities.


This tutorial assumes that you have completed tutorial Lid Driven Cavity Background and have familiarized yourself with the basics of the Modulus APIs.

Problem Description

In this tutorial, you will predict the fluid’s viscosity and thermal diffusivity by providing the flow field information obtained from OpenFOAM simulations as an input to the PINNs. You will use the same 2D slice from a 3-fin flow field that was used as a validation data in tutorial Scalar Transport: 2D Advection Diffusion.

To summarize, the training data for this problem is (\(u_i\), \(v_i\), \(p_i\), \(T_i\)) from OpenFOAM simulation and the model is trained to predict (\(\nu\), \(\alpha\)) with the constraints of satisfying the governing equations of continuity, Navier-Stokes and advection-diffusion.

The \(\nu\) and \(\alpha\) used for the OpenFOAM simulation are \(0.01\) and \(0.002\) respectively.

\(T\) is scaled to define a new transport variable \(c\) for the advection-diffusion equation as shown in equation (161).

(161)\[\begin{split}\begin{split} c &=\frac{T_{actual}}{T_{base}} - 1.0\\ T_{base} &=293.498 K \end{split}\end{split}\]

As the majority of diffusion of temperature occurs in the wake of the heat sink (Fig. 58), you will only sample points in the wake for training the PINNs. You will also discard the points close to the boundary as you will train the network to minimize loss from the conservation laws alone.

Batch of training points sampled from OpenFOAM data

Fig. 58 Batch of training points sampled from OpenFOAM data

Case Setup

In this tutorial you will use the PointwiseConstraint class for making the training data from .csv file. You will make three networks. The first network will memorize the flow field by developing a mapping between (\(x\), \(y\)) and (\(u\), \(v\), \(p\)). The second network will memorize the temperature field by developing a mapping between (\(x\), \(y\)) and (\(c\)). The third network will be trained to invert out the desired quantities viz. (\(\nu\), \(\alpha\)). For this problem, you will be using NavierStokes and AdvectionDiffusion equations from the PDES module.


The python script for this problem can be found at examples/three_fin_2d/

Importing the required packages

The list of packages/modules to be imported are shown below.

import torch
import numpy as np
from sympy import Symbol, Eq

import modulus
from modulus.hydra import to_absolute_path, to_yaml, instantiate_arch
from modulus.hydra.config import ModulusConfig
from modulus.continuous.solvers.solver import Solver
from modulus.continuous.domain.domain import Domain
from modulus.geometry.csg.csg_2d import Rectangle, Line, Channel2D
from modulus.sympy_utils.functions import parabola
from modulus.csv_utils.csv_rw import csv_to_dict
from modulus.PDES.navier_stokes import NavierStokes, GradNormal
from modulus.PDES.basic import NormalDotVec
from modulus.PDES.turbulence_zero_eq import ZeroEquation
from modulus.PDES.advection_diffusion import AdvectionDiffusion
from modulus.continuous.constraints.constraint import (
from modulus.continuous.monitor.monitor import PointwiseMonitor
from modulus.continuous.validator.validator import PointwiseValidator
from modulus.key import Key
from modulus.node import Node

Defining the Equations, Networks and Nodes for a Inverse problem

The process of creating a neural network for an inverse problem is similar to most of the problems you have seen in previous tutorials. However, the information for the flow variables, and in turn their gradients is already present (from OpenFOAM data) for the network to memorize. Hence, this example detachs these variables in computation graph of their respective equations. This means that only the networks predicting 'nu' and 'D' will be optimized to minimize the equation residuals. The velocity, pressure and their gradients are treated as ground truth data.

Also, note that the viscosity and diffusivity are passed in as Symbolic variables ('nu' and 'D' respectively) to the equations as they are unknowns in this problem.

Similar to the tutorial Scalar Transport: 2D Advection Diffusion, you will train two separate networks to memorize flow variables (\(u\), \(v\) , \(p\) ), and scalar transport variable (\(c\)). Also, because 'nu' and 'D' are the custom variables that were defined, you will have a separate network invert_net_nu and invert_net_D that produces \(\nu\) and \(\alpha\) at the output nodes.

The code to generate the nodes for the problem is shown here:

@modulus.main(config_path="conf_inverse", config_name="config")
def run(cfg: ModulusConfig) -> None:

    nu, D = Symbol("nu"), Symbol("D")

    # make list of nodes to unroll graph on
    ns = NavierStokes(nu=nu, rho=1.0, dim=2, time=False)
    ade = AdvectionDiffusion(T="c", rho=1.0, D=D, dim=2, time=False)
    flow_net = instantiate_arch(
        input_keys=[Key("x"), Key("y")],
        output_keys=[Key("u"), Key("v"), Key("p")],
    heat_net = instantiate_arch(
        input_keys=[Key("x"), Key("y")],
    invert_net_nu = instantiate_arch(
        input_keys=[Key("x"), Key("y")],
    invert_net_D = instantiate_arch(
        input_keys=[Key("x"), Key("y")],

    nodes = (
        + ade.make_nodes(
            detach_names=["u", "v", "c", "c__x", "c__y", "c__x__x", "c__y__y"]
        + [flow_net.make_node(name="flow_network", jit=cfg.jit)]
        + [heat_net.make_node(name="heat_network", jit=cfg.jit)]
        + [invert_net_nu.make_node(name="invert_nu_network", jit=cfg.jit)]
        + [invert_net_D.make_node(name="invert_D_network", jit=cfg.jit)]

Assimilating data from CSV files/point clouds to create Training data

PointwiseConstraint: Here the tutorial does not solve the problem for the full domain. Therefore, it does not use the geometry module to create geometry for sampling points. Instead, you use the point cloud data in the form of a .csv file. You can use the PointwiseConstraint class to handle such input data. This class takes in separate dictionaries for input variables and output variables. These dictionaries have a key for each variable and a numpy array of values associated to the key. Also, this tutorial provides a batch size for sampling this .csv point cloud. This is done by specifying the required batch size to the batch_size argument.

Since part of the problem involves memorizing the given flow field, you will have ['x', 'y'] as input keys and ['u', 'v', 'p', 'c', 'continuity', 'momentum_x', 'momentum_y', 'advection_diffusion'] as the output keys. Setting ['u', 'v', 'p', 'c'] as input values from OpenFOAM data, you are making the network assimilate the OpenFOAM distribution of these variables in the selected domain. Setting ['continuity', 'momentum_x', 'momentum_y', 'advection_diffusion'] equal to \(0\), you also instruct the network to satisfy the PDE losses at those sampled points. Now, except the \(\nu\) and \(\alpha\), all the variables in these PDEs are known. Thus the network can use this information to invert out the unknowns.

The code to generate such a boundary condition is shown here:

    base_temp = 293.498

    # OpenFOAM data
    mapping = {
        "Points:0": "x",
        "Points:1": "y",
        "U:0": "u",
        "U:1": "v",
        "p": "p",
        "T": "c",
    openfoam_var = csv_to_dict(
        to_absolute_path("openfoam/heat_sink_Pr5_clipped2.csv"), mapping
    openfoam_var["c"] = openfoam_var["c"] / base_temp - 1.0
    openfoam_invar_numpy = {
        key: value for key, value in openfoam_var.items() if key in ["x", "y"]
    openfoam_outvar_numpy = {
        key: value for key, value in openfoam_var.items() if key in ["u", "v", "p", "c"]
    openfoam_outvar_numpy["continuity"] = np.zeros_like(openfoam_outvar_numpy["u"])
    openfoam_outvar_numpy["momentum_x"] = np.zeros_like(openfoam_outvar_numpy["u"])
    openfoam_outvar_numpy["momentum_y"] = np.zeros_like(openfoam_outvar_numpy["u"])
    openfoam_outvar_numpy["advection_diffusion"] = np.zeros_like(

    # make domain
    domain = Domain()

    # interior
    data = PointwiseConstraint.from_numpy(
    domain.add_constraint(data, "interior_data")

Adding Monitors

In this tutorial, you will create monitors for the convergence of average 'nu' and 'D' inside the domain as the solution progresses. Once you find that the average value of these quantities has reached a steady value, you can end the simulation. The code to generate the PointwiseMonitor is shown here:

    # add monitors
    monitor = PointwiseMonitor(
        metrics={"mean_nu": lambda var: torch.mean(var["nu"])},

    monitor = PointwiseMonitor(
        metrics={"mean_D": lambda var: torch.mean(var["D"])},

    # make solver
    slv = Solver(cfg, domain)

    # start solver

if __name__ == "__main__":

Training the model

Once the python file is setup, the training can be simply started by executing the python script.


Results and Post-processing

You can monitor the Tensorboard plots to see the convergence of the simulation. The Tensorboard graphs should look similar to the ones shown in Fig. 59.

Table 3 Comparison of the inverted coefficients with the actual values


OpenFOAM (True)

Modulus (Predicted)

Kinematic Viscosity \((m^2/s)\)

1.00 × 10−2

9.87 × 10−3

Thermal Diffusivity \((m^2/s)\)

2.00 × 10−3

2.53 × 10−3

Tensorboard plots for :math:`\nu` and :math:`\alpha`

Fig. 59 Tensorboard plots for \(\alpha\) and \(\nu\)