NVIDIA Modulus Sym v1.1.0
Sym v1.1.0

Modulus Sym Constraints

Continuous type constraints

class modulus.sym.domain.constraint.continuous.DeepONetConstraint(nodes: List[Node], dataset: Union[Dataset, IterableDataset], loss: Loss, batch_size: int, shuffle: bool, drop_last: bool, num_workers: int)[source]

Bases: <a href="#modulus.sym.domain.constraint.continuous.PointwiseConstraint">PointwiseConstraint</a>

Base DeepONet Constraint class for all DeepONets

classmethod from_numpy(nodes: List[Node], invar: Dict[str, ndarray], outvar: Dict[str, ndarray], batch_size: int, lambda_weighting: Optional[Dict[str, ndarray]] = None, loss: Loss = PointwiseLossNorm(), shuffle: bool = True, drop_last: bool = True, num_workers: int = 0)[source]

Create custom DeepONet constraint from numpy arrays.

Parameters
  • nodes (List[Node]) – List of Modulus Nodes to unroll graph with.

  • invar (Dict[str, np.ndarray (N, 1)]) – Dictionary of numpy arrays as input.

  • outvar (Dict[str, np.ndarray (N, 1)]) – Dictionary of numpy arrays to enforce constraint on.

  • batch_size (int) – Batch size used in training.

  • lambda_weighting (Dict[str, np.ndarray (N, 1)]) – Dictionary of numpy arrays to pointwise weight losses. Default is ones.

  • loss (Loss) – Modulus Loss module that defines the loss type, (e.g. L2, L1, …).

  • shuffle (bool, optional) – Randomly shuffle examples in dataset every epoch, by default True

  • drop_last (bool, optional) – Drop last mini-batch if dataset not fully divisible but batch_size, by default False

  • num_workers (int) – Number of worker used in fetching data.

class modulus.sym.domain.constraint.continuous.IntegralBoundaryConstraint(nodes: List[Node], geometry: Geometry, outvar: Dict[str, Union[int, float, Basic]], batch_size: int, integral_batch_size: int, criteria: Optional[Union[Basic, Callable]] = None, lambda_weighting: Optional[Dict[str, Union[int, float, Basic]]] = None, parameterization: Optional[Parameterization] = None, fixed_dataset: bool = True, batch_per_epoch: int = 100, quasirandom: bool = False, num_workers: int = 0, loss: Loss = IntegralLossNorm(), shuffle: bool = True)[source]

Bases: <a href="#modulus.sym.domain.constraint.continuous.IntegralConstraint">IntegralConstraint</a>

Integral Constraint applied to boundary/perimeter/surface of geometry. For example, in 3D this will create a constraint on the surface of the given geometry.

Parameters
  • nodes (List[Node]) – List of Modulus Nodes to unroll graph with.

  • geometry (Geometry) – Modulus Geometry to apply the constraint with.

  • outvar (Dict[str, Union[int, float, sp.Basic]]) – A dictionary of SymPy Symbols/Expr, floats or int. This is used to describe the constraint. For example, outvar={‘u’: 0} would specify the integral of ‘u’ to be zero.

  • batch_size (int) – Number of integrals to apply.

  • integral_batch_size (int) – Batch sized used in the Monte Carlo integration to compute the integral.

  • criteria (Union[sp.basic, True]) – SymPy criteria function specifies to only integrate areas that satisfy this criteria. For example, if criteria=sympy.Symbol(‘x’)>0 then only areas that have positive ‘x’ values will be integrated.

  • lambda_weighting (Dict[str, Union[int, float, sp.Basic]] = None) – The weighting of the constraint. For example, lambda_weighting={‘lambda_u’: 2.0} would weight the integral constraint by 2.0.

  • parameterization (Union[Parameterization, None]) – This allows adding parameterization or additional inputs.

  • fixed_dataset (bool = True) – If True then the points sampled for this constraint are done right when initialized and fixed. If false then the points are continually resampled.

  • batch_per_epoch (int = 100) – If fixed_dataset=True then the total number of integrals generated to apply constraint on is total_nr_integrals=batch_per_epoch*batch_size.

  • quasirandom (bool = False) – If true then sample the points using the Halton sequence.

  • num_workers (int) – Number of worker used in fetching data.

  • loss (Loss) – Modulus Loss module that defines the loss type, (e.g. L2, L1, …).

  • shuffle (bool, optional) – Randomly shuffle examples in dataset every epoch, by default True

class modulus.sym.domain.constraint.continuous.IntegralConstraint(nodes: List[Node], dataset: Union[Dataset, IterableDataset], loss: Loss, batch_size: int, shuffle: bool, drop_last: bool, num_workers: int)[source]

Bases: <a href="#modulus.sym.domain.constraint.constraint.Constraint">Constraint</a>

Base class for all Integral Constraints

class modulus.sym.domain.constraint.continuous.PointwiseBoundaryConstraint(nodes: List[Node], geometry: Geometry, outvar: Dict[str, Union[int, float, Basic]], batch_size: int, criteria: Optional[Union[Basic, Callable]] = None, lambda_weighting: Optional[Dict[str, Union[int, float, Basic]]] = None, parameterization: Optional[Parameterization] = None, fixed_dataset: bool = True, importance_measure: Optional[Callable] = None, batch_per_epoch: int = 1000, quasirandom: bool = False, num_workers: int = 0, loss: Loss = PointwiseLossNorm(), shuffle: bool = True)[source]

Bases: <a href="#modulus.sym.domain.constraint.continuous.PointwiseConstraint">PointwiseConstraint</a>

Pointwise Constraint applied to boundary/perimeter/surface of geometry. For example, in 3D this will create a constraint on the surface of the given geometry.

Parameters
  • nodes (List[Node]) – List of Modulus Nodes to unroll graph with.

  • geometry (Geometry) – Modulus Geometry to apply the constraint with.

  • outvar (Dict[str, Union[int, float, sp.Basic]]) – A dictionary of SymPy Symbols/Expr, floats or int. This is used to describe the constraint. For example, outvar={‘u’: 0} would specify ‘u’ to be zero everywhere on the constraint.

  • batch_size (int) – Batch size used in training.

  • criteria (Union[sp.Basic, True]) – SymPy criteria function specifies to only apply constraint to areas that satisfy this criteria. For example, if criteria=sympy.Symbol(‘x’)>0 then only areas that have positive ‘x’ values will have the constraint applied to them.

  • lambda_weighting (Dict[str, Union[int, float, sp.Basic]] = None) – The spatial pointwise weighting of the constraint. For example, lambda_weighting={‘lambda_u’: 2.0*sympy.Symbol(‘x’)} would apply a pointwise weighting to the loss of 2.0 * x.

  • parameterization (Union[Parameterization, None], optional) – This allows adding parameterization or additional inputs.

  • fixed_dataset (bool = True) – If True then the points sampled for this constraint are done right when initialized and fixed. If false then the points are continually resampled.

  • compute_sdf_derivatives (bool, optional) – Compute SDF derivatives when sampling geometery

  • importance_measure (Union[Callable, None] = None) – A callable function that computes a scalar importance measure. This importance measure is then used in the constraint when sampling points. Areas with higher importance are sampled more frequently according to Monte Carlo importance sampling, https://en.wikipedia.org/wiki/Monte_Carlo_integration.

  • batch_per_epoch (int = 1000) – If fixed_dataset=True then the total number of points generated to apply constraint on is total_nr_points=batch_per_epoch*batch_size.

  • quasirandom (bool = False) – If true then sample the points using the Halton sequence.

  • num_workers (int) – Number of worker used in fetching data.

  • loss (Loss) – Modulus Loss module that defines the loss type, (e.g. L2, L1, …).

  • shuffle (bool, optional) – Randomly shuffle examples in dataset every epoch, by default True

class modulus.sym.domain.constraint.continuous.PointwiseConstraint(nodes: List[Node], dataset: Union[Dataset, IterableDataset], loss: Loss, batch_size: int, shuffle: bool, drop_last: bool, num_workers: int)[source]

Bases: <a href="#modulus.sym.domain.constraint.constraint.Constraint">Constraint</a>

Base class for all Pointwise Constraints

classmethod from_numpy(nodes: List[Node], invar: Dict[str, ndarray], outvar: Dict[str, ndarray], batch_size: int, lambda_weighting: Optional[Dict[str, ndarray]] = None, loss: Loss = PointwiseLossNorm(), shuffle: bool = True, drop_last: bool = True, num_workers: int = 0)[source]

Create custom pointwise constraint from numpy arrays.

Parameters
  • nodes (List[Node]) – List of Modulus Nodes to unroll graph with.

  • invar (Dict[str, np.ndarray (N, 1)]) – Dictionary of numpy arrays as input.

  • outvar (Dict[str, np.ndarray (N, 1)]) – Dictionary of numpy arrays to enforce constraint on.

  • batch_size (int) – Batch size used in training.

  • lambda_weighting (Dict[str, np.ndarray (N, 1)]) – Dictionary of numpy arrays to pointwise weight losses. Default is ones.

  • loss (Loss) – Modulus Loss module that defines the loss type, (e.g. L2, L1, …).

  • shuffle (bool, optional) – Randomly shuffle examples in dataset every epoch, by default True

  • drop_last (bool, optional) – Drop last mini-batch if dataset not fully divisible but batch_size, by default False

  • num_workers (int) – Number of worker used in fetching data.

class modulus.sym.domain.constraint.continuous.PointwiseInteriorConstraint(nodes: List[Node], geometry: Geometry, outvar: Dict[str, Union[int, float, Basic]], batch_size: int, bounds: Optional[Dict[Basic, Tuple[float, float]]] = None, criteria: Optional[Union[Basic, Callable]] = None, lambda_weighting: Optional[Dict[str, Union[int, float, Basic]]] = None, parameterization: Optional[Parameterization] = None, fixed_dataset: bool = True, compute_sdf_derivatives: bool = False, importance_measure: Optional[Callable] = None, batch_per_epoch: int = 1000, quasirandom: bool = False, num_workers: int = 0, loss: Loss = PointwiseLossNorm(), shuffle: bool = True)[source]

Bases: <a href="#modulus.sym.domain.constraint.continuous.PointwiseConstraint">PointwiseConstraint</a>

Pointwise Constraint applied to interior of geometry. For example, in 3D this will create a constraint on the interior volume of the given geometry.

Parameters
  • nodes (List[Node]) – List of Modulus Nodes to unroll graph with.

  • geometry (Geometry) – Modulus Geometry to apply the constraint with.

  • outvar (Dict[str, Union[int, float, sp.Basic]]) – A dictionary of SymPy Symbols/Expr, floats or int. This is used to describe the constraint. For example, outvar={‘u’: 0} would specify ‘u’ to be zero everywhere in the constraint.

  • batch_size (int) – Batch size used in training.

  • bounds (Dict[sp.Basic, Tuple[float, float]] = None) – Bounds of the given geometry, (e.g. `bounds={sympy.Symbol(‘x’): (0, 1), sympy.Symbol(‘y’): (0, 1)}).

  • criteria (Union[sp.basic, True]) – SymPy criteria function specifies to only apply constraint to areas that satisfy this criteria. For example, if criteria=sympy.Symbol(‘x’)>0 then only areas that have positive ‘x’ values will have the constraint applied to them.

  • lambda_weighting (Dict[str, Union[int, float, sp.Basic]] = None) – The spatial pointwise weighting of the constraint. For example, lambda_weighting={‘lambda_u’: 2.0*sympy.Symbol(‘x’)} would apply a pointwise weighting to the loss of 2.0 * x.

  • parameterization (Union[Parameterization, None] = {}) – This allows adding parameterization or additional inputs.

  • fixed_dataset (bool = True) – If True then the points sampled for this constraint are done right when initialized and fixed. If false then the points are continually resampled.

  • importance_measure (Union[Callable, None] = None) – A callable function that computes a scalar importance measure. This importance measure is then used in the constraint when sampling points. Areas with higher importance are sampled more frequently according to Monte Carlo importance sampling, https://en.wikipedia.org/wiki/Monte_Carlo_integration.

  • batch_per_epoch (int = 1000) – If fixed_dataset=True then the total number of points generated to apply constraint on is total_nr_points=batch_per_epoch*batch_size.

  • quasirandom (bool = False) – If true then sample the points using the Halton sequence.

  • num_workers (int) – Number of worker used in fetching data.

  • loss (Loss) – Modulus Loss module that defines the loss type, (e.g. L2, L1, …).

  • shuffle (bool, optional) – Randomly shuffle examples in dataset every epoch, by default True

class modulus.sym.domain.constraint.continuous.VariationalConstraint(nodes: List[Node], datasets: Dict[str, DictVariationalDataset], batch_sizes: Dict[str, int], loss: Loss = PointwiseLossNorm(), shuffle: bool = True, drop_last: bool = True, num_workers: int = 0)[source]

Bases: <a href="#modulus.sym.domain.constraint.constraint.Constraint">Constraint</a>

Base class for all Variational Constraints.

B(u, v, g, dom) = int_{dom} (F(u, v) - g*v) dx = 0, where F is an operator, g is a given function/data, v is the test function. loss of variational = B1(u1, v1, g1, dom1) + B2(u2, v2, g2, dom2) + …

class modulus.sym.domain.constraint.continuous.VariationalDomainConstraint(nodes: List[Node], geometry: Geometry, outvar_names: List[str], boundary_batch_size: int, interior_batch_size: int, interior_bounds: Optional[Dict[Basic, Tuple[float, float]]] = None, boundary_criteria: Optional[Union[Basic, Callable]] = None, interior_criteria: Optional[Union[Basic, Callable]] = None, parameterization: Optional[Parameterization] = None, batch_per_epoch: int = 1000, quasirandom: bool = False, num_workers: int = 0, loss: Loss = PointwiseLossNorm(), shuffle: bool = True)[source]

Bases: <a href="#modulus.sym.domain.constraint.continuous.VariationalConstraint">VariationalConstraint</a>

Simple Variational Domain Constraint with a single geometry that represents the domain.

TODO add comprehensive doc string after refactor

Continuous type constraints

class modulus.sym.domain.constraint.discrete.DeepONetConstraint_Data(nodes: List[Node], invar_branch: Dict[str, array], invar_trunk: Dict[str, array], outvar: Dict[str, array], batch_size: int, lambda_weighting: Optional[Dict[str, array]] = None, loss: Loss = PointwiseLossNorm(), shuffle: bool = True, drop_last: bool = True, num_workers: int = 0)[source]

Bases: _DeepONetConstraint

class modulus.sym.domain.constraint.discrete.DeepONetConstraint_Physics(nodes: List[Node], invar_branch: Dict[str, array], invar_trunk: Dict[str, array], outvar: Dict[str, array], batch_size: int, lambda_weighting: Optional[Dict[str, array]] = None, loss: Loss = PointwiseLossNorm(), shuffle: bool = True, drop_last: bool = True, num_workers: int = 0, tile_trunk_input: bool = True)[source]

Bases: _DeepONetConstraint

class modulus.sym.domain.constraint.discrete.SupervisedGridConstraint(nodes: List[Node], dataset: Union[Dataset, IterableDataset], loss: Loss = PointwiseLossNorm(), batch_size: Optional[int] = None, shuffle: bool = True, drop_last: bool = True, num_workers: int = 0)[source]

Bases: <a href="#modulus.sym.domain.constraint.constraint.Constraint">Constraint</a>

Data-driven grid field constraint

Parameters
  • nodes (List[Node]) – List of Modulus Nodes to unroll graph with.

  • dataset (Union[Dataset, IterableDataset]) – dataset which supplies invar and outvar examples Must be a subclass of Dataset or IterableDataset

  • loss (Loss, optional) – Modulus Loss function, by default PointwiseLossNorm()

  • batch_size (int, optional) – Batch size used when running constraint, must be specified if Dataset used Not used if IterableDataset used

  • shuffle (bool, optional) – Randomly shuffle examples in dataset every epoch, by default True Not used if IterableDataset used

  • drop_last (bool, optional) – Drop last mini-batch if dataset not fully divisible but batch_size, by default False Not used if IterableDataset used

  • num_workers (int, optional) – Number of dataloader workers, by default 0

class modulus.sym.domain.constraint.constraint.Constraint(nodes: List[Node], dataset: Union[Dataset, IterableDataset], loss: Loss, batch_size: int, shuffle: bool, drop_last: bool, num_workers: int)[source]

Bases: object

Base class for constraints

static get_dataloader(dataset: Union[Dataset, IterableDataset], batch_size: int, shuffle: bool, drop_last: bool, num_workers: int, distributed: Optional[bool] = None, infinite: bool = True)[source]

Return an appropriate dataloader given a dataset

class modulus.sym.domain.constraint.constraint.InfiniteDataLoader(dataloader)[source]

Bases: object

An infinite dataloader, for use with map-style datasets to avoid StopIteration after each epoch

© Copyright 2023, NVIDIA Modulus Team. Last updated on Oct 17, 2023.