modulus.continuous

Continuous type contraints

class modulus.continuous.constraints.constraint.IntegralBoundaryConstraint(nodes: typing.List[modulus.node.Node], geometry: modulus.geometry.geometry.Geometry, outvar: typing.Dict[str, typing.Union[int, float, sympy.core.basic.Basic]], batch_size: int, integral_batch_size: int, criteria: = True, lambda_weighting: typing.Optional[typing.Dict[str, typing.Union[int, float, sympy.core.basic.Basic]]] = None, param_ranges: typing.Dict[sympy.core.basic.Basic, typing.Tuple[float, float]] = {}, fixed_dataset: bool = True, batch_per_epoch: int = 100, quasirandom: bool = False, num_workers: int = 0, loss: modulus.loss.Loss = IntegralLossNorm() )

Bases: modulus.continuous.constraints.constraint.IntegralConstraint

Integral Constraint applied to boundary/perimeter/surface of geometry. For example, in 3D this will create a constraint on the surface of the given geometry.

nodesList[Node]

List of Modulus Nodes to unroll graph with.

geometryGeometry

Modulus Geometry to apply the constraint with.

outvarDict[str, Union[int, float, sp.Basic]]

A dictionary of SymPy Symbols/Expr, floats or int. This is used to describe the constraint. For example, outvar={‘u’: 0} would specify the integral of ‘u’ to be zero.

batch_sizeint

Number of integrals to apply.

integral_batch_sizeint

Batch sized used in the Monte Carlo integration to compute the integral.

criteriaUnion[sp.basic, True]

SymPy criteria function specifies to only integrate areas that satisfy this criteria. For example, if criteria=sympy.Symbol(‘x’)>0 then only areas that have positive ‘x’ values will be integrated.

lambda_weightingDict[str, Union[int, float, sp.Basic]] = None

The weighting of the constraint. For example, lambda_weighting={‘lambda_u’: 2.0} would weight the integral constraint by 2.0.

param_rangesDict[sp.Basic, Tuple[float, float]] = {}

This allows adding parameterization or additional inputs.

fixed_datasetbool = True

If True then the points sampled for this constraint are done right when initialized and fixed. If false then the points are continually resampled.

batch_per_epochint = 100

If fixed_dataset=True then the total number of integrals generated to apply constraint on is total_nr_integrals=batch_per_epoch*batch_size.

quasirandombool = False

If true then sample the points using the Halton sequence.

num_workersint

Number of worker used in fetching data.

lossLoss

Modulus Loss module that defines the loss type, (e.g. L2, L1, …).

class modulus.continuous.constraints.constraint.IntegralConstraint(dataset, nodes, num_workers=0, loss=IntegralLossNorm())

Bases: modulus.constraint.Constraint

Base class for all Integral Constraints

class modulus.continuous.constraints.constraint.PointwiseBoundaryConstraint(nodes: typing.List[modulus.node.Node], geometry: modulus.geometry.geometry.Geometry, outvar: typing.Dict[str, typing.Union[int, float, sympy.core.basic.Basic]], batch_size: int, criteria: = True, lambda_weighting: typing.Optional[typing.Dict[str, typing.Union[int, float, sympy.core.basic.Basic]]] = None, param_ranges: typing.Dict[sympy.core.basic.Basic, typing.Tuple[float, float]] = {}, fixed_dataset: bool = True, importance_measure: typing.Optional[typing.Callable] = None, batch_per_epoch: int = 1000, quasirandom: bool = False, num_workers: int = 0, loss: modulus.loss.Loss = PointwiseLossNorm() )

Bases: modulus.continuous.constraints.constraint.PointwiseConstraint

Pointwise Constraint applied to boundary/perimeter/surface of geometry. For example, in 3D this will create a constraint on the surface of the given geometry.

nodesList[Node]

List of Modulus Nodes to unroll graph with.

geometryGeometry

Modulus Geometry to apply the constraint with.

outvarDict[str, Union[int, float, sp.Basic]]

A dictionary of SymPy Symbols/Expr, floats or int. This is used to describe the constraint. For example, outvar={‘u’: 0} would specify ‘u’ to be zero everywhere on the constraint.

batch_sizeint

Batch size used in training.

criteriaUnion[sp.basic, True]

SymPy criteria function specifies to only apply constraint to areas that satisfy this criteria. For example, if criteria=sympy.Symbol(‘x’)>0 then only areas that have positive ‘x’ values will have the constraint applied to them.

lambda_weightingDict[str, Union[int, float, sp.Basic]] = None

The spatial pointwise weighting of the constraint. For example, lambda_weighting={‘lambda_u’: 2.0*sympy.Symbol(‘x’)} would apply a pointwise weighting to the loss of 2.0 * x.

param_rangesDict[sp.Basic, Tuple[float, float]] = {}

This allows adding parameterization or additional inputs.

fixed_datasetbool = True

If True then the points sampled for this constraint are done right when initialized and fixed. If false then the points are continually resampled.

importance_measureUnion[Callable, None] = None

A callable function that computes a scalar importance measure. This importance measure is then used in the constraint when sampling points. Areas with higher importance are sampled more frequently according to Monte Carlo importance sampling, https://en.wikipedia.org/wiki/Monte_Carlo_integration.

batch_per_epochint = 1000

If fixed_dataset=True then the total number of points generated to apply constraint on is total_nr_points=batch_per_epoch*batch_size.

quasirandombool = False

If true then sample the points using the Halton sequence.

num_workersint

Number of worker used in fetching data.

lossLoss

Modulus Loss module that defines the loss type, (e.g. L2, L1, …).

class modulus.continuous.constraints.constraint.PointwiseConstraint(dataset, nodes, num_workers=0, loss=PointwiseLossNorm())

Bases: modulus.constraint.Constraint

Base class for all Pointwise Constraints

classmethod from_numpy(nodes: List[modulus.node.Node], invar: Dict[str, numpy.ndarray], outvar: Dict[str, numpy.ndarray], batch_size: int, lambda_weighting: Optional[Dict[str, numpy.ndarray]] = None, num_workers=0, loss=PointwiseLossNorm())

Create custom pointwise constraint from numpy arrays.

nodesList[Node]

List of Modulus Nodes to unroll graph with.

invarDict[str, np.ndarray (N, 1)]

Dictionary of numpy arrays as input.

outvarDict[str, np.ndarray (N, 1)]

Dictionary of numpy arrays to enforce constraint on.

batch_sizeint

Batch size used in training.

lambda_weightingUnion[Dict[str, np.ndarray (N, 1)], None]

Dictionary of numpy arrays to pointwise weight losses. Default is ones.

num_workersint

Number of worker used in fetching data.

lossLoss

Modulus Loss module that defines the loss type, (e.g. L2, L1, …).

class modulus.continuous.constraints.constraint.PointwiseInteriorConstraint(nodes: typing.List[modulus.node.Node], geometry: modulus.geometry.geometry.Geometry, outvar: typing.Dict[str, typing.Union[int, float, sympy.core.basic.Basic]], batch_size: int, bounds: typing.Optional[typing.Dict[sympy.core.basic.Basic, typing.Tuple[float, float]]] = None, criteria: = True, lambda_weighting: typing.Optional[typing.Dict[str, typing.Union[int, float, sympy.core.basic.Basic]]] = None, param_ranges: typing.Dict[sympy.core.basic.Basic, typing.Tuple[float, float]] = {}, fixed_dataset: bool = True, importance_measure: typing.Optional[typing.Callable] = None, batch_per_epoch: int = 1000, quasirandom: bool = False, num_workers: int = 0, loss: modulus.loss.Loss = PointwiseLossNorm() )

Bases: modulus.continuous.constraints.constraint.PointwiseConstraint

Pointwise Constraint applied to interior of geometry. For example, in 3D this will create a constraint on the interior volume of the given geometry.

nodesList[Node]

List of Modulus Nodes to unroll graph with.

geometryGeometry

Modulus Geometry to apply the constraint with.

outvarDict[str, Union[int, float, sp.Basic]]

A dictionary of SymPy Symbols/Expr, floats or int. This is used to describe the constraint. For example, outvar={‘u’: 0} would specify ‘u’ to be zero everywhere in the constraint.

batch_sizeint

Batch size used in training.

boundsDict[sp.Basic, Tuple[float, float]] = None

Bounds of the given geometry, (e.g. `bounds={sympy.Symbol(‘x’): (0, 1), sympy.Symbol(‘y’): (0, 1)}).

criteriaUnion[sp.basic, True]

SymPy criteria function specifies to only apply constraint to areas that satisfy this criteria. For example, if criteria=sympy.Symbol(‘x’)>0 then only areas that have positive ‘x’ values will have the constraint applied to them.

lambda_weightingDict[str, Union[int, float, sp.Basic]] = None

The spatial pointwise weighting of the constraint. For example, lambda_weighting={‘lambda_u’: 2.0*sympy.Symbol(‘x’)} would apply a pointwise weighting to the loss of 2.0 * x.

param_rangesDict[sp.Basic, Tuple[float, float]] = {}

This allows adding parameterization or additional inputs.

fixed_datasetbool = True

If True then the points sampled for this constraint are done right when initialized and fixed. If false then the points are continually resampled.

importance_measureUnion[Callable, None] = None

A callable function that computes a scalar importance measure. This importance measure is then used in the constraint when sampling points. Areas with higher importance are sampled more frequently according to Monte Carlo importance sampling, https://en.wikipedia.org/wiki/Monte_Carlo_integration.

batch_per_epochint = 1000

If fixed_dataset=True then the total number of points generated to apply constraint on is total_nr_points=batch_per_epoch*batch_size.

quasirandombool = False

If true then sample the points using the Halton sequence.

num_workersint

Number of worker used in fetching data.

lossLoss

Modulus Loss module that defines the loss type, (e.g. L2, L1, …).

class modulus.continuous.constraints.constraint.VariationalConstraint(datasets: Dict[str, modulus.continuous.dataset.dataset.VariationalDataset], nodes: List[modulus.node.Node], num_workers: int = 0, loss=PointwiseLossNorm())

Bases: modulus.constraint.Constraint

Base class for all Variational Constraints.

B(u, v, g, dom) = int_{dom} (F(u, v) - g*v) dx = 0, where F is an operator, g is a given function/data, v is the test function. loss of variational = B1(u1, v1, g1, dom1) + B2(u2, v2, g2, dom2) + …

class modulus.continuous.constraints.constraint.VariationalDomainConstraint(nodes: typing.List[modulus.node.Node], geometry: modulus.geometry.geometry.Geometry, outvar_names: typing.List[str], boundary_batch_size: int, interior_batch_size: int, interior_bounds: typing.Optional[typing.Dict[sympy.core.basic.Basic, typing.Tuple[float, float]]] = None, boundary_criteria: = True, interior_criteria: = True, param_ranges: typing.Dict[sympy.core.basic.Basic, typing.Tuple[float, float]] = {}, batch_per_epoch: int = 1000, quasirandom: bool = False, num_workers: int = 0, loss: modulus.loss.Loss = PointwiseLossNorm() )

Bases: modulus.continuous.constraints.constraint.VariationalConstraint

Simple Variational Domain Constraint with a single geometry that represents the domain.

TODO add comprehensive doc string after refactor

Modulus Dataset constructors for continuous type data

class modulus.continuous.dataset.dataset.ContinuousIntegralDataset(batch_size, invar_fn, outvar_fn, lambda_weighting_fn=None, param_ranges_fn=None)

Bases: torch.utils.data.dataset.IterableDataset

An infinitely iterable dataset for a continous set of integral training examples. This will resample training examples (create new ones) every iteration.

class modulus.continuous.dataset.dataset.ContinuousPointwiseDataset(invar_fn, outvar_fn, lambda_weighting_fn=None)

Bases: torch.utils.data.dataset.IterableDataset, modulus.continuous.dataset.dataset.PointwiseDataset

An infinitely iterable dataset for a continous set of pointwise training examples. This will resample training examples (create new ones) every iteration.

class modulus.continuous.dataset.dataset.FixedIntegralDataset(batch_size, list_invar, list_outvar, shuffle: bool = True, drop_last: bool = False, list_lambda_weighting=None)

Bases: torch.utils.data.dataset.IterableDataset

An infinitely iterable dataset for a finite set of integral training examples.

class modulus.continuous.dataset.dataset.FixedPointwiseDataset(batch_size, invar, outvar, lambda_weighting=None, shuffle: bool = True, drop_last: bool = False)

Bases: torch.utils.data.dataset.IterableDataset, modulus.continuous.dataset.dataset.PointwiseDataset

An infinitely iterable dataset for a finite set of pointwise training examples

class modulus.continuous.dataset.dataset.ImportanceSampledPointwiseDataset(batch_size, invar, outvar, importance_measure, lambda_weighting=None, shuffle: bool = True, drop_last: bool = False, resample_freq: int = 1000)

Bases: modulus.continuous.dataset.dataset.FixedPointwiseDataset

An infinitely iterable dataset that applies importance sampling for faster more accurate monte carlo integration

class modulus.continuous.dataset.dataset.PointwiseInferenceDataset(batch_size, invar, shuffle: bool = False, drop_last: bool = False)

Bases: torch.utils.data.dataset.IterableDataset, modulus.continuous.dataset.dataset.PointwiseDataset

A finitely iterable dataset for inferencing the model, only contains inputs

class modulus.continuous.dataset.dataset.PointwiseValidationDataset(batch_size, invar, outvar, lambda_weighting=None, shuffle: bool = False, drop_last: bool = False)

Bases: torch.utils.data.dataset.IterableDataset, modulus.continuous.dataset.dataset.PointwiseDataset

A finitely iterable dataset for evaluating the model

class modulus.continuous.dataset.dataset.VariationalDataset(batch_size: int, invar: Dict[str, numpy.array], outvar_names: List[str], shuffle: bool = True, drop_last: bool = False)

Bases: torch.utils.data.dataset.IterableDataset, modulus.continuous.dataset.dataset.PointwiseDataset

Domain

class modulus.continuous.domain.domain.Domain(name: str = 'domain', encoding=None)

Bases: object

Domain object that contains all needed information about constraints, validators, inferencers, and monitors.

namestr

Unique name for domain.

encodingUnion[np.ndarray, None]

Possible encoding vector for domain. Currently not in use.

add_constraint(constraint, name: Optional[str] = None)

Method to add a constraint to domain.

constraintConstraint

Constraint to be added to domain.

namestr

Unique name of constraint. If duplicate is found then name is iterated to avoid duplication.

add_inferencer(inferencer: modulus.continuous.inferencer.inferencer.Inferencer, name: Optional[str] = None)

Method to add a inferencer to domain.

inferencerInferencer

Inferencer to be added to domain.

namestr

Unique name of inferencer. If duplicate is found then name is iterated to avoid duplication.

add_monitor(monitor: modulus.continuous.monitor.monitor.Monitor, name: Optional[str] = None)

Method to add a monitor to domain.

monitorMonitor

Monitor to be added to domain.

namestr

Unique name of monitor. If duplicate is found then name is iterated to avoid duplication.

add_validator(validator: modulus.continuous.validator.validator.Validator, name: Optional[str] = None)

Method to add a validator to domain.

validatorValidator

Validator to be added to domain.

namestr

Unique name of validator. If duplicate is found then name is iterated to avoid duplication.

rec_inferencers(base_dir: str, writer: torch.utils.tensorboard.writer.SummaryWriter, save_filetypes: str, step: int)

Run and save results of inferencer nodes

rec_monitors(base_dir: str, writer: torch.utils.tensorboard.writer.SummaryWriter, step: int)

Run and save results of monitor nodes

rec_validators(base_dir: str, writer: torch.utils.tensorboard.writer.SummaryWriter, save_filetypes: str, step: int)

Run and save results of validator nodes

Inferencer for Solver class

class modulus.continuous.inferencer.inferencer.Inferencer

Bases: object

Inferencer base class

class modulus.continuous.inferencer.inferencer.PointVTKInferencer(vtk_obj: modulus.plot_utils.vtk.VTKBase, nodes: List[modulus.node.Node], input_vtk_map: Dict[str, List[str]], output_names: List[str], invar: Dict[str, numpy.array] = {}, batch_size: int = 1024, mask_fn: Optional[Callable] = None, mask_value: float = nan, plotter=None, requires_grad: bool = True, log_iter: bool = False)

Bases: modulus.continuous.inferencer.inferencer.PointwiseInferencer

Pointwise inferencer using mesh points of VTK object

vtk_objVTKBase

Modulus VTK object to use point locations from

nodesList[Node]

List of Modulus Nodes to unroll graph with.

input_vtk_mapDict[str, List[str]]

Dictionary mapping from Modulus input variables to VTK variable names {“modulus name”: [“vtk name”]}. Use colons to denote components of multi-dimensional VTK arrays (“name”:# )

output_namesList[str]

List of desired outputs.

invarDict[str, np.array], optional

Dictionary of additional numpy arrays as input, by default {}

batch_sizeint

Batch size used when running inference

mask_fnUnion[Callable, None], optional

Masking function to remove points from inferencing, by default None

mask_valuefloat, optional

Value to assign masked points, by default Nan

plotterPlotter, optional

Modulus Plotter for showing results in tensorboard., by default None

requires_gradbool, optional

If automatic differentiation is needed for computing results., by default True

log_iterbool, optional

Save results to different file each call, by default False

class modulus.continuous.inferencer.inferencer.PointwiseInferencer(invar, output_names, nodes, batch_size: int = 1024, plotter=None, requires_grad: bool = True)

Bases: modulus.continuous.inferencer.inferencer.Inferencer

Pointwise Inferencer that allows inferencing on pointwise data

invarDict[str, np.ndarray (N, 1)]

Dictionary of numpy arrays as input.

output_namesList[str]

List of desired outputs.

nodesList[Node]

List of Modulus Nodes to unroll graph with.

batch_sizeint = 1024

Batch size used when running inference.

plotterUnion[Plotter, None]

Modulus Plotter for showing results in tensorboard.

requires_gradbool = True

If automatic differentiation is needed for computing results.

class modulus.continuous.inferencer.inferencer.VoxelInferencer(bounds: List[List[int]], npoints: List[int], nodes: List[modulus.node.Node], output_names: List[str], export_map: Union[None, Dict[str, List[str]]] = None, invar: Dict[str, numpy.array] = {}, batch_size: int = 1024, mask_fn: Optional[Callable] = None, mask_value: float = nan, plotter=None, requires_grad: bool = True, log_iter: bool = False)

Bases: modulus.continuous.inferencer.inferencer.PointVTKInferencer

Inferencer for creating volex representations. This inferencer works bu creating a uniform mesh of voxels and masking out the ones defined by a callable function. The result is a voxel based representation of any complex geometery at any resolution.

boundsList[List[int]]

List of domain bounds to form uniform rectangular domain

npointsList[int]

Resolution of voxels in each domain

nodesList[Node]

List of Modulus Nodes to unroll graph with.

output_namesList[str]

List of desired outputs.

export_mapDict[str, List[str]], optional

Export map dictionary with keys that are VTK variables names and values that are lists of output variables. Will use 1:1 mapping if none is provided, by default None

invarDict[str, np.array], optional

Dictionary of additional numpy arrays as input, by default {}

mask_fnUnion[Callable, None], optional

Masking function to remove points from inferencing, by default None

mask_valuefloat, optional

Value to assign masked points, by default Nan

plotterPlotter, optional

Modulus Plotter for showing results in tensorboard., by default None

requires_gradbool, optional

If automatic differentiation is needed for computing results., by default True

log_iterbool, optional

Save results to different file each call, by default False

Monitor for Solver class

class modulus.continuous.monitor.monitor.Monitor

Bases: object

Monitor base class

class modulus.continuous.monitor.monitor.PointwiseMonitor(invar, output_names, metrics, nodes)

Bases: object

Pointwise Inferencer that allows inferencing on pointwise data

invarDict[str, np.ndarray (N, 1)]

Dictionary of numpy arrays as input.

output_namesList[str]

List of outputs needed for metric.

metricsDict[str, Callable]

Dictionary of pytorch functions whose input is a dictionary torch tensors whose keys are the output_names. The keys to metrics will be used to label the metrics in tensorboard/csv outputs.

nodesList[Node]

List of Modulus Nodes to unroll graph with.

Modulus Neural Differential Equation Solver

class modulus.continuous.solvers.solver.MultiDomainSolver(cfg: omegaconf.dictconfig.DictConfig, domains: List[modulus.continuous.domain.domain.Domain])

Bases: modulus.continuous.solvers.solver.Solver

Solver class for solving multiple domains. NOTE this Solver is currently experimental and not fully supported.

class modulus.continuous.solvers.solver.SequentialSolver(cfg: omegaconf.dictconfig.DictConfig, domains: List[Tuple[int, modulus.continuous.domain.domain.Domain]], custom_update_operation: Optional[Callable] = None)

Bases: modulus.continuous.solvers.solver.Solver

Solver class for solving a sequence of domains. This solver can be used to set up iterative methods like the hFTB conjugate heat transfer method or the moving time window method for transient problems.

cfgDictConfig

Hydra dictionary of configs.

domainsList[Tuple[int, Domain]]

List of Domains to sequentially solve. Each domain is given as a tuple where the first element is an int for how many times to solve the domain and the second element is the domain. For example, domains=[(1, domain_a), (4, domain_b)] would solve domain_a once and then solve domain_b 4 times in a row.

custom_update_operationUnion[Callable, None] = None

A callable function to update any weights in models. This function will be called at the end of every iteration.

class modulus.continuous.solvers.solver.Solver(cfg: omegaconf.dictconfig.DictConfig, domain: modulus.continuous.domain.domain.Domain)

Bases: modulus.trainer.Trainer

Base solver class for solving single domain.

cfgDictConfig

Hydra dictionary of configs.

domainDomain

Domain to solve for.

Validator for Solver class

class modulus.continuous.validator.validator.PointVTKValidator(vtk_obj: modulus.plot_utils.vtk.VTKBase, nodes: List[modulus.node.Node], input_vtk_map: Dict[str, List[str]], true_vtk_map: Dict[str, List[str]], invar: Dict[str, numpy.array] = {}, true_outvar: Dict[str, numpy.array] = {}, batch_size: int = 1000, plotter=None, requires_grad: bool = True, log_iter: bool = False)

Bases: modulus.continuous.validator.validator.PointwiseValidator

Pointwise validator using mesh points of VTK object

vtk_objVTKBase

Modulus VTK object to use point locations from

nodesList[Node]

List of Modulus Nodes to unroll graph with.

input_vtk_mapDict[str, List[str]]

Dictionary mapping from Modulus input variables to VTK variable names {“modulus name”: [“vtk name”]}. Use colons to denote components of multi-dimensional VTK arrays (“name”:# )

true_vtk_mapDict[str, List[str]]

Dictionary mapping from Modulus target variables to VTK variable names {“modulus name”: [“vtk name”]}.

invarDict[str, np.array], optional

Dictionary of additional numpy arrays as input, by default {}

true_outvarDict[str, np.array], optional

Dictionary of additional numpy arrays used to validate against validation, by default {}

batch_sizeint

Batch size used when running validation.

plotterPlotter, optional

Modulus Plotter for showing results in tensorboard., by default None

requires_gradbool, optional

If automatic differentiation is needed for computing results., by default True

log_iterbool, optional

Save results to different file each call, by default False

class modulus.continuous.validator.validator.PointwiseValidator(invar, true_outvar, nodes, batch_size: int = 1024, plotter=None, requires_grad: bool = True)

Bases: modulus.continuous.validator.validator.Validator

Pointwise Validator that allows walidating on pointwise data

invarDict[str, np.ndarray (N, 1)]

Dictionary of numpy arrays as input.

true_outvarDict[str, np.ndarray (N, 1)]

Dictionary of numpy arrays used to validate against validation.

nodesList[Node]

List of Modulus Nodes to unroll graph with.

batch_sizeint = 1024

Batch size used when running validation.

plotterUnion[Plotter, None]

Modulus Plotter for showing results in tensorboard.

requires_gradbool = True

If automatic differentiation is needed for computing results.

class modulus.continuous.validator.validator.Validator

Bases: object

Validator base class

© Copyright 2021-2022, NVIDIA. Last updated on Mar 29, 2023.