modulus.discrete¶
discrete.constraints.constraint¶
Continuous type contraints
- class modulus.discrete.constraints.constraint.DeepONetConstraint_Data(nodes: List[modulus.node.Node], invar_branch: Dict[str, numpy.array], invar_trunk: Dict[str, numpy.array], outvar: Dict[str, numpy.array], batch_size: int, cell_volumes: Optional[numpy.array] = None, lambda_weighting: Optional[Dict[str, Union[numpy.array, sympy.Basic]]] = None, lazy_loading: bool = False, num_workers: int = 0, loss=PointwiseLossNorm())¶
- class modulus.discrete.constraints.constraint.DeepONetConstraint_Physics(nodes: List[modulus.node.Node], invar_branch: Dict[str, numpy.array], invar_trunk: Dict[str, numpy.array], outvar: Dict[str, numpy.array], batch_size: int, tile_trunk_input: bool = True, cell_volumes: Optional[numpy.array] = None, lambda_weighting: Optional[Dict[str, Union[numpy.array, sympy.Basic]]] = None, lazy_loading: bool = False, num_workers: int = 0, loss=PointwiseLossNorm())¶
- class modulus.discrete.constraints.constraint.GridConstraint(dataset, nodes, num_workers, loss=PointwiseLossNorm())¶
Bases:
modulus.constraint.Constraint
Grid Constraint
- class modulus.discrete.constraints.constraint.SupervisedGridConstraint(nodes: List[modulus.node.Node], invar: Union[Dict[str, numpy.array], Dict[str, modulus.discrete.dataset.datafile.DataFile]], outvar: Union[Dict[str, numpy.array], Dict[str, modulus.discrete.dataset.datafile.DataFile]], batch_size: int, cell_volumes: Optional[numpy.array] = None, lambda_weighting: Optional[Dict[str, Union[numpy.array, sympy.Basic]]] = None, lazy_loading: bool = False, num_workers: int = 0, loss: modulus.loss.Loss = PointwiseLossNorm())¶
Bases:
modulus.discrete.constraints.constraint.GridConstraint
Data-driven grid field constraint
- nodesList[Node]
List of Modulus Nodes to unroll graph with.
- invarUnion[Dict[str, np.array], Dict[str, DataFile]]
Dictionary of numpy arrays as input. Input arrays should be of form [B, cin, xdim, …] If lazy loading is being used this should be a dictionary of DataFile objets
- outvarUnion[Dict[str, np.array], Dict[str, DataFile]]
Dictionary of numpy arrays as target outputs. Target arrays should be of form [B, cin, xdim, …] If lazy loading is being used this should be a dictionary of DataFile objets
- batch_sizeint
Batch size used when running constraint.
- lambda_weightingDict[str, Union[np.array, sp.Basic]], optional
The weighting of the constraint, by default None
- lazy_loadingbool, optional
Use lazy-loading of dataset, by default False
- num_workersint, optional
Number of dataloader workers, by default 0
- lossLoss, optional
Modulus Loss function, by default PointwiseLossNorm()
discrete.dataset.datafile¶
- class modulus.discrete.dataset.datafile.DataFile(cache_size: Optional[int] = None)¶
Bases:
object
Base class for storing datafile information needed for lazy loading data. Should typically be used with large data-sets that consist of multiple files or can be partially loaded (e.g. HDF5).
- class modulus.discrete.dataset.datafile.HDF5DataFile(file_name_or_path: Union[str, pathlib.Path], variable_name: str, cache_size: Optional[int] = None)¶
Bases:
modulus.discrete.dataset.datafile.DataFile
Datafile class for a HDF5 file
discrete.dataset.dataset¶
Modulus Dataset constructors for continuous type data
- class modulus.discrete.dataset.dataset.GridDataset(*args, **kwds)¶
Bases:
torch.utils.data.dataset.IterableDataset
,modulus.discrete.dataset.dataset.DiscreteDataset
Base class for data-driven learning of data on stuctured grid
- class modulus.discrete.dataset.dataset.SupervisedGridDataset(batch_size: int, invar: Union[Dict[str, numpy.array], Dict[str, modulus.discrete.dataset.datafile.DataFile]], outvar: Union[Dict[str, numpy.array], Dict[str, modulus.discrete.dataset.datafile.DataFile]], lambda_weighting: Optional[Dict[str, Union[numpy.array, sympy.Basic]]] = None, shuffle: bool = True, drop_last: bool = False, lazy_loading: bool = True)¶
Bases:
modulus.discrete.dataset.dataset.GridDataset
An infitely iterable dataset for data-driven learning of data on stuctured grid
- batch_sizeint
Batch size of dataset
- invarUnion[Dict[str, np.array], Dict[str, DataFile]]
Dictionary of numpy arrays as input. Input arrays should be of form [B, cin, xdim, …] If lazy loading is being used this should be a dictionary of DataFile objets
- outvarUnion[Dict[str, np.array], Dict[str, DataFile]]
Dictionary of numpy arrays as target outputs. Target arrays should be of form [B, cin, xdim, …] If lazy loading is being used this should be a dictionary of DataFile objets
- lambda_weightingDict[str, Union[np.array, sp.Basic]], optional
The weighting of the each example, by default None
- shufflebool, optional
Randomly shuffle examples in dataset every epoch, by default True
- drop_lastbool, optional
Drop last mini-batch if dataset not fully divisible but batch_size, by default False
- lazy_loadingbool, optional
Use lazy-loading, by default True
- worker_init_fn(i)¶
Called by worker process when it initialises in torch DataLoader
- class modulus.discrete.dataset.dataset.ValidationGridDataset(batch_size: int, invar: Dict[str, numpy.array], outvar: Dict[str, numpy.array], lambda_weighting: Optional[Dict[str, numpy.array]] = None, shuffle: bool = False, drop_last: bool = False)¶
Bases:
modulus.discrete.dataset.dataset.GridDataset
An finitely iterable dataset for data-driven learning of data on stuctured grid. Designed to be used with validators.
- batch_sizeint
Batch size of dataset
- invarDict[str, np.array]
Dictionary of numpy arrays as input. Input arrays should be of form [B, cin, xdim, …] If lazy loading is being used this should be a dictionary of DataFile objets
- outvarDict[str, np.array]
Dictionary of numpy arrays as target outputs. Target arrays should be of form [B, cin, xdim, …] If lazy loading is being used this should be a dictionary of DataFile objets
- lambda_weightingDict[str, np.array], optional
The weighting of the each example, by default None
- shufflebool, optional
Randomly shuffle examples in dataset every epoch, by default True
- drop_lastbool, optional
Drop last mini-batch if dataset not fully divisible but batch_size, by default False
discrete.solvers.solver¶
Modulus Neural Differential Equation Solver
- class modulus.discrete.solvers.solver.Solver(cfg: omegaconf.dictconfig.DictConfig, domain: modulus.continuous.domain.domain.Domain)¶
Bases:
modulus.continuous.solvers.solver.Solver
Trains and Evaluates solver.
discrete.validator.validator¶
Validator for Solver class
- class modulus.discrete.validator.validator.DeepONet_Data_Validator(invar_branch: Dict[str, numpy.array], invar_trunk: Dict[str, numpy.array], true_outvar: Dict[str, numpy.array], nodes, batch_size: int = 100, plotter=None, requires_grad: bool = True)¶
Bases:
modulus.discrete.validator.validator.Validator
DeepONet Validator
- forward_grad(invar)¶
- forward_nograd(invar)¶
- save_results(name, results_dir, writer, save_filetypes, step)¶
- class modulus.discrete.validator.validator.DeepONet_Physics_Validator(invar_branch: Dict[str, numpy.array], invar_trunk: Dict[str, numpy.array], true_outvar: Dict[str, numpy.array], nodes, batch_size: int = 100, plotter=None, requires_grad: bool = True, tile_trunk_input: bool = True)¶
Bases:
modulus.discrete.validator.validator.Validator
DeepONet Validator
- forward_grad(invar)¶
- forward_nograd(invar)¶
- save_results(name, results_dir, writer, save_filetypes, step)¶
- class modulus.discrete.validator.validator.GridValidator(invar: Dict[str, numpy.array], true_outvar: Dict[str, numpy.array], nodes: List[modulus.node.Node], batch_size: int = 100, plotter=None, requires_grad: bool = True)¶
Bases:
modulus.discrete.validator.validator.Validator
Data-driven grid field validator
- invarDict[str, np.array]
Dictionary of numpy arrays as input. Input arrays should be of form [B, cin, xdim, …]
- true_outvarDict[str, np.array]
Dictionary of numpy arrays as target outputs. Target arrays should be of form [B, cin, xdim, …]
- nodesList[Node]
List of Modulus Nodes to unroll graph with.
- batch_sizeint, optional
Batch size used when running validation, by default 100
- plotterUnion[Plotter, None]
Modulus Plotter for showing results in tensorboard.
- requires_gradbool = True
If automatic differentiation is needed for computing results.
- forward_grad(invar)¶
- forward_nograd(invar)¶
- save_results(name, results_dir, writer, save_filetypes, step)¶