vipr_reflectometry.flow_models.load_model.models.networks package¶
Submodules¶
vipr_reflectometry.flow_models.load_model.models.networks.flow_networks module¶
Flow network models for VIPR reflectometry plugin.
This module contains NSF-based flow models used in VIPR: - NSFWithConvEmb: Used by mc1_nsf config - NSFWithStemConvEmb: Used by mc2_nsf config
- class vipr_reflectometry.flow_models.load_model.models.networks.flow_networks.NSF(features: int = 2, context: int = 8, transforms: int = 512, hidden_features: tuple = (64, 64))¶
Bases:
ModuleCreates a neural spline flow (NSF) with monotonic rational-quadratic spline transformations.
By default, transformations are fully autoregressive. Coupling transformations can be obtained by setting :py:`passes=2`.
Warning
Spline transformations are defined over the domain \([-5, 5]\). Any feature outside of this domain is not transformed. It is recommended to standardize features (zero mean, unit variance) before training.
See also
zuko.transforms.MonotonicRQSTransformReferences
Neural Spline Flows (Durkan et al., 2019)- Parameters:
features – The number of features.
context – The number of context features.
transforms – The number of autoregressive transformations.
hidden_features – The numbers of hidden features.
- build_nsf()¶
Builds the neural spline flow (NSF).
- Returns:
The constructed flow.
- forward(x, l)¶
Forward pass for the NSF.
- Parameters:
x (torch.Tensor) – Input tensor.
l (torch.Tensor) – Conditional input tensor.
- Returns:
Loss value.
- Return type:
torch.Tensor
- rsample(N_samples, l)¶
Differentiable sampling of N points for each example in the batch.
- class vipr_reflectometry.flow_models.load_model.models.networks.flow_networks.NSFWithConvEmb(in_channels: int = 1, hidden_channels: tuple | list = (32, 64, 128, 256, 512), dim_embedding: int = 128, dim_avpool: int = 1, embedding_net_activation: str = 'gelu', use_batch_norm: bool = False, dim_out: int = 8, hidden_features: tuple = (64, 64), transforms: int = 4, use_selu_init: bool = False, pretrained_embedding_net: str | None = None, use_embedding_net: bool = True)¶
Bases:
ModuleNSF with convolutional embedding network.
Used by mc1_nsf configuration.
- forward(y, x)¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- rsample(curves, q_values=None, num_samples=1)¶
Differentiable sampling for training with a forward loss.
- sample(curves, q_values=None, num_samples=1)¶
- class vipr_reflectometry.flow_models.load_model.models.networks.flow_networks.NSFWithStemConvEmb(in_channels: int = 1, hidden_channels: tuple | list = (32, 64, 128, 256), dim_embedding: int = 24, dim_avpool: int = 32, embedding_net_activation: str = 'gelu', use_batch_norm: bool = False, dim_out: int = 8, hidden_features: tuple = (128, 128), transforms: int = 8, kernel_size: int = 3, use_se: bool = False, use_selu_init: bool = False, pretrained_embedding_net: str | None = None, use_embedding_net: bool = True, **kwargs)¶
Bases:
ModuleNSF conditioned on a pre-trained, deterministic StemConvEncoderVAE. Uses Encoder’s Mean (mu) as context.
This implementation follows the architecture from reflectorch but is self-contained within the VIPR plugin.
- forward(y, x)¶
Forward pass - x is the curve input.
- rsample(curves, q_values=None, num_samples=1)¶
Differentiable sampling.
- sample(curves, q_values=None, num_samples=1)¶
Sample from the flow.
- class vipr_reflectometry.flow_models.load_model.models.networks.flow_networks.SEBlock(in_channels, reduction=16)¶
Bases:
ModuleSqueeze-and-Excitation block (https://arxiv.org/abs/1709.01507)
- forward(x)¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- class vipr_reflectometry.flow_models.load_model.models.networks.flow_networks.StemConvEncoderVAE(in_channels: int = 1, hidden_channels: tuple = (32, 64, 128, 256, 512), kernel_size: int = 3, dim_embedding: int = 64, dim_avpool: int = 1, use_batch_norm: bool = True, use_se: bool = False, activation: str = 'relu')¶
Bases:
ModuleA 1D CNN encoder for a VAE, modified with an initial ‘stem’ layer. The stem layer uses stride=1 to process the input at full resolution before subsequent layers begin downsampling with stride=2. This helps preserve high-frequency or broad, low-amplitude features.
- forward(x: Tensor, return_features: bool = False)¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- vipr_reflectometry.flow_models.load_model.models.networks.flow_networks.selu_init(m)¶
SELU weight initialization for stable training with SELU activation.
Module contents¶
- class vipr_reflectometry.flow_models.load_model.models.networks.NSF(features: int = 2, context: int = 8, transforms: int = 512, hidden_features: tuple = (64, 64))¶
Bases:
ModuleCreates a neural spline flow (NSF) with monotonic rational-quadratic spline transformations.
By default, transformations are fully autoregressive. Coupling transformations can be obtained by setting :py:`passes=2`.
Warning
Spline transformations are defined over the domain \([-5, 5]\). Any feature outside of this domain is not transformed. It is recommended to standardize features (zero mean, unit variance) before training.
See also
zuko.transforms.MonotonicRQSTransformReferences
Neural Spline Flows (Durkan et al., 2019)- Parameters:
features – The number of features.
context – The number of context features.
transforms – The number of autoregressive transformations.
hidden_features – The numbers of hidden features.
- build_nsf()¶
Builds the neural spline flow (NSF).
- Returns:
The constructed flow.
- forward(x, l)¶
Forward pass for the NSF.
- Parameters:
x (torch.Tensor) – Input tensor.
l (torch.Tensor) – Conditional input tensor.
- Returns:
Loss value.
- Return type:
torch.Tensor
- rsample(N_samples, l)¶
Differentiable sampling of N points for each example in the batch.
- class vipr_reflectometry.flow_models.load_model.models.networks.NSFWithConvEmb(in_channels: int = 1, hidden_channels: tuple | list = (32, 64, 128, 256, 512), dim_embedding: int = 128, dim_avpool: int = 1, embedding_net_activation: str = 'gelu', use_batch_norm: bool = False, dim_out: int = 8, hidden_features: tuple = (64, 64), transforms: int = 4, use_selu_init: bool = False, pretrained_embedding_net: str | None = None, use_embedding_net: bool = True)¶
Bases:
ModuleNSF with convolutional embedding network.
Used by mc1_nsf configuration.
- forward(y, x)¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- rsample(curves, q_values=None, num_samples=1)¶
Differentiable sampling for training with a forward loss.
- sample(curves, q_values=None, num_samples=1)¶
- class vipr_reflectometry.flow_models.load_model.models.networks.NSFWithStemConvEmb(in_channels: int = 1, hidden_channels: tuple | list = (32, 64, 128, 256), dim_embedding: int = 24, dim_avpool: int = 32, embedding_net_activation: str = 'gelu', use_batch_norm: bool = False, dim_out: int = 8, hidden_features: tuple = (128, 128), transforms: int = 8, kernel_size: int = 3, use_se: bool = False, use_selu_init: bool = False, pretrained_embedding_net: str | None = None, use_embedding_net: bool = True, **kwargs)¶
Bases:
ModuleNSF conditioned on a pre-trained, deterministic StemConvEncoderVAE. Uses Encoder’s Mean (mu) as context.
This implementation follows the architecture from reflectorch but is self-contained within the VIPR plugin.
- forward(y, x)¶
Forward pass - x is the curve input.
- rsample(curves, q_values=None, num_samples=1)¶
Differentiable sampling.
- sample(curves, q_values=None, num_samples=1)¶
Sample from the flow.