minerva.models.ssl.simclr
Classes
Base class for all neural network modules. |
Module Contents
- class minerva.models.ssl.simclr.SimCLR(backbone, projection_head, flatten=True, temperature=0.5, lr=0.001)[source]
Bases:
lightning.LightningModule
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self) -> None: super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will also have their parameters converted when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
backbone (torch.nn.Module)
projection_head (torch.nn.Module)
flatten (bool)
temperature (float)
lr (float)
Initializes the SimCLR model.
Parameters
- backbonenn.Module
Backbone model for feature extraction.
- projection_headnn.Module
Projection head model.
- flattenbool, optional, default=True
Whether to flatten the output of the backbone model, by default True
- temperaturefloat, optional, default=0.5
Temperature for the NT-Xent loss, by default 0.5
- lrfloat, optional, default=1e-3
Learning rate for the optimizer, by default 1e-3
- _single_step(batch)[source]
Performs a single forward and loss computation step.
Parameters
- batchTuple[Tuple[Tensor, Tensor], Any]
Input batch containing images and optional labels.
Returns
- Tensor
Computed loss for the batch.
- Parameters:
batch (Tuple[Tuple[torch.Tensor, torch.Tensor], Any])
- Return type:
torch.Tensor
- backbone
- configure_optimizers()[source]
Configures the optimizer for training.
Returns
- torch.optim.Optimizer
Optimizer instance.
- Return type:
torch.optim.Optimizer
- flatten = True
- forward(x)[source]
Forward pass through the SimCLR model.
Parameters
- xTuple[Tensor, Tensor]
Input tensor of features with shape (batch_size, input_dim).
Returns
- Tensor
Output tensor of projected features with shape (batch_size, output_dim).
- Parameters:
x (Tuple[torch.Tensor, torch.Tensor])
- loss
- lr = 0.001
- predict_step(batch, batch_idx, dataloader_idx=None)[source]
Predict step.
Parameters
- batchTuple[Tuple[Tensor, Tensor], Any]
Input batch containing images and optional labels.
- batch_idxint
Index of the current batch.
- dataloader_idxOptional[int], optional
Index of the dataloader, by default None
Returns
- Tensor
Computed loss for the batch.
- Parameters:
batch (Tuple[Tuple[torch.Tensor, torch.Tensor], Any])
batch_idx (int)
dataloader_idx (Optional[int])
- projector
- training_step(batch, batch_idx)[source]
Training step.
Parameters
- batchTuple[Tuple[Tensor, Tensor], Any]
Input batch containing images and optional labels.
- batch_idxint
Index of the current batch.
Returns
- Tensor
Computed loss for the batch.
- Parameters:
batch (Tuple[Tuple[torch.Tensor, torch.Tensor], Any])
batch_idx (int)
- Return type:
torch.Tensor
- validation_step(batch, batch_idx)[source]
Validation step.
Parameters
- batchTuple[Tuple[Tensor, Tensor], Any]
Input batch containing images and optional labels.
- batch_idxint
Index of the current batch.
Returns
- Tensor
Computed loss for the batch.
- Parameters:
batch (Tuple[Tuple[torch.Tensor, torch.Tensor], Any])
batch_idx (int)
- Return type:
torch.Tensor