minerva.losses.batchwise_barlowtwins_loss ========================================= .. py:module:: minerva.losses.batchwise_barlowtwins_loss Classes ------- .. autoapisummary:: minerva.losses.batchwise_barlowtwins_loss.BatchWiseBarlowTwinLoss Module Contents --------------- .. py:class:: BatchWiseBarlowTwinLoss(diag_lambda = 0.01, normalize = False) Bases: :py:obj:`torch.nn.modules.loss._Loss` Implementation of the Batch-wise Barlow Twins loss function (https://arxiv.org/abs/2310.07756). Initialize the BatchWiseBarlowtwinsLoss class. Parameters ---------- diag_lambda: float The value of the diagonal lambda parameter. By default, 0.01. normalize: bool Whether to normalize the loss or not. By default, False. .. py:method:: bt_loss_bs(p, z, lambd=0.01, normalize=False) .. py:attribute:: diag_lambda :value: 0.01 .. py:method:: forward(prediction_data, projection_data) Calculate the loss between the prediction and projection data. This implementation uses a batch-wise version of the Barlow Twins loss function. Parameters ---------- prediction_data : torch.Tensor The prediction data. projection_data : torch.Tensor The projection data. .. py:attribute:: normalize :value: False .. py:method:: off_diagonal(x)