dasf.ml.dl.models
Submodules
Classes
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
Package Contents
- class dasf.ml.dl.models.TorchPatchDeConvNet(n_classes=4, learned_billinear=False, clip=0.1, class_weights=None)[source]
Bases:
NNModule
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- unpool
- conv_block1
- conv_block2
- conv_block3
- conv_block4
- conv_block5
- conv_block6
- conv_block7
- deconv_block8
- unpool_block9
- deconv_block10
- unpool_block11
- deconv_block12
- unpool_block13
- deconv_block14
- unpool_block15
- deconv_block16
- unpool_block17
- deconv_block18
- seg_score19
- class dasf.ml.dl.models.TorchPatchDeConvNetSkip(n_classes=4, learned_billinear=False, clip=0.1, class_weights=None)[source]
Bases:
NNModule
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- unpool
- conv_block1
- conv_block2
- conv_block3
- conv_block4
- conv_block5
- conv_block6
- conv_block7
- deconv_block8
- unpool_block9
- deconv_block10
- unpool_block11
- deconv_block12
- unpool_block13
- deconv_block14
- unpool_block15
- deconv_block16
- unpool_block17
- deconv_block18
- seg_score19
- class dasf.ml.dl.models.TorchSectionDeConvNet(n_classes=4, learned_billinear=False, clip=0.1, class_weights=False)[source]
Bases:
NNModule
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- unpool
- conv_block1
- conv_block2
- conv_block3
- conv_block4
- conv_block5
- conv_block6
- conv_block7
- deconv_block8
- unpool_block9
- deconv_block10
- unpool_block11
- deconv_block12
- unpool_block13
- deconv_block14
- unpool_block15
- deconv_block16
- unpool_block17
- deconv_block18
- seg_score19
- class dasf.ml.dl.models.TorchSectionDeConvNetSkip(n_classes=4, learned_billinear=False, clip=0.1, class_weights=None)[source]
Bases:
NNModule
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- unpool
- conv_block1
- conv_block2
- conv_block3
- conv_block4
- conv_block5
- conv_block6
- conv_block7
- deconv_block8
- unpool_block9
- deconv_block10
- unpool_block11
- deconv_block12
- unpool_block13
- deconv_block14
- unpool_block15
- deconv_block16
- unpool_block17
- deconv_block18
- seg_score19