Safe Haskell | Safe-Inferred |
---|---|
Language | Haskell2010 |
Synopsis
- data ForNonLinearity
- data FanMode
- errorPrefix :: String
- calculateGain :: ForNonLinearity -> Float
- calculateFan :: [Dim String Integer] -> (Integer, Integer)
- sXavierUniform :: forall gradient layout device dataType shape gain generatorDevice m. (Num gain, Floating gain, Scalar gain, MonadThrow m, SGetGeneratorDevice generatorDevice) => TensorSpec gradient layout device dataType shape -> gain -> Generator generatorDevice -> m (Tensor gradient layout (device <+> generatorDevice) dataType shape, Generator (device <+> generatorDevice))
- sXavierNormal :: forall gradient layout device dataType shape gain generatorDevice m. (Num gain, Floating gain, Scalar gain, MonadThrow m, SGetGeneratorDevice generatorDevice) => TensorSpec gradient layout device dataType shape -> gain -> Generator generatorDevice -> m (Tensor gradient layout (device <+> generatorDevice) dataType shape, Generator (device <+> generatorDevice))
- getter :: forall a. FanMode -> (a, a) -> a
- sKaimingUniform :: forall gradient layout device dataType shape generatorDevice m. (MonadThrow m, SGetGeneratorDevice generatorDevice) => TensorSpec gradient layout device dataType shape -> FanMode -> ForNonLinearity -> Generator generatorDevice -> m (Tensor gradient layout (device <+> generatorDevice) dataType shape, Generator (device <+> generatorDevice))
- sKaimingNormal :: forall gradient layout device dataType shape generatorDevice m. (MonadThrow m, SGetGeneratorDevice generatorDevice) => TensorSpec gradient layout device dataType shape -> FanMode -> ForNonLinearity -> Generator generatorDevice -> m (Tensor gradient layout (device <+> generatorDevice) dataType shape, Generator (device <+> generatorDevice))
Documentation
data ForNonLinearity Source #
Note: Identity = linear w/o activation
Instances
Instances
Generic FanMode Source # | |
Show FanMode Source # | |
Eq FanMode Source # | |
Ord FanMode Source # | |
Defined in Torch.GraduallyTyped.NN.Initialization | |
type Rep FanMode Source # | |
Defined in Torch.GraduallyTyped.NN.Initialization |
errorPrefix :: String Source #
calculateGain :: ForNonLinearity -> Float Source #
Gain scaling value for He initialization
calculateFan :: [Dim String Integer] -> (Integer, Integer) Source #
Fan-in / Fan-out scaling calculation
sXavierUniform :: forall gradient layout device dataType shape gain generatorDevice m. (Num gain, Floating gain, Scalar gain, MonadThrow m, SGetGeneratorDevice generatorDevice) => TensorSpec gradient layout device dataType shape -> gain -> Generator generatorDevice -> m (Tensor gradient layout (device <+> generatorDevice) dataType shape, Generator (device <+> generatorDevice)) Source #
Xavier uniform initialization
sXavierNormal :: forall gradient layout device dataType shape gain generatorDevice m. (Num gain, Floating gain, Scalar gain, MonadThrow m, SGetGeneratorDevice generatorDevice) => TensorSpec gradient layout device dataType shape -> gain -> Generator generatorDevice -> m (Tensor gradient layout (device <+> generatorDevice) dataType shape, Generator (device <+> generatorDevice)) Source #
Xavier normal initialization
getter :: forall a. FanMode -> (a, a) -> a Source #
Get fan in or fan out value depending on selected fan mode, used by Kaiming
sKaimingUniform :: forall gradient layout device dataType shape generatorDevice m. (MonadThrow m, SGetGeneratorDevice generatorDevice) => TensorSpec gradient layout device dataType shape -> FanMode -> ForNonLinearity -> Generator generatorDevice -> m (Tensor gradient layout (device <+> generatorDevice) dataType shape, Generator (device <+> generatorDevice)) Source #
Kaiming uniform initialization
sKaimingNormal :: forall gradient layout device dataType shape generatorDevice m. (MonadThrow m, SGetGeneratorDevice generatorDevice) => TensorSpec gradient layout device dataType shape -> FanMode -> ForNonLinearity -> Generator generatorDevice -> m (Tensor gradient layout (device <+> generatorDevice) dataType shape, Generator (device <+> generatorDevice)) Source #
Kaiming normal initialization