hasktorch-gradually-typed-0.2.0.0: experimental project for hasktorch
Safe HaskellSafe-Inferred
LanguageHaskell2010

Torch.GraduallyTyped.Tensor.MathOperations.Pointwise

Synopsis
  • abs :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • absolute :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • acos :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • acosh :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • add :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • addScalar :: forall other gradient layout device dataType shape m. (Scalar other, MonadThrow m) => Tensor gradient layout device dataType shape -> other -> m (Tensor gradient layout device dataType shape)
  • addcdiv :: forall value gradient layout device dataType shape gradient' layout' device' dataType' shape' gradient'' layout'' device'' dataType'' shape'' m. (Scalar value, MonadThrow m) => value -> Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> Tensor gradient'' layout'' device'' dataType'' shape'' -> m (Tensor (gradient <|> (gradient' <|> gradient'')) (layout <+> (layout' <+> layout'')) (device <+> (device' <+> device'')) (dataType <+> (dataType' <+> dataType'')) (shape <+> (shape' <+> shape'')))
  • addcmul :: forall scalar gradient layout device dataType shape gradient' layout' device' dataType' shape' gradient'' layout'' device'' dataType'' shape'' m. (Scalar scalar, MonadThrow m) => scalar -> Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> Tensor gradient'' layout'' device'' dataType'' shape'' -> m (Tensor (gradient <|> (gradient' <|> gradient'')) (layout <+> (layout' <+> layout'')) (device <+> (device' <+> device'')) (dataType <+> (dataType' <+> dataType'')) (shape <+> (shape' <+> shape'')))
  • asin :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • asinh :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • atan :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • atanh :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • atan2 :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • bitwiseNot :: forall gradient layout device dataType shape m. MonadThrow m => Tensor gradient layout device dataType shape -> m (Tensor gradient layout device ('DataType 'Bool) shape)
  • bitwiseAnd :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' m. MonadThrow m => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') (shape <+> shape'))
  • bitwiseAndScalar :: forall other gradient layout device dataType shape. Scalar other => Tensor gradient layout device dataType shape -> other -> Tensor gradient layout device dataType shape
  • bitwiseOr :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' m. MonadThrow m => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') (shape <+> shape'))
  • bitwiseOrScalar :: forall other gradient layout device dataType shape. Scalar other => Tensor gradient layout device dataType shape -> other -> Tensor gradient layout device dataType shape
  • bitwiseXor :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' m. MonadThrow m => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') (shape <+> shape'))
  • bitwiseXorScalar :: forall other gradient layout device dataType shape. Scalar other => Tensor gradient layout device dataType shape -> other -> Tensor gradient layout device dataType shape
  • ceil :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • clamp :: forall min max gradient layout device dataType shape. (Scalar min, Scalar max) => min -> max -> Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • cos :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • cosh :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • deg2rad :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • div :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • divScalar :: forall divisor gradient layout device dataType shape. Scalar divisor => Tensor gradient layout device dataType shape -> divisor -> Tensor gradient layout device dataType shape
  • digamma :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • erf :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • erfc :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • erfinv :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • exp :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • expm1 :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • floor :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • floorDivide :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • floorDivideScalar :: forall divisor gradient layout device dataType shape. Scalar divisor => Tensor gradient layout device dataType shape -> divisor -> Tensor gradient layout device dataType shape
  • fmod :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • fmodScalar :: forall divisor gradient layout device dataType shape. Scalar divisor => divisor -> Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • frac :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • lerp :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' gradient'' layout'' device'' dataType'' shape'' shape''' m. (MonadThrow m, shape''' ~ BroadcastShapesF shape (BroadcastShapesF shape' shape''), Catch shape''') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> Tensor gradient'' layout'' device'' dataType'' shape'' -> m (Tensor (gradient <|> (gradient' <|> gradient'')) (layout <+> (layout' <+> layout'')) (device <+> (device' <+> device'')) (dataType <+> (dataType' <+> dataType'')) shape''')
  • lerpScalar :: forall weight gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (Scalar weight, MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => weight -> Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • lgamma :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • log :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • log10 :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • log1p :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • log2 :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • logaddexp :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • logaddexp2 :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • logicalAnd :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor ('Gradient 'WithoutGradient) (layout <+> layout') (device <+> device') ('DataType 'Bool) shape'')
  • logicalNot :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor ('Gradient 'WithoutGradient) layout device ('DataType 'Bool) shape
  • logicalOr :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor ('Gradient 'WithoutGradient) (layout <+> layout') (device <+> device') ('DataType 'Bool) shape'')
  • logicalXor :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor ('Gradient 'WithoutGradient) (layout <+> layout') (device <+> device') ('DataType 'Bool) shape'')
  • mul :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • mulScalar :: forall other gradient layout device dataType shape m. (Scalar other, MonadThrow m) => Tensor gradient layout device dataType shape -> other -> m (Tensor gradient layout device dataType shape)
  • mvlgamma :: forall gradient layout device dataType shape m. MonadThrow m => Int -> Tensor gradient layout device dataType shape -> m (Tensor gradient layout device dataType shape)
  • neg :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • polygamma :: forall gradient layout device dataType shape. Int -> Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • pow :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient' layout' device' dataType' shape' -> Tensor gradient layout device dataType shape -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • powScalar :: forall exponent gradient layout device dataType shape m. (Scalar exponent, MonadThrow m) => Tensor gradient layout device dataType shape -> exponent -> m (Tensor gradient layout device dataType shape)
  • powTensor :: forall input gradient layout device dataType shape. Scalar input => input -> Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • rad2deg :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • reciprocal :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • remainder :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • round :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • rsqrt :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • sigmoid :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • sign :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • sin :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • sinh :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • sub :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient) (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • subScalar :: forall other gradient layout device dataType shape m. (Scalar other, MonadThrow m) => Tensor gradient layout device dataType shape -> other -> m (Tensor gradient layout device dataType shape)
  • sqrt :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • square :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • tan :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • tanh :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape
  • trueDivide :: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') => Tensor gradient layout device dataType shape -> Tensor gradient' layout' device' dataType' shape' -> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')
  • trueDivideScalar :: forall other gradient layout device dataType shape. Scalar other => Tensor gradient layout device dataType shape -> other -> Tensor gradient layout device dataType shape
  • trunc :: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape -> Tensor gradient layout device dataType shape

Documentation

>>> import Torch.GraduallyTyped.Prelude.List (SList (..))
>>> import Torch.GraduallyTyped

abs Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Computes the element-wise absolute value of the given input tensor: \[ \mathrm{output}_i = \left|\mathrm{input}_i\right|. \] The result is returned as a new tensor.

absolute Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Alias for abs.

acos Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the arccosine of the elements of input: \[ \mathrm{output}_i = \cos^{-1} \left(\mathrm{input}_i\right). \]

acosh Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the arccosine of the elements of input: \[ \mathrm{output}_i = \cosh^{-1} \left(\mathrm{input}_i\right). \]

Note that the domain of the inverse hyperbolic cosine is \([1, \infty)\), and values outside this range will be mapped to \(\mathrm{NaN}\), except for \(+\infty\) for which the output is mapped to \(+\infty\).

add Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

input tensor

-> Tensor gradient' layout' device' dataType' shape'

other tensor

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

output tensor

Element-wise addition of one tensor and another: \[ \mathrm{output}_i = \mathrm{input}_i + \mathrm{other}_i. \] The result is returned as a new tensor.

The shape of other must be broadcastable with the shape of input. See addScalar for a version of this function where the other input is a scalar.

>>> g <- sMkGenerator (SDevice SCPU) 0
>>> sRandn' = sRandn . TensorSpec (SGradient SWithGradient) (SLayout SDense) (SDevice SCPU) (SDataType SFloat)
>>> (a, g') <- sRandn' (SShape $ SName @"feature" :&: SSize @4 :|: SNil) g
>>> (b, _) <- sRandn' (SShape $ SName @"*" :&: SSize @4 :|: SName @"*" :&: SSize @1 :|: SNil) g'
>>> result <- a `add` b
>>> :type result
result
  :: Tensor
       ('Gradient 'WithGradient)
       ('Layout 'Dense)
       ('Device 'CPU)
       ('DataType 'Float)
       ('Shape
          '[ 'Dim ('Name "*") ('Size 4), 'Dim ('Name "feature") ('Size 4)])

addScalar Source #

Arguments

:: forall other gradient layout device dataType shape m. (Scalar other, MonadThrow m) 
=> Tensor gradient layout device dataType shape

input tensor

-> other

input scalar

-> m (Tensor gradient layout device dataType shape)

output

Adds a scalar other to a tensor input: \[ \mathrm{output}_i = \mathrm{input}_i + \mathrm{other}. \] The result is returned as a new tensor. See add for a version of this function where the second argument is a tensor.

TODO: add data type unification of other and dataType.

addcdiv Source #

Arguments

:: forall value gradient layout device dataType shape gradient' layout' device' dataType' shape' gradient'' layout'' device'' dataType'' shape'' m. (Scalar value, MonadThrow m) 
=> value

input scalar

-> Tensor gradient layout device dataType shape

first other tensor

-> Tensor gradient' layout' device' dataType' shape'

second other tensor

-> Tensor gradient'' layout'' device'' dataType'' shape''

input tensor

-> m (Tensor (gradient <|> (gradient' <|> gradient'')) (layout <+> (layout' <+> layout'')) (device <+> (device' <+> device'')) (dataType <+> (dataType' <+> dataType'')) (shape <+> (shape' <+> shape'')))

output tensor

Performs the element-wise division of tensor1 by tensor2, multiply the result by a scalar value and add it to input: \[ \mathrm{output}_i = \mathrm{input}_i + \mathrm{value} \times \frac{\mathrm{tensor1}_i}{\mathrm{tensor2}_i}. \]

See addcmul for a version of this function where tensor1 and tensor2 are multiplied rather than divided.

Note further that for inputs of type DType or DType, value must be a real number, otherwise it must be an integer.

addcmul Source #

Arguments

:: forall scalar gradient layout device dataType shape gradient' layout' device' dataType' shape' gradient'' layout'' device'' dataType'' shape'' m. (Scalar scalar, MonadThrow m) 
=> scalar

input scalar

-> Tensor gradient layout device dataType shape

first other tensor

-> Tensor gradient' layout' device' dataType' shape'

second other tensor

-> Tensor gradient'' layout'' device'' dataType'' shape''

input tensor

-> m (Tensor (gradient <|> (gradient' <|> gradient'')) (layout <+> (layout' <+> layout'')) (device <+> (device' <+> device'')) (dataType <+> (dataType' <+> dataType'')) (shape <+> (shape' <+> shape'')))

output

Performs the element-wise multiplication of tensor1 by tensor2, multiply the result by the scalar value and add it to input: \[ \mathrm{output}_i = \mathrm{input}_i + \mathrm{value} \times \mathrm{tensor1}_i \times \mathrm{tensor2}_i. \]

See addcdiv for a version of this function where tensor1 and tensor2 are divided rather than multiplied.

Note further that for inputs of type DType or DType, value must be a real number, otherwise it must be an integer.

asin Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the arcsine of the elements of input: \[ \mathrm{output}_i = \sin^{-1} \left(\mathrm{input}_i\right). \]

asinh Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the inverse hyperbolic sine of the elements of input: \[ \mathrm{output}_i = \sinh^{-1} \left(\mathrm{input}_i\right). \]

atan Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the arctangent of the elements of input: \[ \mathrm{output}_i = \tan^{-1} \left(\mathrm{input}_i\right). \]

atanh Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the inverse hyperbolic tangent of the elements of input: \[ \mathrm{output}_i = \tanh^{-1} \left(\mathrm{input}_i\right). \]

Note that the domain of the inverse hyperbolic tangent is \((-1, 1)\), and values outside this range will be mapped to \(\mathrm{NaN}\), except for the values \(1\) and \(-1\) for which the output is mapped to \(\pm \infty\) respectively.

atan2 Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

input tensor

-> Tensor gradient' layout' device' dataType' shape'

other input tensor

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

output tensor

Element-wise arctangent of input and other with consideration of the quadrant. Returns a new tensor where each element is the signed angle in radians between the vectors \(\mathrm{other}_i, \mathrm{input}_i)\) and \((1,0)\). Here $mathrm{other}_i$, the \(i\)-th element of the second argument of this function, is the x coordinate while $mathrm{input}_i$, the \(i\)-th element of the first argument, is the y coordinate.

Note that the shapes of input and other must be broadcastable.

bitwiseNot Source #

Arguments

:: forall gradient layout device dataType shape m. MonadThrow m 
=> Tensor gradient layout device dataType shape

input

-> m (Tensor gradient layout device ('DataType 'Bool) shape)

output

Computes the bitwise NOT of the given input tensor. The data type of the input tensor must be DType or an integral data type. For DType tensors, the function computes the logical NOT.

bitwiseAnd Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' m. MonadThrow m 
=> Tensor gradient layout device dataType shape

input tensor

-> Tensor gradient' layout' device' dataType' shape'

other tensor

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') (shape <+> shape'))

output tensor

Computes the bitwise AND of the input and the other tensor. The data type of the tensors must be DType or an integral data type. For DType tensors, the function computes the logical AND.

See bitwiseAndScalar for a version of this function where other is a scalar.

bitwiseAndScalar Source #

Arguments

:: forall other gradient layout device dataType shape. Scalar other 
=> Tensor gradient layout device dataType shape

input tensor

-> other

other scalar

-> Tensor gradient layout device dataType shape

output

Computes the bitwise AND of the tensor input and the scalar other. The data type of the inputs must be DType or an integral data type. If the data type is DType, then the function computes the logical AND.

See bitwiseAnd for a version of this function where other is a tensor.

bitwiseOr Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' m. MonadThrow m 
=> Tensor gradient layout device dataType shape

input tensor

-> Tensor gradient' layout' device' dataType' shape'

other tensor

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') (shape <+> shape'))

output tensor

Computes the bitwise OR of the input and the other tensor. The data type of the tensors must be DType or an integral data type. For DType tensors, the function computes the logical OR.

See bitwiseOrScalar for a version of this function where other is a scalar.

bitwiseOrScalar Source #

Arguments

:: forall other gradient layout device dataType shape. Scalar other 
=> Tensor gradient layout device dataType shape

input tensor

-> other

other scalar

-> Tensor gradient layout device dataType shape

output tensor

Computes the bitwise OR of the tensor input and the scalar other. The data type of the inputs must be DType or an integral data type. If the data type is DType, then the function computes the logical OR.

See bitwiseOr for a version of this function where other is a tensor.

bitwiseXor Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' m. MonadThrow m 
=> Tensor gradient layout device dataType shape

input tensor

-> Tensor gradient' layout' device' dataType' shape'

other tensor

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') (shape <+> shape'))

output tensor

Computes the bitwise XOR of the input and the other tensor. The data type of the tensors must be DType or an integral data type. For DType tensors, the function computes the logical XOR.

See bitwiseXorScalar for a version of this function where other is a scalar.

bitwiseXorScalar Source #

Arguments

:: forall other gradient layout device dataType shape. Scalar other 
=> Tensor gradient layout device dataType shape

input tensor

-> other

other scalar

-> Tensor gradient layout device dataType shape

output tensor

Computes the bitwise XOR of the tensor input and the scalar other. The data type of the inputs must be DType or an integral data type. If the data type is DType, then the function computes the logical XOR.

See bitwiseXor for a version of this function where other is a tensor.

ceil Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the ceil of the elements of input, that is, the smallest integer greater than or equal to each element: \[ \mathrm{output}_i = \lceil\mathrm{input}_i\rceil = \lfloor\mathrm{input}_i\rfloor + 1, \] where \(\lfloor\mathrm{input}_i\rfloor\) is the floor of the \(i\)-th element of input which can be computed with floor.

clamp Source #

Arguments

:: forall min max gradient layout device dataType shape. (Scalar min, Scalar max) 
=> min

min

-> max

max

-> Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Clamp all elements in input into the range [ min, max ] and return the result as a new tensor.

cos Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

cosh Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

deg2rad Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

div Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

tensor dividend

-> Tensor gradient' layout' device' dataType' shape'

tensor divisor

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

tensor output

Element-wise division of the first input tensor, the dividend, by the second input tensor, the divisor. \[ \mathrm{output}_i = \frac{dividend_i}{divisor_i} \] The result is returned as a new tensor.

See divScalar for a version of this function where the divisor is a scalar.

Note further that "true divisions" can be computed with trueDivide or trueDivideScalar which can come in handy when both the dividend and the divisor have DType or integer data types.

divScalar Source #

Arguments

:: forall divisor gradient layout device dataType shape. Scalar divisor 
=> Tensor gradient layout device dataType shape

tensor dividend

-> divisor

scalar divisor

-> Tensor gradient layout device dataType shape

tensor output

Element-wise division of the first input, the dividend tensor, by the second input, the divisor scalar. \[ \mathrm{output}_i = \frac{dividend_i}{divisor} \] The result is returned as a new tensor.

See div for a version of this function where the divisor is a tensor.

Note further that "true divisions" can be computed with trueDivide or trueDivideScalar which can come in handy when both the dividend and the divisor have DType or integer data types.

digamma Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Computes the logarithmic derivative of the gamma function on input: \[ \mathrm{output}_i = \psi\left(\mathrm{input}_i\right) = \frac{d}{d\mathrm{input}_i} \ln\left(\gamma\left(\mathrm{input}_i\right)\right). \]

erf Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Computes and returns the error function of each element of input: \[ \mathrm{output}_i = \mathop{erf}\left(\mathrm{input}_i\right) = \frac{2}{\sqrt{\pi}} \int_0^{\mathrm{output}_i} \exp\left(- t^2\right) dt. \]

See also erfc that computes the complementary error function to high numerical accuracy and erfinv that computes the inverse of the error function.

erfc Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Computes the complementary error function of each element of input: \[ \mathrm{output}_i = 1 - \mathop{erf}\left(\mathrm{input}_i\right) = 1 - \frac{2}{\sqrt{\pi}} \int_0^{\mathrm{output}_i} \exp\left(- t^2\right) dt. \]

See also erf that computes the error function and erfinv that computes the inverse of the error function.

erfinv Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Computes the inverse error function of each element of input: \[ \mathrm{output}_i = \mathop{erfinv}\left(\mathrm{input}_i\right) \] where \(\mathop{erfinv}\left(\mathop{erf}\left(x\right)\right) = x\) for \(x \in (-1, 1)\). erfinv is not defined outside this interval.

See also erf that computes the error function and erfc that computes the complementary error function.

exp Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the exponential of the elements of the input tensor input: \[ \mathrm{output}_i = \exp\left(\mathrm{input}_i\right). \]

See also expm1 for a high-accuracy calculation of the exponential of the elements of input minus 1.

expm1 Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the exponential of the elements minus 1 of input: \[ \mathrm{output}_i = \exp\left(\mathrm{input}_i\right) - 1. \]

See also exp for the exponential function.

floor Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the floor of the elements of input, that is, the largest integer less than or equal to each element.: \[ \mathrm{output}_i = \lfloor\mathrm{input}_i\rfloor = \lceil\mathrm{input}_i\rceil - 1, \] where \(\lceil\mathrm{input}_i\rceil\) is the ceil of the \(i\)-th element of input which can be computed with ceil.

floorDivide Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

dividend tensor

-> Tensor gradient' layout' device' dataType' shape'

divisor tensor

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

output tensor

Return the element-wise division of the tensor dividend by the tensor divisor rounded down to the nearest integer: \[ \mathrm{output}_i = \left\lfloor\frac{\mathrm{dividend}_i}{\mathrm{divisor}_i}\right\rfloor. \]

See floorDivideScalar for a version of this function where divisor is a scalar.

floorDivideScalar Source #

Arguments

:: forall divisor gradient layout device dataType shape. Scalar divisor 
=> Tensor gradient layout device dataType shape

dividend tensor

-> divisor

divisor scalar

-> Tensor gradient layout device dataType shape

output tensor

Return the division of the tensor dividend by the scalar divisor rounded down to the nearest integer: \[ \mathrm{output}_i = \left\lfloor\frac{\mathrm{dividend}_i}{\mathrm{divisor}}\right\rfloor. \]

See floorDivide for a version of this function where divisor is a tensor.

fmod Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

dividend tensor

-> Tensor gradient' layout' device' dataType' shape'

divisor scalar

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

output tensor

Computes the element-wise remainder of the division of the tensor dividend by the tensor divisor. The dividend and divisor may contain both for integer and floating point numbers. The remainder has the same sign as the dividend input.

See fmodScalar for a version of this function where divisor is a scalar.

fmodScalar Source #

Arguments

:: forall divisor gradient layout device dataType shape. Scalar divisor 
=> divisor

divisor scalar

-> Tensor gradient layout device dataType shape

dividend tensor

-> Tensor gradient layout device dataType shape

output tensor

Computes the element-wise remainder of the division of the tensor dividend by the scalar divisor. The dividend and divisor may contain both for integer and floating point numbers. The remainder has the same sign as the dividend input.

See fmodScalar for a version of this function where divisor is a scalar.

frac Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Computes the fractional portion of each element in input: \[ \mathrm{output}_i = \mathrm{input}_i - \left\lfloor\left|\mathrm{input}_i\right|\right\rfloor \times \sgn\left(\mathrm{input}_i\right). \]

lerp Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' gradient'' layout'' device'' dataType'' shape'' shape''' m. (MonadThrow m, shape''' ~ BroadcastShapesF shape (BroadcastShapesF shape' shape''), Catch shape''') 
=> Tensor gradient layout device dataType shape

weight

-> Tensor gradient' layout' device' dataType' shape'

start

-> Tensor gradient'' layout'' device'' dataType'' shape''

end

-> m (Tensor (gradient <|> (gradient' <|> gradient'')) (layout <+> (layout' <+> layout'')) (device <+> (device' <+> device'')) (dataType <+> (dataType' <+> dataType'')) shape''')

output

Linear interpolation of two tensors, start and end, based on a tensor weight. For linear interpolations based on a scalar see lerpScalar.

Returned is the result of the following computation as a tensor: \[ \mathrm{output}_i = \mathrm{start}_i + \mathrm{weight}_i \times \left(\mathrm{end}_i - \mathrm{start}_i\right). \]

Note that the shapes of start, end, and also weight must be broadcastable.

lerpScalar Source #

Arguments

:: forall weight gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (Scalar weight, MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> weight

weight

-> Tensor gradient layout device dataType shape

start

-> Tensor gradient' layout' device' dataType' shape'

end

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

output

Linear interpolation of two tensors, start and end, based on a scalar weight. For linear interpolations based on a tensor see lerp.

Returned is the result of the following computation as a tensor: \[ \mathrm{output}_i = \mathrm{start}_i + \mathrm{weight} \times \left(\mathrm{end}_i - \mathrm{start}_i\right). \]

Note that the shapes of start and end must be broadcastable.

lgamma Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Computes the logarithm of the gamma function on input: \[ \mathrm{output}_i = \log \Gamma\left(\mathrm{input}_i\right). \]

log Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the natural logarithm of the elements of input: \[ \mathrm{output}_i = \ln \left(\mathrm{input}_i\right) = \log_{\mathrm{e}} \left(\mathrm{input}_i\right). \]

log10 Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the decimal logarithm of the elements of input: \[ \mathrm{output}_i = \mathop{lg} \left(\mathrm{input}_i\right) = \log_{10} \left(\mathrm{input}_i\right). \]

log1p Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the natural logarithm of \(1 + \mathrm{input}\): \[ \mathrm{output}_i = \ln \left(1 + \mathrm{input}_i\right). \]

Consider using this function over a literal implementation using log. log1p is much more accurate for small values of input.

log2 Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the logarithm to the base 2 of the elements of input: \[ \mathrm{output}_i = \log_2 \left(\mathrm{input}_i\right). \]

logaddexp Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

other

-> Tensor gradient' layout' device' dataType' shape'

input

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

output

Logarithm of the sum of exponentiations of the inputs. Calculates pointwise the function \(\log \left(\exp x + \exp y\right)\).

This function is useful in statistics where the calculated probabilities of events may be so small as to exceed the range of normal floating point numbers. In such cases the logarithm of the calculated probability is stored. This function allows adding probabilities stored in such a fashion.

logaddexp must not be confused with logsumexp which performs a reduction on a single tensor.

logaddexp2 Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

other

-> Tensor gradient' layout' device' dataType' shape'

input

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

output

Logarithm of the sum of exponentiations of the inputs in base-2. Calculates pointwise the function \(\log_2 \left(2^x + 2^y\right)\).

See logaddexp for further details.

logicalAnd Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

the input tensor

-> Tensor gradient' layout' device' dataType' shape'

the tensor to compute AND with

-> m (Tensor ('Gradient 'WithoutGradient) (layout <+> layout') (device <+> device') ('DataType 'Bool) shape'')

the output tensor

Computes the element-wise logical AND of the given input tensors. The output tensor will have the DType data type. If the input tensors are not a bool tensors, then zeros are treated as False and nonzeros are treated as True.

logicalNot Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

the input tensor

-> Tensor ('Gradient 'WithoutGradient) layout device ('DataType 'Bool) shape

the output tensor

Computes the element-wise logical NOT of the given input tensor. The output tensor will have the DType data type. If the input tensor is not a bool tensor, zeros are treated as False and non-zeros are treated as True.

logicalOr Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

the input tensor

-> Tensor gradient' layout' device' dataType' shape'

the tensor to compute OR with

-> m (Tensor ('Gradient 'WithoutGradient) (layout <+> layout') (device <+> device') ('DataType 'Bool) shape'')

the output tensor

Computes the element-wise logical OR of the given input tensors. The output tensor will have the DType data type. If the input tensors are not a bool tensors, then zeros are treated as False and nonzeros are treated as True.

logicalXor Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

the input tensor

-> Tensor gradient' layout' device' dataType' shape'

the tensor to compute XOR with

-> m (Tensor ('Gradient 'WithoutGradient) (layout <+> layout') (device <+> device') ('DataType 'Bool) shape'')

the output tensor

Computes the element-wise logical XOR of the given input tensors. The output tensor will have the DType data type. If the input tensors are not a bool tensors, then zeros are treated as False and nonzeros are treated as True.

mul Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

input tensor

-> Tensor gradient' layout' device' dataType' shape'

other tensor

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

output tensor

Element-wise multiplication of two tensors: \[ \mathrm{output}_i = \mathrm{input}_i \times \mathrm{other}_i. \] The result is returned as a new tensor.

The shape of other must be broadcastable with the shape of input. See mulScalar for a version of this function where the other input is a scalar.

mulScalar Source #

Arguments

:: forall other gradient layout device dataType shape m. (Scalar other, MonadThrow m) 
=> Tensor gradient layout device dataType shape

tensor input

-> other

scalar other input

-> m (Tensor gradient layout device dataType shape)

tensor output

mvlgamma Source #

Arguments

:: forall gradient layout device dataType shape m. MonadThrow m 
=> Int

the number of dimensions p

-> Tensor gradient layout device dataType shape

the input tensor to compute the the multivariate log-gamma function for

-> m (Tensor gradient layout device dataType shape)

the output tensor

Computes the multivariate log-gamma function with dimension p element-wise, given by \[ \log(\Gamma_p(\mathrm{input})) = C + \sum_{i=1}^{p} \log \left(\Gamma\left(\mathrm{input} - \frac{i-1}{2}\right)\right) \] where \(C = \log(\pi) \times \frac{p(p-1)}{4}\) and \(\Gamma(\dot)\) is the gamma function.

All elements of the input tensor must be greater than \(\frac{p-1}{2}\). Otherwise, the computation is halted and an exception is thrown.

neg Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the negative of the elements of input: \[ \mathrm{output}_i = - \mathrm{input}_i. \]

polygamma Source #

Arguments

:: forall gradient layout device dataType shape. Int

the order of the polygamma function

-> Tensor gradient layout device dataType shape

the input tensor

-> Tensor gradient layout device dataType shape

the output tensor

Computes the \(n\)-th derivative of the digamma function \(\psi\) on the input, where \(n \ge 0\). \(n\) is called the order of the polygamma function \(\psi^{(n)}\) that is defined as: \[ \psi^{(n)}(\mathrm{input}) = \frac{d^{(n)}}{d\mathrm{input}^{(n)}} \psi(\mathrm{input}) \] where \(\psi(\mathrm{input}) = \log(\Gamma(\mathrm{input}))\).

pow Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient' layout' device' dataType' shape'

tensor input

-> Tensor gradient layout device dataType shape

tensor exponent

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

tensor output

Takes the power of each element in the tensor input with the corresponding element in the tensor exponent and returns a tensor with the result.

Note that the exponent and the input must be tensors with broadcastable shapes. See powScalar for a version that takes a scalar exponent as argument and powTensor for a version where the input is a scalar and the exponent a tensor.

The following operation is applied: \[ \mathrm{output}_i = \mathrm{input}_i^{\mathrm{exponent}_i}. \]

powScalar Source #

Arguments

:: forall exponent gradient layout device dataType shape m. (Scalar exponent, MonadThrow m) 
=> Tensor gradient layout device dataType shape

tensor input

-> exponent

scalar exponent

-> m (Tensor gradient layout device dataType shape)

tensor output

Takes the power of each element in the tensor input with the scalar exponent and returns a tensor with the result.

Note that the exponent is a scalar. See pow for a version that takes a tensor exponent as argument and powTensor for a version where the input is a scalar and the exponent a tensor.

The following operation is applied: \[ \mathrm{output}_i = \mathrm{input}_i^{\mathrm{exponent}}. \]

powTensor Source #

Arguments

:: forall input gradient layout device dataType shape. Scalar input 
=> input

scalar input

-> Tensor gradient layout device dataType shape

tensor exponent

-> Tensor gradient layout device dataType shape

tensor output

Takes the power of the scalar input with each element in the tensor exponent and returns a tensor with the result.

Note that the exponent is a tensor while the input is a scalar. See pow for a version that takes a tensor input as argument and powScalar for a version where the input is a tensor and the exponent a scalar.

The following operation is applied: \[ \mathrm{output}_i = \mathrm{input}^{\mathrm{exponent}_i}. \]

rad2deg Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with each of the elements of input converted from angles in radians to degrees.

reciprocal Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the reciprocal of the elements of input: \[ \mathrm{output}_i = \frac{1}{\mathrm{input}_i} \]

remainder Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

dividend

-> Tensor gradient' layout' device' dataType' shape'

divisor

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

output

Computes the element-wise remainder of division.

The dividend and divisor may contain integer and floating point numbers. The remainder has the same sign as the divisor other.

When other is a tensor, the shapes of input and other must be broadcastable.

round Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with each of the elements of input rounded to the closest integer. Note that the data type is unchanged.

rsqrt Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the reciprocal of the square-root of each of the elements of input: \[ \mathrm{output}_i = \frac{1}{\sqrt{\mathrm{input}_i}}. \]

sigmoid Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the sigmoid of the elements of input: \[ \mathrm{output}_i = \frac{1}{1 + \exp \left(-\mathrm{input}_i\right)} \]

sign Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the signs of the elements of input: \[ \mathrm{output}_i = \begin{cases} -1 & \text{if } \mathrm{input}_i < 0 \\ 0 & \text{if } \mathrm{input}_i = 0 \\ 1 & \text{if } \mathrm{input}_i > 0. \end{cases} \]

sin Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the sine of the elements of input: \[ \mathrm{output}_i = \sin \left(\mathrm{input}_i\right). \]

sinh Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the hyperbolic sine of the elements of input: \[ \mathrm{output}_i = \sinh \left(\mathrm{input}_i\right). \]

sub Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

input tensor

-> Tensor gradient' layout' device' dataType' shape'

other tensor

-> m (Tensor (gradient <|> gradient) (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

output tensor

Element-wise subtraction of one tensor from another: \[ \mathrm{output}_i = \mathrm{input}_i - \mathrm{other}_i. \] The result is returned as a new tensor.

The shape of other must be broadcastable with the shape of input. See subScalar for a version of this function where the other input is a scalar.

subScalar Source #

Arguments

:: forall other gradient layout device dataType shape m. (Scalar other, MonadThrow m) 
=> Tensor gradient layout device dataType shape

input tensor

-> other

input scalar

-> m (Tensor gradient layout device dataType shape)

output tensor

Subtracts a scalar other from a tensor input: \[ \mathrm{output}_i = \mathrm{input}_i - \mathrm{other}. \] The result is returned as a new tensor. See sub for a version of this function where the second argument is a tensor.

sqrt Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the square-root of the elements of input: \[ \mathrm{output}_i = \sqrt{\mathrm{input}_i}. \]

square Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the square of the elements of input: \[ \mathrm{output}_i = \mathrm{input}_i^2. \]

See pow, powScalar, or powTensor for exponentiation with respect to arbitrary exponents.

tan Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the tangent of the elements of input: \[ \mathrm{output}_i = \tan \left(\mathrm{input}_i\right). \]

tanh Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output

Returns a new tensor with the hyperbolic tangent of the elements of input: \[ \mathrm{output}_i = \tanh \left(\mathrm{input}_i\right). \]

trueDivide Source #

Arguments

:: forall gradient layout device dataType shape gradient' layout' device' dataType' shape' shape'' m. (MonadThrow m, shape'' ~ BroadcastShapesF shape shape', Catch shape'') 
=> Tensor gradient layout device dataType shape

tensor dividend

-> Tensor gradient' layout' device' dataType' shape'

tensor divisor

-> m (Tensor (gradient <|> gradient') (layout <+> layout') (device <+> device') (dataType <+> dataType') shape'')

tensor output

Performs “true division” that always computes the division in floating point: \[ \mathrm{output}_i = \frac{\mathrm{dividend}_i}{\mathrm{divisor}_i}. \]

trueDivide is completely equivalent to division using div except when both inputs have DType or integer data types, in which case the inputs are converted to floating data types before performing the division.

See trueDivideScalar for a version of this function where the divisor is a scalar.

trueDivideScalar Source #

Arguments

:: forall other gradient layout device dataType shape. Scalar other 
=> Tensor gradient layout device dataType shape

tensor dividend

-> other

scalar divisor

-> Tensor gradient layout device dataType shape

tensor output

Performs “true division” that always computes the division in floating point: \[ \mathrm{output}_i = \frac{\mathrm{dividend}_i}{\mathrm{divisor}}. \]

trueDivideScalar is completely equivalent to division using divScalar except when both inputs have DType or integer data types, in which case the inputs are converted to floating data types before performing the division.

See trueDivide for a version of this function where the divisor is a tensor.

trunc Source #

Arguments

:: forall gradient layout device dataType shape. Tensor gradient layout device dataType shape

input

-> Tensor gradient layout device dataType shape

output