SoftmaxCentered
Inherits From: Bijector
Defined in tensorflow/contrib/distributions/python/ops/bijectors/softmax_centered.py
.
See the guide: Random variable transformations (contrib) > Bijectors
Bijector which computes Y = g(X) = exp([X 0]) / sum(exp([X 0]))
.
To implement softmax as a bijection, the forward transformation appends a value to the input and the inverse removes this coordinate. The appended coordinate represents a pivot, e.g., softmax(x) = exp(x-c) / sum(exp(x-c))
where c
is the implicit last coordinate.
Example Use:
bijector.SoftmaxCentered().forward(tf.log([2, 3, 4])) # Result: [0.2, 0.3, 0.4, 0.1] # Extra result: 0.1 bijector.SoftmaxCentered().inverse([0.2, 0.3, 0.4, 0.1]) # Result: tf.log([2, 3, 4]) # Extra coordinate removed.
At first blush it may seem like the Invariance of domain theorem implies this implementation is not a bijection. However, the appended dimension makes the (forward) image non-open and the theorem does not directly apply.
dtype
dtype of Tensor
s transformable by this distribution.
event_ndims
Returns then number of event dimensions this bijector operates on.
graph_parents
Returns this Bijector
's graph_parents as a Python list.
is_constant_jacobian
Returns true iff the Jacobian is not a function of x.
Note: Jacobian is either constant for both forward and inverse or neither.
is_constant_jacobian
: Python bool
.name
Returns the string name of this Bijector
.
validate_args
Returns True if Tensor arguments will be validated.
__init__
__init__( validate_args=False, name='softmax_centered' )
Constructs Bijector.
A Bijector
transforms random variables into new random variables.
Examples:
# Create the Y = g(X) = X transform which operates on vector events. identity = Identity(event_ndims=1) # Create the Y = g(X) = exp(X) transform which operates on matrices. exp = Exp(event_ndims=2)
See Bijector
subclass docstring for more details and specific examples.
event_ndims
: number of dimensions associated with event coordinates.graph_parents
: Python list of graph prerequisites of this Bijector
.is_constant_jacobian
: Python bool
indicating that the Jacobian is not a function of the input.validate_args
: Python bool
, default False
. Whether to validate input with asserts. If validate_args
is False
, and the inputs are invalid, correct behavior is not guaranteed.dtype
: tf.dtype
supported by this Bijector
. None
means dtype is not enforced.name
: The name to give Ops created by the initializer.ValueError
: If a member of graph_parents
is not a Tensor
.forward
forward( x, name='forward' )
Returns the forward Bijector
evaluation, i.e., X = g(Y).
x
: Tensor
. The input to the "forward" evaluation.name
: The name to give this op.Tensor
.
TypeError
: if self.dtype
is specified and x.dtype
is not self.dtype
.NotImplementedError
: if _forward
is not implemented.forward_event_shape
forward_event_shape(input_shape)
Shape of a single sample from a single batch as a TensorShape
.
Same meaning as forward_event_shape_tensor
. May be only partially defined.
input_shape
: TensorShape
indicating event-portion shape passed into forward
function.forward_event_shape_tensor
: TensorShape
indicating event-portion shape after applying forward
. Possibly unknown.forward_event_shape_tensor
forward_event_shape_tensor( input_shape, name='forward_event_shape_tensor' )
Shape of a single sample from a single batch as an int32
1D Tensor
.
input_shape
: Tensor
, int32
vector indicating event-portion shape passed into forward
function.name
: name to give to the opforward_event_shape_tensor
: Tensor
, int32
vector indicating event-portion shape after applying forward
.forward_log_det_jacobian
forward_log_det_jacobian( x, name='forward_log_det_jacobian' )
Returns both the forward_log_det_jacobian.
x
: Tensor
. The input to the "forward" Jacobian evaluation.name
: The name to give this op.Tensor
, if this bijector is injective. If not injective this is not implemented.
TypeError
: if self.dtype
is specified and y.dtype
is not self.dtype
.NotImplementedError
: if neither _forward_log_det_jacobian
nor {_inverse
, _inverse_log_det_jacobian
} are implemented, or this is a non-injective bijector.inverse
inverse( y, name='inverse' )
Returns the inverse Bijector
evaluation, i.e., X = g^{-1}(Y).
y
: Tensor
. The input to the "inverse" evaluation.name
: The name to give this op.Tensor
, if this bijector is injective. If not injective, returns the k-tuple containing the unique k
points (x1, ..., xk)
such that g(xi) = y
.
TypeError
: if self.dtype
is specified and y.dtype
is not self.dtype
.NotImplementedError
: if _inverse
is not implemented.inverse_event_shape
inverse_event_shape(output_shape)
Shape of a single sample from a single batch as a TensorShape
.
Same meaning as inverse_event_shape_tensor
. May be only partially defined.
output_shape
: TensorShape
indicating event-portion shape passed into inverse
function.inverse_event_shape_tensor
: TensorShape
indicating event-portion shape after applying inverse
. Possibly unknown.inverse_event_shape_tensor
inverse_event_shape_tensor( output_shape, name='inverse_event_shape_tensor' )
Shape of a single sample from a single batch as an int32
1D Tensor
.
output_shape
: Tensor
, int32
vector indicating event-portion shape passed into inverse
function.name
: name to give to the opinverse_event_shape_tensor
: Tensor
, int32
vector indicating event-portion shape after applying inverse
.inverse_log_det_jacobian
inverse_log_det_jacobian( y, name='inverse_log_det_jacobian' )
Returns the (log o det o Jacobian o inverse)(y).
Mathematically, returns: log(det(dX/dY))(Y)
. (Recall that: X=g^{-1}(Y)
.)
Note that forward_log_det_jacobian
is the negative of this function, evaluated at g^{-1}(y)
.
y
: Tensor
. The input to the "inverse" Jacobian evaluation.name
: The name to give this op.Tensor
, if this bijector is injective. If not injective, returns the tuple of local log det Jacobians, log(det(Dg_i^{-1}(y)))
, where g_i
is the restriction of g
to the ith
partition Di
.
TypeError
: if self.dtype
is specified and y.dtype
is not self.dtype
.NotImplementedError
: if _inverse_log_det_jacobian
is not implemented.
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/distributions/bijectors/SoftmaxCentered