Returns an optimizer which clips gradients before applying them.
tf.contrib.estimator.clip_gradients_by_norm(
optimizer, clip_norm
)
Example:
optimizer = tf.train.ProximalAdagradOptimizer(
learning_rate=0.1,
l1_regularization_strength=0.001)
optimizer = tf.contrib.estimator.clip_gradients_by_norm(
optimizer, clip_norm)
estimator = tf.estimator.DNNClassifier(
feature_columns=[...],
hidden_units=[1024, 512, 256],
optimizer=optimizer)
Args |
optimizer
|
An tf.Optimizer object to apply gradients.
|
clip_norm
|
A 0-D (scalar) Tensor > 0. The clipping ratio.
|