View source on GitHub |
Partitioner to specify a fixed number of shards along given axis.
tf.compat.v1.fixed_size_partitioner(
num_shards, axis=0
)
Migrate to TF2
This API is deprecated in TF2. In TF2, partitioner is no longer part of
the variable declaration via tf.Variable
.
ParameterServer Training
handles partitioning of variables. The corresponding TF2 partitioner class of
fixed_size_partitioner
is
tf.distribute.experimental.partitioners.FixedShardsPartitioner
.
Check the migration guide on the differences in treatment of variables and losses between TF1 and TF2.
Before:
x = tf.compat.v1.get_variable(
"x", shape=(2,), partitioner=tf.compat.v1.fixed_size_partitioner(2)
)
After:
partitioner = (
tf.distribute.experimental.partitioners.FixedShardsPartitioner(
num_shards=2)
)
strategy = tf.distribute.experimental.ParameterServerStrategy(
cluster_resolver=cluster_resolver,
variable_partitioner=partitioner)
with strategy.scope():
x = tf.Variable([1.0, 2.0])
Description
Args | |
---|---|
num_shards
|
int , number of shards to partition variable.
|
axis
|
int , axis to partition on.
|
Returns | |
---|---|
A partition function usable as the partitioner argument to
variable_scope and get_variable .
|