Providing Services that take you ahead of your competition.
Work that matters!
Advance Technology Stack just right for your needs!
Serving excellence since 1998.
Use DenseVariational from tensorflow_probability.layers, not standard Dense.
from tensorflow_probability import layers as tfpl
model.add(tfpl.DenseVariational(
units=128,
make_prior_fn=prior,
make_posterior_fn=posterior,
kl_weight=1/num_train_samples,
activation=’relu’
))
Avoid overly narrow priors like zero-mean with tiny stddev unless you’re sure.
Example:
python
Copy
def prior_fn(dtype, shape):
return tfp.distributions.Independent(
tfp.distributions.Normal(loc=tf.zeros(shape, dtype=dtype), scale=1.0),
reinterpreted_batch_ndims=1)
Print both the negative log-likelihood (NLL) and KL divergence:
loss = nll + kl_divergence
If KL is much larger than NLL early in training, your kl_weight is likely too high.
©2025 Samyak Infotech Pvt Ltd. | All trademarks, images and logos are the property of their respective owners.