When self_normalized = True the modified-GAN (Generative/Adversarial
Network) Csiszar-function is:
f(u) = log(1 + u) - log(u) + 0.5 (u - 1)
When self_normalized = False the 0.5 (u - 1) is omitted.
The unmodified GAN Csiszar-function is identical to Jensen-Shannon (with
self_normalized = False).
Args
logu
float-like Tensor representing log(u) from above.
self_normalized
Python bool indicating whether f'(u=1)=0. When
f'(u=1)=0 the implied Csiszar f-Divergence remains non-negative even
when p, q are unnormalized measures.
name
Python str name prefixed to Ops created by this function.
Returns
chi_square_of_u
float-like Tensor of the Csiszar-function evaluated
at u = exp(logu).
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-11-21 UTC."],[],[],null,["# tfp.substrates.numpy.vi.modified_gan\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/probability/blob/v0.23.0/tensorflow_probability/substrates/numpy/vi/csiszar_divergence.py#L720-L761) |\n\nThe Modified-GAN Csiszar-function in log-space.\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tfp.experimental.substrates.numpy.vi.modified_gan`](https://www.tensorflow.org/probability/api_docs/python/tfp/substrates/numpy/vi/modified_gan)\n\n\u003cbr /\u003e\n\n tfp.substrates.numpy.vi.modified_gan(\n logu, self_normalized=False, name=None\n )\n\nA Csiszar-function is a member of, \n\n F = { f:R_+ to R : f convex }.\n\nWhen `self_normalized = True` the modified-GAN (Generative/Adversarial\nNetwork) Csiszar-function is: \n\n f(u) = log(1 + u) - log(u) + 0.5 (u - 1)\n\nWhen `self_normalized = False` the `0.5 (u - 1)` is omitted.\n\nThe unmodified GAN Csiszar-function is identical to Jensen-Shannon (with\n`self_normalized = False`).\n| **Warning:** this function makes non-log-space calculations and may therefore be numerically unstable for `|logu| \u003e\u003e 0`.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `logu` | `float`-like `Tensor` representing `log(u)` from above. |\n| `self_normalized` | Python `bool` indicating whether `f'(u=1)=0`. When `f'(u=1)=0` the implied Csiszar f-Divergence remains non-negative even when `p, q` are unnormalized measures. |\n| `name` | Python `str` name prefixed to Ops created by this function. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|-------------------|-----------------------------------------------------------------------------|\n| `chi_square_of_u` | `float`-like `Tensor` of the Csiszar-function evaluated at `u = exp(logu)`. |\n\n\u003cbr /\u003e"]]