Closed
Description
x = np.random.randn(10) + 10.
x = np.concatenate([x, [np.nan], [np.nan]])
x = np.ma.masked_array(x, np.isnan(x))
with pm.Model() as m:
a = pm.Normal('a')
pm.Normal('x', a, 1., observed=x)
pm.model_to_graphviz(m)
which is technically correct as we define x_missing as a NoDistribution: https://github.com/pymc-devs/pymc3/blob/ee57bed9aa5f73dc48081dd3c82267dd4f0f2834/pymc3/model.py#L1672-L1677
the reason of doing so is that we fill in the missing value into the original data: https://github.com/pymc-devs/pymc3/blob/ee57bed9aa5f73dc48081dd3c82267dd4f0f2834/pymc3/model.py#L1683-L1684
and to not double count the log_prob, the data tensor need to be a FreeRV (so we sample from it) with no log_prob (as the log_prob already accounted for in x
in this example).
I think the easiest way to fix is to overwrite the _repr_
here with the _repr_
from parent.
Metadata
Metadata
Assignees
Labels
No labels