Closed
Description
with pm.Model() as model:
a = pm.HalfNormal("a")
c = pm.Uniform("c", lower=a - 1, upper=1)
print("before")
print("------")
aesara.dprint(c)
print()
model.logpt
print("after")
print("-----")
aesara.dprint(c)
before
------
uniform_rv{0, (0, 0), floatX, False}.1 [id A] 'c'
|RandomStateSharedVariable(<RandomState(MT19937) at 0x7F649E351540>) [id B]
|TensorConstant{[]} [id C]
|TensorConstant{11} [id D]
|Elemwise{sub,no_inplace} [id E] ''
| |halfnormal_rv{0, (0, 0), floatX, False}.1 [id F] 'a'
| | |RandomStateSharedVariable(<RandomState(MT19937) at 0x7F649D753040>) [id G]
| | |TensorConstant{[]} [id H]
| | |TensorConstant{11} [id I]
| | |TensorConstant{0.0} [id J]
| | |TensorConstant{1.0} [id K]
| |TensorConstant{1} [id L]
|TensorConstant{1.0} [id M]
after
-----
uniform_rv{0, (0, 0), floatX, False}.1 [id A] 'c'
|RandomStateSharedVariable(<RandomState(MT19937) at 0x7F649E351540>) [id B]
|TensorConstant{[]} [id C]
|TensorConstant{11} [id D]
|Elemwise{sub,no_inplace} [id E] ''
| |Elemwise{exp,no_inplace} [id F] ''
| | |a_log__ [id G]
| |TensorConstant{1} [id H]
|TensorConstant{1.0} [id I]
Maybe some graph.replace_all
is changing the variables inplace when it should have made a copy first?