You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If I replace a tensor with shape (None,) by one with static shape (1,) llvm aborts during codegen for me. Maybe some rewrite doesn't behave properly, and we produce invalid llvm code somewhere?
importpytensor.tensorasptimportpytensorx=pt.vector("x")
logp=-(x**2).sum()
pytensor.dprint(logp, print_type=True)
# Works if I set the shape to x.shape#x_known = pt.vector("x_known", shape=x.type.shape)x_known=pt.vector("x_known", shape=(1,))
logp=pytensor.graph_replace(logp, {x: x_known})
print("after")
pytensor.dprint(logp, print_type=True)
pytensor.function((x_known,), logp, mode=pytensor.compile.NUMBA)
Maybe we should just check that we don't change the type in a graph_replace?
The text was updated successfully, but these errors were encountered:
Uh oh!
There was an error while loading. Please reload this page.
Description
If I replace a tensor with shape
(None,)
by one with static shape(1,)
llvm aborts during codegen for me. Maybe some rewrite doesn't behave properly, and we produce invalid llvm code somewhere?Maybe we should just check that we don't change the type in a
graph_replace
?The text was updated successfully, but these errors were encountered: