Skip to content

Function involving broadcasting doesn't work on Julia v1.11+ #2454

@giordano

Description

@giordano
julia> using Enzyme

julia> f(x, y) = sum(sin.(x) .^ 2 + cos.(y) .^ 2)
f (generic function with 1 method)

julia> gradient(Forward, f, rand(3), rand(3))
┌ Warning: TODO forward zero-set of memorycopy used memset rather than runtime type 
│ Caused by:
│ Stacktrace:
[...several times....]
ERROR: Constant memory is stored (or returned) to a differentiable variable.
As a result, Enzyme cannot provably ensure correctness and throws this error.
This might be due to the use of a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/faq/#Runtime-Activity).
If Enzyme should be able to prove this use non-differentable, open an issue!
To work around this issue, either:
 a) rewrite this variable to not be conditionally active (fastest, but requires a code change), or
 b) set the Enzyme mode to turn on runtime activity (e.g. autodiff(set_runtime_activity(Reverse), ...) ). This will maintain correctness, but may slightly reduce performance.
Mismatched activity for:   %97 = phi {} addrspace(10)* [ %88, %L304 ], [ %207, %guard_exit35 ] const val:   %207 = load {} addrspace(10)*, {} addrspace(10)* addrspace(11)* %206, align 8, !dbg !301, !tbaa !128, !alias.scope !91, !noalias !94, !dereferenceable_or_null !263, !align !264, !enzyme_type !119, !enzymejl_source_type_Memory\7BFloat64\7D !0, !enzymejl_byref_MUT_REF !0
 value=Unknown object of type Memory{Float64}
 llvalue=  %207 = load {} addrspace(10)*, {} addrspace(10)* addrspace(11)* %206, align 8, !dbg !301, !tbaa !128, !alias.scope !91, !noalias !94, !dereferenceable_or_null !263, !align !264, !enzyme_type !119, !enzymejl_source_type_Memory\7BFloat64\7D !0, !enzymejl_byref_MUT_REF !0

Stacktrace:
  [1] ==
    @ ./promotion.jl:639
  [2] !=
    @ ./operators.jl:277
  [3] _newindexer
    @ ./broadcast.jl:604
  [4] shapeindexer
    @ ./broadcast.jl:599
  [5] newindexer
    @ ./broadcast.jl:598
  [6] extrude
    @ ./broadcast.jl:645
  [7] preprocess
    @ ./broadcast.jl:953
  [8] preprocess_args
    @ ./broadcast.jl:956
  [9] preprocess
    @ ./broadcast.jl:952
 [10] preprocess_args (repeats 2 times)
    @ ./broadcast.jl:955
 [11] preprocess
    @ ./broadcast.jl:952
 [12] override_bc_copyto!
    @ ~/.julia/packages/Enzyme/VsOlr/src/compiler/interpreter.jl:798
 [13] copyto!
    @ ./broadcast.jl:925
 [14] copy
    @ ./broadcast.jl:897
 [15] materialize
    @ ./broadcast.jl:872
 [16] f
    @ ./REPL[43]:1

Stacktrace:
  [1] unalias
    @ ./abstractarray.jl:1500 [inlined]
  [2] broadcast_unalias
    @ ./broadcast.jl:946 [inlined]
  [3] preprocess
    @ ./broadcast.jl:953 [inlined]
  [4] preprocess_args
    @ ./broadcast.jl:956 [inlined]
  [5] preprocess
    @ ./broadcast.jl:952 [inlined]
  [6] preprocess_args (repeats 2 times)
    @ ./broadcast.jl:955 [inlined]
  [7] preprocess
    @ ./broadcast.jl:952 [inlined]
  [8] override_bc_copyto!
    @ ~/.julia/packages/Enzyme/VsOlr/src/compiler/interpreter.jl:798 [inlined]
  [9] copyto!
    @ ./broadcast.jl:925 [inlined]
 [10] copy
    @ ./broadcast.jl:897 [inlined]
 [11] materialize
    @ ./broadcast.jl:872 [inlined]
 [12] f
    @ ./REPL[43]:1 [inlined]
 [13] fwddiffe3julia_f_75521wrap
    @ ./REPL[43]:0
 [14] macro expansion
    @ ~/.julia/packages/Enzyme/VsOlr/src/compiler.jl:5516 [inlined]
 [15] enzyme_call
    @ ~/.julia/packages/Enzyme/VsOlr/src/compiler.jl:5050 [inlined]
 [16] ForwardModeThunk
    @ ~/.julia/packages/Enzyme/VsOlr/src/compiler.jl:4941 [inlined]
 [17] autodiff
    @ ~/.julia/packages/Enzyme/VsOlr/src/Enzyme.jl:669 [inlined]
 [18] autodiff
    @ ~/.julia/packages/Enzyme/VsOlr/src/Enzyme.jl:538 [inlined]
 [19] macro expansion
    @ ~/.julia/packages/Enzyme/VsOlr/src/sugar.jl:680 [inlined]
 [20] gradient(fm::ForwardMode{false, FFIABI, false, false, false}, f::typeof(f), x::Vector{Float64}, args::Vector{Float64}; chunk::Nothing, shadows::Tuple{Tuple{Vector{…}, Vector{…}, Vector{…}}, Tuple{Vector{…}, Vector{…}, Vector{…}}})
    @ Enzyme ~/.julia/packages/Enzyme/VsOlr/src/sugar.jl:582
 [21] gradient(fm::ForwardMode{false, FFIABI, false, false, false}, f::typeof(f), x::Vector{Float64}, args::Vector{Float64})
    @ Enzyme ~/.julia/packages/Enzyme/VsOlr/src/sugar.jl:582
 [22] top-level scope
    @ REPL[44]:1
Some type information was truncated. Use `show(err)` to see complete types.

Similar (but not identical) error happens with FixedSizeArrays, minus all the warnings:

julia> using Enzyme, FixedSizeArrays

julia> gradient(Forward, f, FixedSizeVector(rand(3)), FixedSizeVector(rand(3)))
ERROR: Constant memory is stored (or returned) to a differentiable variable.
As a result, Enzyme cannot provably ensure correctness and throws this error.
This might be due to the use of a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/faq/#Runtime-Activity).
If Enzyme should be able to prove this use non-differentable, open an issue!
To work around this issue, either:
 a) rewrite this variable to not be conditionally active (fastest, but requires a code change), or
 b) set the Enzyme mode to turn on runtime activity (e.g. autodiff(set_runtime_activity(Reverse), ...) ). This will maintain correctness, but may slightly reduce performance.
Mismatched activity for:   %89 = phi {} addrspace(10)** [ %.pre169, %L324.i ], [ %155, %guard_exit72.i ] const val:   %155 = load {} addrspace(10)**, {} addrspace(10)** addrspace(11)* %154, align 8, !dbg !173, !tbaa !183, !alias.scope !186, !noalias !187, !nonnull !0
Type tree: {[-1]:Pointer, [-1,-1]:Float@double}
 llvalue=  %.innerparm1 = alloca { {} addrspace(10)*, [1 x i64] }, align 8, !dbg !69, !enzyme_type !62, !enzyme_inactive !0

No problems when defining an equivalent function with explicit for loop instead of broadcasting

julia> function g(x::AbstractVector{T}, y::AbstractVector{T}) where {T}
           sum = zero(float(T))
           for idx in eachindex(x, y)
               sum += sin(x[idx]) ^ 2 + cos(y[idx]) ^ 2
           end
           return sum
       end
g (generic function with 1 method)

julia> gradient(Forward, g, FixedSizeVector(rand(3)), FixedSizeVector(rand(3)))
([0.378558649401388, 0.9670680072027527, 0.8480688358728419], [-0.6078135603115832, -0.8777293867274361, -0.9998971762837213])

julia> gradient(Forward, g, rand(3), rand(3))
([0.4865266254439435, 0.3500116590634633, 0.39677804546057494], [-0.8026804953064194, -0.9809016723312391, -0.9202634946202176])

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions