Skip to content

DynamicPPL 0.36 #2535

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,23 @@
# Release 0.38.0

DynamicPPL compatibility has been bumped to 0.36.
This brings with it a number of changes: the ones most likely to affect you are submodel prefixing and conditioning.
Variables in submodels are now represented correctly with field accessors.
For example:

```julia
using Turing
@model inner() = x ~ Normal()
@model outer() = a ~ to_submodel(inner())
```

`keys(VarInfo(outer()))` now returns `[@varname(a.x)]` instead of `[@varname(var"a.x")]`

Furthermore, you can now either condition on the outer model like `outer() | (@varname(a.x) => 1.0)`, or the inner model like `inner() | (@varname(x) => 1.0)`.
If you use the conditioned inner model as a submodel, the conditioning will still apply correctly.

Please see [the DynamicPPL release notes](https://github.com/TuringLang/DynamicPPL.jl/releases/tag/v0.36.0) for fuller details.

# Release 0.37.1

`maximum_a_posteriori` and `maximum_likelihood` now perform sanity checks on the model before running the optimisation.
Expand Down
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "Turing"
uuid = "fce5fe82-541a-59a6-adf8-730c64b5f9a0"
version = "0.37.1"
version = "0.38.0"

[deps]
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
Expand Down Expand Up @@ -62,7 +62,7 @@ Distributions = "0.25.77"
DistributionsAD = "0.6"
DocStringExtensions = "0.8, 0.9"
DynamicHMC = "3.4"
DynamicPPL = "0.35"
DynamicPPL = "0.36"
EllipticalSliceSampling = "0.5, 1, 2"
ForwardDiff = "0.10.3"
Libtask = "0.8.8"
Expand Down
2 changes: 1 addition & 1 deletion docs/src/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ even though [`Prior()`](@ref) is actually defined in the `Turing.Inference` modu
| `@model` | [`DynamicPPL.@model`](@extref) | Define a probabilistic model |
| `@varname` | [`AbstractPPL.@varname`](@extref) | Generate a `VarName` from a Julia expression |
| `to_submodel` | [`DynamicPPL.to_submodel`](@extref) | Define a submodel |
| `prefix` | [`DynamicPPL.prefix`](@extref) | Prefix all variable names in a model with a given symbol |
| `prefix` | [`DynamicPPL.prefix`](@extref) | Prefix all variable names in a model with a given VarName |
| `LogDensityFunction` | [`DynamicPPL.LogDensityFunction`](@extref) | A struct containing all information about how to evaluate a model. Mostly for advanced users |

### Inference
Expand Down
5 changes: 2 additions & 3 deletions src/mcmc/Inference.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,8 @@
using DynamicPPL:
Metadata,
VarInfo,
TypedVarInfo,
# TODO(mhauru) all_varnames_grouped_by_symbol isn't exported by DPPL, because it is only
# implemented for TypedVarInfo. It is used by mh.jl. Either refactor mh.jl to not use it
# implemented for NTVarInfo. It is used by mh.jl. Either refactor mh.jl to not use it
# or implement it for other VarInfo types and export it from DPPL.
all_varnames_grouped_by_symbol,
syms,
Expand Down Expand Up @@ -161,7 +160,7 @@
end

# TODO: make a nicer `set_namedtuple!` and move these functions to DynamicPPL.
function DynamicPPL.unflatten(vi::TypedVarInfo, θ::NamedTuple)
function DynamicPPL.unflatten(vi::DynamicPPL.NTVarInfo, θ::NamedTuple)

Check warning on line 163 in src/mcmc/Inference.jl

View check run for this annotation

Codecov / codecov/patch

src/mcmc/Inference.jl#L163

Added line #L163 was not covered by tests
set_namedtuple!(deepcopy(vi), θ)
return vi
end
Expand Down
85 changes: 61 additions & 24 deletions src/mcmc/gibbs.jl
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,11 @@ isgibbscomponent(spl::ExternalSampler) = isgibbscomponent(spl.sampler)
isgibbscomponent(::AdvancedHMC.HMC) = true
isgibbscomponent(::AdvancedMH.MetropolisHastings) = true

function can_be_wrapped(ctx::DynamicPPL.AbstractContext)
return DynamicPPL.NodeTrait(ctx) isa DynamicPPL.IsLeaf
end
can_be_wrapped(ctx::DynamicPPL.PrefixContext) = can_be_wrapped(ctx.context)

# Basically like a `DynamicPPL.FixedContext` but
# 1. Hijacks the tilde pipeline to fix variables.
# 2. Computes the log-probability of the fixed variables.
Expand Down Expand Up @@ -54,8 +59,13 @@ for type stability of `tilde_assume`.
# Fields
$(FIELDS)
"""
struct GibbsContext{VNs,GVI<:Ref{<:AbstractVarInfo},Ctx<:DynamicPPL.AbstractContext} <:
DynamicPPL.AbstractContext
struct GibbsContext{
VNs<:Tuple{Vararg{VarName}},GVI<:Ref{<:AbstractVarInfo},Ctx<:DynamicPPL.AbstractContext
} <: DynamicPPL.AbstractContext
"""
the VarNames being sampled
"""
target_varnames::VNs
"""
a `Ref` to the global `AbstractVarInfo` object that holds values for all variables, both
those fixed and those being sampled. We use a `Ref` because this field may need to be
Expand All @@ -67,26 +77,14 @@ struct GibbsContext{VNs,GVI<:Ref{<:AbstractVarInfo},Ctx<:DynamicPPL.AbstractCont
"""
context::Ctx

function GibbsContext{VNs}(global_varinfo, context) where {VNs}
if !(DynamicPPL.NodeTrait(context) isa DynamicPPL.IsLeaf)
error("GibbsContext can only wrap a leaf context, not a $(context).")
end
return new{VNs,typeof(global_varinfo),typeof(context)}(global_varinfo, context)
end

function GibbsContext(target_varnames, global_varinfo, context)
if !(DynamicPPL.NodeTrait(context) isa DynamicPPL.IsLeaf)
if !can_be_wrapped(context)
error("GibbsContext can only wrap a leaf context, not a $(context).")
end
if any(vn -> DynamicPPL.getoptic(vn) != identity, target_varnames)
msg =
"All Gibbs target variables must have identity lenses. " *
"For example, you can't have `@varname(x.a[1])` as a target variable, " *
"only `@varname(x)`."
error(msg)
end
vn_sym = tuple(unique((DynamicPPL.getsym(vn) for vn in target_varnames))...)
return new{vn_sym,typeof(global_varinfo),typeof(context)}(global_varinfo, context)
target_varnames = tuple(target_varnames...) # Allow vectors.
return new{typeof(target_varnames),typeof(global_varinfo),typeof(context)}(
target_varnames, global_varinfo, context
)
end
end

Expand All @@ -96,8 +94,10 @@ end

DynamicPPL.NodeTrait(::GibbsContext) = DynamicPPL.IsParent()
DynamicPPL.childcontext(context::GibbsContext) = context.context
function DynamicPPL.setchildcontext(context::GibbsContext{VNs}, childcontext) where {VNs}
return GibbsContext{VNs}(Ref(context.global_varinfo[]), childcontext)
function DynamicPPL.setchildcontext(context::GibbsContext, childcontext)
return GibbsContext(
context.target_varnames, Ref(context.global_varinfo[]), childcontext
)
end

get_global_varinfo(context::GibbsContext) = context.global_varinfo[]
Expand Down Expand Up @@ -129,7 +129,9 @@ function get_conditioned_gibbs(context::GibbsContext, vns::AbstractArray{<:VarNa
return map(Base.Fix1(get_conditioned_gibbs, context), vns)
end

is_target_varname(::GibbsContext{VNs}, ::VarName{sym}) where {VNs,sym} = sym in VNs
function is_target_varname(ctx::GibbsContext, vn::VarName)
return any(Base.Fix2(subsumes, vn), ctx.target_varnames)
end

function is_target_varname(context::GibbsContext, vns::AbstractArray{<:VarName})
num_target = count(Iterators.map(Base.Fix1(is_target_varname, context), vns))
Expand All @@ -145,6 +147,37 @@ end
# Tilde pipeline
function DynamicPPL.tilde_assume(context::GibbsContext, right, vn, vi)
child_context = DynamicPPL.childcontext(context)

# Note that `child_context` may contain `PrefixContext`s -- in which case
# we need to make sure that vn is appropriately prefixed before we handle
# the `GibbsContext` behaviour below. For example, consider the following:
# @model inner() = x ~ Normal()
# @model outer() = a ~ to_submodel(inner())
# If we run this with `Gibbs(@varname(a.x) => MH())`, then when we are
# executing the submodel, the `context` will contain the `@varname(a.x)`
# variable; `child_context` will contain `PrefixContext(@varname(a))`; and
# `vn` will just be `@varname(x)`. If we just simply run
# `is_target_varname(context, vn)`, it will return false, and everything
# will be messed up.
# TODO(penelopeysm): This 'problem' could be solved if we made GibbsContext a
# leaf context and wrapped the PrefixContext _above_ the GibbsContext, so
# that the prefixing would be handled by tilde_assume(::PrefixContext, ...)
# _before_ we hit this method.
# In the current state of GibbsContext, doing this would require
# special-casing the way PrefixContext is used to wrap the leaf context.
# This is very inconvenient because PrefixContext's behaviour is defined in
# DynamicPPL, and we would basically have to create a new method in Turing
# and override it for GibbsContext. Indeed, a better way to do this would
# be to make GibbsContext a leaf context. In this case, we would be able to
# rely on the existing behaviour of DynamicPPL.make_evaluate_args_and_kwargs
# to correctly wrap the PrefixContext around the GibbsContext. This is very
# tricky to correctly do now, but once we remove the other leaf contexts
# (i.e. PriorContext and LikelihoodContext), we should be able to do this.
# This is already implemented in
# https://github.com/TuringLang/DynamicPPL.jl/pull/885/ but not yet
# released. Exciting!
vn, child_context = DynamicPPL.prefix_and_strip_contexts(child_context, vn)

return if is_target_varname(context, vn)
# Fall back to the default behavior.
DynamicPPL.tilde_assume(child_context, right, vn, vi)
Expand Down Expand Up @@ -177,6 +210,8 @@ function DynamicPPL.tilde_assume(
)
# See comment in the above, rng-less version of this method for an explanation.
child_context = DynamicPPL.childcontext(context)
vn, child_context = DynamicPPL.prefix_and_strip_contexts(child_context, vn)

return if is_target_varname(context, vn)
DynamicPPL.tilde_assume(rng, child_context, sampler, right, vn, vi)
elseif has_conditioned_gibbs(context, vn)
Expand Down Expand Up @@ -232,9 +267,11 @@ end
wrap_in_sampler(x::AbstractMCMC.AbstractSampler) = x
wrap_in_sampler(x::InferenceAlgorithm) = DynamicPPL.Sampler(x)

to_varname_list(x::Union{VarName,Symbol}) = [VarName(x)]
to_varname(x::VarName) = x
to_varname(x::Symbol) = VarName{x}()
to_varname_list(x::Union{VarName,Symbol}) = [to_varname(x)]
# Any other value is assumed to be an iterable of VarNames and Symbols.
to_varname_list(t) = collect(map(VarName, t))
to_varname_list(t) = collect(map(to_varname, t))

"""
Gibbs
Expand Down
4 changes: 2 additions & 2 deletions test/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ TimerOutputs = "a759f4b9-e2f1-59dc-863e-4aeb61b1ea8f"

[compat]
AbstractMCMC = "5"
AbstractPPL = "0.9, 0.10"
AbstractPPL = "0.9, 0.10, 0.11"
AdvancedMH = "0.6, 0.7, 0.8"
AdvancedPS = "=0.6.0"
AdvancedVI = "0.2"
Expand All @@ -52,7 +52,7 @@ Combinatorics = "1"
Distributions = "0.25"
DistributionsAD = "0.6.3"
DynamicHMC = "2.1.6, 3.0"
DynamicPPL = "0.35"
DynamicPPL = "0.36"
FiniteDifferences = "0.10.8, 0.11, 0.12"
ForwardDiff = "0.10.12 - 0.10.32, 0.10"
HypothesisTests = "0.11"
Expand Down
55 changes: 42 additions & 13 deletions test/mcmc/gibbs.jl
Original file line number Diff line number Diff line change
Expand Up @@ -160,10 +160,6 @@ end
return Inference.setparams_varinfo!!(model, unwrap_sampler(sampler), state, params)
end

function target_vns(::Inference.GibbsContext{VNs}) where {VNs}
return VNs
end

# targets_and_algs will be a list of tuples, where the first element is the target_vns
# of a component sampler, and the second element is the component sampler itself.
# It is modified by the capture_targets_and_algs function.
Expand All @@ -174,7 +170,7 @@ end
return nothing
end
if context isa Inference.GibbsContext
push!(targets_and_algs, (target_vns(context), sampler))
push!(targets_and_algs, (context.target_varnames, sampler))
end
return capture_targets_and_algs(sampler, DynamicPPL.childcontext(context))
end
Expand Down Expand Up @@ -240,14 +236,14 @@ end
chain = sample(test_model(-1), sampler, 2)

expected_targets_and_algs_per_iteration = [
((:s,), mh),
((:s, :m), mh),
((:m,), pg),
((:xs,), hmc),
((:ys,), nuts),
((:ys,), nuts),
((:xs, :ys), hmc),
((:s,), mh),
((@varname(s),), mh),
((@varname(s), @varname(m)), mh),
((@varname(m),), pg),
((@varname(xs),), hmc),
((@varname(ys),), nuts),
((@varname(ys),), nuts),
((@varname(xs), @varname(ys)), hmc),
((@varname(s),), mh),
]
@test targets_and_algs == vcat(
expected_targets_and_algs_per_iteration, expected_targets_and_algs_per_iteration
Expand Down Expand Up @@ -727,6 +723,39 @@ end
end
end

@testset "non-identity varnames" begin
struct Wrap{T}
a::T
end
@model function model1(::Type{T}=Float64) where {T}
x = Vector{T}(undef, 1)
x[1] ~ Normal()
y = Wrap{T}(0.0)
return y.a ~ Normal()
end
model = model1()
spl = Gibbs(@varname(x[1]) => HMC(0.5, 10), @varname(y.a) => MH())
@test sample(model, spl, 10) isa MCMCChains.Chains
spl = Gibbs((@varname(x[1]), @varname(y.a)) => HMC(0.5, 10))
@test sample(model, spl, 10) isa MCMCChains.Chains
end

@testset "submodels" begin
@model inner() = x ~ Normal()
@model function outer()
a ~ to_submodel(inner())
_ignored ~ to_submodel(prefix(inner(), @varname(b)), false)
return _also_ignored ~ to_submodel(inner(), false)
end
model = outer()
spl = Gibbs(
@varname(a.x) => HMC(0.5, 10), @varname(b.x) => MH(), @varname(x) => MH()
)
@test sample(model, spl, 10) isa MCMCChains.Chains
spl = Gibbs((@varname(a.x), @varname(b.x), @varname(x)) => MH())
@test sample(model, spl, 10) isa MCMCChains.Chains
end

@testset "CSMC + ESS" begin
rng = Random.default_rng()
model = MoGtest_default
Expand Down
10 changes: 3 additions & 7 deletions test/optimisation/Optimisation.jl
Original file line number Diff line number Diff line change
Expand Up @@ -71,13 +71,9 @@ using Turing
end

@testset "With prefixes" begin
function prefix_μ(model)
return DynamicPPL.contextualize(
model, DynamicPPL.PrefixContext{:inner}(model.context)
)
end
m1 = prefix_μ(model1(x))
m2 = prefix_μ(model2() | (var"inner.x"=x,))
vn = @varname(inner)
m1 = prefix(model1(x), vn)
m2 = prefix((model2() | (x=x,)), vn)
ctx = Turing.Optimisation.OptimizationContext(DynamicPPL.LikelihoodContext())
@test Turing.Optimisation.OptimLogDensity(m1, ctx)(w) ==
Turing.Optimisation.OptimLogDensity(m2, ctx)(w)
Expand Down
Loading