Skip to content

DynamicPPL 0.36 #2535

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open

DynamicPPL 0.36 #2535

wants to merge 7 commits into from

Conversation

penelopeysm
Copy link
Member

@penelopeysm penelopeysm commented Apr 27, 2025

This PR adds compatibility with DynamicPPL 0.36.

Given that there are some reasonably breaking changes in the way that submodels behave, I've decided to increment the minor version number for Turing.

Because submodels now also prefix VarNames in a 'proper' way, submodel variables now have non-identity lenses. In order to allow Gibbs to sample models with submodels, this PR also fixes #2403 by letting GibbsContext store a tuple of VarNames rather than just Symbols. Performance is compared below.

There's some nasty lack of reproducibility in HMCDA sampling, which is causing the failures in test/mcmc/Inference. I haven't yet been able to repro this locally. However, I would like to deal with this in a separate PR, as the failure isn't introduced by this PR (the last CI run on main was already failing).

This PR supersedes the CompatHelper ones at #2501 #2532 #2533

Benchmarks

I'm not 100% convinced of these, because my benchmarks have returned wildly variable results upon being run multiple times / when I restart my Julia session. Sometimes when I run sample(rng, model, spl, 10000) it runs in < 1 second and sometimes it takes 4 seconds. (This is after running it multiple times in the same session, so it's not an issue of JIT compilation.) I think I would probably feel a little bit more reassured if somebody else could test them out.

However, broadly speaking, I think it's safe to say that I don't have any evidence of performance getting worse. Another noteworthy point is that CI for test/mcmc/gibbs (on ubuntu-latest, 1 thread) for this PR ran in 1 hr 25 min, which is the same time it took for the same job on main. So it seems unlikely that there are any serious performance implications (good or bad).

using AbstractMCMC: step
using DynamicPPL, Turing
using StableRNGs: StableRNG
using Chairmarks

@model function f()
    x ~ Normal()
    y ~ Normal()
end
rng = StableRNG(468)
model = f()
spl = Gibbs(@varname(x) => MH(), @varname(y) => MH())
dspl = DynamicPPL.Sampler(spl)

# This benchmark compares the initial step of MCMC
@be step($rng, $model, $dspl)

# Before: 91.250 μs
# After: 91.208 µs

# This compares the 2nd step onwards
_, st = step(rng, model, dspl)
@be step($rng, $model, $dspl, $st)

# Before: 66.500 μs
# After: 66.417 μs

@penelopeysm penelopeysm force-pushed the py/dppl-0.36 branch 2 times, most recently from 2327e65 to c699b8c Compare April 27, 2025 22:26
Copy link

codecov bot commented Apr 27, 2025

Codecov Report

Attention: Patch coverage is 94.11765% with 1 line in your changes missing coverage. Please review.

Project coverage is 84.27%. Comparing base (8b7c571) to head (697c208).

Files with missing lines Patch % Lines
src/mcmc/Inference.jl 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2535      +/-   ##
==========================================
+ Coverage   84.05%   84.27%   +0.21%     
==========================================
  Files          21       21              
  Lines        1455     1456       +1     
==========================================
+ Hits         1223     1227       +4     
+ Misses        232      229       -3     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@coveralls
Copy link

coveralls commented Apr 27, 2025

Pull Request Test Coverage Report for Build 14816610977

Details

  • 16 of 17 (94.12%) changed or added relevant lines in 2 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage increased (+0.2%) to 84.272%

Changes Missing Coverage Covered Lines Changed/Added Lines %
src/mcmc/Inference.jl 0 1 0.0%
Totals Coverage Status
Change from base Build 14816607950: 0.2%
Covered Lines: 1227
Relevant Lines: 1456

💛 - Coveralls

@penelopeysm penelopeysm marked this pull request as draft April 28, 2025 00:51
@penelopeysm penelopeysm force-pushed the py/dppl-0.36 branch 2 times, most recently from 8c5f782 to 17721a9 Compare May 4, 2025 01:38
@penelopeysm penelopeysm marked this pull request as ready for review May 4, 2025 01:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Allow Gibbs sampler to have non-identity lenses for target variables
2 participants