Releases: ThGaskin/NeuralABM
Releases · ThGaskin/NeuralABM
PNAS publication
This release produces the plots as shown in the PNAS publication. The code is being continuously improved and reworked, and subsequent releases may not produce the same quantitive results.
What's Changed
- General code improvements by @ThGaskin in #6
- Use Logger; allow plotting network for non-generated HW data by @ThGaskin in #8
- Control training device from the config by @ThGaskin in #9
- Use CPU as default device for HW and SIR by @ThGaskin in #10
- General code improvements by @ThGaskin in #12
- Expand activation function selection syntax by @ThGaskin in #13
- Extend tests by @ThGaskin in #7
- Revert changes to HW/Sample_run cfg by @ThGaskin in #14
- Allow specifying biases on each layer of the neural net by @ThGaskin in #15
- Allow selecting loss function and interaction kernel smoothing from config by @ThGaskin in #16
- Allow loss function selection from config by @ThGaskin in #17
- Update README.md by @ThGaskin in #18
- Allow using pytorch default uniform distribution in bias initialisation by @ThGaskin in #20
- Add automatic testing pipeline by @ThGaskin in #22
- Minor improvements to HW and SIR models and graph generation by @ThGaskin in #21
- Tests and improvements to data ops by @ThGaskin in #24
- Add Kuramoto model by @ThGaskin in #23
- Add HarrisWilsonNW model by @ThGaskin in #25
- Correction for 'xr.where' function and update to README by @ThGaskin in #26
- Add alpha parameter to Kuramoto by @ThGaskin in #27
- Add convexity analysis by @ThGaskin in #28
- Minor plot formatting corrections by @ThGaskin in #30
- Remove Latex requirement by @ThGaskin in #32
- Improve SIR densities by @ThGaskin in #35
Full Changelog: https://github.com/ThGaskin/NeuralABM/commits/v1.0.0