Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]flatten_tensordicts default to False #2497

Closed
3 tasks done
HGGshiwo opened this issue Oct 17, 2024 · 3 comments · Fixed by #2502
Closed
3 tasks done

[BUG]flatten_tensordicts default to False #2497

HGGshiwo opened this issue Oct 17, 2024 · 3 comments · Fixed by #2502
Assignees
Labels
bug Something isn't working

Comments

@HGGshiwo
Copy link

HGGshiwo commented Oct 17, 2024

Describe the bug

flatten_tensordicts default to False, while it is said should be True in the annotation.

To Reproduce

Steps to reproduce the behavior.

Please try to provide a minimal example to reproduce the bug. Error messages and stack traces are also helpful.

Please use the markdown code blocks for both code and stack traces.

import torchrl
Traceback (most recent call last):
  File ... 

Expected behavior

A clear and concise description of what you expected to happen.

Screenshots

If applicable, add screenshots to help explain your problem.

System info

Describe the characteristic of your environment:

  • Describe how the library was installed (pip, source, ...)
  • Python version
  • Versions of any other relevant libraries
import torchrl, numpy, sys
print(torchrl.__version__, numpy.__version__, sys.version, sys.platform)

Additional context

Add any other context about the problem here.

Reason and Possible fixes

If you know or suspect the reason for this bug, paste the code lines and suggest modifications.

Checklist

  • I have checked that there is no similar issue in the repo (required)
  • I have read the documentation (required)
  • I have provided a minimal working example to reproduce the bug (required)
@HGGshiwo HGGshiwo added the bug Something isn't working label Oct 17, 2024
@vmoens
Copy link
Contributor

vmoens commented Oct 17, 2024

Hey can you provide a bit more details about this issue? What annotation are we talking about? Do you mean docstrings? Of which function?
Thanks!

@HGGshiwo
Copy link
Author

Hey can you provide a bit more details about this issue? What annotation are we talking about? Do you mean docstrings? Of which function? Thanks!

It is an arg in Trainer.__init__, the link is https://github.com/pytorch/rl/blob/main/torchrl/trainers/trainers.py#L639, you can see it is False in default, but True in comments

@vmoens
Copy link
Contributor

vmoens commented Oct 18, 2024

Ah ok, trainer was not mentioned!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants