Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss keep stable after first several batch? #18

Open
Pichairen opened this issue Apr 2, 2019 · 1 comment
Open

loss keep stable after first several batch? #18

Pichairen opened this issue Apr 2, 2019 · 1 comment

Comments

@Pichairen
Copy link

Pichairen commented Apr 2, 2019

loss never go down after many batch, after many epoch loss still between 0.1 and 0.2, any help?

@Pichairen Pichairen changed the title loss keep stable af loss keep stable after first several batch? Apr 2, 2019
@hengchuan
Copy link
Owner

it may make sense to decay the learning rate after some epochs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants