Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The Loss decrease to 0 quickly when it trains during Epoch2 on my dataset #13

Open
unanan opened this issue May 15, 2018 · 1 comment
Open

Comments

@unanan
Copy link

unanan commented May 15, 2018

Hi, there comes a silly question to me.
I use this repo to train my dataset, which is a 2 classes dataset on Semantic Segmentation.
There is everything ok during Epoch1(I use the Epoch1.pth to predict pictures, and the result is ok. The Segmentation performs good at some place.)
And things go strange at Epoch2. The Loss decreases quickly to 0 at about in 100 steps.
I'm confused of it. Maybe because the num of classes of my dataset is too small? How can I modify it?
Thank you in advance.

@nader93k
Copy link

It may be too late to help you. But anyway, if you could upload your code and a description of your dataset (num of training images, validation images, crop-size, actual image resolutions, and some of the images in the dataset) maybe we can trace your problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants