You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 16, 2024. It is now read-only.
Hi I wanted to know about your dataloader part for NYUD dataset
I am Normalizing and converting to tensor, in my ToTensor() function i am reading the data and converting them to float.
is this correct, if possible please the dataloaders used in this papers
The text was updated successfully, but these errors were encountered:
Hi, Thanks for the reply
I have referred to the "DenseTorch" repo and tried running MTL model with depth ans Seg,its actually working and after 600 epocs the loss and MeanIoU are became stable with no further improvement as below.(Used your NYUD dataset)
Issue 2:
If i want to use Same encoder and Decoder used in Multitask model(mobilenet, Lightweigh Refinenet) for training individual heads, for example if i want to train only Depth, what exactly i need to change
and what parameters to give for depth, if i use Mobilenet and refinenent, instead of xception65, DLv3plus as below(showing side by side Multi(left) and Single tasks(right))
If you want to change the number of tasks, you need to adapt accordingly masks_names, criterions, loss_coeffs, num_classes and metrics. You can compare the configs between the single-task training example and the multi-task training example to confirm that.
For exact hyperparameters, it is better to refer to the original paper(s), iirc DenseTorch default values differ from those in the paper.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi I wanted to know about your dataloader part for NYUD dataset
I am Normalizing and converting to tensor, in my ToTensor() function i am reading the data and converting them to float.
is this correct, if possible please the dataloaders used in this papers
The text was updated successfully, but these errors were encountered: