Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

COCO performance #6

Open
zbf1991 opened this issue Jan 3, 2022 · 2 comments
Open

COCO performance #6

zbf1991 opened this issue Jan 3, 2022 · 2 comments

Comments

@zbf1991
Copy link

zbf1991 commented Jan 3, 2022

Hi,
I also reproduce the pseudo labels on COCO, but the final mIoU of the pseudo labels only reaches 39.9%, which is much lower than your report in your paper, and here is my log, could you please share your log file or your final pseudo labels so that I can check what is the problem?

I directly use your code without any changes.

Here is the main result from my Log file:

result/cam_RIB_coco
[0.2722552891237194, 0.2786577865531708, 0.2848951651555352, 0.29093880903208663, 0.2967588555960802, 0.3023511215664053, 0.30768588721569845, 0.3127349367603064, 0.3174742008072476, 0.3219033934796489, 0.32601260238173524, 0.3297762164192385, 0.33319474995831166, 0.3362583752094398, 0.3389686334201114, 0.3413257565050376, 0.34331515012800756, 0.3449353523677618, 0.3461883620849829, 0.34709014345965156, 0.34764327026979114, 0.34783290477587275, 0.34766379734497427, 0.34713057606322156, 0.3462663810632868, 0.3450816387680953, 0.3435831083609862, 0.3417649041730949, 0.33964585020688953, 0.33723733585199694, 0.33452463044941305, 0.3315106653971657, 0.3282249115321002, 0.3246616273393496, 0.32083791932265093, 0.3167660542320165, 0.31243464959554595, 0.30784246947610633, 0.3030084580507455, 0.2979410572954709, 0.2926588965608258]
0.34783290477587275

step.eval_sem_seg: Sun Jan 2 12:33:03 2022
total images 82783
0.07819244921909785 0.15218531542532188
0.4476158785550421 0.1573540223429845
{'iou': array([0.76962224, 0.57405399, 0.45372461, 0.42301975, 0.66804454,
0.44788392, 0.68736178, 0.43598339, 0.45119599, 0.34367053,
0.16606077, 0.51578795, 0.37524895, 0.56005069, 0.34876581,
0.35976116, 0.65993155, 0.60979095, 0.58667596, 0.51476726,
0.60375777, 0.76226191, 0.61406926, 0.75800884, 0.65777299,
0.20870498, 0.58212256, 0.12488388, 0.27468946, 0.53710008,
0.57887801, 0.09641619, 0.24943139, 0.26747072, 0.30592797,
0.14781007, 0.12192337, 0.32480459, 0.14675007, 0.17803882,
0.33367101, 0.22747043, 0.29426168, 0.08872426, 0.10570477,
0.05069256, 0.30010995, 0.59472847, 0.46239021, 0.47639645,
0.57624603, 0.39865673, 0.32390225, 0.54283463, 0.61383191,
0.57325726, 0.46447526, 0.25194392, 0.46235557, 0.30174565,
0.5464725 , 0.23060425, 0.49820743, 0.43210952, 0.5080675 ,
0.19334627, 0.26914282, 0.48374467, 0.49687464, 0.43566503,
0.39545811, 0.20909918, 0.24580516, 0.41792358, 0.37384429,
0.23341467, 0.29604958, 0.23709656, 0.6229902 , 0.09117136,
0.21931909]), 'miou': 0.3996546933767093}

@jbeomlee93
Copy link
Owner

Hi @zbf1991

We are sorry that our COCO result cannot be reproduced. I'll check it out and let you know again.

But, one thing I want to check: I see that your initial seed result contains only 41 classes, but COCO contains 81 classes. Did you show me only part of them? Or did you obtain those outputs?

Thanks.

@zbf1991
Copy link
Author

zbf1991 commented Jan 8, 2022

@jbeomlee93
The result of the first part is the mIoU under the different thresholds (Line 116-117 in run_sample_coco.py), which is not the result of each class.
Thanks a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants