You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You mentioned in the paper that setting batch_size==8 on four 2080Ti takes about three days to train.
But when I set batch_size>=4 on V100, I get an error.
The problem is the same as in this issue.#23
Is there something wrong with my configuration?As far as I know, the 2080Ti has 11GB of memory, while the V100 I used has 32GB. This is puzzling to me.Looking forward to getting your answer, thank you!
The text was updated successfully, but these errors were encountered:
You mentioned in the paper that setting batch_size==8 on four 2080Ti takes about three days to train.
![image](https://private-user-images.githubusercontent.com/69712084/295504501-5631f226-baa1-41de-879a-34e1bcf016ac.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzk2MDc2MDYsIm5iZiI6MTczOTYwNzMwNiwicGF0aCI6Ii82OTcxMjA4NC8yOTU1MDQ1MDEtNTYzMWYyMjYtYmFhMS00MWRlLTg3OWEtMzRlMWJjZjAxNmFjLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMTUlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjE1VDA4MTUwNlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTU3NjA4OTgyMzZhOWVjMWE2MmU0NzFlNjBhYTAzYTA1ZDc2ZjgzNTJhOWI5YjBkMjFjNDE2NzNhNzdkMDQ4NDImWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.iSgBiOmXBlr5GTXCDxMerQIpsSuTSrEdNzfJ1sswyR4)
![image](https://private-user-images.githubusercontent.com/69712084/295503955-769b6097-d4b8-44d1-855c-661a5af7642f.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzk2MDc2MDYsIm5iZiI6MTczOTYwNzMwNiwicGF0aCI6Ii82OTcxMjA4NC8yOTU1MDM5NTUtNzY5YjYwOTctZDRiOC00NGQxLTg1NWMtNjYxYTVhZjc2NDJmLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMTUlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjE1VDA4MTUwNlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWUzNWRiYmEzY2NjYzkwZGNjYmVjYjU4MTcxMTE1N2E5YWNjZWI1NWY4Y2E5OWNiZWEyODhkZjU0ODljZjM0OTgmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.rgazCcR6a-xxH1AamVfZgNwR-bJoTsamL4Q26Io4lQI)
![image](https://private-user-images.githubusercontent.com/69712084/295504225-48f27a09-db72-462d-91eb-b4c98db5c5a6.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzk2MDc2MDYsIm5iZiI6MTczOTYwNzMwNiwicGF0aCI6Ii82OTcxMjA4NC8yOTU1MDQyMjUtNDhmMjdhMDktZGI3Mi00NjJkLTkxZWItYjRjOThkYjVjNWE2LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMTUlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjE1VDA4MTUwNlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTRlMTI1ODIxNzY0ZjI1MTM4OTI0ZjMxZmFmMDA5Y2E2MjRhZDgzMDQyMTQwOTM0ODRiOThhMjRiN2VlMGJlMzgmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.4LMSnm57jbuHpoppCo2ibAj0deHgquMuVIHTQrmE1TM)
But when I set batch_size>=4 on V100, I get an error.
The problem is the same as in this issue.#23
Is there something wrong with my configuration?As far as I know, the 2080Ti has 11GB of memory, while the V100 I used has 32GB. This is puzzling to me.Looking forward to getting your answer, thank you!
The text was updated successfully, but these errors were encountered: