requirement: Need A GPU (such as A10,A100) to run. trl:https://github.com/huggingface/trl
I modified the max_length from 512 to 128 in Llama2 model, it could run for a while in 8*24GB A10 device.
requirement: Need A GPU (such as A10,A100) to run. trl:https://github.com/huggingface/trl
I modified the max_length from 512 to 128 in Llama2 model, it could run for a while in 8*24GB A10 device.