We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
非常感谢作者提供的视频动作检测源码,请问有计划提供演示demo吗?关于演示demo,有几个问题需要咨询一下作者: 1:是否和端到端的yowov2动作检测一样,只需要输入一段视频clip即可,但看提供的projects/evad/test_net.py 还需要输入人的坐标位置等信息,请问在实际用的时候,这个坐标位置信息是否可以填写为none,而人的坐标位置信息仅用于测试模型map指标?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
非常感谢作者提供的视频动作检测源码,请问有计划提供演示demo吗?关于演示demo,有几个问题需要咨询一下作者:
1:是否和端到端的yowov2动作检测一样,只需要输入一段视频clip即可,但看提供的projects/evad/test_net.py 还需要输入人的坐标位置等信息,请问在实际用的时候,这个坐标位置信息是否可以填写为none,而人的坐标位置信息仅用于测试模型map指标?
The text was updated successfully, but these errors were encountered: