Skip to content

Latest commit

 

History

History
7 lines (7 loc) · 496 Bytes

README.md

File metadata and controls

7 lines (7 loc) · 496 Bytes

GraphAttentionRL

This paper talks about the use of Graph Attention Networks (GATs) in deep Q-learning networks (DQNs). While the aim of network modifications is generally to enhance performance, this paper does not solely aim to do. It instead talks about the insights one can get from the parameters (attention weights, in this particular case) learnt while training. OpenAI’s MsPacmanV0 environment has been used for interacting with the model. The report can be found here.