LiveChess2FEN: a Framework for Classifying Chess Pieces based on CNNs

Automatic digitization of chess games using computer vision is a significant technological challenge. This problem is of much interest for tournament organizers and amateur or professional players to broadcast their over-the-board (OTB) games online or analyze them using chess engines. Previous work has shown promising results, but the recognition accuracy and the latency of state-of-the-art techniques still need further enhancements to allow their practical and affordable deployment. We have investigated how to implement them on an Nvidia Jetson Nano single-board computer effectively. Our first contribution has been accelerating the chessboard's detection algorithm. Subsequently, we have analyzed different Convolutional Neural Networks for chess piece classification and how to map them efficiently on our embedded platform. Notably, we have implemented a functional framework that automatically digitizes a chess position from an image in less than 1 second, with 92% accuracy when classifying the pieces and 95% when detecting the board.

D. Mallasén Quintana, A. A. Del Barrio García, and M. Prieto Matías, “LiveChess2FEN: a Framework for Classifying Chess Pieces based on CNNs,” arXiv:2012.06858 [cs], Dec. 2020. [Online]. Available:
  title = {{{LiveChess2FEN}}: A {{Framework}} for {{Classifying Chess Pieces}} Based on {{CNNs}}},
  shorttitle = {{{LiveChess2FEN}}},
  author = {Mallas{\'e}n Quintana, David and Del Barrio Garc{\'i}a, Alberto Antonio and Prieto Mat{\'i}as, Manuel},
  year = {2020},
  month = dec,
  journal = {arXiv:2012.06858 [cs]},
  eprint = {2012.06858},
  eprinttype = {arxiv},
  primaryclass = {cs},
  archiveprefix = {arXiv}