Skip to content

Jon-Ingvar-Skanoy/hex

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Overview

This project explores the application of the Graph Tsetlin Machine (GTM) to classify the winner of a game of Hex, a deterministic two-player board game with multiple possible board sizes. It evaluates the GTMs performance across multiple board sizes, with completed boards, boards 2 moves prior to completion, and boards 5 moves prior to completion. To find the best performance, multiple data representations, sets of hyperparameters and techniques for dataset generation are evaluated.

Acknowledgements

This project was completed by Group TseHex:

  • Jon Andreas Bull Larsen
  • Jon Ingvar Skånøy
  • Isak Killingrød

Key Objectives:

  1. Achieve 100% classification accuracy for as large of a board size as feasible, when classifying completed boards.
  2. Achieve high classification accuracy for classifying incomplete boards.
  3. Analyze the effects of different graph representations of the board on the performance of the model
  4. Analyze the impact of hyperparameters used both in model training and dataset generation.

Dataset

Generated by simulating random play

  • Random starting player, to prevent the model learning to count the pieces and therefore achieve 100% accuracy without understanding the game
  • Filtration of games which were not played well by the random players
  • Games which are completed in under a certain move count threshold are considered well-played
  • Incomplete boards were generated by playing until the board is complete, then undoing the final 2/5 moves
  • Each dataset was generated for a single board size, with exclusively completed boards / boards 2 moves prior / boards 5 moves prior
  • Multiple datasets for board sizes and final board states were generated for multiple move count thresholds

Methodology

Board representations:

  • Boards were encoded into a set of features to be compatible with GTM.
  • Tested both the best performing set of features, and trained the model on datasets removing one feature at a time to estimate impact of individual features.
  • Performed all tests for both complete and incomplete boards of multiple sizes to estimate difference in the effects of features across board types.

Hyperparameters:

  • Utilized Optuna for hyperparameter tuning of GTM hyperparameters.
  • Manual testing of multiple move count thresholds for dataset generation.

Results

  1. 100% accuracy on 7x7 completed boards and >99% accuracy up to 13x13 completed boards
  2. Understanding of the impact of feature design
  3. Having a stricter move count threshold led to higher performance
  4. High performance on incomplete boards

Hardware

  • Shared Server: Tesla V100 GPU, 32GB VRAM
  • Local Machines: NVIDIA RTX 3090, 4070, 3080, 3070 GPUs

Key Findings

  • Well-designed features can significantly improve performance, although a poorly defined feature might give away too much information and cause the model to not learn useful patterns
  • A smaller number of moves played is easier to classify on completed boards, indicating that minimal noise aside from the relevant data makes it easier for the model to classify the data.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •