Dataset: COCO Dataset, Image Captioning 2014
This repository contains code for the execution of a web-based image caption generator, which has been designed based on the architecture described in the paper Show and Tell: A Neural Image Caption Generator.
- Clone the repository into your local machine
git clone git@github.com:iv97n/image-captioning.git
- In the root directory of the project create a folder called model for storing the pretrained model .ckpt file
cd image-captioning
mkdir model
- Download the nic.ckpt file containing the model weights and save it into the model directory you just created
- Also in the root directory create a folder called data for storing the images you would like to pass to the model
mkdir data
- Install the dependecies running poetry install in the project root directory. This might take a few minutes.
poetry install
- Now you are ready to rock some captions 😎
- Make sure the image you want to caption is within the data folder
- Run the following command, substituting example.jpg with your image name
poetry run python main.py --imagepath data/example.jpg