Skip to content

nabarunbaruaAIML/BERT_NER

Repository files navigation

Hits

Introduction

This Repository was built to show case DistilBERT Implementation for NER which is built on Pytorch. We used distilbert-base-uncased Pretrained Model from Huggingface Transformer Library and Fine Tune it for token classification i.e. NER.

Since Guthub blocks, push for more than 100 MB's file (https://help.github.com/en/github/managing-large-files/conditions-for-large-files) therefore please download the Weights file from Google Drive ( https://drive.google.com/file/d/1MTdgl1qfOEo-TuT39ByrbQsFZQ-y7v3a/view?usp=sharing )

Output

image

Run Locally

Clone the project

  git clone https://github.com/nabarunbaruaAIML/BERT_NER.git

Go to the project directory

  cd BERT_NER

Install dependencies

  conda env create -f environment.yml

All dependencies are included in the environment.yml file.

Start the server

  Python3 clientApp.py

About

Bert Implementation of NER

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published