Skip to content

WanderBernardo/ArtificialIntelligence_UsingBERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

Artificial Intelligence: Using BERT via code in Python Language

The goal is use BERT via code in Python Language

  • BERT (Bidirectional Encoder Representations from Transformers): It’s a specific model based on the Transformer architecture, but it uses only the Encoder part of the Transformer. It was designed by the Google team in 2018 to understand the context of words in a text bidirectionally (i.e., looking at both what comes before and after a word). It is pre-trained on large amounts of text and then fine-tuned for specific tasks like question answering or sentiment analysis.

  • In summary: BERT is a specialized model based on the Transformer, focused on understanding the meaning of text.

Use tools:

Important Point:

  • In my case, I used Google Colab how platform to write the script. But, you can use anyone of the preference.

Showing:

1 - First, istall library: transformers beautifulsoup4 requests and openai image

image

2 - After, declaring libraries: image

  1. Library to request document from Web
  2. Operate with Berts Models

3 - Function to prepare our Dataset: image Only need pass URL, where join in the visible text separete for space.

Using the BERT

1 - Q & R (Question and Answer)

Ideal use this tools in situation that I need to take out doubt in chatbox.

In this case, during build of the code, include Model "BERT": image

In case above, it was used "bert-large-uncased-whole-word-masking-finetuned-squad" model, case you want use other specific model. But, where pick up models: image

Remembering that Dataset is about Windows history, so the question need about it: image

2 - Summarize

Ideal use this tools in situation that I need summary big text in small text.

image

Reinforcing, summary create here, is with base on the dataset created with data: https://pt.wikipedia.org/wiki/Microsoft_Office image

On the code above, realize it has variable, so, when I know to need using? On the site: "https://huggingface.co/" in "Model". Select a model and left side there is explication how use. With necessary code.
image

3 - Sentiment Analysis

Ideal use this tools in opinions, reviews situations, because it rate in "Positive", "Neutral", "Negative"

Example below, we evaluate the sentence directly: "Microsoft is a fantastic company, which contains wonderful products that make our daily lives easier." image

Result: image

  • Modelo Usado:
    • The "nlptown/bert-base-multilingual-uncased-sentiment" model is a BERT variant trained for multilingual sentiment analysis. It returns the probability of different sentiment levels (from 1 to 5 stars).
      • 1-2 stars: “Negative”.
      • 3 stars: “Neutral”.
      • 4-5 stars: “Positive”.

Exemplo using Python Code:

About

The goal is use BERT via code in Python Language

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published