Example

Here’s a single example of our workflow using the NN model

Let’s pull a random entry from our datasets

text = 'TOOK OFF WITH ONE GENERATOR INOPERATIVE, THE OTHER THEN FAILED, CAUSING INSUFFICIENT POWER TO LOCK LANDING GEAR.'
label = 'AU'
label_encoded = 2

This text description corresponds with lablel AU which is encoded as 2

First we tokenize our imput

from transformers import AutoTokenizer

model_path = '../model/'
tokenizer = AutoTokenizer.from_pretrained(model_path)
2023-03-01 20:25:49.841942: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  SSE4.1 SSE4.2 AVX AVX2 AVX512F AVX512_VNNI FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
inputs = tokenizer(text, return_tensors='pt')

Now we pass the inputs to the model

from transformers import AutoModelForSequenceClassification
import torch

model = AutoModelForSequenceClassification.from_pretrained(model_path)
with torch.no_grad():
    logits = model(**inputs).logits

Lets check our results

logits.argmax().item()
5

As we can see our model is incorrect