= 'TOOK OFF WITH ONE GENERATOR INOPERATIVE, THE OTHER THEN FAILED, CAUSING INSUFFICIENT POWER TO LOCK LANDING GEAR.'
text = 'AU'
label = 2 label_encoded
Example
Here’s a single example of our workflow using the NN model
Let’s pull a random entry from our datasets
This text description corresponds with lablel AU
which is encoded as 2
First we tokenize our imput
from transformers import AutoTokenizer
= '../model/'
model_path = AutoTokenizer.from_pretrained(model_path) tokenizer
2023-03-01 20:25:49.841942: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: SSE4.1 SSE4.2 AVX AVX2 AVX512F AVX512_VNNI FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
= tokenizer(text, return_tensors='pt') inputs
Now we pass the inputs to the model
from transformers import AutoModelForSequenceClassification
import torch
= AutoModelForSequenceClassification.from_pretrained(model_path)
model with torch.no_grad():
= model(**inputs).logits logits
Lets check our results
logits.argmax().item()
5
As we can see our model is incorrect