Questions tagged [roberta]

Roberta is a graphical open source IDE designed for multiple robot systems, like Calliope Mini, LEGO Mindstorm or the NAO. The main audience is children doing their first programming steps.

Roberta is a graphical open source IDE designed for multiple robot systems, like Calliope Mini, LEGO Mindstorm or the NAO. The main audience is children doing their first programming steps.

Links:

37 questions
23
votes
2 answers

AutoModelForSequenceClassification requires the PyTorch library but it was not found in your environment

I am trying to use the roberta transformer and a pre-trained model but I keep getting this error: ImportError: AutoModelForSequenceClassification requires the PyTorch library but it was not found in your environment. Checkout the instructions…
Mahmoud Khaled
  • 395
  • 1
  • 3
  • 7
3
votes
1 answer

Finetuning a LM vs prompt-engineering an LLM

Is it possible to finetune a much smaller language model like Roberta on say, a customer service dataset and get results as good as one might get with prompting GPT-4 with parts of the dataset? Can a fine-tuned Roberta model learn to follow…
2
votes
1 answer

Output tensors of a Functional model must be the output of a TensorFlow `Layer`

So I'm trying to expand the Roberta Pretrained Model and I was doing a basic model for testing but I'm getting this error from TensorFlow: ValueError: Output tensors of a Functional model must be the output of a TensorFlow Layer. which is from the…
DeadSec
  • 808
  • 1
  • 12
  • 38
2
votes
0 answers

I got the problem ValueError: not enough values to unpack (expected 2, got 1)

I used a pretrained Roberta model,and this is my roberta model.the pretrained model is https://huggingface.co/rinna/japanese-roberta-base And I install the sentencepiece. !pip install sentencepiece class RoBERTaClass(torch.nn.Module): def…
Chloe
  • 21
  • 2
2
votes
1 answer

Error(s) in loading state_dict for RobertaForSequenceClassification

I am using a fine-tuned Roberta Model that is unbiased-toxic-roberta trained on Jigsaw Data: https://huggingface.co/unitary/unbiased-toxic-roberta It is fine-tuned on 16 classes. I am writing my code for binary classification: Metrics to calculate…
MAC
  • 1,345
  • 2
  • 30
  • 60
2
votes
1 answer

How do you get single embedding vector for each word (token) from RoBERTa?

As you may know, RoBERTa (BERT, etc.) has its own tokenizer and sometimes you get pieces of given word as tokens, e.g. embeddings » embed, #dings Since the nature of the task I am working on, I need a single representation for each word. How do I…
1
vote
0 answers

Keras callback function TypeError: unsupported operand type(s)

I'm creating a text classification model using RoBERTa model. I keep encountering this error TypeError: unsupported operand type(s) for *: 'WarmUp' and 'int' whenever I use either ReduceLROnPlateau or LearningRateScheduler in my callback…
yh01
  • 11
  • 2
1
vote
1 answer

Roberta transformer for ner gives index out of range error

I have a function below that tokenizes and aligns my labels, but it is giving me an error: def tokenize_and_align_labels(examples, label_all_tokens=True): tokenized_inputs = tokenizer(examples["tokens"], truncation=True,…
1
vote
0 answers

AttributeError: module 'fastai.train' has no attribute 'iloc'

im using python 3.9 and fast ai = 1.0.58 i tried to implement this code source : https://towardsdatascience.com/fastai-with-transformers-bert-roberta-xlnet-xlm-distilbert-4f41ee18ecb2 but im having error when creating data bunch train =…
1
vote
0 answers

Node: 'Cast_1' Cast string to float is not supported [[{{node Cast_1}}]] [Op:__inference_train_function_24202] for Roberta

I am getting this problem called " Node: 'Cast_1' Cast string to float is not supported [[{{node Cast_1}}]] [Op:__inference_train_function_24202] " enter image description here I wrote a code about IMDB sentiment analysis for 5000 data in…
1
vote
0 answers

RoBERTa tokenizer issue for certain characters

I am using the RobertaTokenizerFast to tokenize some sentences and align them with annotations. I noticed an issue with some chatacters from transformers import BatchEncoding, RobertaTokenizerFast from tokenizers import Encoding tokenizer =…
1
vote
1 answer

Got an Input to reshape is a tensor with 3368 values, but the requested shape has 2048 error while fine-tuning Roberta

I have a csv file that has two input columns and one class with multiple labels which means I'm trying to do a multi-class classification using fine-tuned RoBERTa model. This is the structure of my csv file (df): text …
anthino12
  • 770
  • 1
  • 6
  • 29
1
vote
2 answers

Is AllenNLP biased towards BERT?

At my University's research group we have been pre-training a RoBERTa model for Portuguese and also a domain-specific one, also based on RoBERTa. We have been conducting a series of benchmarks using huggingface's transformers library, and the…
1
vote
1 answer

Pytorch ram does not free after every epoch

I am training a multi-label text classifier using PyTorch with Roberta. However after 2nd epoch ram fills and kernel crashes I checked and ram is not freed after every epoch. I have 64GB RAM, 8 CPU cores What can be the problem? Here is the my…
1
vote
1 answer

How can i save model while training in torch

I am training RoBERTa model for a new language, and it takes some hours to train the data. So I think it is a good idea to save the model while training so that I can continue training the model from where it stops next time. I am using torch…
robel
  • 109
  • 3
  • 12
1
2 3