Deep Learning for Text with PyTorch
Shubham Jain
Instructor
GPT2LMHeadModel:
GPT2Tokenizer:
import torch from transformers import GPT2Tokenizer, GPT2LMHeadModel
tokenizer = GPT2Tokenizer.from_pretrained('gpt2') model = GPT2LMHeadModel.from_pretrained('gpt2')
seed_text = "Once upon a time"
input_ids = tokenizer.encode(seed_text, return_tensors='pt')
output = model.generate(
)
output = model.generate(input_ids, max_length=40,
)
output = model.generate(input_ids, max_length=40, temperature=0.7,
)
output = model.generate(input_ids, max_length=40, temperature=0.7,
no_repeat_ngram_size=2,
)
output = model.generate(input_ids, max_length=40, temperature=0.7,
no_repeat_ngram_size=2,
pad_token_id=tokenizer.eos_token_id)
generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
print(generated_text)
Generated Text: Once upon a time, the world was a place of great beauty
and great danger. The world of the gods was the place where the great gods were
born, and where they were to live.
t5-small
: Text-to-Text Transfer Transformerimport torch from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("t5-small") model = T5ForConditionalGeneration.from_pretrained("t5-small")
input_prompt = "translate English to French: 'Hello, how are you?'"
input_ids = tokenizer.encode(input_prompt, return_tensors="pt")
output = model.generate(input_ids, max_length=100)
generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
print("Generated text:",generated_text)
Generated text:
"Bonjour, comment êtes-vous?"
Deep Learning for Text with PyTorch