Questions tagged [text-generation]
31 questions
35
votes
3 answers
Can BERT be used for sentence generating tasks?
I am a new learner in NLP. I am interested in the sentence generating task. As far as I am concerned, one state-of-the-art method is the CharRNN, which uses RNN to generate a sequence of words.
However, BERT has come out several weeks ago and is…
ch271828n
- 453
- 1
- 4
- 11
11
votes
3 answers
Why are LLMs able to reproduce bodies of known text exactly?
Mathematically, I wouldn't expect LLMs to be able to reproduce source texts exactly unless the source text was the probable outcome given some prompt. However, I have now tested HuggingFaceH4/zephyr-7b-beta, TheBloke/Llama-2-7B-Chat-GGUF, and…
Grant Curell
- 213
- 2
- 8
10
votes
1 answer
How do I use GPT-2 to summarise text?
In section 3.6 of the OpenAI GPT-2 paper it mentions summarising text based relates to this, but the method is described in very high-level terms:
To induce summarization behavior we add the text TL;DR: after the article and generate 100 tokens…
Tom Hale
- 384
- 3
- 13
9
votes
1 answer
How much of the ChatGPT output is copied from its training set (vs. being abstractively generated)?
One of the main concerns of using ChatGPT answers on Stack Exchange is that it may copy verbatim or almost verbatim some text from its training set, which may infringe the source text's license. This makes me wonder how much of the ChatGPT output is…
Franck Dernoncourt
- 3,473
- 2
- 21
- 39
4
votes
3 answers
How can GPT-3 be used for designing electronic circuits from text descriptions?
I was wondering if it is possible to use GPT-3 to translate text description of a circuit to any circuit design language program, which in turn can be used to make the circuit.
If it is possible, what approach will you suggest?
Aether
- 285
- 2
- 7
3
votes
0 answers
T5 or BERT for sentence correction/generation task?
I have sentences with some grammatical errors , with no punctuations and digits written in words... something like below:
As you can observe, a proper noun , winston isnt highlighted with capital in Sample column. 'People' is spelled wrong and…
Varun kadekar
- 131
- 2
3
votes
1 answer
Is there a complement to GPT/2/3 that can be trained using supervised learning methods?
This is a bit of a soft question, not sure if it's on topic, please let me know how I can improve it if it doesn't meet the criteria for the site.
GPT models are unsupervised in nature and are (from my understanding) given a prompt and then they…
user6916458
- 131
- 3
3
votes
2 answers
How to use LSTM to generate a paragraph
A LSTM model can be trained to generate text sequences by feeding the first word. After feeding the first word, the model will generate a sequence of words (a sentence). Feed the first word to get the second word, feed the first word + the second…
Dan D
- 1,318
- 1
- 14
- 39
2
votes
1 answer
How websites like this one can generate names based on prompts?
I just found this site that provides AI-powered name generators for some of their generators. They can generate names based on prompts that can either keywords or descriptions (that provide more control over the outputs). While I couldn't find any…
user1678860
- 377
- 8
2
votes
1 answer
Clarification on GANs for text generation
A GAN-like architecture for text generation is proposed in 'Generative Adversarial Networks for Text Generation'.
The setup is the following:
The generator of the GAN is proposed to be a recurrent neural network that its by itself a text…
Ramiro Hum-Sah
- 133
- 5
2
votes
0 answers
How should I design a reward function for a NLP problem where two models interoperate?
I would like to design a reward function. I am training two models from the first model that classify set of texts (paragraphs and keywords) and I also got some hidden states. The second model is trying to generate keywords for those paragraphs.
I…
No Na
- 21
- 1
2
votes
0 answers
Pretrained Models for Keyword-Based Text Generation
I'm looking for an implementation that allows me to generate text based on a pre-trained model (e.g. GPT-2).
An example would be gpt-2-keyword-generation (click here for demo). As the author notes, there is
[...] no explicit mathematical/theoetical…
Comfort Eagle
- 129
- 1
- 4
2
votes
0 answers
How to tell if two hotel reviews addressing the same thing
I am playing with a large dataset of hotel reviews, which contains both positive and negative reviews (the reviews are labeled). I want to use this dataset to perform textual style transfer - given a positive review, output a negative review which…
Nadav Borenstein
- 21
- 1
1
vote
1 answer
Method to generate blindly/exhaustively
Much like image generators can be mined for unique and interesting outputs prompt free, is there a way to blindly generate high probability, unique texts? I’m trying to make a movie script generator that I can use to generate unique stories without…
user1354917
- 11
- 1
1
vote
1 answer
Discrepancies in the Exclusion of Elements in Image vs. Text Generation
(The following comments concern DALLE, I have not tested it with other image generating tools, but would be curious to hear if the same happens)
When generating images, it seems that ChatGPT (i.e. DALLE) fails to exclude specific elements from an…
Léreau
- 113
- 4