Questions tagged [natural-language-generation]

For questions about the generation of human-readable text from structured data. This is related to natural-language-processing (NLP), and statistical NLP models can be used in NLG too. However, NLG also applies to templating, report writing and scripted conversations.

See https://en.wikipedia.org/wiki/Natural-language_generation

24 questions
9
votes
3 answers

Can AI write good jokes yet?

Just watched a recent WIRED video on virtual assistants' performance on telling jokes. They're composed by humans, but I'd like to know if AI has gotten good enough to write some.
4
votes
1 answer

How can I leverage artificial intelligence and virtual reality to create intelligent automatic story generation?

How can I leverage artificial intelligence and virtual reality to create intelligent automatic story generation? My idea is to come up with a system that is empowering the user’s imagination to create and experience stories that engage and immerse…
4
votes
2 answers

Why do current language models no longer generate to long or short texts?

One of the biggest strengths of ChatGPT is that it generates fitting text with respect to the input query. It usually stays on topic, anwers the question completely and especially does not start talking gibberish or repeating itself. This behaviour…
4
votes
1 answer

What puts the "chat" in a system like ChatGPT?

So I read Wolfram's What Is ChatGPT Doing … and Why Does It Work? but it left one really big question in my mind. His summary [if it could be called that!] really emphasizes that the core model is trained to "continue a piece of text that it’s been…
4
votes
0 answers

How is ChatGPT maintaining context?

It has been suggested in the answer to this earlier question that it is just remembering a certain amount of recent information. The reference used is this post by OpenAI which says that ChatGPT should only be able to maintain a context of around…
4
votes
1 answer

Why do language models produce different outputs for same prompt?

For conventional 'Neural Networks', the weights simply act as a transformation in highly multi-dimensional space; for a forward pass, the output is always the same since there is no stochastic weighting component in the process. However, in…
4
votes
1 answer

How are certain machine learning models able to produce variable-length outputs given variable-length inputs?

Most machine learning models, such as multilayer perceptrons, require a fixed-length input and output, but generative (pre-trained) transformers can produce sentences or full articles of variable length. How is this possible?
3
votes
1 answer

How can I improve this toy Graph Neural Network Generative Language model

Background I'm an undergraduate student with research interests in a field of physics that has significant overlap with graph theory, and a functioning knowledge of how simple neural nets work and how to build them with TensorFlow and Keras. As many…
3
votes
1 answer

How can a system like Jarvis understand the commands and take actions?

I am looking to make an AI like Jarvis. A perfect real-life example of this type of system is the simple AI that Mark Zuckerberg has recently built. Here is a description of how his AI works. From what I understand, the AI understands keywords,…
2
votes
2 answers

How does a LLM (transformer) pick words from its vocabulary?

I have a very rough understanding of the "attention/self attention" mechanism of transformer models and how this can be used to process a set of word vectors provided as an input/prompt to the encoder of a network and how this will produce…
2
votes
1 answer

What are the most effective methods and tools for summarizing long-form content like articles, editorials, and discussion threads for an app?

With users expecting instantaneous information and no compromise on in-depth details, app developers are challenged to condense long-form content such as articles, editorials, and discussion threads into concise summaries. To ensure that users still…
2
votes
1 answer

I have 5000 html files (structured text), how can I generate a new one that "resembles" those?

I don't know anything about ML or NLP, but I was asked by someone to create brand new statutes (written laws) that resemble the ones currently in effect in my country. I have already gathered the laws, and have 5000 html files now, one per law. The…
2
votes
1 answer

Are there any meaningful books entirely written by an artificial intelligence?

Are there any meaningful books entirely written by an artificial intelligence? I mean something with meaning, unlike random words or empty books. Something that can be charactersed as fiction literature. If yes, then I think it is also interesting…
1
vote
3 answers

Would maximizing (instead of minimizing) error of an LLM/HMM lead to complex behavior?

Imagine we have some sort of "next token predictor," either with transformer architecture, LSTM, or just a HMM (though the terminology I use here will be less aligned to HMMs, I believe the question is generalizable to all generative NLP). We…
1
vote
1 answer

Why doesn't ChatGPT ask questions?

As far as I understand ChatGPT has been trained on a vast array of data, and it does understand questions; but it seems to never ask. Even if a person would ask clarifying questions (that I assume are in the train set) ChatGPT doesn't, opting…
1
2