6

Like many people, I've been experimenting with AI and using it for my own personal needs, for example as a way to summarize search results or as a more personalized recommendation engine for entertainment. I have also used it for professional use in my capacity as a programmer to write and maintain parts of the software my business maintains, with encouragement from my employer and coworkers.

That said, I also have a desire to become a published writer of fiction, and in that context I know that there is generally a feeling of discontent, if not outright outrage, among the author population regarding the unparalleled amount of model training that was done using copyrighted materials without permission or compensation. While I do not feel the same degree of outrage that most authors do, I do agree that authors deserve to have more control or at least appropriate compensation for this usage.

Currently, I do not think generative AI is competent enough to replace professional authors for long-form fictional works, at least not to the same level of quality, and I think there is a certain amount of wariness among both authors and the general population regarding AI-written works, often even accusing authors with a certain writing style of using AI to write for them. I've been on the receiving end of such accusations in the past due to my relatively formal writing style, even in very informal settings like chatrooms...

Because of that, I do not want to use any output of generative AI directly in my writing, whether that is to write, rewrite, edit, proofread or otherwise directly interact with any of my own writing present in the final work. I want the final text to be entirely conceived by a human brain and written by human hands, such that I can say "this work is not written by AI".

That said, during an earlier stage when I was still exploring AI, I have used AI (more specifically ChatGPT) for generic market research purposes. More precisely, I have given it an high level summary of the world I've built, the type of story I want to tell in that world, initial WiP details of some characters and a 1 sentence plot outline, with the goal of finding other works of fiction with shared properties.

During this research, I also asked and received general advice on interactive narrators, something I find intriguing and want to integrate into the finished work. These 2 questions are research I could have done through traditional search engines as well, but my experience is that over the past couple of years, search engines have been rapidly deteriorating in quality due to overzealous SEO of sites, increasing frequency of subscription walls, and engine scope creep.

I'm curious whether this limited use of AI for research, both in amount and scope, could be considered enough for a reasonable person to say that a particular work "was written by AI". I personally don't think it is, but I also tend to have a narrow "letter of the law" interpretation to such things. I can definitely see a professional author with a ferocious opposition to AI consider this as partly written by AI...

Ben
  • 19,064
  • 1
  • 16
  • 72
Nzall
  • 783
  • 4
  • 17

4 Answers4

5

The discussion about usage of AI in publishing is in its early stages and what the consensus may be in the future is difficult to guess. From what I am able to discern, one prominent current position is that authors should be transparent about their use of AI. I'll give you two examples, one from a major publishing platform for mostly fiction, and one from a university that teaches future researchers.

1

Amazon does allow authors that publish through Kindle Direct Publishing to publish works which have been made with the help of AI, but Amazon requires that authors disclose if their works are AI-generated. Amazon differentiates between "AI-generated" and "AI-assisted" content (from Content Guidelines):

AI-generated: We define AI-generated content as text, images, or translations created by an AI-based tool. If you used an AI-based tool to create the actual content (whether text, images, or translations), it is considered "AI-generated," even if you applied substantial edits afterwards.

AI-assisted: If you created the content yourself, and used AI-based tools to edit, refine, error-check, or otherwise improve that content (whether text or images), then it is considered "AI-assisted" and not “AI-generated.” Similarly, if you used an AI-based tool to brainstorm and generate ideas, but ultimately created the text or images yourself, this is also considered "AI-assisted" and not “AI-generated.” It is not necessary to inform us of the use of such tools or processes.

For Amazon, your usage of AI would probably fall into the category of "AI-assisted" content, meaning that you wouldn't have to disclose your use of AI when you wanted to publish on Amazon.

2

One of the ethical standards in scientific work is transparency. My local university, the University of Tübingen (Germany), has begun to educate students in using AI tools for science and scientific writing and has published guidelines for AI usage (at Guidelines for using generative AI tools). The key guidelines are:

Firstly, members of the University must tackle GenAI in a critical and reflexive way. Many of the problems with GenAI (including at a social level) arise from a naive attitude. A critical and reflexive attitude demands not only the development of a sophisticated awareness of the problems, but also the ability to realistically assess the potential and limitations of GenAI.

Secondly, use of GenAI should be organized transparently. If GenAI is used (whether for research, studies, administration or science communication), the procedure must be documented and made visible/transparent in the relevant format.

Thirdly, all members of the University are exhorted to use GenAI responsibly in line with good scientific practice. Among other things this means that when working with GenAI it is the individual responsibility of the authors to ensure that their texts do not contain any plagiarism and all sources are critically examined. It also means that each user must assume responsibility for how and to what end the data provided by AI systems are processed.

In your case this would mean:

  1. Consider what using AI in your scholarly undertaking actually means for the outcomes of your research and whether it changes its validity.
  2. Explain exactly what you did (e.g. in the methods section if you used AI for your research or in a disclaimer if you merely used it in your writing).
  3. Make absolutely sure that your AI does not infringe on anyone's copyright and double-check the veracity of the information your AI gives you. ChatGPT has been documented to be biased and to err.

I'm sure other universities have published similar guidelines. You may want to check those of your institution.

Laurel
  • 4,283
  • 3
  • 13
  • 42
Ben
  • 19,064
  • 1
  • 16
  • 72
5

It is not written by AI unless you copy text produced by the AI.

Absolutely zero text produced by the AI can be copyrighted by you, because it just might be taken from an already copyrighted work.

Of course, other works cannot copyright individual words; so sure, you can use AI to help you find an appropriate adjective or verb. You could use AI to translate a phrase you wrote in English into Spanish or German.

And of course you can use AI to do research into something you know little about, such as military tactics, or who did what in a medieval castle, or how dogs herd sheep, or Native American religious beliefs, etc.

The point is -- don't put anything it writes into your book verbatim.

Otherwise, AI is just an accelerated form of Google. I particularly like Microsoft's CoPilot for this purpose; it can provide links to web-pages in footnotes that support its answers, so you can go to those websites and read material by actual authors, which the AI is often quoting -- It is quoting copyrighted text.

Which proves the point -- If it is published on somebody's web page, it is copyrighted! You can use the knowledge you gain, but you can't just copy it verbatim and stick it in your work. You cannot trust anything the AI says to be copyright free.

But for research, it can be invaluable. For brainstorming? Sure. Plots and story ideas are not copyrightable. (Although character names and place names and other invented words can be trademarked: e.g. Harry Potter, Hogwarts, 007, James Bond.)

There have been many books written about kids and/or adults being magical and going to a school to learn how to use magic. Many books about international spies.

AI can be useful as a research tool, even a brainstorming tool. Just don't use anything it writes verbatim, no matter how attractive it is -- it just may be attractive because it is the copyrighted work of a very good published writer! Even with name changes, that would still be a copyright violation.

Amadeus
  • 107,252
  • 9
  • 137
  • 352
0

Can a work be considered "not written by AI" if AI was used for research and brainstorming but was not used to (re)write, edit or proofread text?

Yeah, that doesn't count for AI writing. If you wrote it and used AI to learn about a topic and gain ideas then it's your writing. The same could be said if someone helped you learn about a field of study they had a degree in. If you adapted the knowledge you took from them into your work it would still be your writing.

-2

I disagree with the implicit claim in Amazon's guidelines that "If you created the content yourself, and used AI-based tools to edit, refine, error-check, or otherwise improve that content (whether text or images), then it is considered "AI-assisted" and not “AI-generated.”". The fact is that pretty much any form of LLM usage will definitely and undeniably create new content, because they are by design made to generate content (whether useful or not). Not only the style would change, but also extra bits would be added.

Thus I would always consider LLM-edited work to be partly "written by AI". If you use chatgpt to refine an essay, the result would likely be half yours and half LLM-generated content.

However, the way you are using LLMs is different, because you stated: "I do not want to use any output of generative AI directly in my writing, whether that is to write, rewrite, edit, proofread or otherwise directly interact with any of my own writing present in the final work.". Therefore your writing would be written completely by yourself, even though its contents and style is influenced by what LLMs have told you, which does not differ from how you are influenced by what humans tell you.

user21820
  • 135
  • 5