Ethical (and Unethical) Uses of AI
- Amanda Melton
- Mar 27
- 2 min read
The header image was generated by AI, for example, in an article.
After my college assignment on artificial intelligence in writing and learning, I wanted to explore my ideas more deeply without the buckets of information I had to find.
What most people fail to realize about AI is that we, as writers, have been using a form of AI since technology came out. The use of AI in these aspects is not new - spell and grammar checks on Google Docs or Microsoft Word have been a significant part of my writing craft for as long as I can remember.
Then, more minor forms of writing checks came about, like Grammarly or the Hemingway Editor. Google, Bing, and Firefox search engines are AI-processed to give you what you might be looking for the closest.
But what about the 'bigger' forms of AI that have been the topic of controversy?
My attitude toward AI has changed the more I read about it. For this conversation, I will discuss ChatGPT or Copilot and its potential do's and don'ts regarding the writing industry. For simplicity, ChatGPT, Copilot, and the like will be clumped together as 'AI' for this article.
DONTs
Don't use AI to generate a scene for you. When you ask AI to write a hard scene for you and add it to your story, it's obvious to readers. If you used that one scene in your book, what other parts are not yours?
Over-rely. Avoid having AI do all the heavy lifting. This ties in with Do's #2
DOs
Ask AI for prompts. Sometimes, it is best to have prompts to enjoy to overcome writer's block.
Use as a Tool, Not a Replacement. Let AI assist with brainstorming, outlining, or editing, but keep your unique voice and creativity at the forefront.
As of this posting, this list is my two cents on AI's uses, emphasizing its potential to be a transformative tool when wielded responsibly, ethically, and with a commitment to transparency and creativity. It serves as a reminder that AI should amplify human ingenuity, not overshadow it.
Comments