When AI tools first emerged in the creative landscape, I had mixed feelings. As a children's book author with fifteen published titles, I was simultaneously intrigued by the possibilities and concerned about maintaining the human touch that makes children's literature special. Would using AI somehow make my books less authentic? Would I be cheating my readers by incorporating these new tools?

Three years and seven AI-assisted books later, my perspective has evolved considerably. I've discovered that like any tool—from a thesaurus to illustration software—artificial intelligence can either enhance or detract from creative work depending on how it's used. Today, I'd like to share my journey navigating this new territory, including the principles I've developed for using AI ethically and effectively in children's book creation.
The Author's Voice: Maintaining Authenticity
My first experiment with AI was a disaster—and taught me perhaps the most valuable lesson about this technology.
I was working on a rhyming picture book about friendship and decided to try generating the entire first draft using AI. The resulting manuscript checked all the technical boxes: perfect meter, consistent rhyme scheme, age-appropriate vocabulary. But it lacked something essential—a distinctive voice. The text felt generic, like it could have been written by anyone (or no one). It was technically correct but creatively hollow.
This experience led to my first and most important principle: AI should enhance your voice, not replace it.
Now, instead of asking AI to generate complete manuscripts, I use it as a collaborative brainstorming partner. For my recent book about my daughters going on a magical journey, I wrote the core narrative in my own voice, then used AI to help brainstorm:
- Alternative rhymes for lines where I felt stuck
- Different metaphors for describing anxiety and wonder in child-friendly terms
- Ways to rephrase concepts that seemed too complex for my target age
The resulting book still sounds distinctly like me—because the foundational text and creative vision remained mine. The AI simply helped me consider options I might not have thought of independently.
A practical approach I've found effective is to write the first draft completely on my own, then use AI for specific revision challenges. This ensures the book's heart and voice come from me, while the technology helps refine and enhance what's already there.

Ethical Review: The Human Filter
Early in my AI experimentation, I discovered another critical lesson when working on a chapter book about history and time travel. I asked the AI for ideas about cultural traditions to include, and while many suggestions were excellent, I noticed some problematic elements—oversimplifications, stereotypes, and in a few cases, completely inaccurate information.
This experience crystallized another crucial principle: Always apply critical human judgment to AI output, especially regarding cultural sensitivity, accuracy, and age-appropriateness.
AI systems learn from existing content, which means they can perpetuate existing biases, inaccuracies, or oversimplifications. This is particularly concerning in children's literature, where we're helping shape young readers' understanding of themselves and the world.
Now, I apply a consistent review process to any AI-assisted content:
- Fact-check all specific information against reliable sources
- Assess cultural representations for accuracy and respect, consulting sensitivity readers when necessary
- Consider developmental appropriateness based on established child development principles
- Evaluate for inadvertent biases or stereotypes
- Ensure the content aligns with my own values and the message I want to convey
This process led me to discard some AI suggestions and substantially modify others for my book about history and time travel. The final book was stronger because I combined AI's creative potential with careful human oversight.
This responsibility cannot be delegated to technology. As authors, we must remain the final ethical filter for everything that reaches our young readers, regardless of how that content was initially generated.
Collaboration, Not Delegation: Understanding AI's Role

Another pivotal moment in my AI journey came during a school visit. A fourth-grader asked me a deceptively simple question: “Do you write your books, or does a computer write them?”
The question forced me to clearly articulate something I'd been feeling but hadn't fully expressed: AI works best as a collaborative tool, not a replacement for human creativity.
I explained to the student that writing a book is like building a house. I'm still the architect and builder—I design the house, lay the foundation, build the walls, and make all the important decisions. AI is like having an assistant who can quickly fetch different materials, suggest alternative designs for a particular room, or help me solve specific problems. The house is still fundamentally my creation, but the assistant helps me build it more efficiently.
This framing has guided my approach ever since. In practical terms, this means:
- I establish the core concept, characters, and narrative arc before involving AI
- I use targeted, specific queries rather than open-ended requests
- I maintain creative control by selecting, modifying, or rejecting AI suggestions based on my vision
- I view AI as one tool in my creative toolkit, not the primary creator
For my chapter book series about a coming-of-age young woman during the Victorian Era, I created the protagonist, supporting characters, setting, and basic mystery structure entirely on my own. Then I used AI to help brainstorm unique mystery scenarios, potential red herrings, and age-appropriate clues. The result was still entirely my creative vision, but enhanced by collaborative ideation.
Transparency with Stakeholders: Finding the Right Approach
One of the most challenging aspects of incorporating AI into my process has been determining the appropriate level of transparency with different stakeholders—publishers, parents, educators, and young readers themselves.
I initially worried about potential stigma. Would admitting to using AI tools somehow devalue my work in publishers' or readers' eyes? Would people assume the book was “written by AI” rather than understanding the collaborative nature of the process?
Through conversations with colleagues, editors, and my manager, I developed a principle that works for me: Be appropriately transparent about AI use based on context and stakeholder needs.

In practice, this means:
- With publishers and professional collaborators: I'm fully transparent about which elements involved AI assistance, particularly for legal and contractual clarity
- With educators and parents: I explain AI's role when relevant, focusing on how technology enhances rather than replaces human creativity
- With young readers: I focus on the creative process in age-appropriate terms, explaining that I use many tools (including computers) to help make my books the best they can be
During one school visit, a fifth-grader asked specifically about AI tools. I explained that sometimes when I'm stuck on a rhyme or trying to explain something complicated in simple terms, I ask the computer for suggestions—just like I might ask a human friend. But I always decide which suggestions to use and change them to match my style. “It's like having a really fast brainstorming partner,” I explained, “but I'm still the author making all the decisions.”
This balanced approach to transparency has worked well, allowing me to be honest about my process while ensuring my books are still recognized as fundamentally human creative works.
Continuous Learning: Adapting as Technology Evolves
Perhaps the most important principle I've developed is to approach AI with a learning mindset, continuously evaluating and adjusting how I use these tools.
AI technology is evolving rapidly, as are the ethical frameworks, best practices, and industry standards surrounding it. What works today might need reconsideration tomorrow. This requires ongoing education and reflection.
I've established several practices to support this principle:
- Monthly review of my AI usage to assess what's working and what isn't
- Regular conversations with fellow authors about their experiences and approaches
- Staying informed about developments in AI ethics and children's publishing industry standards
- Soliciting feedback from editors, educators, and young readers about the books I create
This learning mindset has led me to continually refine my approach. I use AI differently now than I did a year ago, and I expect my process will continue to evolve.
Finding Your Own AI Balance

Every author's relationship with creative tools is unique, and there's no single “right way” to incorporate AI into your children's book creation process. The principles that guide my work might need adaptation to fit your creative style, values, and the specific books you create.
If you're a children's book creator interested in exploring AI tools, I suggest starting small. Rather than trying to apply AI to your entire process, identify a specific pain point—perhaps you struggle with rhyming, or writing flap copy, or creating illustration briefs. Experiment with using AI just for that element, while maintaining your traditional approach to everything else.
As you explore, develop your own set of principles that align with your values as a creator. Consider questions like:
- What elements of my creative process are most important to maintain as purely human?
- Where could I benefit most from AI assistance?
- What review processes do I need to ensure quality and alignment with my vision?
- How comfortable am I discussing AI use with various stakeholders?
- What boundaries will I establish regarding AI's role in my creative work?
There's no universal answer to these questions. The “right amount” of AI involvement varies by creator, project, and context.
What remains universal, however, is our responsibility to young readers. Children deserve books created with integrity, care, and genuine human connection—regardless of the tools used in the creative process. AI can help us create better books more efficiently, but the heart, vision, and purpose of children's literature must remain deeply human.
In my journey, I've found that the most successful integration of AI into my creative process comes when I view technology not as a replacement for human creativity but as an amplifier of it—a tool that helps me create even more engaging, thoughtful books for the young readers who inspire my work every day.
What principles guide your use of creative tools, AI or otherwise? I'd love to hear about your experiences in the comments below!