header4

By Kenny Akridge, Managing Director at PhoenixTeam

If you have been following this series and have already experimented with one or more Large Language Models (LLM) such as ChatGPT, you are undoubtedly convinced that Gen AI is completely revolutionary and that it is here to stay. If not, I encourage you to visit our earlier posts and try out some of the simple examples. Before we explore the rules, let’s get a clear understanding of what Gen AI is.

Generative AI (aka Gen AI or GenAI):a type of artificial intelligence that can create new content, pictures [like the graphic above], music, or writing, all on its own. It learns how to do this by studying lots of examples, and then it uses what it has learned to make new things that have never been made before. Think of it like a really smart robot that can draw, write stories, or compose music just by understanding patterns from what it has seen before.

Full transparency – ChatGPT generated that definition. I think it is mostly a good definition though. That brings us to the first rule.

Rule Number One of Gen AI: “AI should almost always do most of the work.”

Why would I write a definition for Gen AI from scratch when the AI can do it for me? I have applied this rule so often for a little over a year now that it has become second nature. To a large degree, it has replaced Google for me. Need to write a complex Excel formula? Gen AI can do that. Need to OCR something, Gen AI can do that. Want to find patterns in a series of data? You guessed it – Gen AI can do that too. There really isn’t much that it can’t do. I find myself constantly telling people around me to “just let the AI do it.”

So, this is amazing, right? I mean, ChatGPT can complete most of the work for my tasks. You might even wonder if this article was composed by AI. Sadly, ChatGPT did not write this article. It would have saved me a lot of time for sure. Unfortunately, this article falls outside of the “almost always” condition of the rule. When I need something to be genuine, to be authentic, to be delivered in my own voice, there is no substitute (at least not yet). That said, I did consult with ChatGPT on many aspects of this article. “Is ‘almost always’ a clause?” ChatGPT said no, it’s a stipulation or a condition.

Ok, so why does Gen AI only do most of the work?  Enter the second rule.

Rule Number Two of Gen AI: “You almost always need a human for the last mile.”

I borrowed the term “last mile” from the transportation industry. In short, it means “The final leg of delivering goods or services.” Despite all its powerful abilities, Gen AI doesn’t always produce high quality outputs with acceptable accuracy.  Sometimes it guesses and guesses wrong. Sometimes it hallucinates – tells fiction as if it were truth. That’s why most work product needs to be reviewed and polished by a human.

Let’s expand on the human element in AI-assisted application development. Building on these rules, we hypothesize that an artificial intelligence-based collaboration framework, enhanced by both machine and human content curation, will dramatically reduce the time to value in software development. By integrating human insight and feedback directly into the AI development and deployment cycles, we can create high quality software much more rapidly. We will dive deeper into this process in future articles.


You’ve made it this far… so make sure to hit that follow button to stay connected. We’re here to share the latest and greatest insights on all things AI including upcoming interviews from AI experts.

Check out previous articles from our AI series: