This month, I set out to use ChatGPT to generate content about NBA games, specifically passing parameters to learn how ChatGPT handles them.
I wanted to do the following:
- Identify five variables with game result, box score, and odds data that ChatGPT can use to write an article
- Define prompt’s requirements and constraints to produce output in a desired format
- Pass variables to prompt and test output
- Produce article that takes NBA game data, player information, and odds to produce full game preview
Although I did not follow the weekly plan I set out for myself, I was make significant process or achieve all of the bullet points above.
I learned more than expected about prompt engineering, how requirements have to be very clearly explained in painstaking detail, and how subtle changes in the prompt can lead to different results. I also learned how ChatGPT fails to comprehend basic instructions at times, disregarding certain requirements even when stated explicitly.
Here’s an example of just part of a prompt, showing how complex this can get:
Although I did not directly connect to the database and pass live game information, I was able to generate a passable game preview with hypothetical data. With the structure obtained, real data could be passed via a dynamic string generated by a script.
Here’s some output, with redactions included to not give away all the inputs:
Through all this, I have witnessed the power of generative artificial intelligence. I have also learned that significant human knowledge will be needed to use these systems, and that humans will be required to engineer the pipelines that these models tap into. The robots may be coming, but humans will still be very much in demand.
On to August.