What NOT to Do With Modern LLMs
Certain interaction patterns reliably produce poor results — or worse, results that look plausible but are subtly wrong. Two anti-patterns that come from rushing.
Certain interaction patterns reliably produce poor results — or worse, results that look plausible but are subtly wrong.
The commonality uniting most of these anti-patterns is generally rushing and/or frustration. Since you definitely already know those things well, I'll just say "refine the internal system you use to determine when those things are happening, and resist the following applications."
Just Asking for a Result: "Give me a Python script that does X." (cold, no context). "Write me a report about Y." (no constraints, no audience, no purpose). These prompts will get you something, but it will be generic, may not fit your actual situation, and gives you no frame to evaluate it. The less specific your request, the more the model fills in assumptions — and those assumptions may not match yours.
Repeatedly Listing Errors Without Investigation: "I get this error: [paste]. Please fix." (without reading the error). "Still broken, here's the new error." (cycle repeats). This pattern produces code-cruft: progressively messier code as the model attempts to patch without understanding the root cause (and crucially, without knowledge of the standards and patterns you wanted to work within). The model may have access to your full environment, and may even be able to run your code, but by asking it in this way, you've focused it away from those needs, which often leads to weird patching and crazy workarounds.
Better approach: Sometimes you just gotta debug — so before pasting an error, read it, and try to identify what file, line, or condition it points to. Describe what you expected to happen and what happened instead. Tell the model "before you write any code, let's understand what went wrong". The model's job at this point is to help you reason through a problem you understand, or it can find and explain, and then to fix it.