Sometimes a prompt or set of lyrics won’t go through.
When that happens, it’s not personal — it’s the model being cautious.
What’s Actually Happening
Songer itself doesn’t manually moderate your lyrics.
However, the AI model used for generation has built-in safety filters that can’t be disabled.
If something in your prompt or lyrics trips those filters, the model may reject or flag it.
Frustrating? Yes.
Avoidable? Usually.
Common Triggers
Certain themes tend to cause issues more often than others:
Direct references to death, funerals, coffins, or burial
Explicit violent imagery
Copyrighted or trademarked lyrics, phrases, or song titles
Even when used creatively, these can stop generation altogether.
What Usually Works Better
If a prompt gets flagged, small wording changes often solve it.
Abstract or symbolic language tends to pass more reliably
Suggesting themes instead of stating them directly helps
Using original lyrics or content generated inside Songer works best
The model responds better to implication than to blunt detail.
How to Adjust Your Lyrics
If something gets flagged, try:
Replacing direct terms with a metaphor
Softening explicit references
Rephrasing rather than removing the idea entirely
For example:
Instead of... | Try... |
“kill” | “hurt” or “destroy” |
“gun” | “weapon” or “shadow” |
“die” | “fade away” |
“suicide” | “losing hope” or “giving up inside” |
“blood everywhere” | “a crimson scene” or “a heavy silence” |