I mean, pretty much all code is just strings subject to certain parsing constraints by the compiler/interpreter/lexer. What GP is talking about is semantic difference, which is super important in programming.
Well strings in various languages are immutable in many programming languages so that's not it.
In strings I'd say the _content_ matters. If the value needs to be preserved at runtime that is a string, if after compilation the value doesn't, it's only used for binding at compilation time then it's a symbol. So, semantic litmus test I'd propose: if you search and replaced the value and recompiled, would you notice?
uhhh... no? The difference depends on the language, but really it boils down to semantic meaning behind certain strings. x == y has different semantic meaning than "x == y" but both are literally strings of text.
One of my favourite things about the language in fact.