I hate dogma. I hate it even more when a valid scientific observation becomes dogma.
One case concerns the infamous goto statement in programming languages. It is true that a programming language does not need a goto statement in order to be universal. Unfortunately, this led some, most notably among them the late Edsger Dijkstra, to conclude that goto is actually harmful. While it is true that goto can be misused, and that misusing the constructs of a programming language can lead to bad code, I don’t think goto is unique in this regard (it is certainly no more harmful than pointers, global variables, or the side effects of passing variables by reference, just to name a few examples). Nonetheless, with Dijkstra’s letter on record, the making of a dogma was well under way.
And here I am, some 40 years later, trying to write a simple piece of code the logic of which flows like this:
LET X = A
LABEL:
Do something using X
IF some condition is not satisfied THEN LET X = B and GOTO LABEL
The condition, in particular, is always satisfied when X = B.
Yes, I know how to rewrite the above code using a loop construct, to satisfy structured programming purists. But why should I have to, when the most natural way to express this particular algorithm is through the use of a conditional jump, not a loop? Oh wait… it’s because someone who actually believes in dogma prevailed when the JavaScript was designed, and therefore, goto never made it into the language.