You are somewhat correct, but one of the reasons I love common lisp is exactly because it sort of overcame this "bad timing", which is very unforgiving to lesser languages. The fact that today I can write lisp apps in a language whose primary platform(the lisp machine) died 20 years ago, and still enjoy a relatively active ecosystem is quite impressive.
Do you think clojure has the potential to be useful in 20 years if the JVM dies today hypothetically? I don't know, I think yes, because it has a bunch of good ideas, but it isn't at all clear. Take JS as an opposite example, if the web is to disappear, JS will disappear as well, no matter how many other platforms it is ported to, it just isn't good enough to stand on its own.
I could be talking out of my ass here, but my point was that CL has a good track record for survival, so it has the potential for a 100 year language, while clojure is yet to prove itself for such a task. These things are extremely hard to predict, tech history has a habit of throwing such wild-cards as personal computing and the web. We'll see.
The primary platform of CL did not die. CL was ALWAYS developed on multiple platforms. From day -1 there were implementations on non Lisp Machine platforms. People actually ran companies to do so: Lucid (Unix), Franz (Unix, PC), Coral (Mac), Goldworks (PC), Procyon (PC, Mac), Harlequin (Unix), ... The Carnegie Mellon University developed CMUCL for Unix. Japanese developers KCL (Unix, ...), CLISP from Germany, ...
Do you think clojure has the potential to be useful in 20 years if the JVM dies today hypothetically? I don't know, I think yes, because it has a bunch of good ideas, but it isn't at all clear. Take JS as an opposite example, if the web is to disappear, JS will disappear as well, no matter how many other platforms it is ported to, it just isn't good enough to stand on its own.
I could be talking out of my ass here, but my point was that CL has a good track record for survival, so it has the potential for a 100 year language, while clojure is yet to prove itself for such a task. These things are extremely hard to predict, tech history has a habit of throwing such wild-cards as personal computing and the web. We'll see.