Turning the Buddhist (or other) teachings into deeds
is not too hard once you have understood who you are,
and, maybe more importantly, who you are not. Figuring
/that/ out can be tough and require a lot of practice.
What people perceive as modest is really an acceptance
or even appreciation of what is. My apartment has not
been renovated in decades, I repair what needs repair and
otherwise leave things to themselves. I wear clothes until they
disintegrate, and my hardware is already old when I
buy it. This is the course of things. Things age and
change and at some point disappear. Why prefer the new
over the old? Why the old over the new? It is just that
things and beings get old on their own, and it is much
more joyful to witness this than trying to resist it.
Many thanks to everybody who is wrote in this thread! Your words mean a lot to me! I will reply to some individual messages. If I don't, please substitute "thank you!" :)
Thank you so much for reading my books and describing my work in such
beautiful words! You basically answered your own question! My motivation
is just the creation of something I find beautiful. The vision, to pass
knowledge to those who seek it in the simplest possible way, where
"simple" does not necessarily mean in the tersest form, but in a form
that invites being digested.
I do not usually talk much about "myself". I tried, but with no-one
asking, I find it difficult to say anything.
I just bought ePubs for your Raja Yoga Revisted (I usually study SRF material, but alternatives are good!) and Scheme 9 for Empty Space. Your web site is very nice, I loved the ‘Who am I?’ page. I have been using Lisp languages since 1978 but except for studying Peter Norvig’s Lisp in Python, I have never dropped below the abstraction layer into a Lisp implementation so I am looking forward to that.
tug2024 wrote:
> Doesn’t lisp extend lambda calculus (abstraction . application)? As a consequence, lisp (abstraction . application . environment)!
Another valid question downvoted into oblivion.
The environment in (lexically scoped) LISP is an implementation detail. Lambda calculus does not need an environment, because variables are substituted on a sheet of paper. So lambda calculus equals lexically scoped LAMBDA in LISP.
Sure, you could view LISP as LC plus some extra functions (that are not easily implemented in LC).
Thanks. I recently had to reinvent LISP to script my CRDT database.
That was not much work, because I already had the notation (I use RDX, a JSON superset with CRDT types).
Still, I stumbled at the idiosyncratic LISP bracketing. Luckily, RDX allows for different tuple notations. So, I styled it to look less alien to a curly-braced developer. Like this https://github.com/gritzko/go-rdx/blob/main/test/13-getput.j...
For example, print change-dir make-dir; is equivalent to (print (change-dir (make-dir) ) ) in the old money. I wonder if I am reinventing too much here.
Did LISPers try to get rid of the brackets in the past?
There have been many attempts to get rid of sexprs in favor of a “better” syntax. Even John McCarthy, the inventor (discoverer?) of Lisp had plans for an “M-expression” syntax to replace “S-expressions.” It never happened. The secret is that Lispers actually view sexprs as an advantage, not something to be worked around. Once you discover symbolic editing and code manipulation based on sexprs, you’ll never go back to weak line editing. That said, some Lisp dialects (e.g. Clojure and Racket) have embraced other symbols like square and curly brackets to keep the code more terse overall and optically break up longer runs of parentheses.
Probably the best example of a “Lisp without parentheses” is Dylan. Originally, Dylan was developed as a more traditional Lisp with sexprs, but they came up with a non-sexr “surface syntax” before launching it to avoid scaring the public.
Exactly. I also like Clojure’s use of square brackets for vectors and curly braces for maps. It eliminates all the “vector-” and “map-” function calls.
Those are big quality of life improvements. I wish the other lisps would follow suit. I suppose I could just implement them myself with some macros, but having it standard would be sweet.
The Revised Revised Revised Revised Revised Revised Report on the Algorithmic Programming Language Scheme (R6RS) specified that square brackets should be completely interchangeable with round brackets, which allows you to write let bindings or cond clauses like so:
(let ([a (get-some-foo 1)]
[b (get-some-foo 2)])
(cond [(> a b) -1]
[(< a b) 1]
[else 0]))
...but I hate that, I'd much prefer if square brackets were only used for vectors, which is why I have reader macros for square brackets -> vectors and curly brackets -> hash tables in my SBCL run commands.
I think the R6S behavior helps with visual matching, but squanders using square brackets for something more useful (e.g. vectors), which is a shame. Another thing Clojure does is copy Arc in eliminating parentheses around the pairs of forms in let bindings and cond forms, which really aren’t needed. It just expects pairs of forms and the compiler objects if given an odd number. The programmer can use whitespace (notably newlines) to format the code so the pairings are visibly apparent. That reduces a surprising amount of needless parentheses because let binding forms are used all over (less so cond forms).
True, with modern machine-generated mass-operations refactoring is easier than with older tools, but that doesn't mean a given set of brackets is 'useless'.
I wouldn’t go as far as “pretty awful,” but yes, it’s a keystroke more to manipulate two sequential forms instead of one. And yes, there is a slight indentation advantage when the test and the conditional code won’t fit on the same line. It’s easy enough to use “do” when the conditional clause has multiple forms, however. Personally, I’ll take those trade offs for the reduction in clutter.
I think it's a matter of whether you're programming in a mostly applicative way† or in a more imperative way. Especially in the modern age of generational GC, Lisp cons lists support applicative programming with efficient applicative update, but sacrifice efficiency for certain common operations: indexing to a numerical position in a large list, appending to a list, or doing a lookup in a finite map such as an alist. So, in Common Lisp or Scheme, we are often induced to use vectors or hash tables, sacrificing applicative purity for efficiency—thus Perlis's quip about how purely applicative languages are poorly applicable, from https://www.cs.yale.edu/homes/perlis-alan/quotes.html.
In general a sequence of expressions of which only the value of the last is used, like C's comma operator or the "implicit progn" of conventional cond and let bodies, is only useful for imperative programming where the non-last expressions are executed for their side effects.
Clojure's HAMTs can support a wider range of operations efficiently, so Clojure code, in my limited experience, tends to be more purely applicative than code in most other Lisps.
Incidentally, a purely applicative finite map data structure I recently learned about (in December 02023) is the "hash trie" of Chris Wellons and NRK: https://nullprogram.com/blog/2023/09/30/. It is definitely less efficient than a hash table, but, in my tests so far, it's still about 100ns per hash lookup on my MicroPC and 250ns on my cellphone, compared to maybe 50ns or 100ns respectively for an imperative hash table without FP-persistence. It uses about twice as much space. This should make it a usable replacement for hash tables in many applications where either FP-persistence, probabilistically bounded insertion time, or lock-free concurrent access is required.
This "hash trie" is unrelated to Knuth's 01986 "hash trie" https://www.cs.tufts.edu/~nr/cs257/archive/don-knuth/pearls-..., and I think it's a greatly simplified HAMT, but I don't yet understand HAMTs well enough to be sure. Unlike HAMTs, it can also support in-place mutating access (and in fact my performance measurements above were using it).
______
† sometimes called "functional", though that can alternatively refer to programming with higher-order functions
I sometimes wonder if the issue is really the parentheses or the ease of nesting. In LISP it’s natural to write
(f (g (h x))).
Whereas most people are used to.
a = h(x);
b = g(a);
c = f(b);
In C/C++ most functions return error codes, forcing the latter form.
And then there are functional languages allowing:
x -> h -> g -> f
but I think the implicit parameter passing doesn’t sit well with a lot of programmers either.
Interesting comment. I found the lisp/sexpr form instantly understandable. While the others weren't hard to grasp it took a moment to consciously parse them before their meaning was as clear. Perhaps the functional arrow notation is least appreciated because it's seems more abstract or maybe the arrows are just confusing.
More likely than not it's a matter of what a person gets used to. I've enjoyed working in Lisp/Scheme and C, but not so much in primarily functional languages. No doubt programmers have varied histories that explain their preferences.
As you imply, in C one could write nested functions as f (g (h (x))) if examining return values is unnecessary. OTOH in Lisp return values are also often needed, prompting use of (let ...) forms, etc., which can make function nesting unclear. In reality programming languages are all guilty of potential obscurity. We just develop a taste for what flavor of obscurity we prefer to work with.
Loading programs, testing individual functions, examining data. I usually have two windows/consoles open: one for writing code, one for loading and testing.
No idea why emorning3's comment is marked [dead]. It's a good question and in fact Klong arrays are very much like S-expressions. In fact their implementation is pretty much the same, down to using CAR and CDR to manipulate them.
Klong could easily be implemented as a bunch of LISP functions and macros.
Turning the Buddhist (or other) teachings into deeds is not too hard once you have understood who you are, and, maybe more importantly, who you are not. Figuring /that/ out can be tough and require a lot of practice.
What people perceive as modest is really an acceptance or even appreciation of what is. My apartment has not been renovated in decades, I repair what needs repair and otherwise leave things to themselves. I wear clothes until they disintegrate, and my hardware is already old when I buy it. This is the course of things. Things age and change and at some point disappear. Why prefer the new over the old? Why the old over the new? It is just that things and beings get old on their own, and it is much more joyful to witness this than trying to resist it.