libera/commonlisp - IRC Chatlog
Search
4:12:52
mzan
Hi, I'm learning CL. Up so far this is the most complex code I wrote https://aoc-benchmarks.dokmelody.org/linux/program.php?test=aoc2021_day03b&lang=lisp&id=3
4:14:10
mzan
I like the (iter ...) macro a lot. But the (iter ...) macro initialize some variables to "nil" sometime, and the SBCL compiler refuse to apply some type optimizations because it sees an initial nil value and sometime it is not smart enough to figure out that the "nil" value will never be used.
4:16:22
mzan
My impression is that the majority of macros works well together but sometime there is some leak. It is acceptable, but it is rather frustrating if one coded in more rigid languages like Haskell.
4:17:48
mzan
In this case specific, the leak was only related to optimizations, but not real semantic. Probably there are no many leaks on the side of the semantic.
4:19:13
moon-child
mzan: in such 'rigid' languages, you would probably not be allowed to write such code in the first place
4:21:02
moon-child
re iter/optimization: there was some loop/iter-like macro that purported to generate good code for pipelines of transformations. I can not now remember what it's called, but someone else may...
4:21:03
beach
mzan: There are several irritating style problems with your code, like mysterious blank lines, wrong number of semicolons in comments, incorrect indentation.
4:26:36
mzan
sm2n: I read something about type declarations in "iterate", but I will study it better. Thanks!
4:27:34
mzan
BTW, in case of doubts I can expand the code generated by the "iterate" macro, so for sure the problem was not obfuscated by the environment.
4:28:45
mzan
Probably in future I will try to micro-optimize the code, removing left notes signaled by SBCL, and benchmark the difference.
4:30:36
mzan
moon-child: I tried SERIES. In theory they should be superior to "iterate", because you can reuse chunks of code transforming data in different places. SERIES support this.
4:31:20
mzan
In practice, up to date, rarely I feel to write more code in "iterate" than in SERIES. And "iterate" is more flexible and more in line with the imperative semantic of some parts of CL.
4:34:16
EdLangley[m]
I think the sort of implicit stream-fusion SERIES does ends up being pretty limited in CL
4:35:14
EdLangley[m]
I've been experimented with transducers (from Clojure) that work more explicitly by passing a continuation around, and I think they're a better fit for lisps
4:40:14
mzan
Another thing I didn't liked to CL, it is that there is some lack of basic data-structures. In Haskell there is a rather efficient IntMap for example, and other data structures. C++ has plenty of them.
4:40:36
EdLangley[m]
The trick behind them is pretty simple: https://twitter.com/fwoaroof/status/1337667255727886337
4:47:16
mzan
ahhh "(1+ ...)". I din't know there were this function, but I suspected there were something of similar! :-)
4:50:41
mzan
EdLangley[m]: I don't know TRANDUCERS, but you example seems anti-intuitive to me. You are usinng a "reduce", but you pass a function that creates a stream (a vector in this case).
4:51:26
mzan
In your example, "compose" apparently do not work on streams using fusion, but only on the entire result.
4:53:47
mzan
In my ideal world, I would like stream transformers with an optional internal state, that can be fused. Then some final reducers.
4:53:59
EdLangley[m]
So, functions like MAPPING return a function that takes a continuation as an argument
4:54:54
EdLangley[m]
This returns a function with the signature you'd pass to REDUCE: (ACC NEXT) -> whatever
4:55:46
EdLangley[m]
This function, instead of building up a result with an explicit constructor, calls RF and passes an appropriate value of ACC and NEXT to it
4:56:09
EdLangley[m]
So, for MAPPING, this means something like (funcall rf acc (funcall transform next))
5:01:06
mzan
I don't know CL enough for understanding all your code. But it is usefull reading it for me.
5:04:08
EdLangley[m]
But (compose (mapping f) (mapping g)) doesn't build up an intermediate collection
5:04:35
moon-child
i.e. instead of (compose (mapping f) (mapping g)), you can say (mapping (compose f g))
5:05:24
EdLangley[m]
With a full library, you can do like: (compose (mapping #'1+) (filtering #'evenp) . . .)
5:05:58
EdLangley[m]
Basically, any operation that can be expressed through the lambda you pass to REDUCE, you can do this way
5:06:38
EdLangley[m]
Yeah, but the point is to preserve the (mapcar (lambda (b) ...) (mapcar (lambda (a) ...))) style
5:07:58
EdLangley[m]
This also abstracts from the concrete result type and from limitations like "the input sequence must be finite"
5:08:39
EdLangley[m]
You can express an operation once and pick whether it builds a list, a vector or puts results on a channel of some sort at the use-site
5:08:52
moon-child
my primary exposure to this sort of style is apl (not clojure), which is very strongly oriented _away_ from streams
5:08:54
mzan
EdLangley[m]: SERIES are composable, in the sense you can reuse a complex SERIES definition as starting point of another SERIES. Is your paradigm composable?
5:10:45
mzan
so something like (defun even1+ () (compose (mapping #'1+) (filtering #'evenp)) can be reused, while a "loop" no.
12:04:04
gamaliel
Hi, does anyone have experience with the vgplot system? I don't know how to return a plot as a tk widget. When I run (vgplot:plot) it returns an empty string.
13:14:43
qhong
gamaliel: why you want to get a widget object? vgplot (or gnuplot) seems to rely extensively on global state and I think there’s no way to do that
13:15:50
qhong
gamaliel: iirc vgplot doesn’t manage its own GUI, it just sends command to a separate gnuplot process. I could be wrong. I was using it a lot for scientific computing but I had no need to hack it
13:51:57
nij-
I just watched beach's talk again on FCGE. A main issue it addressed is that it's not easy to implement FCGE without sacrificing run-time performance. But I fail to understand how his implementation resolves this issue.. any idea?
13:56:08
Bike
oh. well, beach's implementation uses cells, so that at runtime to look up a definition you just grab it from a cell that was compiled in, which is pretty quick. that's not in the presentation?
14:17:25
nij-
I did stop and re-watch many parts of it, but still failed to see the reason. Hmm.. I also need to think more about what you said.
14:26:54
beach
nij-: An indirection through as CONS cell has the same cost as an indirection through a symbol, which is what most implementations do.
14:27:22
beach
nij-: And as Bike said, previous work used a hash-table lookup for each function call.
14:27:26
nij-
And by grabbing it from a cell, do you mean the same thing as (slot-value ..) in CLOS? I'm not sure how CLOS is implemented.. but whenever (slot-value ..) is evaluated, under the hood isn't it doing a hash table lookup?
14:28:30
beach
nij-: More like (funcall (internal-car (load-time-value (find-function-cell <name>))) arg...)
14:30:49
nij-
I suppose there should still be a mechanism for it to look over a collection of cells, right? That doesn't make it as inefficient as a hash table lookup?
14:31:54
phoe
by the time INTERNAL-CAR is called at all, LOAD-TIME-VALUE has already called FIND-FUNCTION-CELL and installed the reference to the concrete Lisp object in its stead
14:38:40
nij-
Hmm.. so the goal is to make it efficient in run-time. And the heavy work has been done before run-time..?
14:40:55
beach
nij-: A typical Common Lisp implementation has an indirection through a symbol. The work to find that symbol at load time is the same as that of FIND-FUNCTION-CELL.
14:51:19
phoe
it's fun to be a bystander to this, because I'm working on an article documenting a library utility that directly depends on LOAD-TIME-VALUE
15:00:27
beach
It is also fun to see how it is not widely known what a typical Common Lisp implementation has to do at load time in order to find the symbol corresponding to a function call in source code. Nor is it widely known how a typical Common Lisp implementation represents the global (null-lexical) environment, as opposed to the lexical compile-time environments of CLtL2.
15:01:14
phoe
I think it's generally not widely known what a typical CL implementation has to do in order to achieve X, unless you're something of an implementer yourself
15:01:30
lisp123
TIL one of the inventors of Lisp Machines (Tom Knight) has founded a company worth $17bn as last year
15:02:18
lisp123
I do wonder if there is some secret LISP code in there that they are keeping a trade secret
15:03:25
nij-
Sigh. I start wondering if JS is really that bad. Is there any formal argument that shows JS is inferior to Lisp?
15:35:27
qhong
and it can yield bunch of sensible result, like mutation is real power, call/cc is real power, and delim/cc = call/cc + mutation > call/cc
15:37:08
lisp123
Not that I plan on doing it, but is it possible to call generic functions in a eval-when form (you know the ones that run code at load time)
15:44:37
Alfr
lisp123, the default behavior is for forms only to be evaluated when a compiled file is loaded, not when they are compiled.
15:48:28
lisp123
(eval-when (:compile-toplevel :load-toplevel :execute) (defgeneric test (a)) (defmethod test ((a list)) (print "test")) (test '(1 2 3)))
15:48:35
White_Flame
all functionality works at all run/load/compile-times. the question is whether or not that functionality is loaded/present at that time