freenode/lisp - IRC Chatlog
Search
23:39:37
kagevf
jcowan: I don't think White_Flame left one? but the basic idea was to create a macro that you could wrap around a defun and the macro would set compiler optimizations, etc
23:42:44
kagevf
if you update a macro you have to re-compile every defun that uses it ... something I didn't know and caused me a lot of confusion ... so FYI anybody who didn't know that hehe
23:43:39
pjb
kagevf: that's why project often have macros in a separate file, so that we may write file dependencies in asdf and have things recompiled automatically.
23:44:56
pjb
phoe: it's the opposite: each major implementation has a custom FFI. Therefore there can be a CFFI backend for them.
23:44:59
no-defun-allowed
Whatever happened to White_Flame? He got a macro that made his ears burn...
23:45:23
White_Flame
(defmacro fast-body (&body body) `(locally (declare (optimize (speed 3) (safety 0) (space 0) (compilation-speed 0))) ,@body))
23:45:53
pjb
moon-child: cmucl may still be used, since it has some differentiating characteristics. For example, it can be compiled with 8-bit characters instead of unicode. That may be useful.
23:51:54
drmeister
Is there a way to get a list of all source files in an asdf system from within Common Lisp?
23:52:56
drmeister
I'm writing a report and I'm taking an inventory of the amount of code that we compile into our runtime.
23:53:42
drmeister
So far: 580,000 lines of C++; 236,000 lines of Common Lisp code; 81 ASDF systems.
23:54:10
drmeister
The last one doesn't mean anything to non-CL people - so I'm trying to get an estimate of the number of lines of CL code that come from ASDF systems.
23:56:33
White_Flame
speaking of dynamic-extent from upstream in the logs, is there a legal way to push to a lexical list var while having all the cells be dynamic-extent?
23:58:04
pjb
White_Flame: of course, you can do whatever you want with the cells. That's the danger of dynamic-extend declarations.
23:58:42
pjb
White_Flame: note that the declaration is on the variable holding the value, not the value, while it applies obviously to the allocated values…
0:00:02
pjb
(let ((list '()) (cell '())) (declare (dynamic-extend cell)) (loop repeat 10 do (setf cell (cons 'a list)) (setf list cell)) (prin1 list) (values)) #| (a a a a a a a a a a) |#
0:00:57
pjb
assuming of course, that prin1 doesn't store the list somewhere for later (this would be dragons out of the nose).
0:01:17
kagevf
pjb: wouldn't asdf:load-system recompile any defuns anyway even if they're in the same file as the defmacros they are using? and even if the macros are in a different system, wouldn't that still be the case? if what I just said is wrong, I'd be very interested in knowing, since that might explain some things I've observed and don't fully understand ...
0:21:21
White_Flame
but, (let ((cell (cons ...))) (declare (dynamic-extent cell)) (push cell list)) might do it, it just leaves the scope of the LET which might make it immediately invalid depending on the compiler
0:21:57
White_Flame
which is the weirdness I hit in terms of dynamically consing up stuff on the stack (which may or may not be possible based on the stack discipline of the compiler as well)
1:27:32
pjb
kagevf: in the same file yes, but not in other files, if they don't depend on the file where the macros are defined.
1:30:06
pjb
kagevf: if you modify a system and reload it, this doesn't make asdf reload the systems that depend on it! You need to load a leaf system to have all the dependencies recompiled and reloaded. So if you have N systems in a project, you could define an artifical N+1 bottom system that would depend on all the N system, and that you could reload when you modify any of the N system, to have it and all its dependents be reloaded.
1:33:23
pjb
White_Flame: well, dynamic-extend doesn't imply stack. An implemented could use a separate heap, managed similarly to the old Pascal heap, which is basically a parallel, data stack, instead of a garbage collected heap.
1:34:21
White_Flame
still, my example exits the lexical scope the var is defined in, while it still keeps it alive, be it the process stack or not
1:34:27
pjb
White_Flame: upon entry in a function the function records the size of this dynamic-extend heap; dynamic-extend objects are allocated there (simply incrementing the size of the heap), and when the function returns, it resets the size of the heap to the saved value.
1:36:27
saturn2
my impression was that in sbcl, dynamic-extent doesn't affect setf assignments at all
1:36:42
pjb
White_Flame: and remember all declarations but special declarations can be ignored by the implementation.
1:37:55
saturn2
you have to do some sort of continuation-passing kind of thing if you want to build a dynamic-extent list a little at a time
1:39:39
pjb
saturn2: not needed. Any new object that is stored in the declared variable is dynamic-extend, you say to the compiler. What you do with this object later is not the problem of the compiler anymore. It's your problem. My code is conforming and produce in list, a variable not declared, a list of cons cells that are dynamic-extent.
1:39:59
White_Flame
(now, it doesn't have to, but at least SBCL doesn't include the stack magic to dynamic-extent + tco)
1:40:48
White_Flame
pjb: the stack says that the declaration declares that the values become inaccessible outside the form
1:40:51
pjb
White_Flame: for an object to be dynamic-extend doesn't imply anything!!! It's up to the implementation to allocate it wherever it wants, and to free it sooner than later!
1:41:23
pjb
Which is indeed all that conforming programs need to know: the fuck don't access the fucking objects outside of the form!
1:42:11
pjb
White_Flame: yes, because the implementation can ignore dynamic extent so it is possible that we may!
1:44:49
White_Flame
when I said "expressibility", that is creating the assertions for this situation. The declarations assert things which may or may not be checked or used by the implementation
1:45:34
White_Flame
however, dynamic-extent does say "for each value ... that vari takes on", which is quite broader
1:46:17
White_Flame
especially since data that has a larger extent than the current form might be placed in there, too, without practical issue
5:21:18
dieggsy
is there a standard way to "dispatch" on platform? like (os-dispatch (linux do-this) (macos do-that))
5:28:20
lukego
I'm feeling the urge to make a github fork of every dependency so that I can easily read/write them somewhere other than my own home directory. this way I could fix things and send changes upstream instead of just quietly working-around locally. does anyone do this? any workflow tips?
5:31:59
lukego
beach: yes, once you have a fork already established, but I don't have that and in the moment it's always too much hassle to create one. Ideally I'd preemptively fork everything and also keep my forks up to date e.g. with latest quicklisp versions.
5:33:37
lukego
maintaining one repo containing patch files might be easier but can't send pull requests from that
5:34:51
Nilby
lukego: I just fork, clone, and link into ~/quicklisp/local-projects. Then I can swtich back and forth between the local hacked one and the quicklisp one, by adding and removing the link. It's convenient to have a command to it for you.
5:35:49
lukego
Nilby: hard to install your application on another computer if it depends on the contents of ~/quicklisp/local-projects though?
5:36:59
kagevf
I tried to do touch ./new-directory/new-sub-directory/new-directory but bash complained because the parent directories didn't exist, so I just wrapped ensure-directories-exist in a very tiny app with sb-ext:save-lisp-and-die since it did what I wanted ...
5:39:39
Nilby
Or just have a list of your patched dependencies and re-create the fork directory from github, then have it populate local-projects.
5:40:21
lukego
splittist: I think so and what I'd need is a script that syncs it from quicklisp so that it doesn't go stale
5:43:37
lukego
Nilby: sounds messy for me, going to bite me if I want to e.g. hook up a CI that tests the same code as I'm developing, etc.
5:53:49
White_Flame
I always end up M-.'ing into QL dependencies to tweak stuff and then stare at it wondering what to do with that :-P
6:23:29
lukego
splittist: Yeah PRs are a whole other dimension. In the olden days of patch files you could manage your sources any way you wanted but nowadays you really need to maintain a whole github repo to participate in code sharing
6:26:23
lukego
borodust also does a quicklisp distro for the gaming stuff and that seems to work well
6:32:06
easye
What would be the mimimal-non-CL dependencies way of styling org files under Hunchentoot?
6:37:57
easye
ACTION wonders about executing elisp in semi-portable ANSI CL for the second time this week.
6:57:42
kagevf
saturn2: I know about mkdir -p, but I have a list of file names from which I want to generate the directories ... if I use mkdir -p, it will turn the entire file path into a directory ... if I try to turn that into a file bash displays an error
6:58:44
kagevf
if I do mkdir -p directory/filename.txt even filename.txt becomes a directory, and I can't do echo "abc123" > directory/filename.txt
7:00:02
kagevf
maybe I could have done something with awk, but just wrapping ensure-directories-exist into a mini-app was faster
8:02:22
loke[m]
Sorry, I didn't notice that I'm on an IRC-backed channel. This must have looked horrible in an IRC client.
8:05:18
Nilby
ACTION doesn't like unix, which makes crazytown more crazy (let ((file "~/tmp/gralt/baz/moo.txt")) (!= "mkdir" "-p" (nos:dirname file)) (! "echo foo > " file))
8:05:46
nij
Did anyone (intend to) annotate CLHS and make it more friendly, say, with more examples?
8:08:53
beach
nij: The Common Lisp HyperSpec can not be annotated as is. It has a very restrictive copyright.
8:11:22
beach
I may not be remembering this right, but I think the ANSI standards document can not be legally copied, but the dpANS can. And I think the Common Lisp HyperSpec was created from the dpANS. However, the HTML markup is also protected by copyright. I may be wrong, and ANSI may have given permission to LispWorks to create the Common Lisp HyperSpec from the standards document.
8:12:58
beach
nij: I think the right solution is to create a new document from dpANS, and I am pretty sure that is what phoe did.
8:13:33
beach
What we do need is a version of the dpANS in the form of one single LaTeX document, rather than as one TeX document per chapter.
8:13:49
TMA
it depends on the jurisdiction of the birth/citizenship/residence/death of the creator at the time of the work being first published, of the publisher if posthumously and it is generally at least 70 years after the event, even 100 in some cases
8:18:31
Nilby
If I didn't dislike both HTML and TeX so much I would try to do it. I've been fixing a a texinfo copy for years and it's still messed up.
8:30:39
beach
Again, I think the best thing to do would be to create a single LaTeX document (with multiple files obviously) from the per-chapter TeX files of the dpANS.
8:31:21
beach
Then we could use it for the Common Lisp UltraSpec, but also as a basis for a Common Lisp reference manual, and for the WSCL specification.
8:35:02
scymtym
beach: is there more information on this idea of reformatting the dpANS sources anywhere? like previous attempts, necessary steps, expected outcome, etc?
8:36:29
beach
Nilby: If it is correctly structured, it could be parsed. But that's not the case for the dpANS.
8:37:14
beach
Nilby: The dpANS is pure TeX with lots of custom macros, many of which are for aesthetic purposes only.
8:37:43
beach
Nilby: I totally agree, but that's just making the task even more complicated up front.
8:38:50
scymtym
beach: i see. i meant information that would be useful for making another attempt. that would also include how to get the sources and an explanation of the legal situation, i guess
8:39:59
beach
scymtym: I remember it was tricky to find the sources. But I have them, so in the worst case, I can make them available.
8:41:42
theothornhill
beach: But if it's mostly hindered by the hassle of manually doing things, more hands would surely help?
8:42:10
scymtym
i think making the sources available along with a little explanation of the situation would make the project more approachable if that makes sense
8:43:14
beach
I was aiming for LaTeX to avoid having to translate all the macros, and so that we could use cross references, bibliography references, etc.
8:45:19
beach
I wish gilberth were here. He is the expert in recycling stuff like this. He created the "annotatable" CLIM spec from the LaTeX source.
8:45:52
scymtym
beach: not sure. people chipping away at the problem in private and a coordinated effort don't seem to be mutually exclusive since
8:46:04
theothornhill
phoe: Is your effort available somewhere? If not, could you make it available? Otherwise I can try to set up something
8:46:32
scymtym
beach: yeah, seeing the CLIM spec "parser", i wondered how different the CLHS and CLIM are in that regard
8:46:43
Nilby
Profusion of macros seems like what you get when you have super-lisp-hackers writing in TeX.
8:47:56
Duuqnd
It would be nice if there was a free software clone of Symbolics Concordia and the document examiner. Not sure if it would be at all useful for this though.
8:51:57
Nilby
Concordia was resting on quite a highly evolved stack of things that I don't think we have free versions of yet.
8:52:32
Duuqnd
Concordia itself can't be salvaged, I was more thinking something like a clone written on CLIM or something.
8:54:18
beach
Duuqnd: I think jackdaniel is working on some kind of CLIM-based documentation system.
8:57:11
beach
I honestly think the best action would be for each person to take a section or a chapter, whipping up the PDF of the standards document, and then copy-paste from dpANS into a LaTeX document or some other kind of document.
8:59:00
loke[m]
beach: I believe any transformation of the spec should be made into some kind of machine-readable form. LaTeX is fine, as long as the content is parseable without LaTeX itself. Storing the information in sexp form would be ideal.
8:59:42
phoe
but I'll need to restart this from scratch someday because my methodology sucked at the time
9:00:08
phoe
if I ever restart CLUS, I'll want to do it in a reproducible way that starts with dpANS sources and mechanically and reproducibly converts them into whatever format is required
9:00:40
phoe
the most important is *reproducible* because the standard really needs to be copied verbatim, as all holy scriptures must be
9:00:44
beach
loke[m]: But then we have the eternal problem of choosing a format. We never seem to be able to agree upon such a thing.
9:01:17
beach
loke[m]: The reason is simple. There are so many formats to choose from, each one is going to have only a small minority of proponents.
9:01:21
loke[m]
If it's in some sexp form, transforming it to HTML, LaTeX, org-mode or whatever would be trivial.
9:02:22
loke[m]
Well, the details doesn't matter. It could be something like '("Some text " (:bold "bold text here"))
9:02:56
loke[m]
I dunno, point is that I don't think most people would complain, since it's not any specific format, and every proponent of some other format knows they can easily convert it to their favourite style.
9:03:32
beach
loke[m]: But if we need to collaborate on this effort, then we can't have each person choose a different format.
9:03:47
no-defun-allowed
Frob it, just make the format #<CLHS {12345678}> - no one will know what it is without a nice print-object method and thus no one can complain.
9:04:22
loke[m]
True. But you'll have the dictator choose the base sexp format. My example above is what I use for my transformed version of the Maxima documentation, and it works fine.
9:05:01
loke[m]
I have written a documentation browser as part of Climaxima that displays the content in a much nicer way than the official HTML docs, and has better search, so I think it's a success.
9:06:16
Nilby
The trouble is, we can see how well separation of semantics from presentaion is going for the web.
9:06:36
loke[m]
I was working on an automatic converter for the dpand documents that created precisely this.
9:07:10
loke[m]
Nilby: My apologies for making assumptions, but did you take a look at the dpans source? It already contains that information directly in the LaTeX source.
9:09:02
Nilby
loke[m]: Yes, and I know they did a pretty good job with TeX, but it's certainly not perfect, and they we're exactly focusing on the hyperspec aspects.
9:11:08
Nilby
TMA: Thanks. I think I started with one from GCL, but Xach's is probably in better shape.
9:13:28
no-defun-allowed
If you can stomach him, sure why not, but that's not my cup of tea. I dunno.
9:15:41
no-defun-allowed
For one not borderline "political" example, his two hundred (thousand?) year languages are very terse, but I write verbose code, because that is what I can read.
9:16:06
splittist
ACTION would be wary of anything funded, as funding implies official, and constitutional conventions are unpredictable see e.g. French revolution, Russian revolution etc. (:
9:20:28
beach
nij: Like I said, the Common Lisp HyperSpec was created either from the dpANS or perhaps from the standards document with special permission. I forget who did it. Maybe Kent Pitman?
9:21:59
no-defun-allowed
What I would do is start a venture commune to get common property for any CL work. This has the pleasant side effect of competing with PG.
9:24:06
beach
nij: Not at all, it was derived from the standardization documents. And those documents took dozens of people to create.
9:24:41
beach
nij: But, if you followed the discussion the dpANS is in a format that is not easy to exploit.
9:26:05
beach
theothornhill: I am not going to attempt to answer that, because, like I said, any particular format would be defended by only a small minority of us.
9:26:50
beach
theothornhill: And, as I often say, I would like for the entire thing to be encoded as standard objects, with a well defined protocol.
9:27:23
beach
That way, we can't possibly disagree on some surface syntax, because the surface syntax would not be what defines the format.
9:27:43
theothornhill
beach: yeah, but now it looks pretty gnarly, so maybe converting it to a cleaner (as in human readable) format first could be a smart first step?
9:30:18
Nilby
I also agree with phoe, that it would be good for it to be reproducible to go rom dpANS to an object encoding.
9:35:04
beach
contrapunctus: Thanks for reminding me. Then CommonDoc might be the best recommendation. Let me see how it can be used.
9:36:11
splittist
The problem is that documents (for human consumption) are not trees. At least, not at any reasonable level of abstraction.
9:37:17
beach
splittist: That's why I would like to see documentation split into (say) "chunks" where each "chunk" has a unique label and can be inserted into documents in different ways.
9:38:26
contrapunctus
splittist: reeeeaaally? o.O HTML, XML, and anything that targets them like Org, Markdown, etc, as well as XML schemas like ODF...aren't they all trees?
9:39:01
Nilby
I don't know specificlly about CommonDoc, but an object representation is really a graph, and can have circularity, etc.
9:43:12
splittist
headers, footers, footnotes, endnotes, call-outs, tables, pictures; sections of text that don't map to textual content, but to its presentation; annotations that span units of text; styles that do - or don't - compose.
9:44:45
splittist
title pages, front matter, colophons, appendices, schedules, annexes, indexes, tables of contents, tables of authorities, bibliographies, watermarks, borders for paragraphs, borders for pages, ...
9:45:14
nij
So instead of @begin(enum) @item(This is the first item) @item(And this the second) @end(enum), we can just write
9:45:47
beach
nij: Because, again, the main idea is to have the internal CLOS protocol be the main definition of the format. Not the surface syntax.
9:51:50
splittist
CommonDoc seems to be basically the markdown subset of html with nestable sections
9:56:26
TMA
nij: I have had success with running: for i in chap-{1..26}.tex chap-a.tex ; do pdftex $i ; done
9:57:43
TMA
nij: chap-0 contains table of contents, index, list of figures and also some problems that prevent it from finishing pdftex chap-0 successfully without modification
10:02:17
TMA
for the table of contents: replace ".tc" with ".toc" in chap-0.tex ; for the index: run: sort -t: -k2 chap-*.idx > index.idx
10:03:28
White_Flame
nij: CLHS added formatting & links to what might have been plain text in the tex
10:04:14
edgar-rft
nij: the code examples in the CLHS and the "issues" pages were added by Kent Pitman, they are not part of the dPANS Tex version.
10:09:25
Duuqnd
The \input stuff there reminds me a bit of how in Concordia most records were just links to other records. (screenshot for reference: https://i.imgur.com/J9ZWbrC.png)
10:10:24
Nilby
nij: TeX is hard to convert by anything that isn't TeX or TeX-based. PDF and DVI are pre-rendered formats that lose much of semantic content. And all of those formats are designed to be smushed onto dead trees, not clicky-pixels.
10:11:20
Nilby
Of course one of the cool things about Concordia is it's kind of a Zmacs mode and have live objects in it.
10:12:45
Duuqnd
I should probably stop playing around with it because it's making me dislike Unix-likes and Windows more than I already did.
10:19:06
pjb
dieggsy: have a look at: (com.informatimago.tools.manifest:distribution) https://github.com/informatimago/lisp/blob/master/tools/manifest.lisp#L144 ; of course, totally ad-hoc.
10:20:37
Duuqnd
Being able to click on an example and having it run is also pretty neat: https://i.imgur.com/h1pAQI1.png
10:25:47
pjb
nij: I guess PG doesn't fund Common Lisp, because he wasn't billionaires after selling viaweb. So instead he founded Y-combinator VC to fund startups to make him eventually a billionaire. Note that if you can pitch a good startup with y-combinator (and use Common Lisp to implement it), then PG would fund CL, indirectly.
10:27:08
pjb
nij: oh, you meant a standization process? But once your startup will have made it, you will be able to do that yourself, which would be logical, since you will have used CL to build your products!
10:27:49
pjb
PG sold viaweb to yahoo! who hurried to rewrite it in C++… So there's some lost interst in CL there.
10:32:57
Nilby
I think foundational Lisp is thing that's above the normal economy, like other technologies say, e.g. GPS, you can't just throw money to make it happen, and it doesn't produce profit directly, but it's super useful and enables whole classes of other profitable endeavors.
10:35:28
no-defun-allowed
It does not transcend reality. No other programming languages make money on their own, other than creating a net loss for everyone dumb enough to use some.
11:24:51
daphnis
to remove the end of a string, is there something more straightforward than (subseq str 0 (1- (length str))) ?