freenode/#lisp - IRC Chatlog
Search
19:17:13
pjb
O(1)=k₁ O(n)=n*k₂ There are k₁ and k₂ such as for any n that you can store on an actual number, O(1)>>O(n). But usually it's O(1)<<O(n), even assuming that n cannot be bigger than the size of the memory.
0:04:06
learning_
what's a good resource for learning to do threading with CCL? the docs just confuse me
0:12:20
pjb
The only specific things are ccl:process-interrupt and ccl:process-suspend, but you're better not using them in general.
0:13:07
pjb
Don't do that with slime, since swank has open sockets, and it'll fail when reloading the image.
0:14:27
pjb
But it should not be needed, since slime loads it automatically. Unless you want to be able to launch lisp image independently, and connect with slime after the fact.
0:19:00
learning_
I'm reading CCL's page about images, I realize now what you were talking about the executable. That's pretty cool, just modify your lisp system to be the application you want and then save it, and then bam instead of a lisp dev environment, you just have your app
0:20:07
pjb
While developping, you can keep saving the executable lisp image with the default startup function (the lisp toplevel REPL). And when you want to deliver, you can save it with the main function of your application instead.
0:21:16
pjb
On the other hand, since having reproductible builds from sources is convenient and desirable too, you may instead have a script used to build the application from the sources, calling quickload etc.
4:26:09
BW^-
and, a non-deterministic algorithm in the sense of http://courses.csail.mit.edu/6.854/16/Projects/B/dynamic-graphs-survey.pdf connected components, what would that mean, how does it impact us that it is non-deterministic??
4:30:18
BW^-
beach: ah so it's like "make 10 iterations and each will have a 10% probability of success, hence you'll have 100% success in average in 10 iterations"
4:31:21
beach
I didn't read the article. I just know very elementary stuff. It very likely means linear time.
4:38:18
beach
Not necessarily. It could be that 0.001% of them need to be "touched". But the number of the ones that need to be "touched" is proportional to the total number of objects. Complexity theory doesn't care about constant factors.
4:43:01
beach
Designing good data structures is both very hard and very interesting. Take for instance the data structure that we all use on a daily basis, namely what I call an "editable sequence", and that we use in the form of an Emacs buffer.
4:43:08
beach
The best worst-time complexity one can hope for with such a data structure is O(log n). But if you choose such a data structure, almost every operation will take that much time.
4:43:09
beach
Emacs uses a data structure that has O(n) worst-time complexity. But in almost all cases, it takes O(1) time, so for the kind of use case that Emacs caters to, the best (according to complexity theory) data structure is really not the best.
4:44:10
beach
I know, in some cases I should use Θ(n) rather than O(n), but I think most people here know what I mean.
4:48:16
BW^-
beach: see this paper, http://epubs.siam.org/doi/pdf/10.1137/1.9781611973105.81 , slides https://pdfs.semanticscholar.org/d3b9/3729d50038a868d6dde1ac32f919168877fa.pdf
4:51:25
beach
BW^-: I am sorry, but I have a huge number of much more important tasks on my plate, so you need to look for help with this elsewhere. Reading and understanding an article like that can easily take a week or more of full-time work, and I am guessing that my rate is way too high for you.
5:15:42
BW^-
beach: ah, you mentioned that you have a rate, sounded like you do consultancy at least for a side kick; ok.
5:16:13
BW^-
would "log^2 n * V" normally mean "log(n) * log(n) * V" where log wouldbe the natural logarithm?? or 10-base logarithm??
5:16:29
beach
I could. But I am strapped for time, not for cash. So I set a higher rate for work that is not important for my research.
5:18:35
beach
jackdaniel: Not necessarily. In complexity, the base of the log is unimportant (it's just a constant). It is more likely that it means log squared.
5:32:47
beach
BW^-: Yes, log^2(n) * V nomrally means what you said, and since the base of the logarithm only makes a difference by a constant amount, the base doesn't matter, since complexity theory doesn't care about constant amounts, as I have already pointed out.
5:32:51
jackdaniel
BW^-: I admit to beach that I'm most likely wrong with my rush suggestion that it was a typo
5:33:58
beach
BW^-: That is why, in complexity theory, only log(n) should be used without reference to a base. If you see the base being used in an O(...) expression, you can count on the authors being second-rate.
5:35:18
beach
You can, of course use the base when you estimate the exact number of operations, but that is not usually done in complexity theory, simply because the number of operations is usually meaningless (because the cost of different operations can vary by orders of magnitude).
6:06:08
beach
jackdaniel: By the way, the international convention for logarithms states that the base 2 logarithm should be written lb n, the base 10 logarithm should be written lg n and the natural logarithm should be written ln n. As usual, Americans are the first to totally ignore international conventions, so you won't necessarily find that all CS papers respect those conventions.
6:20:36
jackdaniel
I've got it after seeing the first link (if hearing it from you wouldn't be enough ;) but thanks
6:22:11
beach
You know how everything said here in #lisp must be questioned, debated, altered, debunked, denied, etc. I was trying to avoid as much of that as possible.
6:22:59
jackdaniel
people usually tend to agree. the only exception are situation, when they have strong presupositions
6:23:55
beach
Correct, but there are so many participants here that if only a small percentage is in the "disagree" category, there is still likely to be a long debate.
6:24:42
beach
"Well, *I* am going to continue using what *I* have always used, because all those pages are wrong".