freenode/#sicl - IRC Chatlog
Search
15:44:28
Bike
By "subject to capture" I meant (block name (f (lambda () (return-from name)))) where F is unknown. <- the dynamic environment doesn't have to be captured there, just the timestamp-like-part, which it already is, like we discussed
15:49:27
Bike
still need to pass the dynamic environment to calls, guess it can just be a normal argument...
16:51:05
Bike_
oh, and tagbody wouldn't need a successor to the end, there's no way to nonlocally enter there (unelss there's a tag at the end, in which case it would have one anyway)
20:30:29
drmeister
beach: I'm looking at improving performance in HIR processing. One of the things that I've been looking at for a while is the map-instructions-xxx functions. As you probably recall they use a hash-table to keep track of nodes that have been visited.
20:30:54
drmeister
In Clasp they rehash 8 or 9 times as they ramp up in size and then they are tossed away.
20:31:21
drmeister
I'm trying to measure the impact of this when I compile-file in parallel - with multiple threads ramping hash-tables like that.
20:32:01
drmeister
I've gone and changed clasp's hash tables to use open-addressing rather than chaining. It's easier to reason about their size that way.
20:33:17
drmeister
make-hash-table has a :size argument - and the first thing I'm trying is to pass a large number that represents the upper limit of hash-table size when running map-instructions on forms that generate some of the largest HIR graphs.
20:34:23
drmeister
I'm using (make-hash-table :test #'eq :size 524288) currently - that's the largest hash-table that I've seen while building babel.
20:34:44
drmeister
How do you feel about the :size argument to make-hash-table? Is this a reasonable thing to do?
20:36:12
drmeister
That is going to take up about 8MB - I figure memory is cheap and these hash tables are temporary - so why not start them off really large.
20:37:25
drmeister
An alternative plan would be to try and guess how large the hash-tables will get. We could measure the size of the AST and correlate that with the size of the HIR graphs and use the correlation to make a prediction.
21:49:39
drmeister
Setting the size to the worst case has problems - it slows things down on average.