Avatar

9th European Lisp Symposium - Confession 62

2016.05.11 13:58:40
Index

header
As I'm writing this I'm still in Krakow. Sitting next to me is Till, who joined me for ELS this year. It was a blast, but I'm also really exhausted and my throat is still hurting a bit from talking all the time the past three days. Our flight back to Zürich is in about an hour from now and I have a test to study for on the coming Thursday; I actually would've really liked to stay a bit longer, especially considering there were a few people I would've loved to talk to a bit more. Alas, you can't always get what you want.

But before I go through the entire thing by backtracking, let's instead reverse time all the way back to Sunday. Our flight to Krakow was scheduled for 17:00, so we had ample time to lounge around at home and try to relax a bit before the inevitable stress that is airport security and flying in general. Till had also packed way too much stuff, so we unloaded a bunch to make the carrying lighter. In hindsight I'm really glad we did that, as it turned out that we had to walk around quite a bit in Krakow.

At around two we then set off for the airport, where we had a quick lunch and noticed that Till had left his boarding pass in a book he had packed but we then left at home. Fortunately enough –after a bit of trouble with the Swiss website– we managed to download a copy of it to his tablet, so that all turned out fine. I suppose you really can't go for any kind of journey without at least some kind of oversight that gives you a hefty scare.

The plane we flew in was a small jet, but it was still pretty packed. I assume it was mostly Polish people returning home after a quick holiday break. On the flight my stomach got upset a bit, but otherwise it went by just fine. Once we finally arrived in Krakow we got a bit confused about the airport layout, as it was under pretty heavy construction. After some wandering about we managed to find the proper bus stop and get our tickets. I also exchanged way too much money for zloty, most of which is still in my wallet now. I didn't get much of an opportunity to waste it.

The bus ride to the hotel took around three quarters of an hour, so we got to have a good look around the outskirts of the city and its landscapes. The hotel itself was located in an area that looked rather worn down, the streets were not up to par and long stretches of the sidewalk were opened up for construction. After check-in and a short look-see at our room we decided to head on down to the bar and wait for someone to show up. Most of the conference people had already arrived in Krakow before us and were having a jolly time at the pre-conference registration party from what I overheard.

About an hour later we were joined by Christian Schafmeister and Joram Schrijver and the discussions immediately fired up. We talked a lot about Clasp and its near future– Christian was pretty worried about what he could show for his talk. He had the impression that people wouldn't be impressed by a new Common Lisp alone. I can't say I agree with that viewpoint, Clasp brings a lot of new stuff to the table that makes it a great addition to the list of implementations. The C++ interop alone is already noteworthy enough, but there's lots of smaller features that could prove very useful for larger projects. The biggest problem with Clasp remains however; there's just not enough people working on it to move it along quicker. Even with Christian's incredible speed and dedication, there's only so much he can do on his own. I've been trying to push Clasp into a situation where it is more accessible to other people for quite a while now, but especially with recent changes there's a lot left to be done for that– something that was reflected again throughout the discussions we had during the conference.

Later we were joined by a group of other lispers that were just returning from their previous party to start a new one at the bar. Things got rather lively and all sorts of topics got brought up. At around midnight I had to excuse myself however, as I wanted to be at least somewhat fresh on the coming morning. The hotel room was alright, at least it didn't smell terribly and was otherwise nicely roomy. Unfortunately the heating was also turned up enough that I couldn't fall asleep for about an hour. Opening the window cooled things down sufficiently and we finally managed to get some good rest in.

Finding the conference building in the morning was a bit tricky, but we managed to discover a kiosk along the way to get some snacks and drinks in. The conference provided for plenty of that on its own, but I was still glad to have a nice bottle of ice tea in my bag at all times. We got to the conference hall on time for the registration, but the actual conference organisation was oddly delayed. Nevertheless, discussion between the few people that had already showed up sparked almost immediately, so it didn't feel like we had to wait at all.

It was great to see Robert Strandh again as well, although I barely got to talk to him this year, much to my dismay. I'm hoping to remedy that next year. I also met Masatoshi Sano again, but I only got to talk to him on the second day. I met a few other people that I already knew from the previous ELS and had the pleasure of talking to them, but I unfortunately am terrible at remembering names, so I can't list them all here. My apologies.

Moving on to the talks. The first was about Lexical Closures and Complexity, by Francis Sergeraert. At points it was unfortunately –despite the rather heavy focus on Math at ETH– a bit over my head or moving too quickly, so I had a bit of trouble following along what exactly was happening. From what I could gather he uses closures to model potentially infinite or very large problem spaces and then perform various computations and mappings on those, thus still being able to compute real-value results without wasting enormous amounts of resources to try and model it all.

Next up was the language design section of the talks, the first of which focused on automated refactoring tools to aid students in finding style problems in their Racket code. It was fairly interesting to see a brief introduction to the tools used to both analyse and restructure source forms automatically to determine more succinct and idiomatic ways of achieving the same semantic result. It was also rather depressing to see some real-world snippets of code they had gathered from actual students. I can't say I'm surprised that this kind of absolutely horrendous code gets written by people being introduced to a language or programming in general, but one can't help but wonder if these 20-level nested ifs might stem from something other than the writer being new to it.

Following this was the demonstration of a library that extended the CL type system for a way to type-check sequences, allowing you to express things like plists as a type that you otherwise could not. It appears to me that this system might prove useful for succinct pattern checking, but unfortunately because these type definitions don't actually really communicate with the compiler in any way it is rather useless for inference or other potential optimisations that could be done if the compiler had actual knowledge of what kind of structure is being described. The code presented also used (declare (type ..)), which is the wrong way to go about something like this. Declarations are intended as promises from the programmer to the compiler. That all sane compilers also insert checks for the type on standard optimisation levels is not something that should be relied upon. check-type on the other hand would be perfectly suitable for this. However, at that point you might as well drop the type charade altogether and just have something like a check-pattern function that performs the test. Still, the talk presented an interesting view into how the actual sequence type descriptors are compiled into efficient finite state machines. The mechanism behind the type set merging was very intriguing.

Afterwards we heard a talk from Robert about his implementation of editor buffers, presenting an efficient and extensible way to handle text editing. I had read his paper on it before so I was already familiar with the ideas behind it, but it was a nice refresher to hear about it again. I'll make sure to see about hooking in his system when I inevitably get to the point of writing a source editor for QUI. It was also pretty surprising to hear about a topic like this since usually when using editors, one doesn't think about the potential efficiency problems presented by them– it all just works so well most of the time already. The biggest grievance for me in Emacs at the moment isn't necessarily the editing of text itself, even if that slows down to a crawl sometimes when I'm cruising about with a couple hundred cursors at the same time, no, the problem is with dynamic line wrapping. Emacs goes down to a complete crawl if you have any kind of buffer with long lines without line breaks. This however I assume has more to do with the displaying of the buffer than the internal textual manipulation algorithm. Maybe Robert has ideas for a good way to solve that problem as well.

During Lunch I had the great pleasure of meeting and talking to Chris Bagley of CEPL fame. I'll really have to look into that myself to see which parts of it can be incorporated into Trial. I'll definitely incorporate the Varjo part so that we can use sexprs for GLSL code, but there might be lots of other little gems in there that can be repurposed.

Following Lunch was the session on DSLs, starting off with a system to describe statistical tests in lisp. For this talk too I felt like I wasn't familiar enough with the areas it touched upon to really be able to appreciate what was being done. Apparently the system was used to do some hefty number-crunching. Besides, it's always great to see new discoveries in language evolution that allow a convenient description of a problem without sacrificing computational power for it.

The following talk stepped right into that line as well, presenting a high-performance image processing language called CMera (I believe). It does some really nifty stuff like automatically unrolling loops to avoid having to do edge testing in your tight loops all the time, expanding a single loop over an image into nine different parts that are all optimised for their individual parts. When comparing the code written against an implementation of the same algorithm in hand-written C++, the difference is absolutely astounding. Not only does this reduction in code size make things much more readable and maintainable, it also allows them to move much faster and prototype things quickly, all without having to sacrifice any computation speed. If anything they can gain speed by letting the compiler use higher-level information about the code to transform it into specialised and large but performant code.

Finally the last session of the day focused on demonstrations, firing right off with CL-MPI, a library to use the MPI system from within lisp while taking care of all the usual C idiosyncrasies, thus presenting a very handy API to run highly parallel and distributed code. Really interesting to me was their system to synchronise memory between the individual machines automatically. While the system seems pretty neat I couldn't help but wonder whether this might a bit too easily lead to either assuming synchronisation when none is present, or introducing a bottleneck in the system when it has to synchronise too often. Either way, I'm glad I don't have worry about highly distributed systems myself– measly threading alone is enough of a headache for me.

After this we had a really nice showing of an interactive computer vision system in Racket using the Kinect. That sounded like a very fun way to introduce students to Racket and computer vision in general. The few demos he showed also seemed very promising. Given last year's talk on computer vision, I might really have to take the time to look into this stuff more closely some time.

The last talk for the day focused on the problem of lexical variables in CL when debugging. Since lexical variables are often compiled away completely, it's tough to see their values during debugging. SBCL often does a pretty good job at retaining this information when compiling with (debug 3) in my experience, but there's certainly times when it doesn't, and I can see the value in implementing a system that can ensure that this gets preserved in every case, especially one that's portable across implementations. Apparently there's some really nasty code walking necessary to get the job done, and there's apparently still no code walker around that actually works as expected on all major implementations, which is a bit of a downer.

As usual closing off the day was a session of lightning talks. This year mine was a form of continuation on my last year's talk about Qtools. I talked very briefly about Qtools-UI, the effort to replace Qt parts that aren't extensible enough and thus provide a more convenient base for people to work with. I'm not sure if I managed to convince anyone to contribute to it, but hopefully it'll at least linger around in some people's heads so that they might remember it if they ever come across the need to write a GUI.

The rest of the lightning talks I'm afraid to say I can't quite remember. My memory is rather shoddy at the moment and the only reason I remember all the other talks is because I looked up their titles on the website. So, my apologies for skipping out on this, but I think the article is already plenty long as it is so going into detail on all these would only make it all the longer.

The first day was concluded by Chris, Christian, Joram, Till, and I going back to the hotel for a brief chat at the bar, followed by a quest in search for pizza. We looked up a bunch of places near the hotel and went on our way. The first we encountered was too full and the other was near a campus that was filled with students drinking booze and doing BBQ; we deemed it a bit too lively for us. The third one was inside a student dorm building, but had enough space for us to spend some hours talking and eating. The pizza tasted very differently from what I'm used to. It wasn't bad, but also not really my kind of thing.

Some more talking and a good night's rest later it was already Tuesday. Time flies when you're having a blast. We got up a tad later this time around and walked through a convenience store. I was relieved to see that just like all the stores I'm used to the layout is as confusing as possible so that you have to waste lots of time walking by everything except what you're looking for.

Since we were close on time and there was a group photo to shoot we didn't get any time to talk before the first talk. It started off with a presentation of the Julia language by Stefan Karpinski. Julia seems like a really nice replacement for Matlab and I'd very much welcome it if it got more ground that way. However, some of the points that were presented here didn't really seem to make much sense to me. One thing that was emphasised as distinguishing Julia is that number types and arithmetic aren't in the specification, but rather defined in Julia code itself. This sounds like a neat little thing to do for curiosity's sake, but I just can't see the benefit of it. Not to mention that now instead of reading some pages of a spec you have to read some pages of code with possibly arcane interconnections and optimisations going on. Whether this makes anything more clear is really dubious to me. I'm guessing that this part was mostly mentioned at all to at least bring something new to the table since it would otherwise be pretty hard to impress lispers. Another thing I was confused about is that he seemed to hint at the possibility of writing functions that get the information about the inferred type of the compiler and can use that to generate different code, which is something that I've missed in CL in places where macro functions could be further optimised with that kind of information, but the example he showed didn't seem to use that in any way or even get it at all, so I'm not sure if I didn't catch that part or what exactly is going on with that.

A quick break later we got to the implementations part of the talks. Robert presented his modern implementation of the loop macro, which uses a system of combinatory parsing and full CLOS to allow it to be extensible. I'd love to have a portable way to extend loop as iterate really doesn't appeal to me much at all and there's currently nothing else that is extensible for custom sequences and clauses and the like. I'm not sure if his implementation will be adopted, but I would definitely welcome it.

The next two talks, which were about source translation in Racket and STM in Clojure, I'm sad to say I can't really talk about because I was distracted by a bug I had discovered momentarily and couldn't help myself but try to fix. I got absorbed all too easily, so I didn't catch much of it.

During the lunch break I got to talk to Masatoshi Sano for a good while, we mostly discussed the prospect of using Roswell for my Portacle project and talked about some of the difficulties or ways to deal with what I'll just call “The Windows Situation”. I later talked to Joram a bit about potentially getting him involved in the Colleen3 or Trial projects, for which I'd heartily welcome some contributors or even just discussion partners. I'm very excited about the prospect of working with him on that.

And then came the big one. Christian's talk was, just like last year, pretty comparable to a bomb dropping. Some suspect that he's not of this world. Ignoring the question of his conception, hearing him in his element talking about Chemistry is always a treat. He showed off a nice demo of what CANDO is capable of and it really looks like a nicely lispy way of performing chemistry modelling. This is said from what I can tell with my practically nonexistent knowledge of chemistry, so I can't really claim to have a grasp on what you can actually do with it. Given that he's been at this for such a long time though, I'm convinced he knows what he needs to do in order to create things like what he presented to us– a perfect water filtering membrane. I'm still glad to have gotten involved with Clasp, it has given me lots of really great talking and thinking opportunities. It's exciting to listen with or discuss the in-depth details of what's going on inside Clasp. Now though Christian needs to get his chemistry stuff off the ground so that he can get enough funding to continue Clasp. Unfortunately grants have been hard to come by for him and that's a looming pressure that has been haunting the project many times before. Hopefully he'll be able to prove the worth of Clasp and Cando in the near future. I wish him all the luck.

Next up we had a presentation about the question of how different implementations of a depth of field effect perform on different hardware. This was mostly a concerning example as to how much code still needs to be tuned to the hardware it's being run on today. Maybe the effect is actually even more so now, since lots of hardware that lies in the same category is still very differing on what it is adept at. Thankfully I am mostly staying clear of such optimisation lunacy.

Some more coffee passed by and we were ready for the last session of talks for the conference. The debut was made by James Anderson, presenting his research into how source files are connected with each other and what kind of dependencies exist between them. He wrote a system that analysed the entire Quicklisp ecosystem's files for symbol references between things and then crunched all that data down into interesting graphs using his own database technology. The graphs for larger systems like qtools-ui look like a complete jumble as expected. He also mentioned the difficulties of trying to extract this kind of relationship information since he could only inspect code by reading it in. This is particularly a problem for methods, since they are likely to be defined from lots of different source files and potentially packages, but without at least type inference or even runtime information you can't really know where the dependency goes. Initially his idea for doing this seemed to be that he doesn't want to have to write the dependency information into his system definition files and the system should be able to infer it automatically. I'm not so sure that this is is a good idea, or that it is in fact such a problem. It seems like a rather minor inconvenience to me, but then again I've never written systems on a very large scale.

Closing it all off we had a presentation of Bazel and how it can be used to build Lisp. The most promising feature of it all being that you are able to use it to statically link libraries into an SBCL binary. I'm not convinced that Bazel is the tool to use for building unless you have a gigantic project or ecosystem surrounding it already however. It seems ridiculously heavy-weight and paying the price for it and its different way of configuration and operation does not seem worth the benefits unless you really need static linking and cannot do with shared libraries. Still, it gave me some things to think about for the eventual effort of writing my own build system, whenever that will happen.

Then as before we had some more lightning talks to round it all off. I didn't do a second one this time around, mostly because I didn't really know what to talk about for Trial. It did not seem finished enough to present yet. Maybe next year.

Finally the conference was rounded off by some goodbye messages from the conference organisation and the announcement that the next ELS might be happening in Brussels. We then had two hours left before the banquet. Michal Herda guided us into the inner parts of the city where we got to see some nice architecture. Along the way Chris Bagley and I chatted about the problems in writing game engines and games in general and some other assorted topics.

Once we arrived at the banquet I was pretty beat. The place we stayed at looked oddly high-brow. The tables were set with all the usual you get in fancy restaurants– multiple wine glasses, forks and knives. It seemed in an odd conflict with the rest of the getup of the conference attendees. We all looked far too casual for this kind of thing. Due to my pickiness I couldn't eat much of the food that was being served either. It certainly looked fancy, but I don't think the taste was in accordance to that. From what I've heard from others or noticed in their expressions it was nothing exceptional. No matter for me either way though, since I came all this way not to eat, but to finally be with people that understood me and vice versa. And I got ample opportunity to do exactly that. During the dinner I mostly talked with Joram about Colleen3 and Markless.

Soon enough it hit 22:00 and we had to leave. Due to the long way back to the hotel and other delays in saying goodbye to everyone, we only arrived around two hours later, at which point I just slammed myself into bed after making sure that I got the boarding pass for next morning.

And so today I woke up at six, just to make sure that we had plenty of time for potential mistakes. Three quarters of an hour later we were on the bus to the airport. Half an hour later we had already passed through security and were waiting at the gate, at which point I started writing this. The flight after was rather annoying. It was pretty packed and there were lots of screaming children on board, the bane of any flight passenger. I have no idea why there were so many families on board, let alone on a Wednesday morning, let alone from Krakow to Zürich. Despite hellish screams of tortured souls haunting us along the way we made it back safely.

Now it's already 16:00 and aside from getting home and continuing to write this I only got the time to cook some nice lunch– the kitchen remains to be cleaned.

I suppose I should try to form some sort of a conclusion here to end this overly long article on a good note. If it wasn't already apparent from my descriptions, I had a grand time at the conference and I'm really glad that I could attend again this year. A huge thanks to everyone that was willing to talk to me and especially to all the people that got involved to make it all happen. Hopefully it'll happen again next year; I'm definitely looking forward to it.

For now though I have to get back to work hacking lisp. There's so much left to be done.

footer

Written by shinmera