Here is the second part of the April technical review.

I mentioned in a previous post that Peter Norvig released his famous book, Paradigms of Artificial Intelligence Programming, recently. Now, I heard there are also experimental translation in Python and in Haskell (h/t @Jose_A_Alonso).

Although I am quite happy with how the site goes by now, I came across a very nice Jekyll setup, lanyon, with base16 color scheme. If you are after clean web pages, don’t forget Butterick’s practical typography: there is so much to learn on choosing the right font and building well crafted documents, including designing websites. Last but not least, it has been build using Racket and the Pollen publishing engine. Regarding color theme, I really like the base16 color scheme, and one of the theme I liked a lot for Emacs was Spacegray, which is based on base16. I also discovered recently the Nord theme (there is also a custom package for Emacs Doom, which is what I used actually), and I must admit I like it a lot. Color schemes are available for Vim and iTerm2 as well.

Do you know that Algebric Topology, by Allen Hatcher, is freely available? I really like what the author said on the homepage: “I have tried very hard to keep the price of the paperback version as low as possible.” If you need more readings for the week end, and if you are working in the ML or AI world, the proceedings of Machine Learning Research from the International Conference on Artificial Intelligence and Statistics are avalaible on the JMLR site. Or, of you are more versed into linear algebra, go check Linear Algebra with Applications, by W. Keith Nicholson; it is just free, and very complete.

Twenty years. It’s been more than 20 years that Thomas Lumley wrote an article in the Journal of Statistical Software to present an implementation of Generalized Estimatin Equation Models in LispStat. Remember that beautiful software that Luke Tierney wrote for his PhD thesis? It is very likely that nobody uses this anymore, but I still have this little piece of code running on my Macbook. As Jan de Leeuw once said, “(a)bandoning XLISP-STAT” late in the 90’s soon appeared as a necessity as S and then R were becoming the lingua franca of statistics, even if Lisp remain a nice PL on its own.

Learning a new programming language every once in a while is both necessary and refreshing. Professional programmers are sometimes advised to learn at least one new language each year. It is seldom pleasurable, however, to switch one’s main language, for the same reason why it is stressful to move to another country in the middle of one’s life. This must be one of the reasons why a language such as FORTRAN refuses to die. It does its job well, and people have no reason to switch. I was not so fortunate, since I had to give up PL/I in the seventies, APL in the eighties, and now Lisp in the nineties.

We can’t do data science in a GUI, still I do not believe my web browser is the best text editor ever. I came across a blog post discussing why Jupyter notebook is not an option when it comes to write a scientific paper (with all honesty and glory, and the like). See also Jupyter, Mathematica, and the Future of the Research Paper, and the related HN thread. In the meantime, I discovered Codalab which let you build computational notebook in any language (via a Docker container). Here is what it looks like (h/t Lynn Cherny):

I learned that there are many things that VS Code can do, and also that Atom has been revamped, with teletype and more IDE-like features. I miss the ability to hack the editor using some old fashioned Emacs lisp, and Magit of course, but I believe it still is the best option for people that do not want to invest much time in learning Vim or Emacs.

Probability Theory (For Scientists and Engineers) is a new a tutorial written by Michael Betancourt. It relies on R and Stan. There is more to see on Michael’s website, e.g. Robust Statistical Workflow with RStan. By the way, if you haven’t follow what is going on with the brms R package, it now supports the LOO model comparison and model weights approach to bayesian computation developed by Aki Vehtari and coll.

While it is now well over, I keep reading some (advent of) code from people I follow. I enjoyed listening to Joel Grus who happened to use Python 3 for that matter. Lately, this was done using Erlang by Fred Hebert.

The Cars • Candy-O