Skip to content

Metaprogramming Erlang the Easy Way

Alex Bowe
Alex Bowe
2 min read

a_hyper_cube-1

I’ve recently taken Erlang back up1, and I wanted to use this blog post to talk about something cool I learned over the weekend.

I am implementing a data structure. Reimplementing actually, as it is the structure from my thesis – a succinct text index (I will post a blog on this soon).

Why am I reimplementing it in Erlang? The structure involves many bit-level operations, and I wanted to try out Erlang’s primitive Binary type, which seems to be allow efficient splitting and concatenation (which I require). Erlang’s approach to concurrency will hopefully assist me to experiment with distributing the structure, too. As a bonus, the functional approach has lent itself well to this math-centric data structure, and my code is much, MUCH cleaner because of it (I love reduce and other higher order functions).

One function I needed to implement was popcount (the sum of set set bits in a bitvector). For example, if b = 1010 then pop(b) is 2.

There are many methods listed on this blog. One of them is the table method, which precomputes all popcounts for 16 bit integers (or any length you have space for):

POPCOUNT_TABLE16 = [0] * 2**16
for index in xrange(len(POPCOUNT_TABLE16)):
    POPCOUNT_TABLE16[index] = (index & 1)
    + POPCOUNT_TABLE16[index >> 1]

I translated this to Erlang:

View the code on Gist.

So now gen_table(Bits) will generate a tuple for me with all popcounts from 0 to $2^Bits$. However, we may gain performance2 from knowing how big we want the table to be at compile time. If we know the amount of bits we want our popcount table to work for, we could type the table directly into the source. But that would impede our flexibility, and make our code ugly.

Enter ct_expand. We can use ct_expand:term( <code to execute> ) to run gen_table() for us at compile time! Now we only run gen_table() whenever we compile the module – after that the table is embedded in the binary.

First, check out ct_expand (git clone https://github.com/esl/parse_trans.git), and compile it with make3. Then just move the .beam files into your project directory. Now we’re ready for some metaprogramming :)

I created a new module for generating our table at compile time:

View the code on Gist.

The reason we must be in a new module is because ct_expand:term() requires it’s parameter to be something it will know about at compile time. This could be an inline fun (which I tried, but it wasn’t pretty), or it can be something you compile before ct_expand:term() is executed (see line 3). Note on line 2 that we also need to compile parse_transform and ct_expand with our module.

Let’s test it out:

1> c(popcount).
{ok,popcount}
2> popcount:popcount16(2#1010).
2
3> popcount:popcount16(2#101011).
4

Nice! Thanks to Ulf Wiger for making that so easy :)

( Image stolen from http://improvisazn.wordpress.com/2010/02/22/meta/ )


  1. My only other encounter with it was when I wrote an essay on the rationale of Erlang, which is something I will convert to blog format and post later if anyone is interested. 
  2. The compiler may be able to apply further optimisations, so table access might be faster itself, but the main benefit comes from not having to generate the table while your code is running. Note that I haven’t experimented with it. I will try to run some tests this week and update the blog post. 
  3. If rebar gives you an error about crypto being undefined, you will need the erlang-crypto package. On Mac OS X: sudo port install erlang +ssl. (I had this issue) !
erlangfunctional programmingmetaprogramming

Alex Bowe Twitter

Alex has a PhD in succinct data structures for bioinformatics, and currently works on self-driving cars. He's also interested in cryptocurrency, business, fashion, photography, teaching, and writing.


Related Articles

Members Public

Some Lazy Fun with Streams

Update: fellow algorithms researcher Francisco Claude just posted a great article about using lazy evaluation to solve Tic Tac Toe games in Common Lisp. Niki (my brother) also wrote a post using generators with asynchronous prefetching to hide IO latency. Worth a read I say! I’ve recently been obsessing