Am 27.04.21 um 14:02 schrieb Lietz, Haiko:
The 'polbooks' dataset has 105 nodes. An SBM with one block (B=1) has a DL of about 1550 bits. The DL is minimized (DL_min=1300) for B=5. When each node is in its own block (D=105), DL is maximized (DL_max=1950). Can't I make states of different graphs comparable by taking DL_min/DL_max? It seems like a straightforward application of normalized entropy (https://en.wikipedia.org/wiki/Entropy_(information_theory)#Efficiency_(norma...)) to me.
It's difficult to comment, because I don't know what the objective of the comparison is. If you compute the ratio of the minimum DL with the DL for B=1, this would give you the compression ratio when compared to a baseline random graph model. If you compare this ratio between two networks of two different sizes, this gives you an idea of how more random one is versus the other, when compared to a fully random graph with the same density, but no deeper insight. Best, Tiago -- Tiago de Paula Peixoto <tiago@skewed.de>