Now I understand that `edges_dl` specifically encodes the flat prior. I have 2 following questions: 🤔- How could I access the terms in Eq.(41) of the PRE paper, i.e. each term is the level-wise entropy of edge counts, as Eq.(42) describes? For the "lesmis" dataset, the bottom-most layer has the entropy:
nested_state.level_entropy(0) Out[•]: 630.133156768878
This is exactly the sum of these three entropic terms: "adjacency" (332.24632), "degree_dl" (170.10951), and "partition_dl" (127.77732). I could not find a rationale about the missing entropy for edge counts. 🤔- I found that `nested_state.levels[0].entropy(deg_entropy=True) - nested_state.levels[0].entropy(deg_entropy=False) < 0`. This command is expected to print the negative logarithm of Eq.(28) of the paper, which is positive. I am not sure what went wrong. Thanks, Tzu-Chi -- Sent from: http://main-discussion-list-for-the-graph-tool-project.982480.n3.nabble.com/