I have computed the local clustering coefficient for my network and then created a histogram as follows: g = gt.load_graph('graph_no_multi.gt') #"The clustering coefficient is normalized only for _simple_ graphs, with at #most one edge between nodes." gt.remove_parallel_edges(g) #create new property map clustering = g.new_vertex_property("float") #calculate clustering coefficient gt.local_clustering(g,prop=clustering,undirected=False) #Make propery map internal g.vp.clust = clustering #Initiliase dictionary containing list of clustering coefficients for given degree clust_k_hist={} for v in g.vertices(): k = v.out_degree() c=g.vp.clust[v] if k in clust_k_hist: clust_k_hist[k].append(c) else: clust_k_hist[k]=[c] If I now however type `print max(clust_k_hist[0])` I get an answer of `2` which surprises me slightly (similarly for `k=1` I get a value of `2`). Firstly I wasn't expecting a clustering coefficient greater than `1` but also for a degree of `0` I would expect to find only clustering coefficients of `0`. The documentation states that the outdegree is being used in calculating the local clustering coefficient so I am using the outdegree for compiling the histogram. Have I gone wrong somewhere? -- View this message in context: http://main-discussion-list-for-the-graph-tool-project.982480.n3.nabble.com/... Sent from the Main discussion list for the graph-tool project mailing list archive at Nabble.com.