I am curious what is being used to calculate the standard deviation of the
average in gt.vertex_average and gt.edge_average
>>> t2=gt.Graph()
>>> t2.add_vertex(2)
>>> t2.add_edge(t2.vertex(0), t2.vertex(1))
>>> gt.vertex_average(t2, "in")
(0.5, 0.35355339059327373)
Now, shouldn't std be σ(n)=sqrt(((00.5)^2+(10.5)^2)/2)=0.5 ?
also q(n1)=sqrt((0.5^2+0.5^2)/(21))~=0.70710
0.3535 is sqrt(2)/4 which happens to be σ(n1)/2, so it seems there is some
relation to that.
A little bigger graph.
>>> t3=gt.Graph()
>>> t3.add_vertex(5)
>>> t3.add_edge(t3.vertex(0), t3.vertex(1))
>>> gt.vertex_average(t3, "in")
(0.2, 0.17888543819998318)
Now, we should have 0,1,0,0,0 series for vertex incoming degree.
So Windows calc gives σ(n)=0.4 and σ(n1)~=0.44721, so where does 0.1788854
come from ?
Reason, I am asking because, I have a large graph, where the average looks
quite alright but the std makes no sense, as going by the histogram, degree
values are quite a bit more distributed than the std would indicate.

View this message in context: http://maindiscussionlistforthegraphtoolproject.982480.n3.nabble.com…
Sent from the Main discussion list for the graphtool project mailing list archive at Nabble.com.
Hello!
I've got graphviz installed with python and python27 bindings.
However, when I do:
graph_draw(ug, vprops={"label": ug.vertex_index},
output="twonodes.pdf", layout="fdp")
I get the following error:
graph G {
graph [outputorder=edgesfirst, mode=major, overlap=true, ratio=fill,
size="5.905512,5.905512", start=3019255687130444316];
node [label="\N", shape=circle, width="0.105", height="0.105",
style=filled, color="#2e3436", fillcolor="#a40000"];
edge [arrowsize="0.3", color="#2e3436", penwidth="1.0"];
graph [bb="0,0,425,425"];
0 [label=0, width="0.47", height="0.47", pos="330,101"];
1 [label=1, width="0.47", height="0.47", pos="93,324"];
0  1 [pos="318,113 277,152 147,274 106,312"];
}
Error: renderer for pdf is unavailable
Out[8]: <PropertyMap object with key type 'Vertex' and value type
'vector<double>', for Graph 0x5ff3f90, at 0x6001290>
At the same time, if I save the above dot expression as test1.dot and
run the following command from the command line:
$ dot Tpng test1.dot o graph1.png
I get a beautiful graph1.png.
Graphviz is installed in a nonstandard location as I don't have admin
rights. However, the graphviz folder is in LD_LIBRARY_PATH and gv.py
is in PYTHONPATH.
Would very much appreciate your advice on what else graphtool might
need to find the libraries and/or how to test it better.
Many thanks,
Mikhail
Hi Tiago,
I am trying to calculate the shortest distances of a graph after applying a
filter. I have a code that looks like this:
g=gt.load_graph("myGraph.xml",format="xml")
#for later use
distances = gt.shortest_distance(g)
#extract the components of the graph
comp = g.label_components(g)
#This splits the graph in several components
#I want to calculate the shortest distances
#for the component 2 for example
filtering = g.new_vertex_property("boolean")
for v in g.vertices():
if comp[v]==2:
filtering[v]=True
else:
filtering[v]=False
#set the vertex filter
g.set_vertex_filter(filtering)
distances_comp=gt.shortest_distance(g)
The last line of code rises a segmentation fault. I have plotted the graph
with the filtered graph and its correct, also I can calculate the
local_clustering_coefficient without problems. Am I doing something wrong?
Is there any other way to filter the graph and calculate the shortest
distances? Is this a bug?
Thanks so much,
Juan
Hi
I asked earlier as to how utilize the parallel processing in my code. Now I
installed it using the precompiled packages for Ubuntu. does that mean I'll
have to reinstall it from source using the openmp option in the configure
command?
Shubham

http://about.me/shubham.bhushan
Hi,
I am trying to read the graphml output of graphtool's graphml using
networkx.
https://github.com/networkx/networkx/issues/843
Unfortunately this does not work with any of the vector_* type property maps
which graphtool uses. Have you encountered this issue before?
It seems the right thing to do might be to extend your graphml to hold the
vector_* attributes as detailed:
http://graphml.graphdrawing.org/primer/graphmlprimer.html#EXT
Is there some reason why it was done the way it is? How do you manage
read/writing graphml data to other tools?
In the meantime, it might be possible to hack some read support for
graphtool's xml into networkx. To this end, could you please advise how to
parse the 'key1' data (should be two floats)
<node id="n1">
<data key="key0">6</data>
<data key="key1">0x1.5c71d0cb8d943p+3, 0x1.70db7f4083655p+3</data>
</node>
thanks


View this message in context: http://maindiscussionlistforthegraphtoolproject.982480.n3.nabble.com…
Sent from the Main discussion list for the graphtool project mailing list archive at Nabble.com.
Hi
New to the graphtool. Build it from src on Fedora 16 (64) . Built with
cairomm enabled . Try to run as per the quick start , however when
executing
graph_draw(g, vertex_text=g.vertex_index, vertex_font_size=18,
... output_size=(200, 200), output="twonodes.pdf")
i get undefined symbol
_ZN5Cairo7ContextC1EP6_cairob
However the symbol above is defined in : libcairomm1.0.so see below nm
command results
nm /usr/local/lib/libcairomm1.0.so  grep i _ZN5Cairo7ContextC1EP6_cairob
0000000000014b90 T _ZN5Cairo7ContextC1EP6_cairob
Interestingly the library libgraph_tool_draw.so has indeed have the
undefined symbol
U _ZN5Cairo7ContextC1EP6_cairob
but does not depend on the cairomm libarary  see the ldd output . unless
the library is dynamically load i do not understand why
Any advise appreciated
regards Sasho
ldd
/usr/local/lib/python2.7/sitepackages/graph_tool/draw/libgraph_tool_draw.so
linuxvdso.so.1 => (0x00007fffc46da000)
libstdc++.so.6 => /usr/lib64/libstdc++.so.6 (0x00007f99fd6cd000)
libpython2.7.so.1.0 => /usr/lib64/libpython2.7.so.1.0 (0x00007f99fd30e000)
libboost_iostreams.so.1.49.0 => /usr/local/lib/libboost_iostreams.so.1.49.0
(0x00007f99fd0f6000)
libboost_python.so.1.49.0 => /usr/local/lib/libboost_python.so.1.49.0
(0x00007f99fceaa000)
libboost_regex.so.1.49.0 => /usr/local/lib/libboost_regex.so.1.49.0
(0x00007f99fcbc8000)
libCGAL.so.9 => /usr/local/lib/libCGAL.so.9 (0x00007f99fc9a5000)
libexpat.so.0 => /usr/local/lib/libexpat.so.0 (0x00007f99fc77c000)
libbz2.so.1 => /lib64/libbz2.so.1 (0x00007f99fc54d000)
libm.so.6 => /lib64/libm.so.6 (0x00007f99fc2c9000)
libc.so.6 => /lib64/libc.so.6 (0x00007f99fbf11000)
libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f99fbcfb000)
/lib64/ldlinuxx8664.so.2 (0x00000030cea00000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f99fbadf000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f99fb8db000)
libutil.so.1 => /lib64/libutil.so.1 (0x00007f99fb6d7000)
libz.so.1 => /lib64/libz.so.1 (0x00007f99fb4c0000)
librt.so.1 => /lib64/librt.so.1 (0x00007f99fb2b8000)
libgmpxx.so.4 => /usr/lib64/libgmpxx.so.4 (0x00007f99fb0b3000)
libboost_thread.so.1.49.0 => /usr/local/lib/libboost_thread.so.1.49.0
(0x00007f99fae98000)
libgmp.so.3 => /usr/lib64/libgmp.so.3 (0x00007f99fac41000)

View this message in context: http://maindiscussionlistforthegraphtoolproject.982480.n3.nabble.com…
Sent from the Main discussion list for the graphtool project mailing list archive at Nabble.com.
Hi,
I'm new with the graphtool library.
I was wondering if there is a function for computing the richclub
coefficient?
The coefficient as a function of the vertex degree (slightly different from
the assortativity coeficient).
To be more specific a function that does the same as this function from
networkx:
http://networkx.lanl.gov/reference/generated/networkx.algorithms.richclub.r…
I think it might be one of these functions:
http://graphtool.skewed.de/static/doc/correlations.html , but I don't know
which one specifically.
Thank you in advance.
Pablo
I have a function that I’m running on subgraphs. It is effectively a breadfirst outwards search for all paths from the originating vertex. It assigns probabilities to each of the endpoints, which are then used for computing a type of index.
In terms of speed, I’ve wondered about three things:
1  Whether it would be faster to adapt one of the existing graphtool search algorithms and find a way to assign probabilities from the results…
2  Whether there are ways to more speedily implement this algorithm through rejigging the algorithm;
3  And, whether implementing through Cython may provide speedups.
Regarding #1, I’ve looked at using the builtin breadthfirst search, but I haven’t yet figured out how to use the visitor object to create the list of probabilities. Though will try to take a closer look at this sometime soon.
Regarding #3, I’ve looked at a Cython implementation, but it doesn’t appear to provide any significant speedups. I suspect this is because the function uses graph and vertex objects and I’m assuming that Cython doesn’t have the capacity to efficiently implement these.
Any pointers would be appreciated. Thanks.
Here is the function: (Currently a .pyx file for Cython)
from scipy.stats import entropy
def nodeOutProb(g, v):
# Calculates probabilities of all possible routes from node within cutoff distance
visitedNodes = set()
claimedEdges = set()
branches = {}
probabilities = []
# add origin
cdef float o = 1.0
branches[v] = o # add starting node with starting probability
visitedNodes.add(v) # add starting node to visited set
cdef float r, cV, newVal
# start iterating through branches...adding and removing as you go
while len(branches) > 0:
for key, val in branches.items():
newVal = 0.0
newEdges = []
for e in key.all_edges():
if e not in claimedEdges:
newVal += 1
claimedEdges.add(e)
newEdges.append(e)
if newVal == 0:
probabilities.append(1/val)
else:
cV = newVal * val
for n in key.all_neighbours():
if n not in visitedNodes:
visitedNodes.add(n)
branches[n] = cV
elif g.edge(key, n) in newEdges:
probabilities.append(1/cV)
del branches[key]
return entropy(probabilities, base=2)
I have a function that I’m running on subgraphs. It is effectively a breadfirst outwards search for all paths from the originating vertex. It assigns probabilities to each of the endpoints, which are then used for computing a type of index.
In terms of speed, I’ve wondered about three things:
1  Whether it would be faster to adapt one of the existing graphtool search algorithms and find a way to assign probabilities from the results…
2  Whether there are ways to more speedily implement this algorithm through rejigging the algorithm;
3  And, whether implementing through Cython may provide speedups.
Regarding #1, I’ve looked at using the builtin breadthfirst search, but I haven’t yet figured out how to use the visitor object to create the list of probabilities. Though will try to take a closer look at this sometime soon.
Regarding #3, I’ve looked at a Cython implementation, but it doesn’t appear to provide any significant speedups. I suspect this is because the function uses graph and vertex objects and I’m assuming that Cython doesn’t have the capacity to efficiently implement these.
Any pointers would be appreciated. Thanks.
Here is the function: (Currently a .pyx file for Cython)
from scipy.stats import entropy
def nodeOutProb(g, v):
# Calculates probabilities of all possible routes from node within cutoff distance
visitedNodes = set()
claimedEdges = set()
branches = {}
probabilities = []
# add origin
cdef float o = 1.0
branches[v] = o # add starting node with starting probability
visitedNodes.add(v) # add starting node to visited set
cdef float r, cV, newVal
# start iterating through branches...adding and removing as you go
while len(branches) > 0:
for key, val in branches.items():
newVal = 0.0
newEdges = []
for e in key.all_edges():
if e not in claimedEdges:
newVal += 1
claimedEdges.add(e)
newEdges.append(e)
if newVal == 0:
probabilities.append(1/val)
else:
cV = newVal * val
for n in key.all_neighbours():
if n not in visitedNodes:
visitedNodes.add(n)
branches[n] = cV
elif g.edge(key, n) in newEdges:
probabilities.append(1/cV)
del branches[key]
return entropy(probabilities, base=2)
Hello,
I am trying to compile the graphtool source code I downloaded from github
because I need to use the function clustering.motifs with the ordering of
motif vertex maps fixed.
According to INSTALL one need run ./configure. Since this file is not
shipped, I ran " autoreconf fi" first and I am getting the following
error:
configure.ac:192: error: possibly undefined macro: AC_PYTHON_DEVEL
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
Please, how should I proceed?
Are there precompiled packages (with the ordering of motif vertex maps
fixed)?
Thank you for any help you can provide.
Kind regards,
Sabrina.

View this message in context: http://maindiscussionlistforthegraphtoolproject.982480.n3.nabble.com…
Sent from the Main discussion list for the graphtool project mailing list archive at Nabble.com.