Hi, everyone! I met a problem when I'm learning how to use graph-tool. I read the paper, network reconstruction and community detection from dynamics, and I am trying to achieve the same result. When I followed the same settings for real networks with synthetic dynamics, their similarities were just about 0.2. I have a question about how to control the number of infection events per node,a, for the first model and the number of micro-state, M, for the second model. The whole process is shown as following.
import graph_tool.all as gt from matplotlib import cm
g = gt.collection.konect_data["openflights"] ## airport network with SIS dynamics gt.remove_parallel_edges(g) g = gt.extract_largest_component(g, prune=False)
#simulation of an empirical dynamic model
# The algorithm accepts multiple independent time-series for the # reconstruction. We will generate 100 SIS cascades starting from a # random node each time, and uniform infection probability beta=0.2.
ss = [] for i in range(100): si_state = gt.SISState(g, beta=.2) s = [si_state.get_state().copy()] for j in range(10): si_state.iterate_sync() s.append(si_state.get_state().copy()) # Each time series should be represented as a single vector-valued # vertex property map with the states for each note at each time. s = gt.group_vector_property(s) ss.append(s)
# Prepare the initial state of the reconstruction as an empty graph u = g.copy() u.clear_edges() ss = [u.own_property(s) for s in ss] # time series properties need to be 'owned' by graph u
# Create reconstruction state rstate = gt.EpidemicsBlockState(u, s=ss, beta = None, r=1e-6, global_beta=.2, state_args=dict(B=20), nested=False, aE=g.num_edges())
# Now we collect the marginals for exactly 10,000 sweeps, at # intervals of 10 sweeps:
gm = None bm = None betas = []
def collect_marginals(s): global gm, bm gm = s.collect_marginal(gm) b = gt.perfect_prop_hash([s.bstate.b])[0] bm = s.bstate.collect_vertex_marginals(bm, b=b) betas.append(s.params["global_beta"])
gt.mcmc_equilibrate(rstate, force_niter=1000, mcmc_args=dict(niter=10, xstep=0), callback=collect_marginals)
print("Posterior similarity: ", gt.similarity(g, gm, g.new_ep("double", 1), gm.ep.eprob)) print("Inferred infection probability: %g ± %g" % (mean(betas), std(betas)))
########################################################## g = gt.GraphView(gt.collection.konect_data["maayan-foodweb"], directed=True)##a food web network with Ising dynamic gt.remove_parallel_edges(g)
# The algorithm accepts multiple independent time-series for the # reconstruction. We will generate 1000 Ising cascades starting from a # random node each time, and the uniform inverse temperature beta=0.2.
ss = [] for i in range(1000): si_state = gt.IsingGlauberState(g, beta=.1) s = [si_state.get_state().copy()] si_state.iterate_async(niter=1000) s.append(si_state.get_state().copy()) # Each time series should be represented as a single vector-valued # vertex property map with the states for each note at each time. s = gt.group_vector_property(s) ss.append(s)
u = g.copy() u.clear_edges() ss = [u.own_property(s) for s in ss]
rstate = gt.PseudoIsingBlockState(g,s=ss,beta=0.1,state_args=dict(B=1), nested=False, aE=g.num_edges())
gm = None bm = None betas = []
def collect_marginals(s): global gm, bm gm = s.collect_marginal(gm) b = gt.perfect_prop_hash([s.bstate.b])[0] bm = s.bstate.collect_vertex_marginals(bm, b=b) betas.append(s.params["beta"])
gt.mcmc_equilibrate(rstate, force_niter=1000, mcmc_args=dict(niter=10, xstep=0), callback=collect_marginals)
print("Posterior similarity: ", gt.similarity(g, gm, g.new_ep("double", 1), gm.ep.eprob)) print("Inversed temperature: %g ± %g" % (mean(betas), std(betas)))
Moreover, I also wonder how to do a nested version for the same network. Please let me know if you need more information on the question. otherwise, I hope to hear how this can be achieved using graph-tool?
Thanks, Gege Hou
-- Sent from: http://main-discussion-list-for-the-graph-tool-project.982480.n3.nabble.com/