Hi,
Thank you for a providing this great library. The convenience of Python and the speed of native code really is a happy marriage.
I'm missing one thing though. It would be great if you could bias the random jump vector for the PageRank algorithm. This modification is called "personalization". By providing non-uniform jump probabilities you selectively downplay or raise the importance of certain nodes. In natural language processing, where I work, this trick has quite a few applications.
For reference, the parameterization is implemented in NetworkX [1]. Unfortunately, it becomes very slow once the graphs get large.
Thanks,
Anders Johannsen
[1] http://networkx.lanl.gov/reference/generated/networkx.algorithms.link_analys...
Hi Anders,
On 06/27/2011 03:34 PM, Anders Johannsen wrote:
Thank you for a providing this great library. The convenience of Python and the speed of native code really is a happy marriage.
I'm missing one thing though. It would be great if you could bias the random jump vector for the PageRank algorithm. This modification is called "personalization". By providing non-uniform jump probabilities you selectively downplay or raise the importance of certain nodes. In natural language processing, where I work, this trick has quite a few applications.
For reference, the parameterization is implemented in NetworkX [1]. Unfortunately, it becomes very slow once the graphs get large.
This seems like a good idea, and it seems easy enough to implement. I'll add it when I find some time. Would you mind opening a ticket for it the website, so that I don't forget?
Cheers, Tiago
On Mon, Jun 27, 2011 at 15:43, Tiago de Paula Peixoto tiago@skewed.dewrote:
This seems like a good idea, and it seems easy enough to implement. I'll add it when I find some time. Would you mind opening a ticket for it the website, so that I don't forget?
That's great news! I just created the ticket.
Anders