Log File Analysis
Steven Koolen created numerous scripts to
parse crawling results on Tribler network.
Everything is stored on
SuperStorage3:/home/steven/permid_mrt-nov
Both Data, .gnuplot, and .py
- get all raw crawl data files.
- run tribler_log.py, this creates .ips files
- run tribler_log_duplicate.py, which filters duplicate entries (might choose to also run tribler_log_filter_multipleinstance.py in order to filter back and forth switchers, thus using multipe computers)
- run any of the tribler_log_*.py files to create graphs from the data
For example tribler_log_permids_day.py shows the number of users
tribler_log_runall.sh runs all files for step 4 and creates .eps graph files
|