You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
 
Go to file
Urban Guacamole 2e386bd45a Add Top torrents view by seed count 5 years ago
generate-top-torrents Add Top torrents view by seed count 5 years ago
import-tpb-dump Initial commit 5 years ago
index-generator Change dump format for index generator to CSV 5 years ago
spider Clean up 5 years ago
tracker-scraper Update trackers 5 years ago
website Add Top torrents view by seed count 5 years ago
.gitignore Redo tracker scraping, index generation 5 years ago
README.md Redo tracker scraping, index generation 5 years ago
snippets.sql Add Top torrents view by seed count 5 years ago
update-index.sh Add Top torrents view by seed count 5 years ago

README.md

About

What is this?

If you don't know what Torrent Paradise is, see the website.

This is a repository of all the tools I use to build and run torrent-paradise.ml. The 'code name' of the project is nextgen (next gen torrent search), so don't be surprised if it comes up somewhere.

Can you help me?

Maybe, open an issue. Be sure to demonstrate an effort that you tried to solve the problem yourself.

Setup

Here's what the setup looks like rn:

  • VPS, Debian Stretch, 2 GB RAM
    • PostgreSQL 9.6. pg_hba.conf contains this:

      local   all             all                                      peer
      # IPv4 local connections:
      host    nextgen         nextgen          localhost               md5
      
    • IPFS v0.4.18

    • user with username nextgen on the server

  • my laptop w/ Linux
    • Go toolchain installed
    • node v10.15 & npm
    • Python 3 (required only for index-generator/fix-metajson.py)

The programs create their own tables in the DB that they need. Database name is "nextgen".

What I did first after getting the server up and running was importing the TPB dump. Download https://thepiratebay.org/static/dump/csv/torrent_dump_full.csv.gz to the import-tpb-dump directory and run go run.

Usage

Generate the index

See update-index.sh. Before running it for the first time, you should create the materialized view fresh. For instructions, see the first paragraph of snippets.sql.

Spider the DHT

Run go build in spider/ to compile and scp the binary it to the server. You can use the systemd service file in spider/spider.service to start the spider on server boot.

Scraping trackers for seed/leech data

Run go build in tracker-scraper/ to compile and scp the binary it to the server. Run it every time you want to fetch new seed/leech data for all torrents.

tracker-scraper saves the results into the trackerdata table in database.

Contributing

Before working on something, open an issue to ask if it would be okay. I would love to KISS.