23 OCT 2025 - We are back! If you have been following us over the last few years, you will know that the last 2 months have been rough. We website was practically not loading. Sorry for the mess. We are back though and everything should run smoothly now. New servers. Updated domains. And new owners. We invite you all to start uploading torrents again!
Create text, HTML, RSS and XML sitemaps to help search engines like Google and Yahoo to crawl and index your website. Crawler is feature rich supporting many website crawling options. Configure amount of simultaneous connections to use. Supports crawler filters, robots.txt, custom connection and read timeout values, removal of session IDs, scanning of javascript and css files, proxy setup, website login, and various other options. Alias paths during scan, e.g. sites that use mutliple domain names with same content. Scan websites from multiple start paths, useful for websites not fully crosslinked. Scan local and online websites on internet, localhost, LAN, CD-ROM and disks. Scan static and dynamic websites such as portals, online stores, blogs and forums. View reports on broken and redirected links (whereto and wherefrom). Rich template support for HTML sitemaps. Generate sitemap files for ASP.Net controls. Supports splitting and compressing XML sitemaps. Can set and calculate priority, change frequency and last modified in XML sitemaps. Change root path used in generated text, HTML, RSS and XML sitemaps, useful in case you have scanned a mirror or localhost website copy. Integrated FTP sitemap upload. Can also ping and notify search engines of sitemap changes. Command line support includes load project, scan website, build sitemap, FTP upload and ping search engines.