grab-site is an easy preconfigured web crawler designed for backing up websites. Give grab-site a URL and it will recursively crawl the site and write WARC files. Internally, grab-site uses a fork of wpull for crawling. Gives you a dashboard with all of your crawls, showing which URLs are being grabbed, how many URLs are left in the queue, and more; the ability to add ignore patterns when the crawl is already running; an extensively tested default ignore set (global) as well as additional (optional) ignore sets for forums, reddit, etc; duplicate page detection: links are not followed on pages whose content duplicates an already-seen page.