There was a problem with your request, please try again The content you are editing has changed. Reload the page and try again. Does anemone have a memory leak issue for crawling large sites? I've been experimenting with anemone to crawl a massive site and the memory for the process keeps growing for both Mongodb and the spider.rb in activity monitor. I posted a question on stack overflow a littl