I'm trying to create a crawler that has a nested loop. The outer loop is the list of URLs I want it to go through, the inner loop is a list of information pulled from each page (the program defaults my action to a loop, but based on my experience with other crawlers this makes complete sense as a loop action).
When I first created the crawler, everything worked fine. The data extracted in the builder preview was exactly what I needed, but if I save & exit, then reopen my crawler on another day, I have to redo the inner loop extraction. Sometimes this still doesn't work and no data is pulled from the second loop, or it will only be pulled for the first URL, or pulled for every URL but the first.
I cannot for the life of me figure out why this is happening. Everything looks to be set-up properly & upon initial testing, it works fine, but I can't ever leave it or it all breaks. Having to recreate this task every time I need to run it (which is multiple times per week), is not ideal.
Please sign in to leave a comment.