Seems sometimes pages/page elements fail to open/download when using medium/slow proxy regardless big timeout



  • Thats why I do it like that:
    ignore errors > load page> waiter 3 sec> ignore errors> load SAME PAGE > waiter 10sec

    so this way when it attempts to load it second time, thread will already have cached files like images or css/js/other files.

    I was thinking if there is a way to "load cache" from offline because my threads are doing same job over and over and sometimes they fail to get file/wait for element that could be already in cache files. i believe this is due to slow/medium proxy and big unoptimized webpage.

    Or maybe this is stupid idea and wont work? Or it could be a footprint and lead to ban of all my accounts?



  • Your question is interesting! I am experiencing problems with backcoonect proxies of 5, 10 and 15 minute spin behind the proxy's Gatway and so many times at the time of loading the page the ip is switched and the page loading for. I would like to know some way to time this change of ip or to be detected automatically by BAS so that "If it is happening the IP change the BAS does a STOP action and then continue" I will try to create something with the http-client I will insert the blocks a form for each minute before an action checks with a "ping, get" of any page if success continues for action if it fails to return to the souce of proxies to force a "cheat the program" and make it repeat the process without presenting an error or failure to load the page.



  • @peellpeell

    if you know what time is it, why dont set a timer?

    run 2 threads, one for waiting 14 minutes (or whatever), second doing your job.

    tell first one to set variable to false or true when 14minutes passed, and when that variable is false/true dont load any page in other thread

    (PS u might want to use global variables)