BrowserAutomationStudio 22.3.0 has been released



  • Changes in current version are related to parsing.

    Added the ability to generate selector for several elements at once. This is done by choosing the key elements of two types: those that must be present in the list, and those that should not be there. BAS will try to guess the selector based on this information. Each time the list of key elements changes, BAS will update the selector. Thus, you can tweak the result to the one that you need, gradually increasing the number of key points if this is required.

    Here how it looks:

    https://i.imgur.com/tyiZkq0.gif

    Short video with subtitles: https://www.youtube.com/watch?v=5XxXsgPj75U

    Creating actions that are performed for each element in the loop has become possible using a new dialog. It automatically appears if the place to insert action is inside the loop. The new method is more intuitive than the separate submenu that was used previously.

    https://i.imgur.com/nzJxqyh.png

    A click on elements number leads to the focus of the next element. Thus, by sequentially clicking on this link you can traverse all found elements.

    https://i.imgur.com/gHuoufP.png

    A new mechanism has been added for searching elements that are near or overlap each other. By clicking on the up / down arrows on the keyboard, you can go through absolutely all the elements under the cursor. Enter key - select an item.

    https://i.imgur.com/bztGlRf.gif

    Information about the tag, classes and identifier of selected element is also displayed in a new version. You can see how it looks in the previous video.

    Added smart action for parsing links "Get link URL".

    Several minor improvements:

    • Fixed issues when running element loop inside element loop.
    • Fixed issues when running function recursively inside condition.
    • Fixed hanging script when thread number set to 0.
    • New startup page.
    • Http client may set several same headers.
    • Disabled autofocus when running browser in manual mode.


  • @support said in BrowserAutomationStudio 22.3.0 has been released:

    Disabled autofocus when running browser in manual mode.

    Thanks



  • @support said in BrowserAutomationStudio 22.3.0 has been released:

    Disabled autofocus when running browser in manual mode.

    No, fix for HTTP Proxy again, and here I and my customers were waiting for that fix. So dissapointed



  • @gudolik said in BrowserAutomationStudio 22.3.0 has been released:

    @support said in BrowserAutomationStudio 22.3.0 has been released:

    Disabled autofocus when running browser in manual mode.

    No, fix for HTTP Proxy again, and here I and my customers were waiting for that fix. So dissapointed

    Everyone wants the realization of his request.



  • @gudolik what's the issue?



  • @support Http proxies give "Recv failure: Connection was reset" and some socks5
    "SSL read: error:00000000:lib(0):func(0):reason(0), errno 10054" on websites with SSL in GET and POST command in BAS.
    Its CURL related issue, not a proxy. Normal HTTP request in C# works with the same proxy and same headers, it works because it is not CURL, i tried with BASH version CURL with the same proxy which was not working in BAS but in BASH Curl is working. If works in a normal POST request, and in normal curl request with a same proxy then it is the BAS issue, i must mark i tried 5 paid proxy providers and issue exist in all of them.

    I found an explanation of why it's happening, it's here https://curl.haxx.se/docs/sslcerts.html
    Tested without proxy and happened that same error even without proxy
    https://gyazo.com/8ffcf7cbbe26197fc122745162b28904
    and this is when applied these to curl
    CURLOPT_SSL_VERIFYHOST, 0
    CURLOPT_SSL_VERIFYPEER, 0

    or -k for command line as says in their website https://curl.haxx.se/docs/sslcerts.html
    https://gyazo.com/9799aa19de384c060e94941314fa824b

    all send in skype, dont know if you or Fox manage the skype



  • @gudolik This issue is reproducible only on website that you working on and only with some types of proxies.

    Please use MassTunneler, or project, which was sent to you by skype. Project uses Dante socks server with default settings.

    I can't fix see an issue because I can't see any meaningful description or a way to reproduce it.



  • @gudolik If I'm not right, please attach curl command line, which works correct.



  • @gudolik Or any httpclient that works correctly with that website and that proxies.



  • @support said in BrowserAutomationStudio 22.3.0 has been released:

    @gudolik This issue is reproducible only on website that you working on and only with some types of proxies.

    Please use MassTunneler, or project, which was sent to you by skype. Project uses Dante socks server with default settings.

    I can't fix see an issue because I can't see any meaningful description or a way to reproduce it.

    I can send you a huge list on websites which is not working again you did not check what write? Its existing issue on CURL, it is happening even without proxy,cURL has historically been flakey with SSL certificates. Dante socks works some time but mostly returns SSL read: error:00000000:lib(0):func(0):reason(0), errno 10054

    @support said in BrowserAutomationStudio 22.3.0 has been released:

    @gudolik If I'm not right, please attach curl command line, which works correct.

    Cookies is active short term. And like in documentation says you should add --insecure or -k to bypass this TLS/SSL error on request , i did give 2 example screenshots to show you up.

    But here you can try

    curl "https://www.abercrombie.com/api/ecomm/a-uk/session?rememberMe=true" -H "authority: www.abercrombie.com" -H "accept: application/json, text/javascript, */*; q=0.01" -H "origin: https://www.abercrombie.com" -H "x-requested-with: XMLHttpRequest" -H "user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.79 Safari/537.36" -H "content-type: application/json" -H "sec-fetch-site: same-origin" -H "sec-fetch-mode: cors" -H "referer: https://www.abercrombie.com/shop/AccountRouterView?storeId=19658^&isUserLogin=true^&catalogId=11556^&langId=-1" -H "accept-encoding: gzip, deflate, br" -H "accept-language: bg,en-US;q=0.9,en;q=0.8,tr;q=0.7" -H "cookie: uPref=^%^7B^%^22brnd^%^22^%^3A^%^22anf^%^22^%^7D; languageFix=fr; af-minicart-19159=undefined; _abck=5B5D0628A29EA57DF00C024FC86C2994~0~YAAQlUISAknpKttuAQAAeGbG5wPQXNvcbqSs0bk0N0vxdo++Lj4Jsj7DJEdkuGs1cD2UIImCYduhkyqrkcS3vYeKe3nodzCxbcz7RI0v7Oiv80pjpsKvNS4677B98my2ZcVlnQY2v+GZ7bWkbtqT2P/ZaGLOoZX8sk26vgbqxFavjpr56Ul2EWQB0HQyeEHVlY+l1TrWvP7mAcT/OliRMwPezq22IE7QDvfynmbBvBFoa5eaVvFXbaFCeWpvEQVVOVWOxoFrNrurYbSHkbQL6GLLnWPfS3sdN15iCS1ZT+JTNzGr6iFvmlmwBLlxW4von/sNRKT9IZVF5osR~-1~-1~-1; userType=G; geoLocation=GB:EN:; af-minicart-19658=undefined; bm_sz=272DD01613470A0763EDBD7D9D68766A~YAAQRY9lX3xBlw9vAQAAS2DhHQbxKFM24LDYbvsgjeJYWAke56HvufxLyCYk62yT9rEwvmpxnJzS1kEBetVeKoabSLug9svhcfL9UI5H/HIdUtiz1VfIb3bg1nbnner+Aq6ZNjruKbZU/J3AOPzDEGI0GvMEfr3tuVgLxYvGtSvayKofspLTGjDP58gds7nNstl5Vuo=; dtCookie==3=srv=3=sn=32180D71BE32994E0486C8CD92C61D4D=perc=100000=ol=0=mul=1; ANF_AUTHENTICATION=1576754307426; ANFSession=196581576754307431; xSeg=c=BL-0; cf659858fe44f7b954e9749467f78c3b=d5d5f8e557026efa65ef5af46a57fbd4; ak_bmsc=304891A0859A25AF1FC27254CC91F5F15F658F45F2430000835CFB5DAB02F156~plnhWn9aLE+Zz3g1GhlKTSRoTC3im82msrbGk0+X074IgryNEnOct5pMhTJ7lXI3AlAevr1AcuwsFZ7s6G+bs5x+1HwpiCXiu63w/2kcywTexkL0lir39fg3Ci31XAUhM2FZvCM5WbmCmR8foHYpr4nsIoXukHJWep+o3xuZwYi6VvvmfegslHkzUUt2JFwuveEgK9uY63pu4k+g3xJZkA2A143q/aNwlUI36+zkPId44=; JSESSIONID=0000dwxNlAKdADzvwOp_j_zIfoh:-1; WC_SESSION_ESTABLISHED=true; WC_PERSISTENT=tbowUVANpp^%^2Bs08sO^%^2BEBWNUS^%^2B3WdBAmnuC4IFmPavx8s^%^3D^%^3B2019-12-19+06^%^3A18^%^3A30.805_1534516807021-281112_19159_892236114^%^2C-2^%^2CEUR^%^2C2019-12-14+22^%^3A11^%^3A36.951_19658_-1002^%^2C-1^%^2CGBP^%^2C2019-12-19+06^%^3A18^%^3A30.805_19658; WC_AUTHENTICATION_-1002=-1002^%^2CJdh5JVzqovGDGuZIS9A0yAc11LCiDPi1CWNOoTX^%^2B9xo^%^3D; WC_ACTIVEPOINTER=-1^%^2C19658; WC_USERACTIVITY_-1002=-1002^%^2C19658^%^2C0^%^2Cnull^%^2Cnull^%^2Cnull^%^2Cnull^%^2Cnull^%^2Cnull^%^2Cnull^%^2C412054666^%^2Cver_null^%^2CvoqoVPEf5GCGlnbS3B1YlIxidyfmkF6r2qpemC4LuQ137Zf9Bz2wHMB6^%^2BewsXfWQVG5uWvfw3Uji5HrwHZ^%^2FFHXVKGs3lFLM^%^2B3HEXavPEOAwkpUR9DVFq8w3Wrw81rLknPWzRdpp7yVH8sMWlD76UOGutGb33^%^2FrhiAUhUH4XFJB3lmXI263RMoFSsAigAftWXN0^%^2BganppH0tCi^%^2F5o88i2s45A9Wa1tl5ZxRRyU6nsZAs^%^3D; WC_ACTIVITYDATA_-1002=G^%^2C-1^%^2CGBP^%^2C11557^%^2C19470^%^2C-2000^%^2C1534516807021-281112^%^2C-1^%^2CGBP^%^2CEB3wyMIcfZaE1I2xmi2ycaYpzsiKLO3SnN61mOr^%^2BX5ejud6TJYRFxCY8WvNNh09DCEYr^%^2BjKe1J9PdQR0S6mgREb8WBlBACajb4goa0xmodv0OyY5TiE^%^2BLnvwjOvRkgDmgOLAOlnVEczqNpUJJzL6JMoSAGvUyIZQmfivbzhnojx3y9ZSofFzu6BTHWU4kR10KWCkF9AyqI2x3wQx^%^2BkPua1UqwW1UsOx9bM4XrPLLFL0^%^3D; bm_mi=ADD5C6D97E500A613468BE44CC2B7AE7~EREkhm6/w1HtxTVPucuVsv4y13UKhLaud5O8aAu4Lzh2RyB0pRwcyvre2bjh0Y9EmR2bxhaJNV5UxYtrt0FWdvgPk3XlKKyC8D6Ai9vlcoIbg2Oiz8Tumdxe+4QSi32E7KFb9T3N0Obc6LM2j4vACSM9khl3W3TeYBMDoqldWC+jHZ/Zw1fNNdSe9911hL+Jn2vrhy/lvjHcd9DhCxffyAW4afTc5hBZ7Gvi50jsqv6NPqNCWLelt27fKgvDIokL; bm_sv=D4A614967CDC1059E4F4B509EBA5E170~+pYGwgwOPVIogHp5mZsgYniVgM1hzuCK4575W8SyEtwMcGzPLrWhBwM50f+EiU53XgFZN3EIZQ/nO60PTMC0+2Eut/Lbkym0a6QnsryLGybvjxw0w639IDY3I4NaJxvZg+b97QEDwGCVWre8BxYlUCcUVLHRtK+nJ3aj8rqc79E=" --data-binary "^{^\^"logonId^\^":^\^"daeba^@abv.bg^\^",^\^"password^\^":^\^"dsadsa^\^"^}" --compressed
    

    If cookies do not expire before u get ur hands on it. Would give same error without proxy
    https://gyazo.com/8ffcf7cbbe26197fc122745162b28904
    and if you add -k or --insecure like in the documentation says , must work
    https://gyazo.com/9799aa19de384c060e94941314fa824b

    libcurl performs peer SSL certificate verification by default. This is done by using a CA certificate store that the SSL library can use to make sure the peer's server certificate is valid.
    
    If you communicate with HTTPS, FTPS or other TLS-using servers using certificates that are signed by CAs present in the store, you can be sure that the remote server really is the one it claims to be.
    
    If the remote server uses a self-signed certificate, if you don't install a CA cert store, if the server uses a certificate signed by a CA that isn't included in the store you use or if the remote host is an impostor impersonating your favorite site, and you want to transfer files from this server, do one of the following:
    
    Tell libcurl to not verify the peer. With libcurl you disable this with curl_easy_setopt(curl, CURLOPT_SSL_VERIFYPEER, FALSE);
    
    With the curl command line tool, you disable this with -k/--insecure.
    

    adding

    CURLOPT_SSL_VERIFYHOST, 0
    CURLOPT_SSL_VERIFYPEER, 0
    

    is few seconds job, won't change anything for others or broke anything just will fix http proxy issue

    @support said in BrowserAutomationStudio 22.3.0 has been released:

    @gudolik Or any httpclient that works correctly with that website and that proxies.

    Additional info i used https://github.com/csharp-leaf/Leaf.xNet this lib in C# for test and works good with these proxy and this website but its works because is not Curl doe :)



  • @gudolik

    You told me that you have an issue on certain website. The issue was "No response for a certain amount of time from http client if you are working with proxy". Correct?

    I told you, that the issue is with that particular server which is very sensitive to headers and just holds connection if it doesn't like something. And proven it by providing tests with and without headers.

    I've also sent you project, with fixes(those fixes makes http client behave more like browser). Project works with Dante, default configuration.
    And shown you a video with MassTunneler, which also works well.

    Now the questions.

    Are we talking about same issue?
    
    If we talking about same issue, I don't understand why, because it is resolved, I've sent you solution.
    If we are talking about other issue, please define it properly, please explain whats working wrong.
    


  • @support said in BrowserAutomationStudio 22.3.0 has been released:

    You told me that you have an issue on certain website. The issue was "No response for a certain amount of time from http client if you are working with proxy". Correct?

    Okay, let me explain from the beginning. I have like 10 website/projects to be completed in BAS and all of them require HTTP requests. My first test with proxy i was having issue Recv failure: Connection was reset but without proxy was seems to work and i contact you in skype to report this and you come back and say because header is missing was having this issue. But actually is the same with header or not sometime curl will pass but sometimes will not doesn't matter with proxy or not. Here example of curl whith my proxy and no header and request is pass https://gyazo.com/87c018c05c03d0bdb2a59a8971926e2e. Then i report back that is still hapening and explain that test project which i give was with public proxy and thats why you thought was the header. Then you said its working on SOCK5 proxies, in my test i also saw that success rate is higher in SOCKS5 but still getting error sometimes even with your Dante socks which is "SSL read: error:00000000:lib(0):func(0):reason(0), errno 10054" and again this refference to Curl SSL error. To verify issue i put request on loop https://gyazo.com/a8284bc049dfbfcbfefb430d73bbc4b1 and yes same proxy first give "Recv failure: Connection was reset" then on 5th request pass and was fine for next 6 requests then cookies died and give access denied 403. And this is how i did now but this slows down really the bot because sometimes wont pass on 15th requests in loop. Then for last i tried in pure CURL in Bash command because you said in skype you tried with pure curl too and works. After trying in pure curl i saw issue still exist like i put screenshots up even without proxy. But adding -k or --insecure its fixes and thats what im asking you to do to add these in http curl requests. to fix the issue. And looking into skype curl which you send its include --insecure and thats why its worked on your curl to :) Hope this is more clear explanation



  • @gudolik

    Ok, here we go again. Your server is sensitive to input information. That is not rule, but exception on the web. Because when I tested it without proxy and without required headers, it just holds session infinite. That is not how servers works. They usually closes session with some error code. There are several cases, like headers, but not only headers. I don't own a server source code, so I can't say for sure about all things.

    This thing is also important.

    browser makes connection, sends request to ****, then make a couple of more requests, maybe makes more parallel connections, and then persist them for a long time, then maybe close. Browser also saves cookies from a first request and then sends it to ****/uk/account/loginmobile. HTTP client does only one connection and closes it immediately.
    

    Your initial case was "No response for a certain amount of time from http client if you are working with proxy".

    It is resolved. Because I've send you a project which resolves issue and always works. I've even provided a proxy to test.

    I've tried to understand, at least what problem we are trying to solve, but there is no answser for a direct question. You told about a history what you have tried, and what results did you get. That is not the issue, it is your experience.

    If you have certain error, that might be because of proxy, because of BAS, because of curl, or because of anything else. Issue must contain a steps, how it can be repeated and strict description. I can't see this.

    The only issue that I saw in your last message is:

    said in BrowserAutomationStudio 22.3.0 has been released:

    But adding -k or --insecure its fixes and thats what im asking you to do to add these in http curl requests. to fix the issue.

    Which is obviously not correct, because:

    1. --insecure is included by default. https://github.com/bablosoft/BAS/search?q=CURLOPT_SSL_VERIFYPEER&unscoped_q=CURLOPT_SSL_VERIFYPEER
    2. If the issue is with --insecure, it can easily can be fixed by adding it to curl command line, but it not. See failure examples, that I've sent to you.
    3. Missing --insecure command line param would give another error, something like "SSL peer certificate or SSH remote key was not OK"
    4. Issue reproduces on http:// scheme, which is obviously not influenced with this param.

    If you want to report about issue, please use following guide.
    https://community.bablosoft.com/topic/2707/how-to-correctly-report-about-error
    Further messages about this case, which doesn't match following guide will be ignored.


Log in to reply