Navigation X
ALERT
Click here to register with a few steps and explore all our cool stuff we have to offer!



   1565

close

by Mr3olba - 16 October, 2021 - 06:35 PM
This post is by a banned member (Mr3olba) - Unhide
Mr3olba  
Supreme
37
Posts
13
Threads
6 Years of service
#1
(This post was last modified: 02 July, 2024 - 02:25 PM by Mr3olba. Edited 1 time in total.)
close
BTC : 3HjDPQFfEizxtSZtMnYfssVb3Ri9eSb1wP
This post is by a banned member (Atomical) - Unhide
Atomical  
Godlike
440
Posts
102
Threads
5 Years of service
#2
(This post was last modified: 16 October, 2021 - 10:20 PM by Atomical.)
my dick bends

i dont know alot about workers but i use multithreading for account gens. 
this thread may help you
I am currently not selling anything. Beware of impersonators.
This post is by a banned member (Mr3olba) - Unhide
Mr3olba  
Supreme
37
Posts
13
Threads
6 Years of service
#3
appending a function to a task list using requests will end processing one by one, i need to implement aiohttp with asyn and i have no idea how to do it xD
BTC : 3HjDPQFfEizxtSZtMnYfssVb3Ri9eSb1wP
This post is by a banned member (SirDank) - Unhide
SirDank  
Supreme
1.874
Posts
54
Threads
4 Years of service
#4
(This post was last modified: 17 October, 2021 - 07:25 PM by SirDank. Edited 3 times in total.)
this isnt asyncio but here
 
Code:
from concurrent.futures import ThreadPoolExecutor
from concurrent.futures import as_completed
def FUNCTION_NAME_HERE(VARIABLE_NAME_HERE):
    print(f"{VARIABLE_NAME_HERE}")

futures = []
executor = ThreadPoolExecutor(max_workers=100)
for file in file_list:
    futures.append(executor.submit(FUNCTION_NAME_HERE, VARIABLE_NAME_HERE))
for future in as_completed(futures):
    try:
        future.result()
    except:
        pass
futures.clear()
This post is by a banned member (PayByte) - Unhide
PayByte  
Supreme
49
Posts
13
Threads
6 Years of service
#5
Code:
import asyncio
import aiohttp
from time import time

async def scrape_site(url: str, session: aiohttp.ClientSession = None) -> None:
    session = session or aiohttp.ClientSession()

    async with session.get(url) as res:
        print(f"Request sent to site {url}")

async def main() -> None:
    sites = ["https://youtube.com", "https://google.com", "https://yahoo.com"]

    start = time()
    session = aiohttp.ClientSession()
    await asyncio.gather(*[scrape_site(site, session) for site in sites])
    await session.close()
    print(f"Finished in {time()-start}ms")

if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

Quick example I made, I hope this helps ?

Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
or
Sign in
Already have an account? Sign in here.


Forum Jump:


Users browsing this thread: 1 Guest(s)