之前使用scrapy爬取了一些代理網站的代理寒匙,因為像知乎等網站針對ip進行了反爬蝇闭,如果限制了ip地址呻率,需要使用代理來進行訪問,所以爬取一些代理丁眼,有備無患筷凤。但是很多免費代理網站提供的代理,十個可能就一兩個能用,因此寫一個小程序來對代理進行驗證就有必要了藐守,這也是一個代理池的基本實現思路挪丢。要驗證代理是否可用很簡單,通過http請求帶上代理地址卢厂,驗證請求是否成功即可乾蓬。python中當然是用requests最簡單了,但是如果要驗證多個代理慎恒,是用request進行同步驗證任内,代理多了必然很慢。所以需要考慮多線程融柬,或異步方式死嗦。而python中可以通過協程來實現異步請求,更有封裝好的aiohttp框架可以使用粒氧。以下代碼對比requests和aiohttp對20個代理地址進行校驗的效率(請求地址越除,百度):
requests
import requests
import time
from requests import ConnectTimeout
from lear_scrapy.util.redisclient import RedisClient
api = 'http://www.baidu.com'
# 代理存在redis里面,這里通過redisClient取出外盯,具體代碼就不貼了
rc = RedisClient()
proxies = rc.get_proxies(3500, 3520)
# print(proxies)
# 記錄開始時間
start = time.time()
for proxy in proxies:
try:
print(proxy.decode())
# 這里設置一個5秒超時時間摘盆,避免使用代理太長時間連接不上
resp = requests.get(api, proxies={'http': proxy.decode()}, timeout=5)
print(proxy.decode(), ':', resp.status_code)
except ConnectTimeout as e:
print(e)
except Exception as e:
print(e)
end = time.time()
print('takes:', (end - start))
循環(huán)20次請求百度后
takes: 74.56189751625061
aiohttp
import aiohttp
import asyncio
import time
from lear_scrapy.util.redisclient import RedisClient
class ProxyValidate:
def __init__(self, proxies, api):
self.__proxies = proxies
self.__api = api
def validate_all(self):
loop = asyncio.get_event_loop()
useful_proxies = []
task = [self.validate_single(proxy.decode(), useful_proxies) for proxy in self.__proxies]
loop.run_until_complete(asyncio.wait(task))
print(useful_proxies)
async def validate_single(self, proxy, useful_proxies):
try:
async with aiohttp.ClientSession() as session:
async with session.get(self.__api, timeout=5, proxy=proxy) as resp:
if resp.status == 200:
print(proxy, ':useful')
useful_proxies.append(proxy)
except aiohttp.ClientProxyConnectionError as error:
print(proxy, ':bad')
print(error)
except Exception as e:
print(proxy, ':bad')
print(e)
if __name__ == '__main__':
rc = RedisClient()
proxies = rc.get_proxies(3500, 3520)
# proxies = [b'http://125.40.238.181:56834', b'http://117.127.0.201:8080', b'http://221.2.174.6:8060', b'http://121.41.120.245:80']
start = time.time()
validate = ProxyValidate(proxies, 'http://www.baidu.com')
validate.validate_all()
end = time.time()
print('validate proxies takes:', (end - start))
同樣請求百度20次
validate proxies takes: 5.147261142730713
居然只花了5秒鐘。
結論:我的老天