requests.exceptions.ConnectionError: HTTPConnectionPool(host='jy-qj.com.cn', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x00000000032ED358>: Failed to establish a new connection: [Errno 11004] getaddrinfo failed',))
今天爬例58二手大概30万的二手url,到下午打时候,突然发现报错了,谷歌了一下,需要在requests的时候,加上timeout,尝试例一下,依然在报错。
进入debug模式,代码如下
from bs4 import BeautifulSoup
import requests
# 拿到58同城二手市场的列表页
start_url='http://cd.58.com/sale.shtml'
url_host='http://cd.58.com'
def get_index_url(url):
try:
wb_data = requests.get(url)
print(wb_data)
soup = BeautifulSoup(wb_data.text, 'lxml')
links = soup.select('#ymenu-side > ul > li > span.dlb > a')
for link in links:
page_url = url_host + link.get('href','')
print(page_url)
except requests.exceptions.ConnectionError as e :
None
get_index_url(start_url)
channel_list='''
http://cd.58.com/shouji/
http://cd.58.com/tongxunyw/
http://cd.58.com/danche/
http://cd.58.com/diandongche/
http://cd.58.com/diannao/
http://cd.58.com/shuma/
http://cd.58.com/jiadian/
http://cd.58.com/ershoujiaju/
http://cd.58.com/yingyou/
http://cd.58.com/fushi/
http://cd.58.com/meirong/
http://cd.58.com/yishu/
http://cd.58.com/tushu/
http://cd.58.com/wenti/
http://cd.58.com/bangong/
http://cd.58.com/shebei.shtml
http://cd.58.com/chengren/
'''
发现域名都没解析,使用try,except 判断,直接输出None
感觉有点想端口被占用,然后再重开一个 ceshi.py python文件
from bs4 import BeautifulSoup
import requests
url='http://www.baidu.com'
a=requests.get(url)
print(a)
输出依然是requests.exceptions.ConnectionErro,,
这个时候打开终端,查看80端口,,,,命令行调试 netstat -an|grep 80
显示
root@kk-HP-450-Notebook-PC:/home/kk# netstat -an |grep :80
- tcp 0 0 0.0.0.0:80 0.0.0.0:* LISTEN
- tcp 0 0 0.0.0.0:8099 0.0.0.0:* LISTEN
- tcp 0 0 192.168.0.116:46340 203.208.39.218:80 ESTABLISHED
- tcp 0 0 192.168.0.116:55916 203.208.51.80:80 ESTABLISHED
- tcp 0 0 192.168.0.116:60568 203.208.39.242:80 ESTABLISHED
- tcp 1 1 192.168.0.116:49790 117.174.144.48:80 LAST_ACK
- tcp 0 0 192.168.0.116:56002 203.208.51.89:80 ESTABLISHED
- tcp 1 1 192.168.0.116:36678 183.232.231.117:80 LAST_ACK
- tcp 0 0 192.168.0.116:51214 42.159.236.181:80 TIME_WAIT
- tcp 0 0 192.168.0.116:36000 203.208.39.252:80 ESTABLISHED
- tcp 0 0 192.168.0.116:60844 203.208.51.82:80 ESTABLISHED
- tcp 0 0 192.168.0.116:56086 203.208.51.89:80 ESTABLISHED
- tcp 1 1 192.168.0.116:53838 111.13.101.191:80 LAST_ACK
- tcp 0 0 192.168.0.116:56090 203.208.51.89:80 ESTABLISHED
- tcp 0 0 192.168.0.116:60846 203.208.51.82:80 ESTABLISHED
- tcp 0 0 192.168.0.116:46276 203.208.39.218:80 ESTABLISHED
- tcp 0 0 192.168.0.116:54686 203.208.51.58:80 ESTABLISHED
- tcp 0 0 192.168.0.116:39776 203.208.43.109:80 ESTABLISHED
- tcp 0 0 192.168.0.116:46334 203.208.39.218:80 ESTABLISHED
- tcp 1 1 192.168.0.116:49796 117.174.144.48:80 LAST_ACK
- tcp 1 1 192.168.0.116:53258 180.97.104.146:80 LAST_ACK
- tcp 0 0 192.168.0.116:38318 203.208.39.225:80 ESTABLISHED
- tcp 0 0 192.168.0.116:46232 203.208.43.91:80 ESTABLISHED
- tcp 0 0 192.168.0.116:56088 203.208.51.89:80 ESTABLISHED
- tcp 0 0 192.168.0.116:46302 203.208.39.218:80 ESTABLISHED
- tcp 0 0 192.168.0.116:39772 203.208.43.109:80 ESTABLISHED
- tcp 0 0 192.168.0.116:43920 203.208.51.45:80 ESTABLISHED
- tcp 0 0 192.168.0.116:43922 203.208.51.45:80 ESTABLISHED
- tcp 0 1 192.168.0.116:44452 111.202.114.35:80 FIN_WAIT1
- tcp 0 0 192.168.0.116:58922 203.208.51.81:80 ESTABLISHED
- tcp 0 0 192.168.0.116:55918 203.208.51.80:80 ESTABLISHED
- tcp6 0 0 :::80 :::* LISTEN
》》》》》》》》》》》》》》》》》》》》》》》》》》》》》》》》》》在通过
lsof -i:80
kill pid
然后重启pycharm 重启一下网络的连接
运行 ceshi.py
返回 <Response [200]>
本站文章如无特殊说明,均为本站原创,如若转载,请注明出处:爬虫.requests.exceptions.ConnectionErro - Python技术站