完整的錯誤信息
'(MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /bert-base-uncased/resolve/main/vocab.txt (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7f1320354880>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: 625af900-631f-4614-9358-30364ecacefe)')' thrown while requesting HEAD https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt
'(MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /bert-base-uncased/resolve/main/added_tokens.json (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7f1320354d60>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: 1679a995-7441-4afe-a685-9a7bd6da9f2a)')' thrown while requesting HEAD https://huggingface.co/bert-base-uncased/resolve/main/added_tokens.json
'(MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /bert-base-uncased/resolve/main/special_tokens_map.json (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7f13202fb250>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: 9af5b73e-5230-45d7-8886-5d37d38f09a8)')' thrown while requesting HEAD https://huggingface.co/bert-base-uncased/resolve/main/special_tokens_map.json
'(MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /bert-base-uncased/resolve/main/tokenizer_config.json (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7f13202fb730>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: 12136040-d033-4099-821c-dcb80fb50018)')' thrown while requesting HEAD https://huggingface.co/bert-base-uncased/resolve/main/tokenizer_config.json
Traceback (most recent call last):
File "/tmp/pycharm_project_494/Zilean-Classifier/main.py", line 48, in <module>
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
File "/root/miniconda3/envs/DL/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1838, in from_pretrained
raise EnvironmentError(
OSError: Can't load tokenizer for 'bert-base-uncased'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'bert-base-uncased' is the correct path to a directory containing all relevant files for a BertTokenizer tokenizer.
首先造成這種錯誤的原因主要是因?yàn)槟愕姆?wù)器沒有辦法連接huggingface的原因,你可以直接在你的服務(wù)器上嘗試能否直接ping
ping huggingface.co
那我的機(jī)器就是沒有數(shù)據(jù)傳輸過來,當(dāng)然前提是你自己的服務(wù)器一定要有網(wǎng)絡(luò)連接(可以嘗試ping www.baidu.com來檢測自己機(jī)器是否有網(wǎng)絡(luò))。
解決方案1
使用VPN,這個方法比較適用機(jī)器是你自己的,如果機(jī)器不是你的,你搭VPN比較麻煩,因?yàn)樽獾姆?wù)器會定時清理,在Linux搭建VPN也很簡單,大家搜索一哈有很多方案
解決方案2【推薦】
第二種方式適用于租賃的機(jī)器的情況,就是直接將本地的下載好(你的本地也需要能訪問外網(wǎng))的預(yù)訓(xùn)練模型上傳上去,如果你已經(jīng)在你的本地簡單跑過代碼了,沒有就去官網(wǎng)下載,首先我們確定我們本地文件所處的路徑【windows下應(yīng)該在你的用戶文件下面又有個.cache,注意打開隱藏文件夾】:
-
將指定的模型下載到本地【本地機(jī)器需要科學(xué)上網(wǎng)】
from transformers import BertModel, BertTokenizer # 使用bert-large-uncased model = BertModel.from_pretrained('bert-large-uncased') tokenizer = BertTokenizer.from_pretrained('bert-large-uncased')
此時你的機(jī)器上會出現(xiàn)如下圖片:
-
找到本地下載好的模型文件
- 如果你是
windows
用戶,你的用戶User文件下面又有個.cache/huggingface/hub/,注意打開隱藏文件; - 如果你是
macos
用戶在下面地址中~/.cache/huggingface/hub/models--bert-base-uncased
- 如果你是
-
上傳文件到服務(wù)器上
將本地文件上傳到服務(wù)器的下面地址中~/.cache/huggingface/hub/models--bert-base-uncased
就可以運(yùn)行你的代碼了,但是這里運(yùn)行的時候有個小問題,就是你運(yùn)行時候仍然會報錯說無法下載這些文件,請耐心等待,你的代碼會正常運(yùn)行
如果你不想出現(xiàn)之前上面還顯示出錯的問題,那么修改之前的加載方法,之前的加載方法為:
config = BertConfig.from_pretrained(model_name)
修改為文章來源:http://www.zghlxwxcb.cn/news/detail-716341.html
# 指定本地bert模型路徑 bert_model_dir = "/path/to/bert/model" config = transformers.BertConfig.from_pretrained(bert_model_dir)
即可文章來源地址http://www.zghlxwxcb.cn/news/detail-716341.html
到了這里,關(guān)于報錯解決MaxRetryError(“HTTPSConnectionPool(host=‘huggingface.co‘, port=443):xxx“)的文章就介紹完了。如果您還想了解更多內(nèi)容,請?jiān)谟疑辖撬阉鱐OY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關(guān)文章,希望大家以后多多支持TOY模板網(wǎng)!