実現したいこと
pip install --user transformers==3.5
環境
Python 3.10.5
前提
transformersのバージョン3.5をinstallしたいのですが、
エラーが発生してしまいます。
尚、
pip install transformers --user
を実行すると
Successfully installed transformers-4.26.1
と出力され、4.26.1はinstallすることができるようです。
私が欲しいのはtransformersのバージョンは3.5なのです。
発生している問題・エラーメッセージ
pip install --user transformers==3.5
Defaulting to user installation because normal site-packages is not writeable Collecting transformers==3.5 . .(省略) . Requirement already satisfied: charset-normalizer<3,>=2 in c:\program files\python310\lib\site-packages (from requests->transformers==3.5) (2.1.1) Requirement already satisfied: certifi>=2017.4.17 in c:\program files\python310\lib\site-packages (from requests->transformers==3.5) (2022.9.14) Requirement already satisfied: joblib in c:\program files\python310\lib\site-packages (from sacremoses->transformers==3.5) (1.1.0) Collecting click Using cached click-8.1.3-py3-none-any.whl (96 kB) Requirement already satisfied: six in c:\program files\python310\lib\site-packages (from sacremoses->transformers==3.5) (1.16.0) Building wheels for collected packages: sentencepiece, tokenizers Building wheel for sentencepiece (setup.py) ... error error: subprocess-exited-with-error × python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [17 lines of output] C:\Users\Masarina\AppData\Roaming\Python\Python310\site-packages\setuptools\dist.py:788: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead warnings.warn( running bdist_wheel running build running build_py creating build creating build\lib.win-amd64-cpython-310 copying sentencepiece.py -> build\lib.win-amd64-cpython-310 running build_ext building '_sentencepiece' extension creating build\temp.win-amd64-cpython-310 creating build\temp.win-amd64-cpython-310\Release "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.29.30133\bin\HostX86\x64\cl.exe" /c /nologo /O2 /W3 /GL /DNDEBUG /MD "-IC:\Program Files\Python310\include" "-IC:\Program Files\Python310\Include" "-IC:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.29.30133\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\cppwinrt" /EHsc /Tpsentencepiece_wrap.cxx /Fobuild\temp.win-amd64-cpython-310\Release\sentencepiece_wrap.obj /MT /I..\build\root\include cl : コマンド ライン warning D9025 : '/MD' より '/MT' が優先されます。 sentencepiece_wrap.cxx sentencepiece_wrap.cxx(2777): fatal error C1083: include ファイルを開けません。'sentencepiece_processor.h':No such file or directory error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2019\\Community\\VC\\Tools\\MSVC\\14.29.30133\\bin\\HostX86\\x64\\cl.exe' failed with exit code 2 [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for sentencepiece Running setup.py clean for sentencepiece Building wheel for tokenizers (pyproject.toml) ... error error: subprocess-exited-with-error × Building wheel for tokenizers (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [47 lines of output] running bdist_wheel running build running build_py creating build creating build\lib.win-amd64-cpython-310 creating build\lib.win-amd64-cpython-310\tokenizers copying py_src\tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers creating build\lib.win-amd64-cpython-310\tokenizers\models copying py_src\tokenizers\models\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\models creating build\lib.win-amd64-cpython-310\tokenizers\decoders copying py_src\tokenizers\decoders\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\decoders creating build\lib.win-amd64-cpython-310\tokenizers\normalizers copying py_src\tokenizers\normalizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\normalizers creating build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers copying py_src\tokenizers\pre_tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers creating build\lib.win-amd64-cpython-310\tokenizers\processors copying py_src\tokenizers\processors\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\processors creating build\lib.win-amd64-cpython-310\tokenizers\trainers copying py_src\tokenizers\trainers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\trainers creating build\lib.win-amd64-cpython-310\tokenizers\implementations copying py_src\tokenizers\implementations\base_tokenizer.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations copying py_src\tokenizers\implementations\bert_wordpiece.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations copying py_src\tokenizers\implementations\byte_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations copying py_src\tokenizers\implementations\char_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations copying py_src\tokenizers\implementations\sentencepiece_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations copying py_src\tokenizers\implementations\sentencepiece_unigram.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations copying py_src\tokenizers\implementations\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations copying py_src\tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers copying py_src\tokenizers\models\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\models copying py_src\tokenizers\decoders\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\decoders copying py_src\tokenizers\normalizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\normalizers copying py_src\tokenizers\pre_tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers copying py_src\tokenizers\processors\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\processors copying py_src\tokenizers\trainers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\trainers running build_ext running build_rust error: can't find Rust compiler If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler. To update pip, run: pip install --upgrade pip and then retry package installation. If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain. [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for tokenizers Failed to build sentencepiece tokenizers ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
試したこと
pip install --user Cmake
pip install --user --upgrade pip
pip install --user --upgrade setuptools
(それぞれ実行毎PC再起動済)
補足情報(FW/ツールのバージョンなど)
多少遠回りな情報となりますが、「ここまでの経緯」として念のため記載させていただきます。
BEATの実装を練習するために
https://github.com/YutaroOgawa/BERT_Japanese_Google_Colaboratory/blob/master/2_BERT_livedoor_news_on_Google_Colaboratory.ipynb
こちらのサイトを参考に実装を進めています。
こちらのサイトにはpythonのバージョンの記載がありませんでしたので
「もしかしたら3.10.5より以前のどれかしらのバージョンのpythonであれば動くのだろうなぁ」
「でもcupy環境構築したから壊したくないなぁ」
といった現状です。
そのほか
気まぐれですのでお手柔らかにお願いしますっ(; ・`д・´)

あなたの回答
tips
プレビュー