I recently started to get interested in computer vision technology and went through a couple of tutorials to solve a few problems in my business. There are 4 buildings in the area and the problem is to control the number of people entering and exiting them. And also it is necessary to take into account the movement of service personnel.
I tried using the following repository to solve these problems:
https://github.com/theAIGuysCode/yolov4-deepsort
And it seems to me that this will solve my problem. But now there is a question of processing speed of video recordings from CCTV cameras. I tried to run a ten second video fragment and the script completed in 211 seconds. Which in my opinion is very long.
What can I do to improve the processing speed? Tell me where to look for the answer.
Error when trying to install openvino
Building wheels for collected packages: tokenizers
Building wheel for tokenizers (pyproject.toml) ... error
ERROR: Command errored out with exit status 1:
command: /usr/bin/python3.6 /home/baurzhan/.local/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /tmp/tmpr56p_xyt
cwd: /tmp/pip-install-pfzy6bre/tokenizers_dce0e65cae1e4e7c9325570d12cd6d63
Complete output (51 lines):
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-3.6
creating build/lib.linux-x86_64-3.6/tokenizers
copying py_src/tokenizers/__init__.py -> build/lib.linux-x86_64-3.6/tokenizers
creating build/lib.linux-x86_64-3.6/tokenizers/models
copying py_src/tokenizers/models/__init__.py -> build/lib.linux-x86_64-3.6/tokenizers/models
creating build/lib.linux-x86_64-3.6/tokenizers/decoders
copying py_src/tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-3.6/tokenizers/decoders
creating build/lib.linux-x86_64-3.6/tokenizers/normalizers
copying py_src/tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-3.6/tokenizers/normalizers
creating build/lib.linux-x86_64-3.6/tokenizers/pre_tokenizers
copying py_src/tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-3.6/tokenizers/pre_tokenizers
creating build/lib.linux-x86_64-3.6/tokenizers/processors
copying py_src/tokenizers/processors/__init__.py -> build/lib.linux-x86_64-3.6/tokenizers/processors
creating build/lib.linux-x86_64-3.6/tokenizers/trainers
copying py_src/tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-3.6/tokenizers/trainers
creating build/lib.linux-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-3.6/tokenizers/implementations
creating build/lib.linux-x86_64-3.6/tokenizers/tools
copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-x86_64-3.6/tokenizers/tools
copying py_src/tokenizers/tools/__init__.py -> build/lib.linux-x86_64-3.6/tokenizers/tools
copying py_src/tokenizers/__init__.pyi -> build/lib.linux-x86_64-3.6/tokenizers
copying py_src/tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-3.6/tokenizers/models
copying py_src/tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-3.6/tokenizers/decoders
copying py_src/tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-3.6/tokenizers/normalizers
copying py_src/tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-3.6/tokenizers/pre_tokenizers
copying py_src/tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-3.6/tokenizers/processors
copying py_src/tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-3.6/tokenizers/trainers
copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-x86_64-3.6/tokenizers/tools
running build_ext
running build_rust
error: can't find Rust compiler
If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
To update pip, run:
pip install --upgrade pip
and then retry package installation.
If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
----------------------------------------
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects