Whisper
をテンプレートにして作成
[
トップ
] [
新規
|
一覧
|
検索
|
最終更新
|
ヘルプ
|
ログイン
]
開始行:
[[Deep Learning]]
#contents
&color(red){※前提条件:本情報はWhisper 1.5.0を基づいて説...
* Whisper [#v9e2bc78]
这个方案就是 OpenAI 开源的 Whisper,当然是用 Python 写的...
GitHub 仓库地址:
https://github.com/openai/whisper
参考
https://blog.csdn.net/xiaohucxy/article/details/134838912
Model 下载位置:
https://huggingface.co/ggerganov/whisper.cpp/tree/main
//不同人提交的
https://huggingface.co/Systran
** CPU [#vca4d666]
NuGet安装下面两个包
- Whisper.net
- Whisper.net.Runtime
** GPU [#i3b54b68]
NuGet安装下面两个包
- Whisper.net
- Whisper.net.Runtime.Clblast
- Whisper.net.Runtime.Cublas: Cublas 是 NVIDIA 提供的一个...
- Whisper.net.Runtime.Clblast: Clblast 是一个用于 OpenCL ...
* Fast-Whisper [#e9d4529a]
虽然已经很简单了,但是对于程序员来说还是不够简洁,毕竟程...
于是,就有了更快、更简洁的 Fast-Whisper。Fast-Whisper 并...
总结一下,也就是比 Whisper 更快,官方的说法是比 Whisper ...
GitHub 仓库地址:
https://github.com/SYSTRAN/faster-whisper
CUDA的下载路径:
https://developer.nvidia.com/cuda-downloads
运行保存在本地的large-v3模型
#codeprettify{{
from faster_whisper import WhisperModel
model_size = "small"
path = r"E:\aSer\whisper\faster-whisper-small"
# Run on GPU with FP16
model = WhisperModel(model_size_or_path=path, device="cpu...
# or run on GPU with INT8
# model = WhisperModel(model_size, device="cuda", compute...
# or run on CPU with INT8
# model = WhisperModel(model_size, device="cpu", compute_...
segments, info = model.transcribe("E:\\aSer\\whisper\\202...
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment...
}}
** 下载地址 [#td954730]
只能翻墙才能访问到
|large-v3模型|https://huggingface.co/Systran/faster-whisp...
|large-v2模型|https://huggingface.co/guillaumekln/faster-...
|large-v2模型|https://huggingface.co/guillaumekln/faster-...
|medium模型|https://huggingface.co/guillaumekln/faster-wh...
|small模型|https://huggingface.co/guillaumekln/faster-whi...
|base模型|https://huggingface.co/guillaumekln/faster-whis...
|tiny模型|https://huggingface.co/guillaumekln/faster-whis...
下载cuBLAS and cuDNN
https://github.com/Purfview/whisper-standalone-win/relea...
** 环境配置 [#ae0befd5]
创建环境 在conda环境中创建python运行环境
conda create -n faster_whisper python=3.9 # python版本要...
激活虚拟环境
conda activate faster_whisper
安装faster-whisper依赖
pip install faster-whisper
* Distil-Whisper [#e89abc06]
Distil-Whisper is a distilled version of Whisper that is ...
https://github.com/huggingface/distil-whisper?tab=readme...
Model下载位置
https://huggingface.co/distil-whisper/distil-large-v3
#hr();
コメント:
#comment_kcaptcha
終了行:
[[Deep Learning]]
#contents
&color(red){※前提条件:本情報はWhisper 1.5.0を基づいて説...
* Whisper [#v9e2bc78]
这个方案就是 OpenAI 开源的 Whisper,当然是用 Python 写的...
GitHub 仓库地址:
https://github.com/openai/whisper
参考
https://blog.csdn.net/xiaohucxy/article/details/134838912
Model 下载位置:
https://huggingface.co/ggerganov/whisper.cpp/tree/main
//不同人提交的
https://huggingface.co/Systran
** CPU [#vca4d666]
NuGet安装下面两个包
- Whisper.net
- Whisper.net.Runtime
** GPU [#i3b54b68]
NuGet安装下面两个包
- Whisper.net
- Whisper.net.Runtime.Clblast
- Whisper.net.Runtime.Cublas: Cublas 是 NVIDIA 提供的一个...
- Whisper.net.Runtime.Clblast: Clblast 是一个用于 OpenCL ...
* Fast-Whisper [#e9d4529a]
虽然已经很简单了,但是对于程序员来说还是不够简洁,毕竟程...
于是,就有了更快、更简洁的 Fast-Whisper。Fast-Whisper 并...
总结一下,也就是比 Whisper 更快,官方的说法是比 Whisper ...
GitHub 仓库地址:
https://github.com/SYSTRAN/faster-whisper
CUDA的下载路径:
https://developer.nvidia.com/cuda-downloads
运行保存在本地的large-v3模型
#codeprettify{{
from faster_whisper import WhisperModel
model_size = "small"
path = r"E:\aSer\whisper\faster-whisper-small"
# Run on GPU with FP16
model = WhisperModel(model_size_or_path=path, device="cpu...
# or run on GPU with INT8
# model = WhisperModel(model_size, device="cuda", compute...
# or run on CPU with INT8
# model = WhisperModel(model_size, device="cpu", compute_...
segments, info = model.transcribe("E:\\aSer\\whisper\\202...
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment...
}}
** 下载地址 [#td954730]
只能翻墙才能访问到
|large-v3模型|https://huggingface.co/Systran/faster-whisp...
|large-v2模型|https://huggingface.co/guillaumekln/faster-...
|large-v2模型|https://huggingface.co/guillaumekln/faster-...
|medium模型|https://huggingface.co/guillaumekln/faster-wh...
|small模型|https://huggingface.co/guillaumekln/faster-whi...
|base模型|https://huggingface.co/guillaumekln/faster-whis...
|tiny模型|https://huggingface.co/guillaumekln/faster-whis...
下载cuBLAS and cuDNN
https://github.com/Purfview/whisper-standalone-win/relea...
** 环境配置 [#ae0befd5]
创建环境 在conda环境中创建python运行环境
conda create -n faster_whisper python=3.9 # python版本要...
激活虚拟环境
conda activate faster_whisper
安装faster-whisper依赖
pip install faster-whisper
* Distil-Whisper [#e89abc06]
Distil-Whisper is a distilled version of Whisper that is ...
https://github.com/huggingface/distil-whisper?tab=readme...
Model下载位置
https://huggingface.co/distil-whisper/distil-large-v3
#hr();
コメント:
#comment_kcaptcha
ページ名: