1000 Commits

Author SHA1 Message Date
samiabat
59164a4cb5 add ref 2025-03-25 02:28:05 +03:00
samiabat
73f08615bf change webui to api 2025-03-25 02:13:20 +03:00
samiabat
4354225311 fix port 2025-03-25 01:06:30 +03:00
samiabat
944119b720 remove load wiehgt false part 2025-03-25 00:44:09 +03:00
samiabat
9e96a70762 add all 2025-03-25 00:19:45 +03:00
samiabat
28f3bd48f5 change the app that is going to run 2025-03-24 23:57:43 +03:00
samiabat
28f9e4d14c add all 2025-03-24 22:30:36 +03:00
samiabat
ee8e5f8ab8 add latest 2025-03-24 22:16:53 +03:00
samiabat
b66e483fe8 add the correct path 2025-03-24 22:00:21 +03:00
samiabat
102f08d7f1 add local files 2025-03-24 21:38:57 +03:00
samiabat
550c37528c add all 2025-03-24 21:09:02 +03:00
samiabat
f623a5bb96 add gitcloning 2025-03-24 20:45:38 +03:00
samiabat
82c300b417 fix the dockerfile issue 2025-03-24 11:44:32 +03:00
samiabat
636ea89e4a fix pythorch issue 2025-03-24 11:34:44 +03:00
samiabat
b0f4a3927f remove git lfs 2025-03-24 11:16:02 +03:00
samiabat
2cb70c0889 add 2025-03-24 10:53:02 +03:00
samiabat
7533685af9 change the git 2025-03-24 10:52:07 +03:00
samiabat
8c8f9e7b40 change to api 2025-03-24 10:13:54 +03:00
samiabat
9450cd951d add new dockerfile 2025-03-24 05:57:40 +03:00
samiabat
bf993dff2b remove the download.py 2025-03-24 05:19:21 +03:00
ChasonJiang
165882d64f
修复多余的注释导致的bug (#2158) 2025-03-05 18:22:01 +08:00
RVC-Boss
271db6a4de
fix torch.inference_mode()RuntimeError:Inplace update to inference tensor outside InferenceMode is not allowed.
fix torch.inference_mode()RuntimeError:Inplace update to inference tensor outside InferenceMode is not allowed.
2025-03-05 18:07:47 +08:00
ChasonJiang
053a356ffe
修复gpt的padding mask的问题 (#2153)
* 修复gpt的padding mask的问题

* rollback tts_config
2025-03-05 17:14:43 +08:00
KamioRinn
fe2f04bdb8
API for V3 (#2154) 2025-03-05 17:13:46 +08:00
ChasonJiang
6dd2f72090
更改gpt并行推理时的mask策略为padding left (#2144)
* 更改gpt并行推理时的mask策略为padding left,使batch_infer更接近于naive_infer
减少冗余操作并使用torch_sdpa,以提升推理速度

* rollback tts_infer.yaml
2025-03-04 16:45:37 +08:00
KamioRinn
959a2ddbeb
Matching fast_langdetect update (#2140) 2025-03-04 14:10:58 +08:00
Fridemn
bb8a8efeca
fix: 修复 Linux 一键安装脚本执行失败问题 (#2142)
安装 pyopenjtalk 库时不仅要保证 gcc 版本不高于 14,同时在执行 pip install -r requirements.txt 前需要保证环境变量中更新刚刚安装的 gcc/g++/cmake。
因此在安装三者后补充了设置环境变量,并且用 hash -r 确保生效
2025-03-04 14:10:37 +08:00
RVC-Boss
df33574a26
修复超分后音量超过1写错了的bug
修复超分后音量超过1写错了的bug
2025-03-01 19:17:03 +08:00
Fridemn
4fd57b0ea7
修复 Linux 无法编译 pyopenjtalk (#2127) 20250228v3 2025-02-28 22:21:34 +08:00
RVC-Boss
118c2bed68
Update Changelog_CN.md 2025-02-28 16:53:33 +08:00
RVC-Boss
a32a2b8934
v3sovits模型推理支持webui传语速参数调整合成语速
v3sovits模型推理支持webui传语速参数调整合成语速
2025-02-28 16:50:35 +08:00
RVC-Boss
c38b169019
v3sovits模型推理支持webui传语速参数调整合成语速
v3sovits模型推理支持webui传语速参数调整合成语速
2025-02-28 16:50:12 +08:00
RVC-Boss
a69be1eae7
Merge pull request #2122 from KamioRinn/Fix-Short-CJK-LangSegmenter
Fix Short CJK LangSegmenter
2025-02-28 11:23:05 +08:00
RVC-Boss
9ef2c00bbc
Merge pull request #2123 from SapphireLab/Doc
Update Documentation
2025-02-28 11:01:30 +08:00
starylan
b446f7954b 更新其他语言ChangeLog 2025-02-28 02:52:15 +08:00
starylan
ff299d17d3 更新其他语言ReadMe 2025-02-28 02:51:24 +08:00
starylan
e9af7921fa 删除多余缩进 2025-02-28 02:51:04 +08:00
KamioRinn
3356fc9e09 Fix get_phones_and_bert 2025-02-28 02:25:58 +08:00
KamioRinn
531a38f119 Fix Short CJK LangSegmenter 2025-02-28 02:04:25 +08:00
starylan
780524e0cc 更新i18n 2025-02-28 01:41:20 +08:00
starylan
e35ade8f60 拆分i18n 2025-02-28 01:41:04 +08:00
starylan
fc1400bdba 更新webui 2025-02-28 01:40:50 +08:00
RVC-Boss
2cd843dcbc
Update inference_webui.py 2025-02-27 22:46:29 +08:00
RVC-Boss
94ffcbe616
Update inference_webui.py 2025-02-27 22:43:18 +08:00
RVC-Boss
a0ff6fa7a2
Update inference_webui.py 2025-02-27 22:28:02 +08:00
RVC-Boss
0b20a949ed
Update audio_sr.py 2025-02-27 22:25:24 +08:00
RVC-Boss
a68e3c4354
Update TTS.py 2025-02-27 22:14:51 +08:00
RVC-Boss
ffcba8e553
Update requirements.txt 2025-02-27 21:05:53 +08:00
RVC-Boss
250b1c73cb
fix 24k to 48k inference 2025-02-27 20:49:17 +08:00
RVC-Boss
060a0d91dc
fix 24k to 48k inference
fix 24k to 48k inference
2025-02-27 19:05:54 +08:00