如何实现多方下的lora大模型微调 #1855
Unanswered
Silver-Glacier
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
使用该框架如何实现多方场景下的lora大模型微调,能否给出个微调llm的示例,例如fate框架中的fate-llm
Beta Was this translation helpful? Give feedback.
All reactions