hiyouga
|
90d6df6222
|
release v0.9.0 (real)
|
2024-09-09 01:00:25 +08:00 |
hiyouga
|
94d5b1bd8f
|
add e2e tests
|
2024-09-05 21:52:28 +08:00 |
hiyouga
|
359ef8bb0e
|
support Yi-Coder models
|
2024-09-05 03:12:24 +08:00 |
hiyouga
|
57497135bf
|
add vl_feedback dataset
|
2024-09-04 03:13:03 +08:00 |
hiyouga
|
194064fdae
|
add pokemon dataset
|
2024-09-02 01:02:25 +08:00 |
hiyouga
|
a8f8a2ad8a
|
update readme
|
2024-09-01 23:32:39 +08:00 |
hiyouga
|
753c5fb36c
|
update wechat
|
2024-09-01 23:30:57 +08:00 |
hoshi-hiyouga
|
6f9e455af4
|
Merge pull request #5317 from ByronHsu/patch-1
Add liger kernel link
|
2024-09-01 23:30:12 +08:00 |
hiyouga
|
8e49940746
|
add rlhf-v dataset
|
2024-09-01 22:57:41 +08:00 |
Byron Hsu
|
b8a9cb554e
|
Add liger kernel link
|
2024-08-30 17:16:16 -07:00 |
hiyouga
|
e08045a946
|
add examples
|
2024-08-30 21:43:19 +08:00 |
hiyouga
|
3382317e32
|
refactor mm training
|
2024-08-30 02:14:31 +08:00 |
hiyouga
|
d14edd350d
|
add extra requires
|
2024-08-27 12:52:12 +08:00 |
hiyouga
|
72bc8f0111
|
support liger kernel
|
2024-08-27 11:20:14 +08:00 |
hiyouga
|
3804ddec9e
|
update readme
|
2024-08-19 23:32:04 +08:00 |
codingma
|
625a0e32c4
|
add tutorial and doc links
|
2024-08-13 16:13:10 +08:00 |
hiyouga
|
c93d55bfb0
|
update readme
|
2024-08-10 10:17:35 +08:00 |
hiyouga
|
576a894f77
|
update readme
|
2024-08-09 20:46:02 +08:00 |
hiyouga
|
c75b5b83c4
|
add magpie ultra dataset
|
2024-08-09 20:28:55 +08:00 |
hiyouga
|
dc770efb14
|
add qwen2 math models
|
2024-08-09 20:20:35 +08:00 |
hiyouga
|
e2a28f51c6
|
add adam_mini to readme
|
2024-08-09 20:02:03 +08:00 |
hiyouga
|
86f7099fa3
|
update scripts
|
2024-08-09 19:16:23 +08:00 |
hiyouga
|
b7ca6c8dc1
|
fix #5048
|
2024-08-05 23:48:19 +08:00 |
hoshi-hiyouga
|
9e409eadb0
|
Update README.md
|
2024-07-30 01:53:19 +08:00 |
hoshi-hiyouga
|
8d5a41f2cd
|
Update README.md
|
2024-07-30 01:52:35 +08:00 |
liudan
|
b9ed9d45cc
|
增加了MiniCPM在页面首页的支持列表,MiniCPM官方github也放了LLama_factory的友情链接
|
2024-07-29 10:58:28 +08:00 |
hiyouga
|
668654b5ad
|
tiny fix
|
2024-07-26 11:51:00 +08:00 |
hoshi-hiyouga
|
b8896b9b8b
|
Merge pull request #4970 from HardAndHeavy/add-rocm
Add ROCm support
|
2024-07-26 11:41:23 +08:00 |
hoshi-hiyouga
|
1186ad53d4
|
Update README.md
|
2024-07-26 11:29:28 +08:00 |
hoshi-hiyouga
|
f97beca23a
|
Update README.md
|
2024-07-26 11:29:09 +08:00 |
HardAndHeavy
|
c8e18a669a
|
Add ROCm support
|
2024-07-25 21:29:28 +03:00 |
khazic
|
ceba96f9ed
|
Added the reference address for TRL PPO details.
|
2024-07-25 09:03:21 +08:00 |
hiyouga
|
77cff78863
|
fix #4959
|
2024-07-24 23:44:00 +08:00 |
hoshi-hiyouga
|
5626bdc56d
|
Update README.md
|
2024-07-24 21:07:14 +08:00 |
hiyouga
|
26533c0604
|
add llama3.1
|
2024-07-24 16:20:11 +08:00 |
hiyouga
|
87346c0946
|
update readme
|
2024-07-03 19:39:05 +08:00 |
wangzhihong
|
22da47ba27
|
add LazyLLM to `Projects using LLaMA Factory` in `README.md`
|
2024-07-03 11:12:20 +08:00 |
hiyouga
|
d4e2af1fa4
|
update readme
|
2024-07-01 00:22:52 +08:00 |
hiyouga
|
d74244d568
|
fix #4398 #4592
|
2024-06-30 21:28:51 +08:00 |
hiyouga
|
0e0d69b77c
|
update readme
|
2024-06-28 06:55:19 +08:00 |
hiyouga
|
6f63050e1b
|
add Gemma2 models
|
2024-06-28 01:26:50 +08:00 |
hiyouga
|
e44a4f07f0
|
tiny fix
|
2024-06-27 20:14:48 +08:00 |
hoshi-hiyouga
|
64b131dcfa
|
Merge pull request #4461 from hzhaoy/feature/support-flash-attn
support flash-attn in Dockerfile
|
2024-06-27 20:05:26 +08:00 |
hiyouga
|
ad144c2265
|
support HQQ/EETQ #4113
|
2024-06-27 00:29:42 +08:00 |
hzhaoy
|
e19491b0f0
|
add flash-attn installation flag in Dockerfile
|
2024-06-27 00:13:30 +08:00 |
hiyouga
|
efb81b25ec
|
fix #4419
|
2024-06-25 01:51:29 +08:00 |
hiyouga
|
41086059b1
|
tiny fix
|
2024-06-25 01:15:19 +08:00 |
hoshi-hiyouga
|
5dc8fa647e
|
Update README.md
|
2024-06-25 01:03:38 +08:00 |
MengqingCao
|
d7207e8ad1
|
update docker files
1. add docker-npu (Dockerfile and docker-compose.yml)
2. move cuda docker to docker-cuda and tiny changes to adapt to the new path
|
2024-06-24 10:57:36 +00:00 |
hiyouga
|
e507e60638
|
update readme
|
2024-06-24 18:22:12 +08:00 |