-
-
-
-
-
-
Inference Providers
Active filters: sea
Text Generation
• 8B • Updated
• 78
• 8
SeaLLMs/SeaLLMs-v3-7B-Chat
Text Generation
• 8B • Updated
• 1.34k
• • 70
Text Generation
• 7B • Updated
• 9.06k
• 68
mlx-community/SeaLLM-7B-v2-4bit-mlx
Updated
• 4
• 3
LoneStriker/SeaLLM-7B-v2-GGUF
7B • Updated
• 145
• 6
LoneStriker/SeaLLM-7B-v2-3.0bpw-h6-exl2
Text Generation
• Updated
LoneStriker/SeaLLM-7B-v2-4.0bpw-h6-exl2
Text Generation
• Updated
• 3
LoneStriker/SeaLLM-7B-v2-5.0bpw-h6-exl2
Text Generation
• Updated
• 3
LoneStriker/SeaLLM-7B-v2-6.0bpw-h6-exl2
Text Generation
• Updated
LoneStriker/SeaLLM-7B-v2-8.0bpw-h8-exl2
Text Generation
• Updated
• 1
LoneStriker/SeaLLM-7B-v2-AWQ
Text Generation
• 7B • Updated
• 4
Text Generation
• 8B • Updated
• 114
• 28
Text Generation
• 4B • Updated
• 118
• 6
Text Generation
• 2B • Updated
• 88
• • 8
Text Generation
• 0.6B • Updated
• 84
• 9
Text Generation
• 4B • Updated
• 102
• 2
Text Generation
• 2B • Updated
• 66
• • 6
Text Generation
• 0.6B • Updated
• 171
• 7
sail/Sailor-1.8B-Chat-gguf
2B • Updated
• 318
• 3
sail/Sailor-0.5B-Chat-gguf
0.6B • Updated
• 563
• 4
4B • Updated
• 403
• 3
8B • Updated
• 395
• 5
Text Generation
• 9B • Updated
• 12.9k
• 50
SeaLLMs/SeaLLM-7B-v2.5-GGUF
9B • Updated
• 133
• 8
SeaLLMs/SeaLLM-7B-v2.5-mlx-quantized
Text Generation
• 2B • Updated
• 3
• 2
NikolayKozloff/Sailor-7B-Q8_0-GGUF
8B • Updated
• 21
• 1
QuantFactory/SeaLLM-7B-v2.5-GGUF
Text Generation
• 9B • Updated
• 60
• 1
QuantFactory/SeaLLM-7B-v2-GGUF
Text Generation
• 7B • Updated
• 123
• 1
Image-Text-to-Text
• 8B • Updated
• 5
• 5