Inferact
Our mission is to grow vLLM as the world's AI inference engine and accelerate AI progress by making inference cheaper and faster.
- 118 followers
- United States of America
- https://inferact.ai/
- contact@inferact.ai
Popular repositories Loading
-
vllm-large-scale-serving
vllm-large-scale-serving PublicForked from vllm-project/vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
Python
Repositories
Showing 1 of 1 repositories
- vllm-large-scale-serving Public Forked from vllm-project/vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
Inferact/vllm-large-scale-serving’s past year of commit activity
Top languages
PythonMost used topics
Loading…