GGML - AI at the edge
ggml is a tensor library for machine learning to enable large models and high performance on commodity hardware. It is used by llama.cpp and whisper.cpp
- Low-level cross-platform implementation
- Integer quantization support
- Broad hardware support
- No third-party dependencies
- Zero memory allocations during runtime
The ggml way
-
Minimal
We like simplicity and aim to keep the codebase as small and as simple as possible
-
Open Core
The library and related projects are freely available under the MIT license. The development process is open and everyone is welcome to join. In the future we may choose to develop extensions that are licensed for commercial use
-
Explore and have fun!
We built ggml in the spirit of play. Contributors are encouraged to try crazy ideas, build wild demos, and push the edge of what’s possible
Contributing
Join the development at ggml-org on GitHub.
Company
ggml.ai is a company founded by Georgi Gerganov to support the development of ggml. Nat Friedman and Daniel Gross provided the pre-seed funding.
We are currently seeking to hire full-time developers that share our vision and would like to help advance the idea of on-device inference. If you are interested and if you have already been a contributor to any of the related projects, please contact us at [email protected]
Business inquiries
For any business-related topics, including support or enterprise deployment, please contact us at [email protected]