About
Powerful, self-serve machine learning platform where you can turn models into scalable APIs in just a few clicks. Sign up for Deep Infra account using GitHub or log in using GitHub. Choose among hundreds of the most popular ML models. Use a simple rest API to call your model. Deploy models to production faster and cheaper with our serverless GPUs than developing the infrastructure yourself. We have different pricing models depending on the model used. Some of our language models offer per-token pricing. Most other models are billed for inference execution time. With this pricing model, you only pay for what you use. There are no long-term contracts or upfront costs, and you can easily scale up and down as your business needs change. All models run on A100 GPUs, optimized for inference performance and low latency. Our system will automatically scale the model based on your needs.
|
About
FriendliAI is a generative AI infrastructure platform that offers fast, efficient, and reliable inference solutions for production environments. It provides a suite of tools and services designed to optimize the deployment and serving of large language models (LLMs) and other generative AI workloads at scale. Key offerings include Friendli Endpoints, which allow users to build and serve custom generative AI models, saving GPU costs and accelerating AI inference. It supports seamless integration with popular open source models from the Hugging Face Hub, enabling lightning-fast, high-performance inference. FriendliAI's cutting-edge technologies, such as Iteration Batching, Friendli DNN Library, Friendli TCache, and Native Quantization, contribute to significant cost savings (50–90%), reduced GPU requirements (6× fewer GPUs), higher throughput (10.7×), and lower latency (6.2×).
|
About
SiliconFlow is a high-performance, developer-focused AI infrastructure platform offering a unified and scalable solution for running, fine-tuning, and deploying both language and multimodal models. It provides fast, reliable inference across open source and commercial models, thanks to blazing speed, low latency, and high throughput, with flexible options such as serverless endpoints, dedicated compute, or private cloud deployments. Platform capabilities include one-stop inference, fine-tuning pipelines, and reserved GPU access, all delivered via an OpenAI-compatible API and complete with built-in observability, monitoring, and cost-efficient smart scaling. For diffusion-based tasks, SiliconFlow offers the open source OneDiff acceleration library, while its BizyAir runtime supports scalable multimodal workloads. Designed for enterprise-grade stability, it includes features like BYOC (Bring Your Own Cloud), robust security, and real-time metrics.
|
About
python-sql is a library to write SQL queries in a pythonic way. Simple selects, select with where condition. Select with join or select with multiple joins. Select with group_by and select with output name. Select with order_by, or select with sub-select. Select on other schema and insert query with default values. Insert query with values, and insert query with query. Update query with values. Update query with where condition. Update query with from the list. Delete query with where condition, and delete query with sub-query. Provides limit style, qmark style, and numeric style.
|
|||
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
|||
Audience
Anyone in search of a solution to run the top AI models to improve their machine learning outcomes
|
Audience
AI infrastructure engineers wanting a solution to manage AI models across various workloads
|
Audience
Developers and AI teams seeking a solution to easily run, manage, and scale language and multimodal models in production
|
Audience
Developers searching for a solution offering a library to write SQL queries
|
|||
Support
Phone Support
24/7 Live Support
Online
|
Support
Phone Support
24/7 Live Support
Online
|
Support
Phone Support
24/7 Live Support
Online
|
Support
Phone Support
24/7 Live Support
Online
|
|||
API
Offers API
|
API
Offers API
|
API
Offers API
|
API
Offers API
|
|||
Screenshots and Videos |
Screenshots and Videos |
Screenshots and Videos |
Screenshots and Videos |
|||
Pricing
$0.70 per 1M input tokens
Free Version
Free Trial
|
Pricing
$5.9 per hour
Free Version
Free Trial
|
Pricing
$0.04 per image
Free Version
Free Trial
|
Pricing
Free
Free Version
Free Trial
|
|||
Reviews/
|
Reviews/
|
Reviews/
|
Reviews/
|
|||
Training
Documentation
Webinars
Live Online
In Person
|
Training
Documentation
Webinars
Live Online
In Person
|
Training
Documentation
Webinars
Live Online
In Person
|
Training
Documentation
Webinars
Live Online
In Person
|
|||
Company InformationDeep Infra
deepinfra.com
|
Company InformationFriendliAI
Founded: 2021
United States
friendli.ai/
|
Company InformationSiliconFlow
Founded: 2023
Singapore
www.siliconflow.com
|
Company InformationPython Software Foundation
United States
pypi.org/project/python-sql/
|
|||
Alternatives |
Alternatives |
Alternatives |
Alternatives |
|||
|
|
||||||
Categories |
Categories |
Categories |
Categories |
|||
Integrations
Codestral Mamba
DeepSeek-V2
FLUX.1
FLUX.1 Kontext
Gemma 3
Kimi K2
Kubernetes
Le Chat
Llama 3
Llama 3.1
|
Integrations
Codestral Mamba
DeepSeek-V2
FLUX.1
FLUX.1 Kontext
Gemma 3
Kimi K2
Kubernetes
Le Chat
Llama 3
Llama 3.1
|
Integrations
Codestral Mamba
DeepSeek-V2
FLUX.1
FLUX.1 Kontext
Gemma 3
Kimi K2
Kubernetes
Le Chat
Llama 3
Llama 3.1
|
Integrations
Codestral Mamba
DeepSeek-V2
FLUX.1
FLUX.1 Kontext
Gemma 3
Kimi K2
Kubernetes
Le Chat
Llama 3
Llama 3.1
|
|||
|
|
|
|
|