Koyeb support deploy LLM models for inference, but you have to pay for the whole running machine and that's not scalable in some uses cases. Would be great if Koyeb add a payment API for some open models.