Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We at ottex.ai use bunny.net to deploy globally an openrouter like speach-to-text API (5 continents, 26 locations, idle cost 3$).

Highly recommend their Edge Containers product, super simple and has nice primitives to deploy globally for a low latency workloads.

We connect all containers to one redis pubsub server to push important events like user billing overages, top-ups etc. Super simple, very fast, one config to manage all locations.



That's inspirational. Do you perhaps have an architectural writeup somewhere?


Nope, but I will think about this, thank you for the idea. Maybe it's time to start a technical blog for ottex


Are cold starts an issue?


There is no cold starts at all. It’s running non-stop.

Bunny bills per resource utilization (not provisioned) and since we run backend on Go it consumes like 0.01 CPU and 15mb RAM per idle container and costs pennies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: