Shing Lyu

Disclaimer: This content reflects my personal opinions, not those of any organizations I am or have been affiliated with. Code samples are provided for illustration purposes only, use with caution and test thoroughly before deployment.

Serverless Rust on GCP Cloud Run: From Basic Deployment to Optimized Container Builds

A few weeks ago, I published a comparison of Rust across AWS Lambda, Azure Functions, and GCP’s serverless offerings. I was pretty harsh on GCP, essentially dismissing Cloud Run as “not really serverless” and suggesting Google was missing the boat entirely.

The GCP community didn’t let that slide.

Multiple comments and messages pointed out that I hadn’t given Cloud Run a fair evaluation. They were right. I was so focused on function-as-a-service models that I didn’t appreciate what GCP was actually offering with their “serverless containers” approach.

I had wrongly assumed that “containers” meant always-on hosts, but Cloud Run actually uses a microVM security model that provides strong isolation while still billing you only for the time your function runs - just like other serverless platforms.

So I did what any curious engineer should do: I went back and gave it another try.

And honestly? I was pleasantly surprised. Cloud Run’s container-first approach actually works brilliantly with Rust, and the developer experience is surprisingly smooth once you understand the optimization patterns.

(continue reading...)


Rust Serverless on the Big Three Clouds: AWS, Azure, and GCP Compared

When you’re choosing a cloud platform for your next Rust serverless project, the landscape can be confusing. Each of the big three cloud providers—AWS, Azure, and Google Cloud Platform—offers different levels of maturity and support for Rust in serverless environments. I’ve spent some time experimenting with two platforms (more on why I’m missing one later), and the differences are more significant than you might expect.

Why does this comparison matter? Rust brings compelling advantages to serverless computing: it runs with low resource requirements and delivers high performance, which translates to improved performance and lower cloud costs. Additionally, Rust eliminates many bugs during compile time, so your code is less like a ticking time bomb that will break during runtime. This means less late night on-call madness and happier developers.

What I Mean by “Serverless”

Before diving into the comparison, let me clarify what I mean by “serverless” in this context. Many cloud providers give vague definitions, so here’s my own practical definition that guided this evaluation.

(continue reading...)


When Customers Say No: The Unexpected Door Solution Architects Should Never Ignore

“Sorry, but that just won’t work for our organization.”

Ever been in a customer meeting where the conversation seems to hit a brick wall? As a solution architect, you pitch your best cloud strategy, only to be met with a polite—sometimes not-so-polite—rejection. Most folks would pack up and move on. But what if this is the moment you should lean in and ask… why?

(continue reading...)


Building a Database-Backed API with Serverless Rust on Azure

In my previous post, we deployed a simple “Hello World” Rust function to Azure. Now we’re ready to build something production-worthy: a database-backed REST API that handles real business logic. We’ll create a cake ordering system for a local bakery, complete with data persistence, input validation, and proper error handling.

Business Case

(continue reading...)


Serverless Rust on Azure: Deploying a Rust Azure Function

This is the first post in my new series on serverless Rust on Azure. We’ll start simple by following the official Azure tutorial, with my notes on the caveats and tweaks I’ve made along the way. Having worked with AWS for the past 6 years, I’ll also highlight key differences from AWS Lambda.

You can find the complete code on GitHub: https://github.com/shinglyu/serverless-rust-on-azure.

(continue reading...)