Azure Functions — Serverless Computing Without the Complexity
Your API receives three requests per hour on weekdays and zero on weekends, but your VM runs 24/7 costing you $70 a month. That is the exact problem serverless computing solves. Azure Functions lets you write code that runs only when triggered — and you pay only for execution time measured in milliseconds. No server provisioning, no capacity planning, no idle compute burning money.
What Are Azure Functions?
Azure Functions is Microsoft's serverless compute platform. You write a function, define what triggers it, and Azure handles everything else — scaling, infrastructure, OS patching, and load balancing.
Key characteristics:
- Event-driven: Functions execute in response to triggers (HTTP requests, queue messages, timers, database changes)
- Auto-scaling: Scales from zero to thousands of instances automatically
- Pay-per-execution: Consumption plan charges per execution and GB-seconds
- Multiple languages: C#, JavaScript, Python, Java, PowerShell, TypeScript, Go, Rust (custom handler)
Triggers and Bindings
This is the core concept that makes Azure Functions powerful. A trigger defines what starts the function. Bindings connect the function to other services without writing integration code.
Think of it this way: the trigger is the doorbell, and bindings are the pipes that carry data in and out.
| Trigger Type | Fires When | Common Use Case |
|---|---|---|
| HTTP | HTTP request received | REST APIs, webhooks |
| Timer | Cron schedule hits | Cleanup jobs, report generation |
| Queue Storage | Message arrives in queue | Async processing |
| Blob Storage | File uploaded/modified | Image processing, ETL |
| Cosmos DB | Document created/updated | Change feed processing |
| Event Grid | Event published | Resource change reactions |
| Service Bus | Message arrives in topic/queue | Enterprise messaging |
Hosting Plans
| Feature | Consumption | Premium | Dedicated (App Service) |
|---|---|---|---|
| Scaling | 0 to 200 instances | 1 to 100 instances | Manual/auto scale |
| Cold Start | Yes (seconds) | Pre-warmed (none) | None |
| Timeout | 5 min (max 10) | 30 min (unlimited) | 30 min (unlimited) |
| VNet Integration | No | Yes | Yes |
| Price | Per execution + GB-s | Per vCPU-s + memory | Monthly fixed |
| Free Grant | 1M executions/month | None | None |
Start with Consumption. Move to Premium when you need VNet access, longer execution times, or cannot tolerate cold starts.
Creating Your First Function — HTTP Trigger
Prerequisites
# Install Azure Functions Core Tools
npm install -g azure-functions-core-tools@4 --unsafe-perm true
# Verify installation
func --version
Python HTTP Function
# Create a new function project
func init my-function-app --python
# Navigate to project
cd my-function-app
# Create an HTTP-triggered function
func new --name HttpGreeting --template "HTTP trigger" --authlevel anonymous
The generated function looks like this in function_app.py:
import azure.functions as func
import json
import logging
app = func.FunctionApp()
@app.route(route="HttpGreeting", auth_level=func.AuthLevel.ANONYMOUS)
def HttpGreeting(req: func.HttpRequest) -> func.HttpResponse:
logging.info("HTTP trigger function processed a request.")
name = req.params.get("name")
if not name:
try:
req_body = req.get_json()
name = req_body.get("name")
except ValueError:
pass
if name:
return func.HttpResponse(
json.dumps({"message": f"Hello, {name}!"}),
mimetype="application/json",
status_code=200
)
else:
return func.HttpResponse(
json.dumps({"error": "Pass a name in the query string or request body"}),
mimetype="application/json",
status_code=400
)
JavaScript HTTP Function
// src/functions/HttpGreeting.js
const { app } = require("@azure/functions");
app.http("HttpGreeting", {
methods: ["GET", "POST"],
authLevel: "anonymous",
handler: async (request, context) => {
context.log("HTTP trigger function processed a request.");
const name = request.query.get("name")
|| (await request.json()).name;
if (name) {
return {
jsonBody: { message: `Hello, ${name}!` }
};
}
return {
status: 400,
jsonBody: { error: "Pass a name parameter" }
};
}
});
Run Locally
# Start the function locally
func start
# Test it
curl "http://localhost:7071/api/HttpGreeting?name=Azure"
# Output: {"message": "Hello, Azure!"}
Timer Triggers for Scheduled Jobs
Timer triggers use NCRONTAB expressions (six-field cron). Perfect for cleanup tasks, report generation, or periodic data syncs.
@app.timer_trigger(schedule="0 0 9 * * 1-5", arg_name="timer",
run_on_startup=False)
def DailyReport(timer: func.TimerRequest) -> None:
logging.info("Running daily report at 9:00 AM UTC, weekdays only")
# Generate report logic here
if timer.past_due:
logging.warning("Timer is past due — catching up!")
The schedule 0 0 9 * * 1-5 means: second 0, minute 0, hour 9, any day, any month, Monday through Friday.
Queue Triggers for Async Processing
Offload heavy work to a queue and process it asynchronously:
@app.queue_trigger(arg_name="msg",
queue_name="order-processing",
connection="AzureWebJobsStorage")
def ProcessOrder(msg: func.QueueMessage) -> None:
order = json.loads(msg.get_body().decode("utf-8"))
logging.info(f"Processing order {order['orderId']} "
f"for customer {order['customerId']}")
# Process the order
# Validate inventory, charge payment, send confirmation
Durable Functions for Workflows
Regular functions are stateless and short-lived. Durable Functions let you build stateful, long-running workflows by chaining functions together.
A common pattern is fan-out/fan-in — start multiple tasks in parallel and wait for all to complete:
import azure.functions as func
import azure.durable_functions as df
app = func.FunctionApp()
bp = df.Blueprint()
@bp.orchestration_trigger(context_name="context")
def batch_processor(context: df.DurableOrchestrationContext):
# Get list of items to process
items = yield context.call_activity("GetItems", None)
# Fan out — process all items in parallel
tasks = [context.call_activity("ProcessItem", item) for item in items]
results = yield context.task_all(tasks)
# Fan in — aggregate results
summary = yield context.call_activity("CreateSummary", results)
return summary
@bp.activity_trigger(input_name="item")
def ProcessItem(item: str) -> str:
# Process individual item
return f"Processed: {item}"
app.register_functions(bp)
Deploying to Azure
# Create a Function App in Azure
az functionapp create \
--resource-group rg-serverless-demo \
--name func-app-prod-2025 \
--consumption-plan-location eastus \
--runtime python \
--runtime-version 3.11 \
--functions-version 4 \
--storage-account stfuncprod2025 \
--os-type Linux
# Deploy from local project
func azure functionapp publish func-app-prod-2025
# View function URLs
az functionapp function list \
--resource-group rg-serverless-demo \
--name func-app-prod-2025 \
--output table
Solving Cold Starts
Cold starts happen when the Consumption plan spins up a new instance. The first request waits for the runtime to initialize — anywhere from 1 to 10 seconds depending on language and dependencies.
Solutions:
- Premium Plan — Pre-warmed instances eliminate cold starts entirely
- Keep warm with a timer — A timer function that runs every 5 minutes keeps at least one instance alive
- Optimize dependencies — Fewer imports means faster startup
- Use Linux — Linux cold starts are faster than Windows for Python and Node.js
# Switch to Premium plan for zero cold starts
az functionapp plan create \
--resource-group rg-serverless-demo \
--name plan-premium-prod \
--location eastus \
--sku EP1 \
--is-linux true \
--min-instances 1 \
--max-burst 10
Monitoring with Application Insights
Every function should be connected to Application Insights for observability:
# Create Application Insights
az monitor app-insights component create \
--resource-group rg-serverless-demo \
--app insights-func-prod \
--location eastus \
--kind web
# Link it to your Function App
az functionapp config appsettings set \
--resource-group rg-serverless-demo \
--name func-app-prod-2025 \
--settings APPINSIGHTS_INSTRUMENTATIONKEY=$(az monitor app-insights component show \
--resource-group rg-serverless-demo \
--app insights-func-prod \
--query instrumentationKey -o tsv)
Once connected, you get automatic tracking of:
- Execution count and duration
- Success/failure rates
- Dependency calls (databases, HTTP, queues)
- Custom logs from your function code
- Live metrics stream for real-time debugging
Wrapping Up
Azure Functions removes the undifferentiated heavy lifting of infrastructure management. Start with a Consumption plan and an HTTP trigger to build your first API. Add timer triggers for scheduled work, queue triggers for async processing, and Durable Functions when you need multi-step workflows. The key is choosing the right trigger for each job — functions are the glue that connects every Azure service together without running a single server.
Next up: We will explore Azure DevOps Pipelines — building CI/CD workflows that take your code from commit to production with approvals, stages, and automated testing.
