Page cover

microsoftAzure Foundations & Azure Functions

Review of Azure fundamentals, with a special focus on Azure Functions - a powerful serverless compute service that enables event-driven, scalable applications without managing infrastructure.

Part 1: Azure Foundations Review

1.1 Core Azure Concepts

Before diving into Azure Functions, let's consolidate the foundational knowledge from our Azure modules. Understanding these concepts is essential because everything in Azure builds on this hierarchy.

Azure Resource Hierarchy

Understanding how Azure organizes resources is critical for proper cloud architecture. Think of it like organizing files on your computer - you have folders within folders, and Azure works the same way but with cloud resources:

Key Principle: Always group related resources together. When you delete a Resource Group, all resources inside are deleted too. This makes cleanup easy - delete one Resource Group and everything associated with your project is gone.

1.2 Essential Azure CLI Commands

The Azure CLI is your primary tool for managing Azure resources from the command line. While you can do everything in the Azure Portal (web interface), the CLI is faster, scriptable, and essential for automation.

Why Use CLI Instead of Portal?

Portal (Web)
CLI (Command Line)

Click through menus

Type commands

Good for learning, exploring

Good for automation, scripting

Hard to repeat exact steps

Easy to save commands as scripts

Takes time with many resources

Fast, can process many resources quickly

Account & Subscription Management

When you first start with Azure, you need to authenticate (login) and select which subscription to work with. Think of a subscription as your Azure "account" - it's where all your resources live and where billing happens.

Understanding Subscriptions:

Most students have a single "Azure for Students" subscription with $100 credit. In enterprise environments, companies often have multiple subscriptions to separate billing and access:

  • Development subscription - for testing, experimentation

  • Production subscription - for live applications

  • Sandbox subscription - for learning, no restrictions

Resource Group Operations

Resource Groups are containers that hold related resources for an Azure solution. Think of them as project folders - everything for one project goes in one Resource Group.

circle-check

Command Flag Reference:

Flag
What It Does

--name or -n

Name of the resource

--location or -l

Azure region (westeurope, eastus, etc.)

--output or -o

Output format: table, json, yaml, tsv

--yes or -y

Skip confirmation prompts

--no-wait

Don't wait for operation to complete

Region & Location Management

Azure has datacenters around the world. When you create resources, you choose where they physically live. This affects:

  • Latency: Resources closer to users respond faster

  • Compliance: Some data must stay in specific countries

  • Cost: Some regions are cheaper than others

  • Availability: Not all services are available in all regions

Common European Regions:

Region Name
Display Name
Use Case

westeurope

West Europe

Netherlands - main EU region

northeurope

North Europe

Ireland - backup/DR region

germanywestcentral

Germany West Central

Data residency in Germany

polandcentral

Poland Central

Closest to Poland

1.3 Virtual Machines - Key Commands

Virtual Machines (VMs) are the foundation of cloud computing. A VM is a computer running in Azure's datacenter that you control remotely. You can install any software, run any application - it's like having a server without buying hardware.

When to Use VMs:

  • You need full control over the operating system

  • Running legacy applications that can't be containerized

  • Need specific software that only runs on Windows/Linux

  • Learning and experimentation

When NOT to Use VMs:

  • Running simple web applications (use App Service)

  • Running containerized microservices (use Container Apps)

  • Running event-driven code (use Azure Functions)

Creating Virtual Machines

Understanding the VM Create Parameters:

Parameter
What It Means

--resource-group

Which Resource Group to put the VM in

--name

Name of your VM

--image

Operating system to install (Ubuntu2204, Win2022Datacenter, etc.)

--size

How powerful the VM is (CPU, RAM)

--admin-username

Username for logging in

--generate-ssh-keys

Create SSH keys for secure access (Linux only)

--public-ip-address

Create a public IP so you can connect from internet

Common VM Sizes for Learning:

Size
vCPUs
RAM
Cost/month (approx)
Use Case

Standard_B1s

1

1 GB

~$8

Very basic, testing only

Standard_B2s

2

4 GB

~$30

Development, learning

Standard_B2ms

2

8 GB

~$60

Small production workloads

Standard_D2s_v3

2

8 GB

~$70

General purpose production

B-series VMs are "burstable" - perfect for learning because they're cheap for workloads that don't use 100% CPU constantly.

Finding Images and Sizes

Controlling VMs (Start, Stop, Deallocate)

Understanding the difference between stop and deallocate is crucial for cost management:

State
Description
Billing

Running

VM is on and working

Full cost

Stopped

OS shutdown, but resources still allocated

Still charged for compute!

Deallocated

Resources released back to Azure

No compute charges

circle-exclamation

Getting Information and Remote Access

1.4 Networking Fundamentals

Networking in Azure is about connecting your resources together and controlling what can communicate with what. This is fundamental for security - you don't want your database accessible from the internet!

Key Networking Concepts:

Concept
What It Is

VNet

Virtual Network - your private network in Azure

Subnet

A range of IP addresses within a VNet

NSG

Network Security Group - firewall rules for traffic

Public IP

An IP address accessible from the internet

Private IP

An IP address only accessible within your VNet

NAT Gateway

Allows outbound internet access from private subnets

Virtual Network (VNet) Operations

A VNet is your private, isolated network in Azure. Resources in a VNet can communicate with each other by default, but are isolated from other VNets and the internet.

IP Address Planning:

When you create a VNet, you define an address space using CIDR notation:

  • 10.0.0.0/16 = 65,536 IP addresses (10.0.0.0 to 10.0.255.255)

  • 10.0.1.0/24 = 256 IP addresses (10.0.1.0 to 10.0.1.255)

Common Subnet Layout:

Subnet Name
CIDR
Purpose

web-subnet

10.0.1.0/24

Web servers (public access)

app-subnet

10.0.2.0/24

Application servers (internal)

db-subnet

10.0.3.0/24

Database servers (most secure)

Network Security Groups (NSGs)

NSGs are Azure's firewall. They contain security rules that allow or deny network traffic to resources. Think of them as a bouncer at a club - they check every "packet" against rules and decide if it can enter or leave.

NSG Rule Components:

Component
What It Means

Priority

Lower number = evaluated first (100-4096)

Direction

Inbound (coming in) or Outbound (going out)

Access

Allow or Deny

Protocol

TCP, UDP, ICMP, or * (any)

Source/Destination Address

Where traffic is coming from/going to

Source/Destination Port

Which port (22=SSH, 80=HTTP, 443=HTTPS, etc.)

Common Security Rules:

Rule Name
Direction
Port
Purpose

Allow-SSH

Inbound

22

Remote access to Linux VMs

Allow-RDP

Inbound

3389

Remote access to Windows VMs

Allow-HTTP

Inbound

80

Web traffic (unencrypted)

Allow-HTTPS

Inbound

443

Web traffic (encrypted)

Deny-All

Inbound

*

Block everything else (catch-all)

Private Subnets & NAT Gateway

Security Best Practice: VMs should NOT have public IP addresses by default. Instead:

  1. Put VMs in private subnets (no direct internet access)

  2. Use NAT Gateway for outbound access (VMs can reach internet, but internet can't reach VMs)

  3. Use Azure Bastion for administrative access (instead of public IPs)

1.5 Storage Fundamentals

Azure Storage is used to store files, blobs (binary large objects), queues, and tables. The most common use is Blob Storage - storing files like images, documents, backups, and logs.

Storage Account Hierarchy:

Storage Redundancy Options:

SKU
Description
Durability

Standard_LRS

3 copies in one datacenter

99.999999999%

Standard_ZRS

3 copies across 3 availability zones

99.9999999999%

Standard_GRS

6 copies: 3 local + 3 in another region

99.99999999999999%

Standard_RAGRS

Like GRS but can read from secondary region

Highest

For learning: Standard_LRS is cheapest and sufficient.

Accessing Storage Securely:

There are several ways to access Azure Storage:

Method
Use Case

--auth-mode login

Use your Azure AD credentials (recommended)

Account Key

Full access to everything (dangerous if leaked!)

SAS Token

Limited, time-bound access to specific resources

Managed Identity

For Azure services to access storage securely

Part 2: Azure Functions - Serverless Computing

Now let's dive into Azure Functions - one of the most powerful and practical Azure services for modern applications.

2.1 What Are Azure Functions?

Azure Functions is a serverless compute service that lets you run code without managing servers. You write the code, Azure handles everything else: scaling, infrastructure, patching, and availability.

2.2 Key Characteristics of Azure Functions

Feature
Description

Event-Driven

Functions are triggered by events (HTTP requests, timers, queues, file uploads)

Auto-Scaling

Automatically scales from 0 to thousands of instances based on demand

Pay-Per-Execution

You only pay when your code runs (Consumption plan)

Multiple Languages

Supports C#, JavaScript, Python, Java, PowerShell, TypeScript

Stateless by Default

Each execution is independent (use Durable Functions for stateful workflows)

2.3 Azure Functions Triggers

A trigger is what causes a function to execute. Understanding triggers is key to understanding Azure Functions:

Trigger Types Explained

Trigger
Use Case
Example

HTTP

REST APIs, webhooks, web services

User submits form → process data

Timer

Scheduled tasks, cron jobs

Every night at 2 AM → cleanup old files

Blob Storage

File processing

Image uploaded → resize/thumbnail

Queue Storage

Async processing, work queues

Order placed → process payment

Event Hub

High-volume event streaming

IoT devices → process sensor data

Cosmos DB

Database change events

Document updated → sync to search index

Service Bus

Enterprise messaging

Message received → notify systems

2.4 Hosting Plans

Azure Functions offers three hosting plans, each with different characteristics:

Plan
Scaling
Max Timeout
Cold Start
Best For

Consumption

Auto (0 → ∞)

5-10 min

Yes

Intermittent workloads, pay-per-use

Premium

Auto (1+ → ∞)

Unlimited

No

Production APIs, low latency required

Dedicated (App Service)

Manual

Unlimited

No

Existing App Service infrastructure

Cold Start Explained:

When using the Consumption plan, if your function hasn't been called recently, Azure may deallocate the resources. The next request needs to "wake up" the function, causing a delay (cold start). Premium plan keeps instances warm.

2.5 Real-World Scenarios

Here are practical examples of how Azure Functions are used in production:

Scenario 1: Image Processing Pipeline

The Problem: A company allows users to upload profile photos. These photos need to be resized into multiple formats (thumbnail, medium, large) for different parts of the website. Traditional approach would require running a server 24/7 just waiting for uploads.

The Serverless Solution: Use a Blob-triggered Azure Function that automatically activates whenever a new image is uploaded to Azure Blob Storage.

How It Works:

  1. User uploads a photo through a web form

  2. The photo is saved to Azure Blob Storage (container: "uploads")

  3. Azure detects the new blob and automatically triggers the Function

  4. The Function reads the image, creates resized versions using image processing libraries

  5. Processed images are saved to another container (e.g., "processed")

  6. The Function terminates and no compute resources are consumed until the next upload

Why This Is Powerful: If 1,000 users upload photos simultaneously, Azure automatically spins up 1,000 function instances to process them in parallel. When uploads stop, the system scales back to zero - you pay nothing while waiting.

Scenario 2: Microservices API Backend

The Problem: Building a REST API typically requires setting up a web server (like Express.js or Flask), configuring routing, managing server infrastructure, and handling scaling. For APIs with variable traffic, you end up paying for idle server time.

The Serverless Solution: Use HTTP-triggered Azure Functions where each API endpoint is a separate function. Azure handles all the server infrastructure, routing, and scaling automatically.

How It Works:

  1. Frontend application (mobile app or website) makes an HTTP request to the function URL

  2. Azure receives the request and routes it to the appropriate function based on the URL path

  3. The function executes: reads request data, queries the database, processes the result

  4. The function returns a JSON response to the frontend

  5. After the response is sent, the function instance can be reused or terminated

Why This Is Powerful:

Each function can scale independently. If your /users endpoint gets 10x more traffic than /orders, Azure will automatically allocate more instances for /users while keeping /orders at minimal capacity.

This is the essence of microservices architecture - small, independent, single-purpose services.

Scenario 3: Scheduled Data Processing

The Problem: Many business operations need to run on a schedule - generating daily reports, cleaning up old database records, syncing data from external systems, or creating backups. Traditionally, this requires running a dedicated server with cron jobs or Windows Task Scheduler.

The Serverless Solution: Use Timer-triggered Azure Functions that run on a schedule defined using CRON expressions. No server needed - Azure wakes up the function at the specified time, runs it, and you pay only for execution time.

How It Works:

  1. You define a CRON expression in the function configuration: "0 0 2 * * *" means "at 2:00 AM every day"

  2. Azure's scheduler monitors this schedule and triggers the function at exactly 2:00 AM

  3. The function performs ETL (Extract-Transform-Load): pulls data from sources, transforms it, and loads it to the destination

  4. Function logs execution status and any errors for monitoring

  5. After completion, the function terminates until the next scheduled run

Understanding CRON Expressions:

  • "0 0 2 * * *" = At 2:00:00 AM every day

  • "0 */5 * * * *" = Every 5 minutes

  • "0 0 9-17 * * 1-5" = Every hour from 9 AM to 5 PM, Monday to Friday

  • Format: {second} {minute} {hour} {day} {month} {day-of-week}

Why This Is Powerful:

You don't pay for a server running 24/7 just to execute a 5-minute job once per day. The function runs, completes its task, and you're billed only for those few minutes of execution.

Scenario 4: Real-Time Event Processing (IoT)

The Problem: IoT (Internet of Things) systems generate massive amounts of data - thousands of sensors sending readings every second. Processing this data in real-time requires infrastructure that can handle millions of events and scale instantly when traffic spikes.

The Serverless Solution: Use Event Hub-triggered Azure Functions. Azure Event Hub acts as a buffer that can ingest millions of events per second, and Azure Functions process these events in real-time, scaling automatically to match the incoming data rate.

How It Works:

  1. IoT devices (temperature sensors, motion detectors, smart meters) continuously send data

  2. Azure Event Hub receives all incoming events and stores them temporarily in partitions

  3. Azure Functions are triggered by new events in Event Hub (batch processing for efficiency)

  4. Each function instance processes a batch of events: analyzes data, detects anomalies, triggers alerts

  5. Results are stored in a database and alerts are sent to monitoring systems

  6. Functions scale up/down based on the event backlog in Event Hub

Real-World Example: A smart building with 10,000 temperature sensors. Each sensor sends a reading every 10 seconds. That's 1,000 events per second! If a temperature exceeds a threshold, the function immediately sends an alert to the maintenance team and logs the event.

Why This Is Powerful: Event Hub decouples the event producers (sensors) from consumers (functions). If the processing temporarily slows down, events are safely buffered. Functions scale independently to catch up with the backlog.

Scenario 5: Webhook Integration

The Problem: Modern applications need to integrate with external services - payment processors (Stripe), version control (GitHub), communication platforms (Slack, Twilio). These services notify your application about events via webhooks - HTTP POST requests sent to your endpoint.

The Serverless Solution: Use HTTP-triggered Azure Functions as webhook endpoints. When an external service sends a webhook, your function processes the event and takes appropriate actions (update database, send notifications, trigger workflows).

How It Works:

  1. You create an HTTP-triggered function and get a public URL (e.g., https://myfunc.azurewebsites.net/api/github-webhook)

  2. You configure the external service (GitHub, Stripe) to send webhooks to this URL

  3. When an event occurs (code pushed, payment received), the service sends a POST request with event data

  4. Your function parses the webhook payload, validates the signature (for security), and processes the event

  5. The function performs actions: updates database, sends Slack notification, triggers other workflows

  6. Returns a 200 OK response to confirm receipt

Security Consideration: Always validate webhook signatures! Services like GitHub and Stripe include a secret signature in the request header. Your function should verify this signature to ensure the webhook is authentic and not from a malicious source.

Why This Is Powerful: You get instant notifications about external events without polling APIs repeatedly. The serverless model is perfect for webhooks - they're sporadic (you don't know when they'll arrive), short-lived (just process and respond), and vary in volume (Black Friday vs. regular Tuesday).

2.6 Benefits of Azure Functions

Benefit
Description

No Infrastructure Management

Focus on code, not servers. No OS updates, no patching.

Automatic Scaling

Handles 1 request or 1 million - automatically

Cost Efficiency

Pay only for execution time (Consumption plan: first 1M executions free!)

Fast Development

Quick to deploy, easy to test, integrate with Azure services

Event-Driven Architecture

Natural fit for microservices and reactive systems

DevOps Friendly

Easy CI/CD integration, infrastructure as code with Terraform

2.7 Azure Functions vs Other Compute Options

When should you use Functions vs VMs vs Containers vs App Service?

Need
Best Choice
Reason

Run code occasionally, pay per use

Azure Functions

Scales to zero, pay-per-execution

API with consistent traffic

App Service or Premium Functions

Always warm, predictable performance

Full control over OS

Virtual Machines

Complete control, complex deps

Containerized microservices

Container Apps / AKS

Container orchestration, complex apps

Background job processing

Azure Functions (Queue trigger)

Event-driven, auto-scaling

Real-time data processing

Azure Functions (Event Hub)

High throughput, serverless

2.8 Creating Your First Azure Function

Creating an Azure Function involves two main steps: setting up the infrastructure (Function App, storage account) and writing the actual function code. Let's walk through both.

What You Need to Understand First:

  • Function App - A container that hosts one or more functions. Think of it like a project that groups related functions together.

  • Storage Account - Required by Azure Functions to store internal data (function code, logs, trigger state). Every Function App needs one.

  • Consumption Plan - The serverless billing model where you pay only for execution time. Functions scale automatically and can scale to zero when idle.

Here's how to create and deploy a simple HTTP-triggered function.

Prerequisite: Creating the Cloud Infrastructure (Azure CLI)

Before you can build and publish any lab in this section, you must first create the physical cloud resources in Azure to hold them. The following Azure CLI commands create the resource group, the storage account, and the "empty house" Function App where your code will eventually live.

Run these commands in your terminal once before starting the labs below:

Lab: Creating and Deploying Your First Azure Function

This step-by-step manual explains the entire lifecycle of an Azure Function—from generating the initial structure to writing your own code and deploying it to the cloud.

Prerequisites:

  1. You must have the Azure Functions Core Tools installed.

  2. You must have already run the Azure CLI commands above to create the empty myuniquefunctionapp123 resource in Azure.

Step 1: Generate the Project "Skeleton" (CLI)

You don't start by writing code from scratch. Instead, you use the CLI to generate a ready-to-use template on your local machine.

What exactly happened here?

  • func init set up the core project folder with necessary settings (host.json, local.settings.json) and a requirements.txt file for Python dependencies.

  • func new did NOT write your custom application logic. It physically created the function_app.py file on your disk and pasted a default, Microsoft-provided "Hello World" template inside it.

Step 2: Write Your Custom Code (Code Editor)

The CLI cannot read your mind. To make the function do what you want, you must manually edit the generated skeleton.

  1. Open your project folder (MyFunctionProject) in a code editor like Visual Studio Code.

  2. Open the function_app.py file. You will see the default "Hello World" code.

  3. Replace or modify this code with your custom Python logic.

Here is what the standard HTTP trigger code looks like inside function_app.py:

Line-by-Line Explanation:

  • app = func.FunctionApp() - Creates the main application object that will host all your functions

  • @app.function_name(name="HttpTrigger") - Gives this function a name for identification in Azure Portal

  • @app.route(route="hello", ...) - Defines the HTTP endpoint. The function will be accessible at https://yourapp.azurewebsites.net/api/hello

  • auth_level=func.AuthLevel.ANONYMOUS - No authentication required (anyone can call this endpoint)

  • req: func.HttpRequest - The incoming HTTP request object containing headers, query params, body

  • req.params.get('name', 'World') - Gets the name query parameter, defaults to "World" if not provided

  • func.HttpResponse(...) - Returns an HTTP response with JSON body and status code

Step 3: Run and Test Locally (CLI)

Before sending code to the cloud, you should verify it works on your machine. The Core Tools simulate the Azure environment locally.

Wait for the host to start. It will display a localhost URL. Test it in a new terminal window:

Step 4: Deploy to Azure (CLI)

Once you've verified your custom code works, it's time to send it to the empty Azure Function App infrastructure you created in the Prerequisite steps.

How does the code get to Azure? When you run publish, the CLI takes your entire project folder (your function_app.py, configuration, and dependencies), zips it up, and securely uploads it to Azure over the internet. Azure extracts the ZIP file onto its servers and immediately starts serving your updated code.

Step 5: Test the Deployed Function in the Cloud

The Azure CLI will output a public URL once the deployment finishes. Test the live, cloud-hosted version of your code:

Lab: Creating a Timer Trigger Function

Timer-triggered functions run on a schedule. They're perfect for recurring tasks like backups, reports, or data synchronization.

Step 1: Generate the Timer Function "Skeleton" (CLI)

Assuming you are already inside your Azure Functions project folder (MyFunctionProject):

Step 2: Write Your Custom Code (Code Editor)

  1. Open your code editor and go to function_app.py.

  2. Delete the default generated code for the timer.

  3. Paste the following custom logic into function_app.py:

Key Points About Timer Triggers:

  • schedule="0 0 2 * * *" - CRON expression defining when to run (2:00 AM daily)

  • run_on_startup=False - Don't run when the Function App starts (only on schedule)

  • timer: func.TimerRequest - Contains information about the timer (past due status, schedule)

  • Timer functions don't return anything (-> None) since there's no caller to respond to

Step 3: Test Locally (CLI)

You can test if the function loads correctly without waiting until 2:00 AM.

The terminal will show that the DailyCleanup function is loaded and will give you the exact timestamp of its next scheduled execution.

Lab: Creating a Blob Trigger Function

Blob-triggered functions react to file uploads in Azure Blob Storage. They're ideal for file processing pipelines.

Step 1: Generate the Blob Function "Skeleton" (CLI)

Inside your MyFunctionProject folder:

Step 2: Configure Local Storage Simulation

Blob triggers require a storage account to monitor. When running locally, you must provide a connection string in local.settings.json. Open local.settings.json and ensure "AzureWebJobsStorage": "UseDevelopmentStorage=true" is set (this uses the local Azurite emulator).

Step 3: Write Your Custom Code (Code Editor)

  1. Open function_app.py.

  2. Clear the previous code.

  3. Paste the following logic to process uploaded blobs:

Key Points About Blob Triggers:

  • path="uploads/{name}" - Watches the "uploads" container. {name} captures the filename

  • connection="AzureWebJobsStorage" - References the storage connection setup in your configuration

  • blob: func.InputStream - Provides access to the uploaded file as a stream

  • blob.read() - Reads the entire file content into memory (be careful with large files!)

Important Consideration: Blob triggers have a slight delay (can be up to several minutes for large containers). For real-time processing, consider Event Grid triggers instead.

Step 4: Deploy and Monitor (CLI)

Once deployed with func azure functionapp publish <AppName>, Azure will automatically monitor the real Blob Storage container linked to the app and fire the function whenever a file is uploaded. You can view the live processing logs using:

2.9 Azure Functions + DevOps Integration

Azure Functions integrate perfectly with the DevOps tools you've learned. This section shows how Docker and Terraform work together with serverless computing to create a complete, automated deployment pipeline.

Why Integrate Functions with DevOps Tools?

  • Docker: Ensures your function runs identically in development, testing, and production

  • Terraform: Automates infrastructure creation, making deployments repeatable and version-controlled

  • Together: You can deploy entire serverless architectures with a single command

Lab: Containerizing Azure Functions with Docker

You can containerize Azure Functions for consistent deployments. This is especially useful when your function has complex dependencies that are hard to install on Azure directly, or you want to run functions on Azure Container Apps.

Step 1: Create the Dockerfile

In your Azure Functions project directory, create a file named Dockerfile (without any extension) and paste the following:

Understanding the Dockerfile:

  • FROM ... - Uses Microsoft's official Azure Functions base image with Python 3.11. This image includes the Functions runtime.

  • AzureWebJobsScriptRoot - Tells the Functions runtime where to find your code inside the container.

  • AzureFunctionsJobHost... - Enables console logging so you can see output when running locally.

  • COPY requirements.txt and RUN pip install - Installs your Python dependencies.

  • COPY . ... - Copies your function code into the container.

Step 2: Build and Run Locally

Now, use the Docker CLI to build the image and run it on your machine.

Step 3: Test the Containerized Function

In a separate terminal, test if your function works inside the container:

Lab: Deploying Azure Functions with Terraform

Infrastructure as Code (IaC) allows you to define your entire serverless infrastructure in code, version it with Git, and deploy it automatically.

Step 1: Create the Terraform Configuration File

Create a new, empty Directory on your machine for your Terraform project. Inside it, create a file named main.tf and paste the following Azure infrastructure definition:

Understanding the Terraform Resources:

  • azurerm_resource_group - Container for all related Azure resources.

  • azurerm_storage_account - Storage required for internal function operations.

  • azurerm_service_plan with sku_name = "Y1" - The Consumption (serverless) plan that scales automatically.

  • azurerm_linux_function_app - The actual Function App. References the storage and service plan created above.

Step 2: Deploy Infrastructure

Open a terminal in the folder where you created the main.tf file and run the following commands in order:

The Power of IaC: With this main.tf file, you can recreate your entire serverless infrastructure in any Azure subscription with a single terraform apply command.

Wait, Where is the Python Code? Terraform creates the empty infrastructure (the house), but it doesn't automatically put your code inside it (the furniture). Once the infrastructure is created by terraform apply shown above, you must map your local Python project (from Lab 1) to this new infrastructure.

Step 3: Deploy Your Python Code to the Terraform Infrastructure

  1. Go back to your Azure Functions Python project folder (where function_app.py lives).

  2. Look at the main.tf file "function_app" block. It created an app named my-unique-function-app.

  3. Deploy your code to that specific app using the Core Tools:

Step 4: Cleanup (Optional)

To avoid unnecessary costs, tear down the environment when you are done. Never do this in production!

2.10 Security Best Practices for Azure Functions

Security is critical for serverless applications because your functions are exposed to the internet and process potentially untrusted input. Unlike traditional servers where you control the network perimeter, serverless functions require a different security mindset.

Quick Reference Table:

Practice
Implementation

Use Authentication

Configure Azure AD, API keys, or managed identities

Secure Connection Strings

Use Azure Key Vault, not app settings for secrets

Network Restrictions

Use VNet integration, private endpoints

HTTPS Only

Enforce HTTPS for all HTTP triggers

Input Validation

Always validate and sanitize input data

Least Privilege

Use managed identities with minimal permissions

Understanding Authentication Levels

Azure Functions provide three built-in authentication levels for HTTP triggers:

What Each Level Means:

  • ANONYMOUS - Anyone can call your function without any key. Use only for truly public endpoints.

  • FUNCTION - Requires a function-specific key in the request. Each function can have its own key.

  • ADMIN - Requires the master key (also called host key). This key has access to ALL functions in the app.

How to Call a Protected Function:

Enforcing HTTPS

Never allow HTTP (unencrypted) connections to your functions. All data, including API keys, would be visible to attackers on the network.

Instead of storing credentials in your code or configuration, use Managed Identities. Azure automatically provides your function with an identity that can authenticate to other Azure services.

The Problem with Traditional Credentials:

The Solution with Managed Identity:

How Managed Identity Works:

  1. You enable Managed Identity for your Function App in Azure

  2. You grant that identity permissions to access other resources (e.g., "read blobs from storage account X")

  3. Your code uses DefaultAzureCredential() which automatically obtains tokens from Azure

  4. No secrets to store, rotate, or accidentally commit to Git!

Input Validation

Always validate and sanitize input data. Never trust data coming from users or external systems.

Common Validation Checks:

  • Check for required fields

  • Validate data types (string, number, boolean)

  • Validate formats (email, phone, date)

  • Check string lengths (prevent oversized payloads)

  • Sanitize data before using in database queries (prevent SQL injection)

Storing Secrets in Azure Key Vault

Never store sensitive data (API keys, passwords, connection strings) in plain text in your function configuration.

Then reference the secret in your Function App configuration:

Your code can now read DatabasePassword from environment variables, and Azure automatically fetches the actual value from Key Vault.

Part 3: Quick Reference Cheatsheet

Azure CLI Essential Commands

Azure Services Comparison

Service
Use Case
Scaling
Cost Model

Virtual Machines

Full control, legacy apps

Manual

Per hour (always on)

App Service

Web apps, APIs

Auto/Manual

Per plan tier

Azure Functions

Event-driven, serverless

Automatic

Per execution

Container Apps

Containerized microservices

Automatic

Per resource usage

AKS

Complex container orchestration

Manual/Auto

Per cluster + nodes

Last updated