
Azure Foundations & Azure Functions
Review of Azure fundamentals, with a special focus on Azure Functions - a powerful serverless compute service that enables event-driven, scalable applications without managing infrastructure.
Part 1: Azure Foundations Review
1.1 Core Azure Concepts
Before diving into Azure Functions, let's consolidate the foundational knowledge from our Azure modules. Understanding these concepts is essential because everything in Azure builds on this hierarchy.
Azure Resource Hierarchy
Understanding how Azure organizes resources is critical for proper cloud architecture. Think of it like organizing files on your computer - you have folders within folders, and Azure works the same way but with cloud resources:

Key Principle: Always group related resources together. When you delete a Resource Group, all resources inside are deleted too. This makes cleanup easy - delete one Resource Group and everything associated with your project is gone.
1.2 Essential Azure CLI Commands
The Azure CLI is your primary tool for managing Azure resources from the command line. While you can do everything in the Azure Portal (web interface), the CLI is faster, scriptable, and essential for automation.
Why Use CLI Instead of Portal?
Click through menus
Type commands
Good for learning, exploring
Good for automation, scripting
Hard to repeat exact steps
Easy to save commands as scripts
Takes time with many resources
Fast, can process many resources quickly
Account & Subscription Management
When you first start with Azure, you need to authenticate (login) and select which subscription to work with. Think of a subscription as your Azure "account" - it's where all your resources live and where billing happens.
Understanding Subscriptions:
Most students have a single "Azure for Students" subscription with $100 credit. In enterprise environments, companies often have multiple subscriptions to separate billing and access:
Development subscription - for testing, experimentation
Production subscription - for live applications
Sandbox subscription - for learning, no restrictions
Resource Group Operations
Resource Groups are containers that hold related resources for an Azure solution. Think of them as project folders - everything for one project goes in one Resource Group.
Best Practice: Create a new Resource Group for each lab or project. This way, when you're done, you can delete the entire Resource Group and clean up all resources at once.
Command Flag Reference:
--name or -n
Name of the resource
--location or -l
Azure region (westeurope, eastus, etc.)
--output or -o
Output format: table, json, yaml, tsv
--yes or -y
Skip confirmation prompts
--no-wait
Don't wait for operation to complete
Region & Location Management
Azure has datacenters around the world. When you create resources, you choose where they physically live. This affects:
Latency: Resources closer to users respond faster
Compliance: Some data must stay in specific countries
Cost: Some regions are cheaper than others
Availability: Not all services are available in all regions
Common European Regions:
westeurope
West Europe
Netherlands - main EU region
northeurope
North Europe
Ireland - backup/DR region
germanywestcentral
Germany West Central
Data residency in Germany
polandcentral
Poland Central
Closest to Poland
1.3 Virtual Machines - Key Commands
Virtual Machines (VMs) are the foundation of cloud computing. A VM is a computer running in Azure's datacenter that you control remotely. You can install any software, run any application - it's like having a server without buying hardware.
When to Use VMs:
You need full control over the operating system
Running legacy applications that can't be containerized
Need specific software that only runs on Windows/Linux
Learning and experimentation
When NOT to Use VMs:
Running simple web applications (use App Service)
Running containerized microservices (use Container Apps)
Running event-driven code (use Azure Functions)
Creating Virtual Machines
Understanding the VM Create Parameters:
--resource-group
Which Resource Group to put the VM in
--name
Name of your VM
--image
Operating system to install (Ubuntu2204, Win2022Datacenter, etc.)
--size
How powerful the VM is (CPU, RAM)
--admin-username
Username for logging in
--generate-ssh-keys
Create SSH keys for secure access (Linux only)
--public-ip-address
Create a public IP so you can connect from internet
Common VM Sizes for Learning:
Standard_B1s
1
1 GB
~$8
Very basic, testing only
Standard_B2s
2
4 GB
~$30
Development, learning
Standard_B2ms
2
8 GB
~$60
Small production workloads
Standard_D2s_v3
2
8 GB
~$70
General purpose production
B-series VMs are "burstable" - perfect for learning because they're cheap for workloads that don't use 100% CPU constantly.
Finding Images and Sizes
Controlling VMs (Start, Stop, Deallocate)
Understanding the difference between stop and deallocate is crucial for cost management:
Running
VM is on and working
Full cost
Stopped
OS shutdown, but resources still allocated
Still charged for compute!
Deallocated
Resources released back to Azure
No compute charges
Important: If you just run az vm stop, you still pay! Always use az vm deallocate to stop billing.
Getting Information and Remote Access
1.4 Networking Fundamentals
Networking in Azure is about connecting your resources together and controlling what can communicate with what. This is fundamental for security - you don't want your database accessible from the internet!
Key Networking Concepts:
VNet
Virtual Network - your private network in Azure
Subnet
A range of IP addresses within a VNet
NSG
Network Security Group - firewall rules for traffic
Public IP
An IP address accessible from the internet
Private IP
An IP address only accessible within your VNet
NAT Gateway
Allows outbound internet access from private subnets
Virtual Network (VNet) Operations
A VNet is your private, isolated network in Azure. Resources in a VNet can communicate with each other by default, but are isolated from other VNets and the internet.
IP Address Planning:
When you create a VNet, you define an address space using CIDR notation:
10.0.0.0/16= 65,536 IP addresses (10.0.0.0 to 10.0.255.255)10.0.1.0/24= 256 IP addresses (10.0.1.0 to 10.0.1.255)
Common Subnet Layout:
web-subnet
10.0.1.0/24
Web servers (public access)
app-subnet
10.0.2.0/24
Application servers (internal)
db-subnet
10.0.3.0/24
Database servers (most secure)
Network Security Groups (NSGs)
NSGs are Azure's firewall. They contain security rules that allow or deny network traffic to resources. Think of them as a bouncer at a club - they check every "packet" against rules and decide if it can enter or leave.
NSG Rule Components:
Priority
Lower number = evaluated first (100-4096)
Direction
Inbound (coming in) or Outbound (going out)
Access
Allow or Deny
Protocol
TCP, UDP, ICMP, or * (any)
Source/Destination Address
Where traffic is coming from/going to
Source/Destination Port
Which port (22=SSH, 80=HTTP, 443=HTTPS, etc.)
Common Security Rules:
Allow-SSH
Inbound
22
Remote access to Linux VMs
Allow-RDP
Inbound
3389
Remote access to Windows VMs
Allow-HTTP
Inbound
80
Web traffic (unencrypted)
Allow-HTTPS
Inbound
443
Web traffic (encrypted)
Deny-All
Inbound
*
Block everything else (catch-all)
Private Subnets & NAT Gateway
Security Best Practice: VMs should NOT have public IP addresses by default. Instead:
Put VMs in private subnets (no direct internet access)
Use NAT Gateway for outbound access (VMs can reach internet, but internet can't reach VMs)
Use Azure Bastion for administrative access (instead of public IPs)
1.5 Storage Fundamentals
Azure Storage is used to store files, blobs (binary large objects), queues, and tables. The most common use is Blob Storage - storing files like images, documents, backups, and logs.
Storage Account Hierarchy:
Storage Redundancy Options:
Standard_LRS
3 copies in one datacenter
99.999999999%
Standard_ZRS
3 copies across 3 availability zones
99.9999999999%
Standard_GRS
6 copies: 3 local + 3 in another region
99.99999999999999%
Standard_RAGRS
Like GRS but can read from secondary region
Highest
For learning: Standard_LRS is cheapest and sufficient.
Accessing Storage Securely:
There are several ways to access Azure Storage:
--auth-mode login
Use your Azure AD credentials (recommended)
Account Key
Full access to everything (dangerous if leaked!)
SAS Token
Limited, time-bound access to specific resources
Managed Identity
For Azure services to access storage securely
Part 2: Azure Functions - Serverless Computing
Now let's dive into Azure Functions - one of the most powerful and practical Azure services for modern applications.
2.1 What Are Azure Functions?
Azure Functions is a serverless compute service that lets you run code without managing servers. You write the code, Azure handles everything else: scaling, infrastructure, patching, and availability.

2.2 Key Characteristics of Azure Functions
Event-Driven
Functions are triggered by events (HTTP requests, timers, queues, file uploads)
Auto-Scaling
Automatically scales from 0 to thousands of instances based on demand
Pay-Per-Execution
You only pay when your code runs (Consumption plan)
Multiple Languages
Supports C#, JavaScript, Python, Java, PowerShell, TypeScript
Stateless by Default
Each execution is independent (use Durable Functions for stateful workflows)
2.3 Azure Functions Triggers
A trigger is what causes a function to execute. Understanding triggers is key to understanding Azure Functions:

Trigger Types Explained
HTTP
REST APIs, webhooks, web services
User submits form → process data
Timer
Scheduled tasks, cron jobs
Every night at 2 AM → cleanup old files
Blob Storage
File processing
Image uploaded → resize/thumbnail
Queue Storage
Async processing, work queues
Order placed → process payment
Event Hub
High-volume event streaming
IoT devices → process sensor data
Cosmos DB
Database change events
Document updated → sync to search index
Service Bus
Enterprise messaging
Message received → notify systems
2.4 Hosting Plans
Azure Functions offers three hosting plans, each with different characteristics:
Consumption
Auto (0 → ∞)
5-10 min
Yes
Intermittent workloads, pay-per-use
Premium
Auto (1+ → ∞)
Unlimited
No
Production APIs, low latency required
Dedicated (App Service)
Manual
Unlimited
No
Existing App Service infrastructure
Cold Start Explained:
When using the Consumption plan, if your function hasn't been called recently, Azure may deallocate the resources. The next request needs to "wake up" the function, causing a delay (cold start). Premium plan keeps instances warm.

2.5 Real-World Scenarios
Here are practical examples of how Azure Functions are used in production:
Scenario 1: Image Processing Pipeline
The Problem: A company allows users to upload profile photos. These photos need to be resized into multiple formats (thumbnail, medium, large) for different parts of the website. Traditional approach would require running a server 24/7 just waiting for uploads.
The Serverless Solution: Use a Blob-triggered Azure Function that automatically activates whenever a new image is uploaded to Azure Blob Storage.

How It Works:
User uploads a photo through a web form
The photo is saved to Azure Blob Storage (container: "uploads")
Azure detects the new blob and automatically triggers the Function
The Function reads the image, creates resized versions using image processing libraries
Processed images are saved to another container (e.g., "processed")
The Function terminates and no compute resources are consumed until the next upload
Why This Is Powerful: If 1,000 users upload photos simultaneously, Azure automatically spins up 1,000 function instances to process them in parallel. When uploads stop, the system scales back to zero - you pay nothing while waiting.
Scenario 2: Microservices API Backend
The Problem: Building a REST API typically requires setting up a web server (like Express.js or Flask), configuring routing, managing server infrastructure, and handling scaling. For APIs with variable traffic, you end up paying for idle server time.
The Serverless Solution: Use HTTP-triggered Azure Functions where each API endpoint is a separate function. Azure handles all the server infrastructure, routing, and scaling automatically.

How It Works:
Frontend application (mobile app or website) makes an HTTP request to the function URL
Azure receives the request and routes it to the appropriate function based on the URL path
The function executes: reads request data, queries the database, processes the result
The function returns a JSON response to the frontend
After the response is sent, the function instance can be reused or terminated
Why This Is Powerful:
Each function can scale independently. If your /users endpoint gets 10x more traffic than /orders, Azure will automatically allocate more instances for /users while keeping /orders at minimal capacity.
This is the essence of microservices architecture - small, independent, single-purpose services.
Scenario 3: Scheduled Data Processing
The Problem: Many business operations need to run on a schedule - generating daily reports, cleaning up old database records, syncing data from external systems, or creating backups. Traditionally, this requires running a dedicated server with cron jobs or Windows Task Scheduler.
The Serverless Solution: Use Timer-triggered Azure Functions that run on a schedule defined using CRON expressions. No server needed - Azure wakes up the function at the specified time, runs it, and you pay only for execution time.

How It Works:
You define a CRON expression in the function configuration:
"0 0 2 * * *"means "at 2:00 AM every day"Azure's scheduler monitors this schedule and triggers the function at exactly 2:00 AM
The function performs ETL (Extract-Transform-Load): pulls data from sources, transforms it, and loads it to the destination
Function logs execution status and any errors for monitoring
After completion, the function terminates until the next scheduled run
Understanding CRON Expressions:
"0 0 2 * * *"= At 2:00:00 AM every day"0 */5 * * * *"= Every 5 minutes"0 0 9-17 * * 1-5"= Every hour from 9 AM to 5 PM, Monday to FridayFormat:
{second} {minute} {hour} {day} {month} {day-of-week}
Why This Is Powerful:
You don't pay for a server running 24/7 just to execute a 5-minute job once per day. The function runs, completes its task, and you're billed only for those few minutes of execution.
Scenario 4: Real-Time Event Processing (IoT)
The Problem: IoT (Internet of Things) systems generate massive amounts of data - thousands of sensors sending readings every second. Processing this data in real-time requires infrastructure that can handle millions of events and scale instantly when traffic spikes.
The Serverless Solution: Use Event Hub-triggered Azure Functions. Azure Event Hub acts as a buffer that can ingest millions of events per second, and Azure Functions process these events in real-time, scaling automatically to match the incoming data rate.

How It Works:
IoT devices (temperature sensors, motion detectors, smart meters) continuously send data
Azure Event Hub receives all incoming events and stores them temporarily in partitions
Azure Functions are triggered by new events in Event Hub (batch processing for efficiency)
Each function instance processes a batch of events: analyzes data, detects anomalies, triggers alerts
Results are stored in a database and alerts are sent to monitoring systems
Functions scale up/down based on the event backlog in Event Hub
Real-World Example: A smart building with 10,000 temperature sensors. Each sensor sends a reading every 10 seconds. That's 1,000 events per second! If a temperature exceeds a threshold, the function immediately sends an alert to the maintenance team and logs the event.
Why This Is Powerful: Event Hub decouples the event producers (sensors) from consumers (functions). If the processing temporarily slows down, events are safely buffered. Functions scale independently to catch up with the backlog.
Scenario 5: Webhook Integration
The Problem: Modern applications need to integrate with external services - payment processors (Stripe), version control (GitHub), communication platforms (Slack, Twilio). These services notify your application about events via webhooks - HTTP POST requests sent to your endpoint.
The Serverless Solution: Use HTTP-triggered Azure Functions as webhook endpoints. When an external service sends a webhook, your function processes the event and takes appropriate actions (update database, send notifications, trigger workflows).

How It Works:
You create an HTTP-triggered function and get a public URL (e.g.,
https://myfunc.azurewebsites.net/api/github-webhook)You configure the external service (GitHub, Stripe) to send webhooks to this URL
When an event occurs (code pushed, payment received), the service sends a POST request with event data
Your function parses the webhook payload, validates the signature (for security), and processes the event
The function performs actions: updates database, sends Slack notification, triggers other workflows
Returns a 200 OK response to confirm receipt
Security Consideration: Always validate webhook signatures! Services like GitHub and Stripe include a secret signature in the request header. Your function should verify this signature to ensure the webhook is authentic and not from a malicious source.
Why This Is Powerful: You get instant notifications about external events without polling APIs repeatedly. The serverless model is perfect for webhooks - they're sporadic (you don't know when they'll arrive), short-lived (just process and respond), and vary in volume (Black Friday vs. regular Tuesday).
2.6 Benefits of Azure Functions
No Infrastructure Management
Focus on code, not servers. No OS updates, no patching.
Automatic Scaling
Handles 1 request or 1 million - automatically
Cost Efficiency
Pay only for execution time (Consumption plan: first 1M executions free!)
Fast Development
Quick to deploy, easy to test, integrate with Azure services
Event-Driven Architecture
Natural fit for microservices and reactive systems
DevOps Friendly
Easy CI/CD integration, infrastructure as code with Terraform
2.7 Azure Functions vs Other Compute Options
When should you use Functions vs VMs vs Containers vs App Service?
Run code occasionally, pay per use
Azure Functions
Scales to zero, pay-per-execution
API with consistent traffic
App Service or Premium Functions
Always warm, predictable performance
Full control over OS
Virtual Machines
Complete control, complex deps
Containerized microservices
Container Apps / AKS
Container orchestration, complex apps
Background job processing
Azure Functions (Queue trigger)
Event-driven, auto-scaling
Real-time data processing
Azure Functions (Event Hub)
High throughput, serverless
2.8 Creating Your First Azure Function
Creating an Azure Function involves two main steps:
setting up the infrastructure (Function App, storage account) and
writing the actual function code.
Let's walk through both.
What You Need to Understand First:
Function App - A container that hosts one or more functions. Think of it like a project that groups related functions together.
Storage Account - Required by Azure Functions to store internal data (function code, logs, trigger state). Every Function App needs one.
Consumption Plan - The serverless billing model where you pay only for execution time. Functions scale automatically and can scale to zero when idle.Here's how to create and deploy a simple HTTP-triggered function:
Azure Functions can be written in different language than your main app language!
Using Azure CLI
The following commands create all the infrastructure needed to run Azure Functions. Each command builds on the previous one.
Simple Python HTTP Function
Now let's look at the actual function code. Azure Functions use a decorator-based approach where you define what triggers the function using Python decorators.
Understanding the Code Structure:
Line-by-Line Explanation:
app = func.FunctionApp()- Creates the main application object that will host all your functions@app.function_name(name="HttpTrigger")- Gives this function a name for identification in Azure Portal@app.route(route="hello", ...)- Defines the HTTP endpoint. The function will be accessible athttps://yourapp.azurewebsites.net/api/helloauth_level=func.AuthLevel.ANONYMOUS- No authentication required (anyone can call this endpoint)req: func.HttpRequest- The incoming HTTP request object containing headers, query params, bodyreq.params.get('name', 'World')- Gets thenamequery parameter, defaults to "World" if not providedfunc.HttpResponse(...)- Returns an HTTP response with JSON body and status code
Testing Your Function:
Once deployed, you can test it with:
Timer Trigger Example
Timer-triggered functions run on a schedule. They're perfect for recurring tasks like backups, reports, or data synchronization.
Key Points About Timer Triggers:
schedule="0 0 2 * * *"- CRON expression defining when to run (2:00 AM daily)run_on_startup=False- Don't run when the Function App starts (only on schedule)timer: func.TimerRequest- Contains information about the timer (past due status, schedule)Timer functions don't return anything (
-> None) since there's no caller to respond to
Blob Trigger Example
Blob-triggered functions react to file uploads in Azure Blob Storage. They're ideal for file processing pipelines.
Key Points About Blob Triggers:
path="uploads/{name}"- Watches the "uploads" container.{name}captures the filenameconnection="AzureWebJobsStorage"- References the storage connection string from app settingsblob: func.InputStream- Provides access to the uploaded file as a streamblob.read()- Reads the entire file content into memory (be careful with large files!)
Important Consideration: Blob triggers have a slight delay (can be up to several minutes for large containers). For real-time processing, consider Event Grid triggers instead.
2.9 Azure Functions + DevOps Integration
Azure Functions integrate perfectly with the DevOps tools you've learned. This section shows how Docker and Terraform work together with serverless computing to create a complete, automated deployment pipeline.
Why Integrate Functions with DevOps Tools?
Docker: Ensures your function runs identically in development, testing, and production
Terraform: Automates infrastructure creation, making deployments repeatable and version-controlled
Together: You can deploy entire serverless architectures with a single command
With Docker
You can containerize Azure Functions for consistent deployments. This is especially useful when:
Your function has complex dependencies that are hard to install on Azure
You want to test locally with the exact same environment as production
You need to run functions on Azure Container Apps or Kubernetes instead of the built-in Functions host
Understanding the Dockerfile:
FROM mcr.microsoft.com/azure-functions/python:4-python3.11- Uses Microsoft's official Azure Functions base image with Python 3.11. This image includes the Functions runtime and all necessary tools.AzureWebJobsScriptRoot=/home/site/wwwroot- Tells the Functions runtime where to find your codeAzureFunctionsJobHost__Logging__Console__IsEnabled=true- Enables console logging so you can see output when running locallyCOPY requirements.txtandRUN pip install- Installs your Python dependenciesCOPY . /home/site/wwwroot- Copies your function code into the container
Building and Running Locally:
With Terraform
Infrastructure as Code for Functions allows you to define your entire serverless infrastructure in code, version it with Git, and deploy it automatically.
What This Terraform Configuration Creates:
A Resource Group to contain all resources
A Storage Account (required by Functions for internal operations)
A Service Plan (defines the hosting model - Consumption, Premium, or Dedicated)
The Function App itself
Understanding the Terraform Resources:
azurerm_resource_group- Container for all related Azure resourcesazurerm_storage_account- Storage for function code, logs, and trigger state.LRSmeans locally redundant (cheapest option)azurerm_service_planwithsku_name = "Y1"- The Consumption (serverless) plan. "Y1" is Azure's code for Consumption tierazurerm_linux_function_app- The actual Function App. References the storage and service plan created above
Deploying with Terraform:
The Power of IaC: With this Terraform file, you can recreate your entire serverless infrastructure in any Azure subscription with a single terraform apply command. This is invaluable for disaster recovery, testing environments, and multi-region deployments.
2.10 Security Best Practices for Azure Functions
Security is critical for serverless applications because your functions are exposed to the internet and process potentially untrusted input. Unlike traditional servers where you control the network perimeter, serverless functions require a different security mindset.
Quick Reference Table:
Use Authentication
Configure Azure AD, API keys, or managed identities
Secure Connection Strings
Use Azure Key Vault, not app settings for secrets
Network Restrictions
Use VNet integration, private endpoints
HTTPS Only
Enforce HTTPS for all HTTP triggers
Input Validation
Always validate and sanitize input data
Least Privilege
Use managed identities with minimal permissions
Understanding Authentication Levels
Azure Functions provide three built-in authentication levels for HTTP triggers:
What Each Level Means:
ANONYMOUS - Anyone can call your function without any key. Use only for truly public endpoints.
FUNCTION - Requires a function-specific key in the request. Each function can have its own key.
ADMIN - Requires the master key (also called host key). This key has access to ALL functions in the app.
How to Call a Protected Function:
Enforcing HTTPS
Never allow HTTP (unencrypted) connections to your functions. All data, including API keys, would be visible to attackers on the network.
Using Managed Identities (Recommended)
Instead of storing credentials in your code or configuration, use Managed Identities. Azure automatically provides your function with an identity that can authenticate to other Azure services.
The Problem with Traditional Credentials:
The Solution with Managed Identity:
How Managed Identity Works:
You enable Managed Identity for your Function App in Azure
You grant that identity permissions to access other resources (e.g., "read blobs from storage account X")
Your code uses
DefaultAzureCredential()which automatically obtains tokens from AzureNo secrets to store, rotate, or accidentally commit to Git!
Input Validation
Always validate and sanitize input data. Never trust data coming from users or external systems.
Common Validation Checks:
Check for required fields
Validate data types (string, number, boolean)
Validate formats (email, phone, date)
Check string lengths (prevent oversized payloads)
Sanitize data before using in database queries (prevent SQL injection)
Storing Secrets in Azure Key Vault
Never store sensitive data (API keys, passwords, connection strings) in plain text in your function configuration.
Then reference the secret in your Function App configuration:
Your code can now read DatabasePassword from environment variables, and Azure automatically fetches the actual value from Key Vault.
Part 3: Quick Reference Cheatsheet
Azure CLI Essential Commands
Azure Services Comparison
Virtual Machines
Full control, legacy apps
Manual
Per hour (always on)
App Service
Web apps, APIs
Auto/Manual
Per plan tier
Azure Functions
Event-driven, serverless
Automatic
Per execution
Container Apps
Containerized microservices
Automatic
Per resource usage
AKS
Complex container orchestration
Manual/Auto
Per cluster + nodes
Last updated