Building with Code: A Hands-On Developer’s Guide to Core AWS Services (EC2, S3, Lambda, and EBS)


As cloud platforms mature, the real value for developers lies not in clicking around dashboards—but in scripting infrastructure, deploying apps, and automating workflows with precision.

This blog walks through hands-on, code-driven interactions with key AWS services: EC2, S3, Lambda, and EBS. Whether you’re building backends, automating DevOps, or creating event-driven architectures, this post is your entry point to real-world AWS development.


1. Launching and Managing EC2 Instances with CLI + Boto3

EC2 (Elastic Compute Cloud) is your go-to for VM-based compute.

Spin up an EC2 instance using CLI:

aws ec2 run-instances \
  --image-id ami-0abcdef1234567890 \
  --count 1 \
  --instance-type t2.micro \
  --key-name my-key \
  --security-groups my-sg

Stop or terminate programmatically (Python):

import boto3
ec2 = boto3.client('ec2')
ec2.stop_instances(InstanceIds=['i-0123456789abcdef0'])

Use CloudInit scripts to bootstrap apps automatically. Example (user-data.sh):

#!/bin/bash
sudo apt update
sudo apt install -y python3-pip
pip3 install flask

Provision this script via --user-data file://user-data.sh.


2. Automating File Storage with Amazon S3

S3 is ideal for code assets, logs, backups, and static web hosting.

Create and upload to a bucket:

aws s3 mb s3://my-coding-assets
aws s3 cp ./script.py s3://my-coding-assets/

Python automation:

s3 = boto3.client('s3')
s3.upload_file('data.json', 'my-coding-assets', 'data.json')

Tip: Use pre-signed URLs to securely share objects:

url = s3.generate_presigned_url(
    'get_object',
    Params={'Bucket': 'my-coding-assets', 'Key': 'data.json'},
    ExpiresIn=3600
)
print("Download link:", url)

Bonus: Add a static site config:

aws s3 website s3://my-coding-assets/ --index-document index.html

3. Writing Serverless Code with AWS Lambda

Lambda lets you write code without provisioning infrastructure—ideal for webhooks, ETL, or automation.

Hello World with Python:

def lambda_handler(event, context):
    return {"statusCode": 200, "body": "Hello from Lambda!"}

Deploy via CLI:

zip function.zip lambda_function.py
aws lambda create-function \
  --function-name HelloLambda \
  --runtime python3.9 \
  --role arn:aws:iam::123456789012:role/execution-role \
  --handler lambda_function.lambda_handler \
  --zip-file fileb://function.zip

Trigger via S3 (event-driven):

{
  "Type": "AWS::S3::Bucket",
  "Properties": {
    "NotificationConfiguration": {
      "LambdaConfigurations": [{
        "Event": "s3:ObjectCreated:*",
        "Function": "arn:aws:lambda:us-east-1:123456789012:function:HelloLambda"
      }]
    }
  }
}

4. EBS: Attaching and Scripting Persistent Block Storage

EBS volumes power persistent storage for EC2.

Create and attach a volume:

aws ec2 create-volume \
  --availability-zone us-east-1a \
  --size 20 \
  --volume-type gp2

Attach:

aws ec2 attach-volume \
  --volume-id vol-1234567890abcdef0 \
  --instance-id i-0123456789abcdef0 \
  --device /dev/xvdf

Mount programmatically (init script):

#!/bin/bash
mkfs -t ext4 /dev/xvdf
mkdir /mnt/data
mount /dev/xvdf /mnt/data

You can also create and manage snapshots for backups:

aws ec2 create-snapshot --volume-id vol-1234567890abcdef0

Infrastructure as Code: Automate It All

Use AWS CloudFormation, Terraform, or Pulumi to codify and version-control your AWS infrastructure.

Example: Terraform for EC2 + EBS

resource "aws_instance" "dev" {
  ami           = "ami-0abcdef1234567890"
  instance_type = "t2.micro"
}

resource "aws_ebs_volume" "dev_data" {
  availability_zone = "us-east-1a"
  size              = 10
}

resource "aws_volume_attachment" "dev_attach" {
  device_name = "/dev/xvdf"
  volume_id   = aws_ebs_volume.dev_data.id
  instance_id = aws_instance.dev.id
}

Final Thoughts: Cloud Dev = Code-First Thinking

Whether you’re writing Lambda functions, automating EC2 provisioning, or scripting S3 pipelines—AWS becomes exponentially more powerful when approached as code.

As you scale your apps and infrastructure, think in scripts, not clicks.

Next Steps:

  • Explore AWS CDK (Cloud Development Kit)
  • Integrate CI/CD pipelines (GitHub Actions, CodePipeline)
  • Profile cost and monitor resources with CloudWatch

TL;DR

AWS ServiceUse CaseScripting Highlight
EC2VM computeCLI + Boto3 to launch/terminate
S3Object storageUploads + pre-signed URLs
LambdaServerlessEvent-driven Python functions
EBSBlock storageAutomated volume attach/mount