Build a Serverless File Uploader with Python and AWS in 20 Minutes

Build a Serverless File Uploader with Python and AWS in 20 Minutes

Build a Serverless File Uploader with Python and AWS in 20 Minutes

Ever wanted to upload files to the cloud without setting up a full backend? Today, I’ll show you how to build a serverless file uploader using Python and AWS in just 20 minutes. Whether you’re a beginner or a pro, this guide is practical and easy to follow.

Why Serverless?

Serverless apps save you from managing servers. AWS handles scaling, security, and infrastructure. You only write the code that matters—your business logic!

  • AWS S3 → store files
  • AWS Lambda → run Python code without servers
  • API Gateway → expose your uploader as an endpoint

Step 1: Create an S3 Bucket

  1. Go to AWS S3 console.
  2. Click Create bucket.
  3. Give it a unique name, e.g., my-file-uploader-bucket.
  4. Set Public access to off (we’ll manage access via presigned URLs).

✅ Your bucket is now ready.

Step 2: Create a Lambda Function

  1. Go to AWS Lambda → Create function.
  2. Choose Author from scratch, runtime: Python 3.11.
  3. Function name: serverless-file-uploader.

Step 3: Add S3 Permission

Go to Configuration → Permissions → Add inline policy. Use this JSON policy:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": ["s3:PutObject", "s3:GetObject"],
      "Resource": "arn:aws:s3:::my-file-uploader-bucket/*"
    }
  ]
}

✅ This allows your Lambda to upload files to S3.

Step 4: Write Python Code

In Lambda, add this code:

import boto3
import json
import os
from urllib.parse import unquote_plus

s3_client = boto3.client('s3')
BUCKET_NAME = 'my-file-uploader-bucket'

def lambda_handler(event, context):
    try:
        file_name = event['queryStringParameters']['filename']
        file_name = unquote_plus(file_name)
        url = s3_client.generate_presigned_url(
            ClientMethod='put_object',
            Params={'Bucket': BUCKET_NAME, 'Key': file_name},
            ExpiresIn=3600
        )
        return {
            "statusCode": 200,
            "body": json.dumps({"upload_url": url})
        }
    except Exception as e:
        return {
            "statusCode": 400,
            "body": json.dumps({"error": str(e)})
        }

This generates a presigned URL for secure file uploads.

Step 5: Expose via API Gateway

  1. Go to API Gateway → Create API → HTTP API.
  2. Add route /upload → integrate with Lambda.
  3. Deploy → note the API endpoint.

Step 6: Upload Files from Python

import requests

API_URL = "https://your-api-id.amazonaws.com/upload"
file_path = "example.txt"
file_name = "example.txt"

# Get presigned URL
response = requests.get(API_URL, params={"filename": file_name})
upload_url = response.json()['upload_url']

# Upload file
with open(file_path, 'rb') as f:
    r = requests.put(upload_url, data=f)
print("Upload status:", r.status_code)

✅ Your file is now in S3—serverless and effortless!

Step 7: Test & Celebrate

  • Run python upload.py.
  • Check your S3 bucket → the file should be there.
  • 🎉 You just built a serverless file uploader in under 20 minutes!

Why This Matters

  • No server management → saves hours
  • Scalable → handles thousands of files
  • Secure → presigned URLs expire automatically

Next Steps

  • Add a frontend with React/HTML for browser uploads.
  • Add authentication with AWS Cognito.
  • Track uploads with CloudWatch logs.

💬 Your Turn: Have you built a serverless project before? What’s your favorite use-case? Reply in the comments—I read every one!

Follow for more hands-on guides!

Comments

Popular posts from this blog

Docker for Beginners: Build, Ship, and Run Apps with Ease

Top 7 Real-World Projects to Learn React, Python, and AWS (Beginner to Advanced)