Inspiration

I the past I have experience moments in my life where big images increased the page load time of a website significantly and have caused frustration especially in situation where the internet speed is a bit slow. It can be overcome by using existing tools but as someone who spends a lot of time coding it is important for me to understand the mechanism of actually what is happening and that curious is what fuel me to create Lambda Images. Also optimized images play a crucial role not only to keep users engaged but also improves SEO, as search engines prioritize fast-loading sites.

What it does: Application Overview

Your application is a fully serverless, headless image-management and processing platform built on AWS and a lightweight Hono router.

Core Image Compression

  • Lambda & Sharp
    • Node.js 22.x Lambda (CDK-packaged) with a native Sharp Layer
    • Downloads originals from S3, re-encodes to WebP at 80% quality
    • Writes compressed files to a parallel folder in S3
    • Records metadata (dimensions, sizes, keys, timestamps) in a relational database (Drizzle ORM)
    • Enables on-the-fly image transformations (resize, crop, format conversion, blur, sharpen, grayscale, etc.) via dynamic API routes powered by Sharp

AI-Driven Features

  1. Text-to-Image Generation

    • Uses Meta Llama to refine the user’s description into an optimal prompt
    • Feeds prompt to Titan to generate a 512×512 PNG
    • Uploads the generated image to S3 and runs the compression pipeline
  2. Alt-Text Generation

    • Retrieves a compressed image from S3
    • Streams it into Meta Llama-4 with a request for detailed alt-text
    • Returns the generated description via SSE or JSON

Delivery & Transformations

  • CloudFront
    All images are served through a CloudFront distribution.

  • REST Endpoints

    • Direct redirects to compressed or original images (HTML or Open Graph use)
    • Dynamic URL-based transforms (resize, crop, blur, sharpen, format conversion, grayscale) with Smart Sharp caching
    • Free-form transform POST that returns original & optimized metadata in JSON
    • Workspace image listings with cursor-based pagination (Zod-validated query params)

Infrastructure & Security

  • CORS enabled on every route for cross-origin clients
  • API Gateway with API-key protection for all the routes expect image route.
  • nanoid for generating unique public IDs
  • IAM Roles grant least-privilege access to S3 and Bedrock
  • Scalable, pay-per-use pipeline for image creation, optimization, metadata management, and delivery

How we built it

The front-end of the application was created with Next.js (TypeScript), leveraging:

  • Server-Side Rendering & ISR for fast first-load and incremental updates
  • React Hooks & SWR for seamless client-side data fetching and cache invalidation
  • Tailwind CSS (and shadcn/ui components) for a utility-first, consistent design system
  • Drizzle ORM to communicated with the database.

All image optimization and AI communication happens in AWS Lambda functions (TypeScript) deployed via AWS CDK:

  • AWS CDK provisions:
    • Node.js 22.x Lambdas with a native Sharp Layer for ARM64, handling on-the-fly compression and transforms
    • API Gateway (REST) with CORS and API-Key auth.
    • S3 buckets (original vs. compressed/transformed folders) and a CloudFront CDN fronting them
    • IAM roles scoped to S3 and Amazon Bedrock model ARNs for least-privilege access
  • Hono framework in each Lambda:
    • Compression routes that download from S3, re-encode to WebP (80% quality), upload back, and record metadata (Drizzle ORM on Aurora Serverless)
    • Dynamic URL-based transforms (resize, crop, blur, sharpen, format, grayscale) with Sharp and cached variants in S3
    • AI-powered endpoints:
    • Text-to-Image: Meta Llama → Titan image generator → S3 upload → compression pipeline
    • Alt-Text Generation: Stream compressed image into Llama-4 via ConverseCommand/Stream → return detailed descriptions
  • Drizzle ORM for type-safe queries and migrations against PostgreSQL
  • Nanoid for collision-resistant public IDs

This stack delivers a fully serverless, scalable, and maintainable image management platform—separating a modern Next.js SPA from event-driven AWS Lambdas.

Challenges I ran into

  • Using Sharp
    Sharp relies on native binaries that must be compiled for the Lambda runtime’s underlying architecture. Initially when I tried to directly use I came across errors. So for that I had to use a native Sharp Layer.

  • Introducing an API Key on API Gateway
    When I first enabled API-Key authentication for our Lambda-backed endpoints, all of our client-side fetch calls immediately started failing—none of them included the x-api-key header, and exposing that key in the browser is a security no-no. Since I hadn’t worked with API Gateway’s auth model before, I didn’t realize the implications, so I ended up refactoring every request from the frontend into Next.js server-side API routes. Those routes now inject the API key securely and proxy the calls to our Lambdas.

Accomplishments that I am proud of

  • Fully Serverless Pipeline
    Deployed a zero-server, pay-per-use architecture using AWS Lambda, API Gateway, S3, CloudFront, and amplify, eliminating server management and scaling automatically under load.
  • High-Performance Image Processing
    Achieved sub-second end-to-end image compression and on-the-fly transformations by leveraging ARM64-optimized Sharp layers, keeping cold starts low and function bundles lean.
  • AI Creativity
    Integrated a two-phase Bedrock workflow: Meta Llama for prompt refinement, Titan for image generation, plus Llama-4 for detailed alt-text—delivering rich, automated media creation and accessibility features.
  • Dynamic Caching & CDN
    Automated caching rules via S3 key conventions and CloudFront, so once a transformation is generated it’s served from the edge with immutable long-term cache headers.

What I learned

I am really proud to have a deeper understanding about aws lambda's functionalities and how to integrate with API Gateway to create secure endpoints which can perform and scale heavy computes effortlessly.

What's next for Lambda Images

After the judging period I am planning to add more functionality to the optimization endpoint so that it can take more parameters.

Built With

Share this project:

Updates