Skip to content
Get Started for Free

Quickstart

In this quickstart you’ll start LocalStack and deploy a simple serverless API — a Lambda function backed by DynamoDB — entirely on your local machine. No AWS account needed.

By the end you will have:

  • LocalStack running locally in Docker
  • A Lambda function deployed and invokable via a public URL
  • A DynamoDB table storing data written by the Lambda
  • Confirmed that your local environment behaves like real AWS

Choose your preferred deployment style below: AWS CLI (awslocal) or Terraform (tflocal).

The fastest way to get LocalStack running locally is with lstk, a lightweight CLI that handles authentication and image setup automatically.

Install lstk:

Terminal window
brew install localstack/tap/lstk # macOS / Linux with Homebrew
Terminal window
npm install -g @localstack/lstk # or via npm

Then start LocalStack:

Terminal window
lstk start

On first run, lstk opens a browser login to authenticate, then pulls the image and starts the container automatically.

Wait for the container to report ready — you’ll see a log line like Ready. or you can verify with:

Terminal window
curl -s http://localhost:4566/_localstack/health | grep '"running"'

Now deploy a Lambda function and a DynamoDB table. Pick the tooling you prefer:

Install the awslocal wrapper if you haven’t already:

Terminal window
pip install awscli-local

Create the Lambda function:

Terminal window
mkdir -p /tmp/localstack-demo
cat > /tmp/localstack-demo/handler.py << 'EOF'
import json, boto3, os, uuid
def handler(event, context):
table = boto3.resource('dynamodb').Table(os.environ['TABLE_NAME'])
method = event.get('requestContext', {}).get('http', {}).get('method', 'GET')
if method == 'POST':
item = {'id': str(uuid.uuid4()), **json.loads(event.get('body', '{}'))}
table.put_item(Item=item)
return {'statusCode': 200, 'body': json.dumps(item)}
result = table.scan()
return {'statusCode': 200, 'body': json.dumps(result['Items'])}
EOF
cd /tmp/localstack-demo && zip handler.zip handler.py

Create the DynamoDB table:

Terminal window
awslocal dynamodb create-table \
--table-name Messages \
--attribute-definitions AttributeName=id,AttributeType=S \
--key-schema AttributeName=id,KeyType=HASH \
--billing-mode PAY_PER_REQUEST

Deploy the Lambda:

Terminal window
awslocal lambda create-function \
--function-name messages-api \
--runtime python3.12 \
--handler handler.handler \
--zip-file fileb:///tmp/localstack-demo/handler.zip \
--role arn:aws:iam::000000000000:role/lambda-role \
--environment Variables={TABLE_NAME=Messages}
awslocal lambda wait function-active --function-name messages-api

Create a public function URL:

Terminal window
awslocal lambda create-function-url-config \
--function-name messages-api \
--auth-type NONE

Retrieve the URL:

Terminal window
LAMBDA_URL=$(awslocal lambda list-function-url-configs \
--function-name messages-api \
--query 'FunctionUrlConfigs[0].FunctionUrl' \
--output text)
echo $LAMBDA_URL

Store a message:

Terminal window
curl -X POST "$LAMBDA_URL" \
-H "Content-Type: application/json" \
-d '{"message": "Hello, LocalStack!"}'

You should get back a response like:

{ "id": "a1b2c3d4-...", "message": "Hello, LocalStack!" }

List all messages:

Terminal window
curl "$LAMBDA_URL"

That’s the win. You just invoked a real Lambda function that wrote to a real DynamoDB table — all running locally, with no AWS account and no cloud costs.

You can browse the resources you just deployed in the LocalStack Web Application. Navigate to your Default Instance and click through to Lambda or DynamoDB to see your running infrastructure.

When you’re done, stop LocalStack to tear down all local resources:

Terminal window
lstk stop # if using lstk
localstack stop # if using the LocalStack CLI

LocalStack is ephemeral by default — stopping it removes all provisioned resources. To persist state across restarts, see Persistence or Cloud Pods.

  • Tutorials — Deeper dives into specific AWS services and application stacks
  • Supported Services — Full list of emulated AWS services
  • CI/CD Setup — Run LocalStack in GitHub Actions and other pipelines
  • AI & Agent Workflows — Use LocalStack with AI coding tools and agents
  • Toolingawslocal, tflocal, LocalStack Desktop, and more
Was this page helpful?