AWS recently announced Lambda Managed Instances, a new deployment option that runs Lambda functions on customer-owned EC2 instances while AWS manages the operational aspects. This brings an important change to the Lambda programming model.
What's Different: Multi-Concurrent Invocations
Unlike traditional Lambda functions where one execution environment handles one invocation at a time, Lambda Managed Instances support multi-concurrent invocations. This means one execution environment can handle multiple invocations simultaneously, each processed by a different runtime worker. As AWS describes it:
Multiple invocations can execute simultaneously within the same execution environment, each handled by a different runtime worker.
The execution environment remains continuously active, processing invocations as they arrive without freezing between invocations. This yields better utilization of underlying EC2 instances.
Lambda Managed Instances are best suited for:
-
High volume-predictable workloads - Steady-state workloads without unexpected traffic spikes
-
Performance-critical applications - Access to latest CPUs, varying memory-CPU ratios, and high network throughput
-
Regulatory requirements - Granular governance with control over VPC and instance placement
-
Variety of applications - Event-driven applications, media/data processing, web applications, and legacy workloads migrating to serverless
Swift Runtime: Built for Concurrency from Day One
Good news: the Swift AWS Lambda Runtime was designed with concurrency support from the beginning. However, in it's current form, the Swift runtime handles invocations in sequence.
We've tested it with Lambda Managed Instances and confirmed it works as expected, handling multiple invocations correctly without cross-contamination between requests.
In the future, we might consider adding support for concurrent invocations (handling multiple requests in parallel), depending on user feedback and demand. There are multiple strategies to implement this: launch multiple Swift processes, each polling for events and handling them in sequence in a process, or create a Task upon reception of events and run your handler in the Task.
Important Considerations for Your Code
While the runtime handles concurrency safely, you need to ensure your function code is concurrency-safe:
1. Respect Swift Concurrency Safety
If you're using workarounds to bypass Swift's concurrency safety checks (such as @unchecked Sendable or unsafe transfer mechanisms), your code may break in a multi-concurrent environment. The compiler's concurrency safety features exist for good reasons—trust them.
2. Avoid Shared State Between Invocations
Each invocation must be isolated. Pay special attention to:
- File operations: Use unique file names for each invocation. Include the request ID in file names to prevent conflicts:
let fileName = "/tmp/\(context.requestID)-data.json"
-
Global variables: Avoid mutable global state that could be accessed by multiple concurrent invocations
-
Shared resources: Database connections, file handles, and other resources must be managed with concurrency in mind
Getting Started
The Swift AWS Lambda Runtime works seamlessly with Lambda Managed Instances—no code changes required if your function is already concurrency-safe. Simply configure your Lambda function with a capacity provider and deploy.
For more details on Lambda Managed Instances and the multi-concurrent execution model, see the AWS documentation.