Skip to main content

One doc tagged with "inference"

View all tags

ML Development Environment Architecture

A serverless, fully-managed environment that lets data-scientists move from raw data to a low-latency prediction API in minutes using Amazon SageMaker Studio, Amazon S3, and Amazon SageMaker real-time endpoints. Security, cost-control, and CI/CD best-practices are baked into every layer.