Aretove Reviews Amazon SageMaker

Aretove reviews Amazon SageMaker for using various algorithms in production environment beyond Jupyter notebooks. AWS run the gamut from basic EC2 instances to full blown Machine Learning tools and Internet of Things. In 2017 AWS claimed 57 percent of public cloud market. Amazon SageMaker played a major role in how algorithms were implemented. It gave the power of cloud to all Data Scientists with various in-built algorithms. It allowed them to bring their own deep learning frameworks like TensorFlow, MXNet, PyTorch, Caffe2 etc.

SageMaker facilitates to build, train, and deploy machine learning models at scale. This is where a Data Scientist can go beyond Jupyter notebooks very easily. With Machine Learning as a Service (MLaaS), AWS has provided many tools to Data Scientists of Small and Medium size businesses to explore various algorithms of Machine Learning for business goals.


In order to use Amazon SageMaker you must follow the steps to create notebook instance, define jobs, create models and train them. Finally when you are ready with various tests, you can deploy the endpoints to be used in production.

Aretove Reviews Amazon SageMaker
Aretove Reviews Amazon SageMaker
 Create Notebook Instance
Aretove Reviews Amazon SageMaker
Aretove Reviews Amazon SageMaker

Once instantiated, the notebook will get default 5GB space. You must open the notebook to start to use it. At this time you can either explore some existing Amazon SageMaker examples or write your own code on notebook. As in regular Jupyter notebooks, you can import various python based ML libraries (e.g., numpy, pandas, matplotlib etc.) to use along with SageMaker libraries.

Common steps for most of the implementations

1.Prepare your notebook. 

2.Download the data to be used for training and testing into Amazon SageMaker notebook. You can refer to any file hosted on S3 or anywhere on internet over HTTP.

3.Investigate and transform the data based on the algorithm chosen.

4.Estimate a model based on the algorithm.

5.Evaluate the effectiveness of the model.

6.Set the model up for on-going predictions.


Once you have trained the model, you can deploy the model behind a real-time endpoint. You need to pass the instance type (e.g., ml.m4.xlarge) to the deploy method. The initial instance count defines the minimum number of instances to spin for the this job. Based on the load it should auto scale.

SageMaker Deploy Job


Implementation of machine learning systems with Amazon SageMaker is one of the core initiatives in providing various Data Science services at Aretove. It will be interesting to implement various algorithms like XGBoost, Time-series Forecasting and DeepAR to predict and forecast various analysis for retail and eCommerce. Stay tuned for more updates on this…

1 Comment

  • A lot of thanks for your own efforts on this site. Ellie really loves managing investigations and it is simple to grasp why.

    We all learn all concerning the compelling medium you make rewarding
    guidance through this blog and therefore cause response from the others about this situation and our princess is really discovering a great deal.
    Have fun with the rest of the new year. You are always conducting a tremendous job.

Leave a Reply