in

Configure cross-account entry of Amazon Redshift clusters in Amazon SageMaker Studio utilizing VPC peering


With cloud computing, as compute energy and knowledge turned extra out there, machine studying (ML) is now making an affect throughout each business and is a core a part of each enterprise and business.

Amazon SageMaker Studio is the primary absolutely built-in ML growth surroundings (IDE) with a web-based visible interface. You possibly can carry out all ML growth steps and have full entry, management, and visibility into every step required to construct, practice, and deploy fashions.

Amazon Redshift is a totally managed, quick, safe, and scalable cloud knowledge warehouse. Organizations typically need to use SageMaker Studio to get predictions from knowledge saved in an information warehouse equivalent to Amazon Redshift.

As described within the AWS Well-Architected Framework, separating workloads throughout accounts permits your group to set widespread guardrails whereas isolating environments. This may be significantly helpful for sure safety necessities, in addition to to simplify value controls and monitoring between tasks and groups. Organizations with a multi-account structure sometimes have Amazon Redshift and SageMaker Studio in two separate AWS accounts. Additionally, Amazon Redshift and SageMaker Studio are sometimes configured in VPCs with personal subnets to enhance safety and cut back the danger of unauthorized entry as a greatest apply.

Amazon Redshift natively supports cross-account knowledge sharing when RA3 node sorts are used. Should you’re utilizing another Amazon Redshift node sorts, equivalent to DS2 or DC2, you should use VPC peering to determine a cross-account connection between Amazon Redshift and SageMaker Studio.

On this submit, we stroll by way of step-by-step directions to determine a cross-account connection to any Amazon Redshift node kind (RA3, DC2, DS2) by connecting the Amazon Redshift cluster positioned in a single AWS account to SageMaker Studio in one other AWS account in the identical Area utilizing VPC peering.

Answer overview

We begin with two AWS accounts: a producer account with the Amazon Redshift knowledge warehouse, and a shopper account for Amazon SageMaker ML use circumstances that has SageMaker Studio arrange. The next is a high-level overview of the workflow:

  1. Arrange SageMaker Studio with VPCOnly mode within the shopper account. This prevents SageMaker from offering web entry to your studio notebooks. All SageMaker Studio site visitors is thru the required VPC and subnets.
  2. Replace your SageMaker Studio area to activate SourceIdentity to propagate the person profile identify.
  3. Create an AWS Identity and Access Management (IAM) function within the Amazon Redshift producer account that the SageMaker Studio IAM function will assume to entry Amazon Redshift.
  4. Replace the SageMaker IAM execution function within the SageMaker Studio shopper account that SageMaker Studio will use to imagine the function within the producer Amazon Redshift account.
  5. Arrange a peering connection between VPCs within the Amazon Redshift producer account and SageMaker Studio shopper account.
  6. Question Amazon Redshift in SageMaker Studio within the shopper account.

The next diagram illustrates our resolution structure.

Conditions

The steps on this submit assume that Amazon Redshift is launched in a non-public subnet within the Amazon Redshift producer account. Launching Amazon Redshift in a non-public subnet gives an extra layer of safety and isolation in comparison with launching it in a public subnet as a result of the personal subnet will not be straight accessible from the web and safer from exterior assaults.

To obtain public libraries, you could create a VPC and a non-public and public subnet within the SageMaker shopper account. Then launch a NAT gateway within the public subnet and add an web gateway for SageMaker Studio within the personal subnet to entry the web. For directions on the best way to set up a connection to a non-public subnet, seek advice from How do I set up a NAT gateway for a private subnet in Amazon VPC?

Arrange SageMaker Studio with VPCOnly mode within the shopper account

To create SageMaker Studio with VPCOnly mode, full the next steps:

  1. On the SageMaker console, select Studio within the navigation pane.
  2. Launch SageMaker Studio, select Customary setup, and select Configure.

Should you’re already utilizing AWS IAM Identity Center (successor to AWS Single Sign-On) for accessing your AWS accounts, you should use it for authentication. In any other case, you should use IAM for authentication and use your present federated roles.

  1. Within the Normal settings part, choose Create a brand new function.
  2. Within the Create an IAM function part, optionally specify your Amazon Simple Storage Service (Amazon S3) buckets by deciding on Any, Particular, or None, then select Create function.

This creates a SageMaker execution function, equivalent to AmazonSageMaker-ExecutionRole-00000000.

  1. Below Community and Storage Part, select your VPC, subnet (personal subnet), and safety group that you simply created as a prerequisite.
  2. Choose VPC Solely, then select Subsequent.

Replace your SageMaker Studio area to activate SourceIdentity to propagate the person profile identify

SageMaker Studio is built-in with AWS CloudTrail to allow directors to watch and audit person exercise and API calls from SageMaker Studio notebooks. You possibly can configure SageMaker Studio to report the person identification (particularly, the user profile name) to watch and audit person exercise and API calls from SageMaker Studio notebooks in CloudTrail occasions.

To log particular person exercise amongst a number of person profiles, we advisable that you simply activate SourceIdentity to propagate the SageMaker Studio area with the person profile identify. This lets you persist the person data into the session so you possibly can attribute actions to a particular person. This attribute can also be persevered over whenever you chain roles, so you will get fine-grained visibility into their actions within the producer account. As of the time this submit was written, you possibly can solely configure this utilizing the AWS Command Line Interface (AWS CLI) or any command line instrument.

To replace this configuration, all apps within the area have to be within the Stopped or Deleted state.

Use the next code to allow the propagation of the person profile identify because the SourceIdentity:

update-domain
--domain-id <worth>
[--default-user-settings <value>]
[--domain-settings-for-update "ExecutionRoleIdentityConfig=USER_PROFILE_NAME"]

This requires that you simply add sts:SetSourceIdentity within the belief relationship on your execution function.

Create an IAM function within the Amazon Redshift producer account that SageMaker Studio should assume to entry Amazon Redshift

To create a job that SageMaker will assume to entry Amazon Redshift, full the next steps:

  1. Open the IAM console within the Amazon Redshift producer account.

  1. Select Roles within the navigation pane, then select Create function.

  1. On the Choose trusted entity web page, choose Customized belief coverage.
  2. Enter the next customized belief coverage into the editor and supply your SageMaker shopper account ID and the SageMaker execution function that you simply created:
{
    "Model": "2012-10-17",
    "Assertion": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::<<SageMaker-Consumer-Account-ID>>:role/service-role/AmazonSageMaker-ExecutionRole-XXXXXX"
            },
            "Action": [
                "sts:AssumeRole",
                "sts:SetSourceIdentity"
           ]
            
        }
    ]
}

  1. Select Subsequent.
  2. On the Add required permissions web page, select Create coverage.
  3. Add the next pattern coverage and make crucial edits primarily based in your configuration.
{
    "Model": "2012-10-17",
    "Assertion": [
        {
            "Sid": "GetRedshiftCredentials",
            "Effect": "Allow",
            "Action": "redshift:GetClusterCredentials",
            "Resource": [
                "arn:aws:redshift:<<redshift-region-name>>:<<REDSHIFT-PRODUCER-ACCOUNT-ID>>:dbname:<<redshift-cluster-name>>/<<redshift-db-name>>",
                "arn:aws:redshift:<<redshift-region-name>>:<<REDSHIFT-PRODUCER-ACCOUNT-ID>>:dbuser:<<redshift-cluster-name>>/${redshift:DbUser}",
                "arn:aws:redshift:<<redshift-region-name>>:<<REDSHIFT-PRODUCER-ACCOUNT-ID>>:cluster:<<redshift-cluster-name>>"
            ],
            "Situation": {
                "StringEquals": {
                    "redshift:DbUser": "${aws:SourceIdentity}"
                }
            }
        },
        {
            "Sid": "DynamicUserCreation",
            "Impact": "Permit",
            "Motion": "redshift:CreateClusterUser",
            "Useful resource": "arn:aws:redshift:<<redshift-region-name>>:<<REDSHIFT-PRODUCER-ACCOUNT-ID>>:dbuser:<<redshift-cluster-name>>/${redshift:DbUser}",
            "Situation": {
                "StringEquals": {
                    "redshift:DbUser": "${aws:SourceIdentity}"
                }
            }
        },
        {
            "Impact": "Permit",
            "Motion": "redshift:JoinGroup",
            "Useful resource": "arn:aws:redshift:<<redshift-region-name>>:<<REDSHIFT-PRODUCER-ACCOUNT-ID>>:dbgroup:<<redshift-cluster-name>>/*"
        },
        {
            "Sid": "DataAPIPermissions",
            "Impact": "Permit",
            "Motion": [
                "redshift-data:ExecuteStatement",
                "redshift-data:CancelStatement",
                "redshift-data:ListStatements",
                "redshift-data:GetStatementResult",
                "redshift-data:DescribeStatement",
                "redshift-data:ListDatabases",
                "redshift-data:ListSchemas",
                "redshift-data:ListTables",
                "redshift-data:DescribeTable"
            ],
            "Useful resource": "*"
        },
        {
            "Sid": "ReadPermissions",
            "Impact": "Permit",
            "Motion": [
                "redshift:Describe*",
                "redshift:ViewQueriesInConsole"
            ],
            "Useful resource": "*"
        }
    ]
}

  1. Save the coverage by including a reputation, equivalent to RedshiftROAPIUserAccess.

The SourceIdentity attribute is used to tie the identification of the unique SageMaker Studio person to the Amazon Redshift database person. The actions by the person within the producer account can then be monitored utilizing CloudTrail and Amazon Redshift database audit logs.

  1. On the Identify, evaluate, and create web page, enter a job identify, evaluate the settings, and select Create function.

Replace the IAM function within the SageMaker shopper account that SageMaker Studio assumes within the Amazon Redshift producer account

To replace the SageMaker execution function for it to imagine the function that we simply created, full the next steps:

  1. Open the IAM console within the SageMaker shopper account.
  2. Select Roles within the navigation pane, then select the SageMaker execution function that we created (AmazonSageMaker-ExecutionRole-*).
  3. Within the Permissions coverage part, on the Add permissions menu, select Create inline coverage.

  1. Within the editor, on the JSON tab, enter the next coverage, the place <StudioRedshiftRoleARN> is the ARN of the function you created within the Amazon Redshift producer account:
{
    "Model": "2012-10-17",
    "Assertion": {
        "Impact": "Permit",
        "Motion": "sts:AssumeRole",
        "Useful resource": "<StudioRedshiftRoleARN>"
    }
}

You will get the ARN of the function created within the Amazon Redshift producer account on the IAM console, as proven within the following screenshot.

  1. Select Evaluate coverage.
  2. For Identify, enter a reputation on your coverage.
  3. Select Create coverage.

Your permission insurance policies ought to look much like the next screenshot.

Arrange a peering connection between the VPCs within the Amazon Redshift producer account and SageMaker Studio shopper account

To ascertain communication between the SageMaker Studio VPC and Amazon Redshift VPC, the 2 VPCs should be peered utilizing VPC peering. Full the next steps to determine a connection:

  1. In both the Amazon Redshift or SageMaker account, open the Amazon VPC console.
  2. Within the navigation pane, select Peering connections, then select Create peering connection.
  3. For Identify, enter a reputation on your connection.
  4. Below Choose a neighborhood VPC to look with, select a neighborhood VPC.
  5. Below Choose one other VPC to look with, specify one other VPC in the identical Area and one other account.
  6. Select Create peering connection.

  1. Evaluate the VPC peering connection and select Settle for request to activate.

After the VPC peering connection is efficiently established, you create routes on each the SageMaker and Amazon Redshift VPCs to finish connectivity between them.

  1. Within the SageMaker account, open the Amazon VPC console.
  2. Select Route tables within the navigation pane, then select the VPC that’s related to SageMaker and edit the routes.
  3. Add CIDR for the vacation spot Amazon Redshift VPC and the goal because the peering connection.
  4. Moreover, add a NAT gateway.
  5. Select Save adjustments.

  1. Within the Amazon Redshift account, open the Amazon VPC console.
  2. Select Route tables within the navigation pane, then select the VPC that’s related to Amazon Redshift and edit the routes.
  3. Add CIDR for the vacation spot SageMaker VPC and the goal because the peering connection.
  4. Moreover, add an web gateway.
  5. Select Save adjustments.

You possibly can connect with SageMaker Studio out of your VPC by way of an interface endpoint in your VPC as an alternative of connecting over the web. If you use a VPC interface endpoint, communication between your VPC and the SageMaker API or runtime is performed completely and securely throughout the AWS community.

  1. To create a VPC endpoint, within the SageMaker account, open the VPC console.
  2. Select Endpoints within the navigation pane, then select Create endpoint.
  3. Specify the SageMaker VPC, the respective subnets and acceptable safety teams to permit inbound and outbound NFS site visitors on your SageMaker notebooks area, and select Create VPC endpoint.

Question Amazon Redshift in SageMaker Studio within the shopper account

After all of the networking has been efficiently established, observe the steps on this part to hook up with the Amazon Redshift cluster within the SageMaker Studio shopper account utilizing the AWS SDK for pandas library:

  1. In SageMaker Studio, create a brand new pocket book.
  2. If the AWS SDK for pandas package deal will not be put in you possibly can set up it utilizing the next:
!pip set up awswrangler #AWS SDK for pandas

This set up will not be persistent and will probably be misplaced if the KernelGateway App is deleted. Customized packages might be added as a part of a Lifecycle Configuration.

  1. Enter the next code within the first cell and run the code. Change RoleArn and region_name values primarily based in your account settings:
import boto3
import awswrangler as wr
import pandas as pd
from datetime import datetime
import json
sts_client = boto3.consumer('sts')

# Name the assume_role technique of the STSConnection object and cross the function
# ARN and a job session identify.
assumed_role_object=sts_client.assume_role(
    RoleArn="arn:aws:iam::<<REDSHIFT-PRODUCER-ACCOUNT-ID>>:function/<<redshift-account-role>>",
    RoleSessionName="RedshiftSession"
)
credentials=assumed_role_object['Credentials']

# Use the non permanent credentials that AssumeRole returns to make a 
# connection to Amazon S3  
redshift_session=boto3.Session(
    aws_access_key_id=credentials['AccessKeyId'],
    aws_secret_access_key=credentials['SecretAccessKey'],
    aws_session_token=credentials['SessionToken'],
    region_name="<<redshift-region-name>>",
)

  1. Enter the next code in a brand new cell and run the code to get the present SageMaker person profile identify:
def get_userprofile_name():
    metadata_file_path="/decide/ml/metadata/resource-metadata.json"
    with open(metadata_file_path, 'r') as logs:
        metadata = json.load(logs)
    return metadata.get("UserProfileName")

  1. Enter the next code in a brand new cell and run the code:
con_redshift = wr.redshift.connect_temp(
    cluster_identifier="<<redshift-cluster-name>>",
    database="<<redshift-db-name>>",
    person=get_userprofile_name(),
    auto_create=True,
    db_groups=[<<list-redshift-user-group>>],
    boto3_session = redshift_session
)

To efficiently question Amazon Redshift, your database administrator must assign the newly created person with the required learn permissions throughout the Amazon Redshift cluster within the producer account.

  1. Enter the next code in a brand new cell, replace the question to match your Amazon Redshift desk, and run the cell. This could return the data efficiently for additional knowledge processing and evaluation.
df = wr.redshift.read_sql_query(
    sql="SELECT * FROM customers",
    con=con_redshift
)

Now you can begin constructing your knowledge transformations and evaluation primarily based on your corporation necessities.

Clear up

To wash up any assets to keep away from incurring recurring prices, delete the SageMaker VPC endpoints, Amazon Redshift cluster, and SageMaker Studio apps, customers, and area. Additionally delete any S3 buckets and objects you created.

Conclusion

On this submit, we confirmed the best way to set up a cross-account connection between personal Amazon Redshift and SageMaker Studio VPCs in several accounts utilizing VPC peering and entry Amazon Redshift knowledge in SageMaker Studio utilizing IAM function chaining, whereas additionally logging the person identification when the person accessed Amazon Redshift from SageMaker Studio. With this resolution, you remove the necessity to manually transfer knowledge between accounts to entry knowledge. We additionally walked by way of the best way to entry the Amazon Redshift cluster utilizing the AWS SDK for pandas library in SageMaker Studio and put together the info on your ML use circumstances.

To be taught extra about Amazon Redshift and SageMaker, seek advice from the Amazon Redshift Database Developer Guide and Amazon SageMaker Documentation.


In regards to the Authors

Supriya Puragundla is a Senior Options Architect at AWS. She helps key buyer accounts on their AI and ML journey. She is captivated with data-driven AI and the realm of depth in machine studying.

Marc Karp is a Machine Studying Architect with the Amazon SageMaker workforce. He focuses on serving to prospects design, deploy, and handle ML workloads at scale. In his spare time, he enjoys touring and exploring new locations.


Sensible Introduction to Transformer Fashions: BERT | by Shashank Kapadia | Jul, 2023

Cointegration vs Spurious Correlation: Perceive the Distinction for Correct Evaluation | by Egor Howell | Jul, 2023