Scaling RISE with SAP knowledge and AWS Glue

Scaling RISE with SAP knowledge and AWS Glue


Prospects typically need to increase and enrich SAP supply knowledge with different non-SAP supply knowledge. Such analytic use instances may be enabled by constructing an information warehouse or knowledge lake. Prospects can now use the AWS Glue SAP OData connector to extract knowledge from SAP. The SAP OData connector helps each on-premises and cloud-hosted (native and SAP RISE) deployments. By utilizing the AWS Glue OData connector for SAP, you possibly can work seamlessly together with your knowledge on AWS Glue and Apache Spark in a distributed trend for environment friendly processing. AWS Glue is a serverless knowledge integration service that makes it simpler to find, put together, transfer, and combine knowledge from a number of sources for analytics, machine studying (ML), and utility improvement.

AWS Glue OData connector for SAP makes use of the SAP ODP framework and OData protocol for knowledge extraction. This framework acts in a provider-subscriber mannequin to allow knowledge transfers between SAP methods and non-SAP knowledge targets. The ODP framework helps full knowledge extraction and alter knowledge seize by means of the Operational Delta Queues (ODQ) mechanism. As a supply for knowledge extraction for SAP, you should utilize SAP knowledge extractors, ABAP CDS views, SAP BW, or BW/4 HANA sources, HANA info views in SAP ABAP sources, or any ODP-enabled knowledge sources.

SAP supply methods can maintain historic knowledge, and may obtain fixed updates. Because of this, it’s essential to allow incremental processing of supply adjustments. This weblog put up particulars how one can extract knowledge from SAP and implement incremental knowledge switch out of your SAP supply utilizing the SAP ODP OData framework with supply delta tokens.

Resolution overview

Instance Corp desires to research the product knowledge saved of their SAP supply system. They need to perceive their present product providing, specifically the variety of merchandise that they’ve in every of their materials teams. It will embody becoming a member of knowledge from the SAP materials grasp and materials group knowledge sources from their SAP system. The fabric grasp knowledge is obtainable on incremental extraction, whereas the fabric group is simply obtainable on a full load. These knowledge sources needs to be mixed and obtainable to question for evaluation.

Stipulations

To finish the answer offered within the put up, begin by finishing the next prerequisite steps:

  1. Configure operational knowledge provisioning (ODP) knowledge sources for extraction within the SAP Gateway of your SAP system.
  2. Create an Amazon Easy Storage Service (Amazon S3) bucket to retailer your SAP knowledge.
  3. In an AWS Glue Knowledge Catalog, create a database known as sapgluedatabase.
  4. Create an AWS Id and Entry Administration (IAM) position for the AWS Glue extract, remodel, and cargo (ETL) job to make use of. The position should grant entry to all assets utilized by the job, together with Amazon S3 and AWS Secrets and techniques Supervisor. For the answer on this put up, title the position GlueServiceRoleforSAP. Use the next insurance policies:
    • AWS managed insurance policies:
    • Inline coverage:
      {
             "Model": "2012-10-17",
             "Assertion": [
                    {
                            "Sid": "VisualEditor0",
                            "Effect": "Allow",
                            "Action": [
                                   "s3:PutObject",
                                   "s3:GetObjectAcl",
                                   "s3:GetObject",
                                   "s3:GetObjectAttributes",
                                   "s3:ListBucket",
                                   "s3:DeleteObject",
                                   "s3:PutObjectAcl"],
                            "Useful resource": [
                                   "arn:aws:s3:::",
                                   "arn:aws:s3:::/*"
                            ]
                    }
             ]
      }
      

Create the AWS Glue connection for SAP

The SAP connector helps each CUSTOM (that is SAP BASIC authentication) and OAUTH authentication strategies. For this instance, you may be connecting with BASIC authentication.

  1. Use the AWS Administration Console for AWS Secrets and techniques Supervisor to create a secret known as ODataGlueSecret in your SAP supply. Particulars in AWS Secrets and techniques Supervisor ought to embody the weather within the following code. You will have to enter your SAP system username rather than <your SAP username> and its password rather than <your SAP username password>.
    {
       "basicAuthUsername": "",
       "basicAuthPassword": "",
       "basicAuthDisableSSO": "True",
       "customAuthenticationType": "CustomBasicAuth"
    }
    

  2. Create the AWS Glue connection GlueSAPOdata in your SAP system by deciding on the brand new SAP OData knowledge supply.
  3. Configure the reference to the suitable values in your SAP supply.
    1. Utility host URL: The host will need to have the SSL certificates for the authentication and validation of your SAP host title.
    2. Utility service path: /sap/opu/odata/iwfnd/catalogservice;v=2;
    3. Port quantity: Port variety of your SAP supply system.
    4. Shopper quantity: Shopper variety of your SAP supply system.
    5. Logon language: Logon language of your SAP supply system.
  4. Within the Authentication part, choose CUSTOM because the Authentication Sort.
  5. Choose the AWS Secret created within the previous steps: SAPODataSecret.
  6. Within the Community Choices part enter the VPC, subnet and safety group used for the connection to your SAP system. For extra info on connecting to your SAP system, see Configure a VPC in your ETL job.

Create an ETL job to ingest knowledge from SAP

Within the AWS Glue console, create a brand new Visible Editor AWS Glue job.

  1. Go to the AWS Glue console.
  2. Within the navigation pane beneath ETL Jobs select Visible ETL.
  3. Select Visible ETL to create a job within the Visible Editor.
  4. For this put up, edit the default title to be Materials Grasp Job and select Save.

In your Visible Editor canvas, choose your SAP sources.

  1. Select the Visible tab, then select the plus signal to open the Add nodes menu. Seek for SAP and add the SAP OData Supply.
  2. Select the node you simply added and title it Materials Grasp Attributes.
    1. For SAP OData connection, choose the GlueSAPOData connection.
    2. Choose the fabric attributes, service and entity set out of your SAP supply.
    3. For Entity Identify and Sub Entity Identify, choose SAP OData entity out of your SAP supply.
    4. From the Fields, choose Materials, Created on, Materials Group, Materials Sort, Previous Matl quantity, GLUE_FETCH_SQ, DELTA_TOKEN and DML_STATUS.
    5. Enter restrict 100 within the filter part, to restrict the info for design time.

Be aware that this service helps delta extraction, so Incremental switch is the default chosen choice.

After the AWS Glue service position particulars have been chosen, the info preview is obtainable. You’ll be able to alter the preview to incorporate the three new obtainable fields, that are:

  • glue_fetch_sq: This can be a sequence discipline, generated from the EPOC timestamp within the order the file was obtained and is exclusive for every file. This can be utilized if you have to know or set up the order of adjustments within the supply system.
  • delta_token: All information may have this discipline worth clean, aside from the final handed file, which can include the worth for the ODQ token to seize any modified information (CDC). This file will not be a transactional file from the supply and is simply there for the aim of passing the delta token worth.
  • dml_status: It will present UPDATED for all newly inserted and up to date information from the supply and DELETED for information which were deleted from supply.

For delta enabled extraction, the final file handed will include the worth DELTA_TOKEN and the delta_token discipline might be stuffed as talked about above.

  1. Add one other SAP ODATA supply connection to your canvas, and title this node Materials Group Textual content.
    1. Choose the fabric group service and entity set out of your SAP supply
    2. For Entity Identify and Sub Entity Identify, choose the SAP OData entity out of your SAP supply

Be aware that this service helps full extraction, so Full switch is the default chosen choice. You too can preview this dataset.

  1. When previewing the info, discover the language key. SAP passes all languages, so add a filter of SPRAS = ‘E’ to solely extract English. Be aware this makes use of the SAP inside worth of the sector.
  2. Add a remodel node to the canvas Change Schema remodel after the Materials Group Textual content.
    • Rename the fabric group discipline in goal key to matkl2, so it’s totally different than your first supply.
    • Below Drop, choose ;spras, odq_changemode, odq_entitycntr, dml_status, delta_token and glue_fetch_sq.

  3. Add a be a part of remodel to your canvas, bringing collectively each supply datasets.
    1. Make sure the node dad and mom of each Materials Grasp Attributes and Change Schema have been chosen
    2. Choose the Be part of sort of Left be a part of
    3. Choose the be a part of circumstances as the important thing fields from every supply
      • Below Materials Grasp Attributes, choose matkl
      • Below Change Schema, choose matkl2

You’ll be able to preview the output to make sure the proper knowledge is being returned. Now, you’re able to retailer the end result.

  1. Add the S3 bucket goal, to your canvas.
    1. Make sure the node dad and mom is Be part of
    2. For format, choose Parquet.
    3. For S3 Goal Location, browse to the S3 bucket you created within the stipulations and add materialmaster/ to the S3 goal location.
    4. For the Knowledge Catalog replace choices, choose Create a desk within the Knowledge Catalog and on subsequent runs, replace the schema and add new partitions.
    5. For Database, choose the title of the AWS Glue database created earlier sapgluedatabase.
    6. For Desk title, enter materialmaster.
  2. Select Save to save lots of your job. Your job ought to appear like the next determine.

Clone your ETL job and make it incremental

After your ETL job has been created, it’s able to clone and embody incremental knowledge dealing with utilizing delta tokens.

To do that, you will have to switch the job script immediately. You’ll modify the script so as to add a press release which retrieves the final delta token (to be saved on the job tag) and add the delta token worth to the to the request (or execution of the job), which can allow the Delta Enabled SAP OData Service when retrieving the info on the following job run.

The primary execution of the job won’t have a delta token worth on the tag; due to this fact, the decision might be an preliminary run and the delta token will subsequently be saved within the tags for future executions.

  1. Go to the AWS Glue console.
  2. Within the navigation pane beneath ETL Jobs select Visible ETL.
  3. Choose the Materials Grasp Job, select Actions and choose Clone job.
  4. Change the title of the job to Materials Grasp Job Delta, then select the Script tab.
  5. You have to add a further python library that can handle storing and retrieving the Delta Tokens for every job execution. To do that, navigate to the Job Particulars tab, scroll down and increase the Superior Properties part. Within the Python library path add the next path:
    s3://aws-blogs-artifacts-public/artifacts/BDB-4789/sap_odata_state_management.zip

  1. Now select the Script tab and select Edit script on the highest proper nook. Select Verify to substantiate that your job might be script-only.

Apply the next adjustments to the script to allow the delta token.

  1. 7. Import the SAP OData state administration library courses you added in step 5 above, by including the next code to row 8.
    from sap_odata_state_management.state_manager import StateManagerFactory, StateManagerType, StateType

  2. The following few steps will retrieve and persist the delta token within the job tags so it may be accessed by the following job execution. The delta token is added to the request again to the SAP supply, so the incremental adjustments are extracted. If there isn’t a token handed, the load will run as an preliminary load and the token might be persevered for the following run which can then be a delta load.To initialize the sap_odata_state_management library, extract the connection choices right into a variable and replace them utilizing the state supervisor. Do that by including the next code to line 16 (after the job.init assertion).

Yow will discover the and the within the present generated script beneath # Script generated for node Materials Grasp Attributes. Be sure you substitute with the suitable values.

key = ""
state_manager = StateManagerFactory.create_manager(
    manager_type=StateManagerType.JOB_TAG, state_type=StateType.DELTA_TOKEN, choices={"job_name": args['JOB_NAME'], "logger": glueContext.get_logger()}
)
choices = {
    "connectionName": "GlueSAPOData",
    "entityName": "",
    "ENABLE_CDC": "true"
}
connector_options = state_manager.get_connector_options(key)
choices.replace(connector_options)

  1. 9. Remark out the present script generated for node Materials Grasp Attributes by including a #, and add the next substitute snippet.
     = glueContext.create_dynamic_frame.from_options(connection_type="sapodata", connection_options=choices, transformation_ctx="")

  2. To extract the delta token from the dynamic body and persist it within the job tags, add the next code snippet simply above the final line in your script (earlier than job.commit())
    state_manager.update_state(key, .toDF())

That is what your remaining script ought to appear like:

import sys
from awsglue.transforms import *
from awsglue.utils import getResolvedOptions
from pyspark.context import SparkContext
from awsglue.context import GlueContext
from awsglue.job import Job
from awsglue.dynamicframe import DynamicFrame
from sap_odata_state_management.state_manager import StateManagerFactory, StateManagerType, StateType

args = getResolvedOptions(sys.argv, ['JOB_NAME'])
sc = SparkContext()
glueContext = GlueContext(sc)
spark = glueContext.spark_session
job = Job(glueContext)
job.init(args['JOB_NAME'], args)

key = "MaterialMasterAttributes_node1730873953236"
state_manager = StateManagerFactory.create_manager(
    manager_type=StateManagerType.JOB_TAG, state_type=StateType.DELTA_TOKEN, choices={"job_name": args['JOB_NAME'], "logger": glueContext.get_logger()}
)
choices = {
    "connectionName": "GlueSAPOData",
    "entityName": "/sap/opu/odata/sap/ZMATERIAL_ATTR_SRV/EntityOf0MATERIAL_ATTR",
    "ENABLE_CDC": "true"
}

# Script generated for node Materials Group Textual content
MaterialGroupText_node1730874412841 = glueContext.create_dynamic_frame.from_options(connection_type="sapodata", connection_options={"ENABLE_CDC": "false", "connectionName": "GlueSAPOData", "FILTER_PREDICATE": "SPRAS = 'E'", "ENTITY_NAME": "/sap/opu/odata/sap/ZMATL_GROUP_SRV/EntityOf0MATL_GROUP_TEXT"}, transformation_ctx="MaterialGroupText_node1730874412841")

# Script generated for node Materials Grasp Attributes
#MaterialMasterAttributes_node1730873953236 = glueContext.create_dynamic_frame.from_options(connection_type="sapodata", connection_options={"ENABLE_CDC": "true", "connectionName": "GlueSAPOdata", "FILTER_PREDICATE": "restrict 100", "SELECTED_FIELDS": "MATNR,MTART,MATKL,BISMT,ERSDA,DML_STATUS,DELTA_TOKEN,GLUE_FETCH_SQ", "ENTITY_NAME": "/sap/opu/odata/sap/ZMATERIAL_ATTR_SRV/EntityOf0MATERIAL_ATTR"}, transformation_ctx="MaterialMasterAttributes_node1732755261264")
MaterialMasterAttributes_node1730873953236 = glueContext.create_dynamic_frame.from_options(connection_type="sapodata", connection_options=choices, transformation_ctx="MaterialMasterAttributes_node1730873953236")

# Script generated for node Change Schema
ChangeSchema_node1730875214894 = ApplyMapping.apply(body=MaterialGroupText_node1730874412841, mappings=[("matkl", "string", "matkl2", "string"), ("txtsh", "string", "txtsh", "string")], transformation_ctx="ChangeSchema_node1730875214894")

# Script generated for node Be part of
MaterialMasterAttributes_node1730873953236DF = MaterialMasterAttributes_node1730873953236.toDF()
ChangeSchema_node1730875214894DF = ChangeSchema_node1730875214894.toDF()
Join_node1730874996674 = DynamicFrame.fromDF(MaterialMasterAttributes_node1730873953236DF.be a part of(ChangeSchema_node1730875214894DF, (MaterialMasterAttributes_node1730873953236DF['matkl'] == ChangeSchema_node1730875214894DF['matkl2']), "left"), glueContext, "Join_node1730874996674")

# Script generated for node Amazon S3
AmazonS3_node1730875848117 = glueContext.write_dynamic_frame.from_options(body=Join_node1730874996674, connection_type="s3", format="json", connection_options={"path": "s3://sapglueodatabucket", "compression": "snappy", "partitionKeys": []}, transformation_ctx="AmazonS3_node1730875848117")
state_manager.update_state(key, MaterialMasterAttributes_node1730873953236.toDF())
job.commit()

  1. Select Save to save lots of your adjustments.
  2. Select Run to run your job. Be aware that there are at present no tags in your job particulars.
  3. Wait in your job run to be efficiently accomplished. You’ll be able to see the standing on the Runs tab.
  4. After your job run is full, you’ll discover on the Job Particulars tab {that a} tag has been added. The following job run will learn this token and run a delta load.

Question your SAP knowledge supply knowledge

The AWS Glue job run has created an entry within the Knowledge Catalog enabling you to question the info instantly.

  1. Go to the Amazon Athena console.
  2. Select Launch Question Editor.
  3. Be sure to have an acceptable workgroup assigned, or create a workgroup if required.
  4. Choose the sapgluedatabase and run a question (similar to the next) to start out analyzing your knowledge.
    choose matkl, txtsh, depend(*)
    from materialmaster
    group by 1, 2
    order by 1, 2;

Clear up

To keep away from incurring fees, clear up the assets used on this put up out of your AWS account, together with the AWS Glue jobs, SAP OData connection, Glue Knowledge Catalog entry, Secrets and techniques Supervisor secret, IAM position, the contents of the S3 bucket, and the S3 bucket.

Conclusion

On this put up, we confirmed you how you can create a serverless incremental knowledge load course of for a number of SAP knowledge sources. The method used AWS Glue to incrementally load the info from a SAP supply utilizing SAP ODP delta tokens after which load the info into Amazon S3.

The serverless nature of AWS Glue implies that there isn’t a infrastructure administration, and also you pay just for the assets consumed whereas your jobs are working (plus storage price for outputs). As organizations more and more grow to be extra knowledge pushed, this SAP connector can present an environment friendly, price efficient, performant, safe solution to embody SAP supply knowledge in your huge knowledge and analytic outcomes. For extra info see AWS Glue.


In regards to the authors

Allison Quinn is a Sr. ANZ Analytics Specialist Options Architect for Knowledge and AI primarily based in Melbourne, Australia working intently with Monetary Service prospects within the area. Allison labored over 15 years with SAP merchandise earlier than concentrating her Analytics technical specialty on AWS native companies. She’s very keen about all issues knowledge, and democratizing in order that prospects of every type can drive enterprise profit.

Pavol is an Innovation Resolution Architect at AWS, specializing in SAP cloud adoption throughout EMEA. With over 20 years of expertise, he helps world prospects migrate and optimize SAP methods on AWS. Pavol develops tailor-made methods to transition SAP environments to the cloud, leveraging AWS’s agility, resiliency, and efficiency. He assists purchasers in modernizing their SAP landscapes utilizing AWS’s AI/ML, knowledge analytics, and utility companies to boost intelligence, automation, and efficiency.

Partha Pratim Sanyal is a Software program Improvement Engineer with AWS Glue in Vancouver, Canada, specializing in Knowledge Integration, Analytics, and Connectivity. With intensive backend improvement experience, he’s devoted to crafting impactful, customer-centric options. His work focuses on constructing options that empower customers to effortlessly analyze and perceive their knowledge. Partha’s dedication to addressing complicated person wants drives him to create intuitive and value-driven experiences that elevate knowledge accessibility and insights for purchasers.

Diego is an skilled Enterprise Options Architect with over 20 years’ expertise throughout SAP applied sciences, specializing in SAP innovation and knowledge and analytics. He has labored each as associate and as a buyer, giving him an entire perspective on what it takes to promote, implement, and run methods and organizations. He’s keen about know-how and innovation, specializing in buyer outcomes and delivering enterprise worth.

Luis Alberto Herrera Gomez is a Software program Improvement Engineer with AWS Glue in Vancouver, specializing in backend engineering, microservices, and cloud computing. With 7-8 years of expertise, together with roles as a backend and full-stack developer for a number of startups earlier than becoming a member of Amazon and AWS; Luis focuses on growing scalable and environment friendly cloud-based purposes. His experience in AWS applied sciences permits him to design high-performance methods that deal with complicated knowledge processing duties. Luis is keen about leveraging cloud computing to fixing difficult enterprise issues.

Leave a Reply

Your email address will not be published. Required fields are marked *