MuleSoft Accelerator for SAP icon

MuleSoft Accelerator for SAP

(6 reviews)

Use case 4 - Real-time and bulk data lakes integration

Send sales order, inventory, and customer information from source systems to common data lake storage repositories.

Contents

See also

Overview

Businesses benefit from understanding the current supply and demand for their products and anticipating future trends. Greater visibility into sales orders and inventory allows a customer to plan better, align resources better, and operate more efficiently, particularly in Manufacturing, Transportation, Retail, and Marketing Logistics/Operations Services businesses.

This solution sends sales orders, inventory, and customer information from source systems to common data lake storage repositories such as Amazon S3 or Microsoft Azure Blob Storage in either real-time or in batch, laying the groundwork to send key objects to a structured data warehouse like Snowflake or Amazon Redshift and ultimately an analytics platform such as Tableau to provide predictive business intelligence.

Create order/inventory forecasts in Tableau

This solution also enables the creation of order and inventory forecasts directly in Tableau from Amazon S3 data leveraging the Amazon Athena Connector. Amazon Athena, a serverless interactive query service, allows the analysis of data stored in Amazon S3 using standard SQL queries. See the Amazon Athena setup guide for more information.

Use case description

The purpose of this solution is to consume or process the inventory, order, and customer transaction data in real-time or run a batch scheduler and then store it in an underlying data lake, which can be further consumed by an analytics platform such as Tableau to provide intelligence and prediction of order or inventory.

Glossary

TermDefinition
CIMThe Cloud Information Model defines a set of standard data structures that can be used as canonical representations of common entities for integrating systems.
Global DataA Global Data service provides an accurate, consistent, and complete copy of business data for use by enterprise applications and business partners while also providing a means to link that copy to occurrences in other systems.

High-level architecture

sap-datalakes-architecture.png

Sequence diagrams

Order sequence diagrams

SAP S/4HANA order create:

sap-datalakes-saps4hana-order-create-sequence-diagram.png

Salesforce order create:

sap-datalakes-salesforce-order-create-sequence-diagram.png

Customer sequence diagrams

SAP customer update:

sap-datalakes-sap-customer-update-sequence-diagram.png

Salesforce customer update:

sap-datalakes-salesforce-customer-profile-sequence-diagram.png

SAP Inventory sequence diagram

sap-datalakes-inventory-sequence-diagram.png

Batch

SAP customers batch sequence diagram:

sap-datalakes-batch-sap-customers-sequence-diagram.png

Salesforce customers batch sequence diagram:

sap-datalakes-batch-salesforce-customers-sequence-diagram.png

SAP orders batch sequence diagram:

sap-datalakes-batch-sap-orders-sequence-diagram.png

Salesforce orders batch sequence diagram:

sap-datalakes-batch-salesforce-orders-sequence-diagram.png

Use case considerations

  • SAP S/4HANA or Salesforce is the system of record for customers and orders depending upon the business need.
  • SAP S/4HANA is the system of record for inventory.

Technical considerations

  • The Cloud Information Model (CIM) is used as the canonical model for all business types.
  • Objects flow one-way from Salesforce or SAP S/4HANA to Amazon S3 and Microsoft Azure Blob Storage.
  • Each business object is stored separately in the data lake and the user can query by joining objects on Tableau to forecast or visualize the data based on the business need.
  • Both real-time and batch data transfer from core systems (Salesforce and SAP S/4HANA) to the data lake is supported.
  • Tableau requires at least five data points in the time series to estimate a trend, and enough data points for at least two seasons or one season plus five periods to estimate seasonality. For example, at least nine data points are required to estimate a model with a four-quarter seasonal cycle (4 + 5), and at least 24 to estimate a model with a twelve-month seasonal cycle (2 * 12).

End-to-end scenarios

  • Historical data is transferred from either Salesforce or SAP S/4HANA depending upon which system is the SOR for these objects to the data lake to start for trend/forecasting.
  • After the historical data dump of these objects, the customer has the option to choose either real-time or batch of Customers, Order, and Inventory to the data lake.
  • For the incremental load, the data is added to the existing object in the AWS S3 bucket or Microsoft Azure Blob Storage.

Systems involved

  • Amazon S3
  • Microsoft Azure Blob Storage
  • Salesforce
  • SAP S/4HANA

Goals

Support the ingestion of the following business objects in Cloud data storage for analytics, inventory forecasting, and demand planning:

  • Customer
  • Order
  • Inventory



back to top

Before you begin

bulb.png The Getting Started with MuleSoft Accelerators guide provides general information on getting started with the accelerator components. This includes instructions on setting up your local workstation for configuring and deploying the applications.

Processing logic

The following entities and associated relevant data are sent to the data lake using MuleSoft APIs in real-time or batch based on the customer needs:

  • Customer
  • Order
  • Inventory

Real-time

Customer:

  1. Customer changes are created in Salesforce and SAP and consumed by the Salesforce Topic Listener and SAP S/4HANA Event Listener respectively and then converted to CIM Object.
  2. A CIM object is published to Customer Exchange that is further transferred to Customer data storage queues.
  3. The Data Storage Process API listens to changes from the Customer data storage queue. Apply the following filter:
    • Duplicate customer records are excluded.
  4. If the AWS flag is true, make a call to the AWS Data Storage System API.
  5. If the Azure flag is true, make a call to the Azure Data Storage System API.

Order:

  1. Order changes are created in Salesforce and SAP and consumed by the Salesforce Topic Listener and SAP S/4HANA Event Listener respectively and then converted to SalesOrder CIM objects.
  2. A CIM object is published to Order Exchange that is further transferred to Order data storage queues.
  3. The Data Storage Process API listens to changes from the Order data storage queue. Apply the following filters:
    • Exclude updates to canceled orders.
    • Include replacement orders.
    • Order is in a fully completed state (status = CREATED, not mid-checkout).
  4. If the AWS flag is true, make a call to the AWS Data Storage System API.
  5. If the Azure flag is true, make a call to the Azure Data Storage System API.

Inventory:

  1. The SAP S/4HANA Poller Process API runs on a schedule that can be modified based on business needs.
  2. The SAP S/4HANA Inventory System API is called to fetch the SAP Inventory changes in a given time interval and map them to CIM objects.
  3. A CIM object is published to Inventory Exchange that is further transferred to Inventory data storage queues.
  4. The Data Storage Process API listens to messages from the Inventory data storage queue.
  5. If the AWS flag is true, make a call to the AWS Data Storage System API.
  6. If the Azure flag is true, make a call to the Azure Data Storage System API.

Batch

Customer:

  1. The Data Storage Process API scheduler will run and fetch the Customer data from Salesforce or SAP based on the business need and map it to the CIM object.
  2. If the AWS flag is true, make a call to the AWS Data Storage System API.
  3. If the Azure flag is true, make a call to the Azure Data Storage System API.

Order:

  1. The Data Storage Process API scheduler will run and fetch the Order data from Salesforce or SAP and map it to the SalesOrder CIM object.
  2. If the AWS flag is true, make a call to the AWS Data Storage System API.
  3. If the Azure flag is true, make a call to the Azure Data Storage System API.

Successful outcome

After successfully completing the processing of Customer, Inventory, and Order for all target systems, the following conditions will be met:

  • AWS S3 bucket holds the records for Customer, Inventory, and Order.
  • Azure Blob Storage holds the records for Customer, Inventory, and Order.

Mappings

Source type mapping

Source System Source Type CIM Types Mapping Notes
SAP Customer IndividualCustomer
Contact Address
Contact Email
Contact Phone
Salesforce Person Account Individual
CustomerContact Address
Contact Email
Contact Phone
Person Account is represented as record type, mapping to Account and Contact
Account OrganizationCustomerContact Address
Contact Email
Contact Phone
Account is mapped separately from Contacts
Contact Individual
Contact Address
Contact Email
Contact Phone
Similar mappings as for Person Accounts but at the Individual level only
Customer Profile Individual
Customer
Contact Address
Contact Email
Contact Phone
One B2C customer profile will map to multiple CIM objects
SAP Order SalesOrderSalesOrderProduct Order line items added as sales order products
Salesforce Order
OrderItem
SalesOrder
SalesOrderProduct
Order line items added as sales order products
SAP Inventory Produce

Target type mapping

Target System CIM Type Target Types Mapping Notes
AWS Data Lake Customer Customer Metadata: Information about data
Data = CIM
SalesOrder SalesOrder Metadata: Information about data
Data = CIM
Inventory Inventory Metadata: Information about data
Data = CIM
Azure Data Lake Customer Customer Metadata: Information about data
Data = CIM
SalesOrder SalesOrder Metadata: Information about data
Data = CIM
Inventory Inventory Metadata: Information about data
Data = CIM

Downloadable assets

System APIs

Process APIs

Additional System APIs



back to top


Reviews