Extending SAC Planning – Creating custom calculations or ML (2023)

Learn how a SAC Planning model can be populated with data coming from custom calculations or Machine Learning. We describe this concept in a series of three blogs:

  • Accessing planning data with SAP Datasphere
    • Create a simple planning model in SAC
    • Make the planning data available in SAP Datasphere, so that it can be used by a Machine Learning algorithm
  • Creating custom calculations or ML (this blog)
    • Define the Machine Learning Logic
    • Create a REST-API that makes the Machine Learning logic accessible for SAC Planning
  • Orchestrating the end-to-end business process
    • Import the predictions into the planning model
    • Operationalise the process

This diagram shows the architecture and process from a high level:

Extending SAC Planning – Creating custom calculations or ML (1)

The whole concept and blog series has been put together by Maria TZATSOU, Andreas Forster, Gonzalo Hernan Sendra and Vlad-Andrei SLADARIU.

In the previous blog it was explained, how to capture input from the planning user and how to expose this information in the backend as Remote Table in SAP Datasphere. In this blog we will use that data in a very simple custom calculation example. We train a Machine Learning model (regression) to learn from a product’s past, how the price that we charged (UnitPrice) correlates to the quantities we sold (UnitsSold). This model is then applied on the prices we intend to charge in the coming months to estimate the sales quantities we can expect each month.

The same architecture and concept can be used for more complex requirements. You have access to a very comprehensive Machine Learning library with 100+ algorithms. For example we have used this concept to add a risk assessment to SAC Planning by integrating Monte Carlo simulations. The Monte Carlo topic could be worth a separate blog, let us know, if you would find this valuable.

We are using the Machine Learning that is embedded in SAP Datasphere / SAP HANA Cloud to avoid data extraction to keep the architecture lean. This is possible, as the built-in Machine Learning frameworks PAL (Predictive Analysis Library) and APL (Automated Predictive Library) can work with the planning data.

SAP Datasphere includes besides the Machine Learning frameworks also other analytical engines like geospatial or text analysis. Since all these engines are built into SAP Datasphere / SAP HANA Cloud, they work directly on the data without having to extract the content elsewhere.

In this blog we use Python to trigger the Predictive Analysis Library. This is possible thanks to our hana_ml package, which provides a convenient option to trigger those engines out of your favourite Python environment. This means you can remain in your familiar interface to trigger the algorithms on your planning data in this case. Personally, I prefer to script Python in Jupyter Notebooks, but you can use the environment of your choice.

For an introduction on this package and how to use it with SAP Datasphere, please see the previous blog in this series. In this very blog, we assume you have some familiarity with that package and only mention the most important steps needed for the SAC Planning extension. All necessary code is listed in this blog, but you can also download the actual files from the samples repository.

To work with Jupyter Notebooks you can install for example Anaconda. In the “Anaconda Prompt” type “jupyter lab” to open the notebook environment. Install the hana_ml package with

!pip hana_ml install

Connect to SAP Datasphere from Python. This requires:

  • The Database User that was created in the earlier blog (ie “EXTEND_SAC_PLANNING#_TU”). Remember that for this user the option “Enable Automated Predictive Library (APL) and Predictive Analysis Library (PAL)” is activated. This is only possible with an activated Script Server (see SAP Note 2994416).
  • SAP Datasphere must allow your external IP address to connect (see the documentation for details)

When your system has been configured, you can test establishing a connection from your local Jupyter notebook with logon credentials in clear text.

(Video) SAC planning Overview Comparison to BPC

import hana_ml.dataframe as dataframeconn = dataframe.ConnectionContext(address='[YOURENDPOINT]',port=443,user='[YOURDBUSER]',password='[YOURPASSWORD]')conn.connection.isconnected() 

If this succeeds, you can use the SAP HANA User Store to keep the credentials in a safe place, to avoid having to use a password in clear text. If you would like to set this up, please see the bonus section at the bottom of this blog.

import hana_ml.dataframe as dataframeconn = dataframe.ConnectionContext(userkey='MYDATASPHERE')conn.connection.isconnected()

Extending SAC Planning – Creating custom calculations or ML (2)

To start working with the data, create a hana_ml Dataframe, that points to the view, that accesses the user input. This is the view, which was created in the previous blog (ie “ExtSACP01_FactData_View”). To improve calculation performance save the data into a temporary table in SAP Datasphere, here called “#PLANNINGDATA”. All data is staying in the cloud. Just display a few rows to verify that the data is available.

df_remote = conn.table('ExtSACP01_FactData_View', schema='EXTEND_SAC_PLANNING')df_remote = df_remote.save('#PLANNINGDATA', table_type='LOCAL TEMPORARY', force=True)df_remote.head(10).collect()

Extending SAC Planning – Creating custom calculations or ML (3)

First we train a simple Machine Learning model on the actuals data, to use the UnitPrice as Predictor to explain the UnitsSold. This requires some semantical data transformation. In the original structure these two values that belong to the same months (UnitPrice and UnitsSold) are spread across two rows of data. Structure the data, so that both values for the same month become columns for the same row. Display a few rows to verify the output.

df_remote_act = df_remote.filter(""" "Version" = 'public.Actual'""")df_remote_act = df_remote_act.filter('''"Account" = 'UnitPrice' ''').rename_columns({'Value': 'UnitPrice'}).select('Date', 'UnitPrice').set_index('Date').join( df_remote_act.filter('''"Account" = 'UnitsSold' ''').rename_columns({'Value': 'UnitsSold'}).select('Date', 'UnitsSold').set_index('Date'), how='left')df_remote_act.head(5).collect()

Extending SAC Planning – Creating custom calculations or ML (4)

Similarly, prepare the data on which the model will be applied to estimate future sales quantities. We use the planned UnitPrice for the months, for which no actuals are available yet (April to December 2023 in this example).

df_remote_plan = df_remote.filter(""" "Version" = 'public.Plan'""")df_remote_plan = df_remote_plan.filter('''"Account" = 'UnitPrice' ''').rename_columns({'Value': 'UnitPrice'}).select('Date', 'UnitPrice').set_index('Date').join( df_remote_plan.filter('''"Account" = 'UnitsSold' ''').rename_columns({'Value': 'UnitsSold'}).select('Date', 'UnitsSold').set_index('Date'), how='left')month_last_actual = df_remote_act.select('Date').max()df_remote_plan = df_remote_plan.filter(f'''"Date" > '{month_last_actual}' ''')df_remote_plan.head(10).collect()

Extending SAC Planning – Creating custom calculations or ML (5)

Trigger the training of the Machine Learning model, using the embedded Machine Learning in SAP Datasphere. We use a simple linear regression.

from hana_ml.algorithms.pal.unified_regression import UnifiedRegressionur_hgbt = UnifiedRegression(func='LinearRegression')ur_hgbt.fit(data=df_remote_act, features=['UnitPrice'], label='UnitsSold')

We are not testing the model further. Let’s just have a quick look at the model quality. An R’2 of 0.96 shows that the model describes the relationship between UnitPrice and UnitsSold extremely well. We are using sample data. Should your own models on real data come with similar statistics, you may want to be suspicious that something went wrong.

from hana_ml.visualizers.unified_report import UnifiedReportUnifiedReport(ur_hgbt).build().display()

Extending SAC Planning – Creating custom calculations or ML (6)

Now apply the trained model to predict the UnitsSold for the future months, based on the planned UnitPrice for each month. The prediction itself is just one line of code. The remainder of the code structures the data to be in the necessary format so that SAC Planning can import it later on.

df_rem_predicted = ur_hgbt.predict(data=df_remote_plan, features='UnitPrice', key='Date')df_rem_predicted = df_rem_predicted.drop(['LOWER_BOUND', 'UPPER_BOUND', 'REASON'])df_rem_predicted = df_rem_predicted.rename_columns({'SCORE': 'Value'})df_rem_predicted = df_rem_predicted.add_constant('Version', 'public.Plan')df_rem_predicted = df_rem_predicted.add_constant('Category', 'Planning')df_rem_predicted = df_rem_predicted.add_constant('Account', 'UnitsSold')df_rem_predicted = df_rem_predicted.select('Version', 'Date', 'Account', 'Value', 'Category')df_rem_predicted.head(5).collect()

Extending SAC Planning – Creating custom calculations or ML (7)

(Video) SAC Planning - Create Planning Applications with Comments using Analytics Designer

And save those predictions to a physical table in SAP Datasphere, from where SAC Planning can import the data.

df_rem_predicted.save('CUSTOM_CALCULATIONS', force=True)

You can now see the forecasts in the table, for example via the SAP HANA Database Explorer or in tools like DBeaver.

Extending SAC Planning – Creating custom calculations or ML (8)

Once we are happy with the code from the Jupyter Notebook, it needs to be callable via REST-API, to be integrated into the SAC Planning workflow. For the deployment of that Python code you can use any environment that is able to expose such code as REST-API, for instance Cloud Foundry, Kyma or SAP Data Intelligence.

In this example we use Cloud Foundry, which is a very lightweight deployment on the SAP Business Technology Platform. In case you haven’t deployed Python code yet on Cloud Foundry, you can familiarise yourself with the blog Scheduling Python code on Cloud Foundry. In our case however, we don’t need to schedule any code, hence you can ignore that part of the blog. You do not need to create instances of the “Job Scheduler” and “Authorization & Trust Management (XSUAA)” services.

For creating the REST-API that is to be called from SAC Planning four files are required. All necessary code is listed in this blog, but remember that you can also download the actual files from the samples repository.

File 1 of 4: sacplanningmlunitssold.py

This file contains the code that gets executed when the REST-API is called. It is mostly the code that was created in the Jupyter Notebook. Personally, I am creating such a Python file in Visual Studio Code, but any Python environment should be fine.

Hint: The REST-API needs to have both a GET and POST endpoint. The GET endpoint is required so that SAP Analytics Cloud can save a connection for that endpoint. The POST endpoint is required by the multi-action, that operationalises the code.

from flask import Flaskfrom flask import Responseimport os, json, IPythonfrom hana_ml import dataframefrom hana_ml.algorithms.pal.unified_regression import UnifiedRegressionapp = Flask(__name__)# Port number is required to fetch from env variable# http://docs.cloudfoundry.org/devguide/deploy-apps/environment-variable.html#PORTcf_port = os.getenv("PORT")# Get SAP HANA logon credentials from user-provided variable in CloudFoundryhana_credentials_env = os.getenv('HANACRED')hana_credentials = json.loads(hana_credentials_env)hana_address = hana_credentials['address']hana_port = hana_credentials['port']hana_user = hana_credentials['user']hana_password = hana_credentials['password']# POST method as required by SAC PLanning@app.route('/', methods=['GET', 'POST'])def processing():# Connect to SAP HANA Cloudconn = dataframe.ConnectionContext(address=hana_address, port=hana_port, user=hana_user, password=hana_password)# Move planning data to staging table (for performance)df_remote = conn.table('ExtSACP01_FactData_View', schema='EXTEND_SAC_PLANNING')df_remote = df_remote.save('#PLANNINGDATA', table_type='LOCAL TEMPORARY', force=True)# Prepare the actuals data to train the modeldf_remote_act = df_remote.filter(""" "Version" = 'public.Actual'""")df_remote_act = df_remote_act.filter('''"Account" = 'UnitPrice' ''').rename_columns({'Value': 'UnitPrice'}).select('Date', 'UnitPrice').set_index('Date').join( df_remote_act.filter('''"Account" = 'UnitsSold' ''').rename_columns({'Value': 'UnitsSold'}).select('Date', 'UnitsSold').set_index('Date'), how='left')# Prepare the planning data to predict the future (months without actuals)df_remote_plan = df_remote.filter(""" "Version" = 'public.Plan'""")df_remote_plan = df_remote_plan.filter('''"Account" = 'UnitPrice' ''').rename_columns({'Value': 'UnitPrice'}).select('Date', 'UnitPrice').set_index('Date').join(df_remote_plan.filter('''"Account" = 'UnitsSold' ''').rename_columns({'Value': 'UnitsSold'}).select('Date', 'UnitsSold').set_index('Date'), how='left')month_last_actual = df_remote_act.select('Date').max()df_remote_plan = df_remote_plan.filter(f'''"Date" > '{month_last_actual}' ''')# Train and apply the Machine Learning modelur_hgbt = UnifiedRegression(func='LinearRegression')ur_hgbt.fit(data=df_remote_act, features=['UnitPrice'], label='UnitsSold')df_rem_predicted = ur_hgbt.predict(data=df_remote_plan, features='UnitPrice', key='Date')# Prepare dataset for returning to SAC Planningdf_rem_predicted = df_rem_predicted.drop(['LOWER_BOUND', 'UPPER_BOUND', 'REASON'])df_rem_predicted = df_rem_predicted.rename_columns({'SCORE': 'Value'})df_rem_predicted = df_rem_predicted.add_constant('Version', 'public.Plan')df_rem_predicted = df_rem_predicted.add_constant('Category', 'Planning')df_rem_predicted = df_rem_predicted.add_constant('Account', 'UnitsSold')df_rem_predicted = df_rem_predicted.select('Version', 'Date', 'Account', 'Value', 'Category')# Save the data into staging tabledf_rem_predicted.save('CUSTOM_CALCULATIONS', force=True)# Process compeletereturn Response("{'message':'The data has been processed'}", status=200, mimetype='application/json')if __name__ == '__main__':if cf_port is None:app.run(host='0.0.0.0', port=5000, debug=True)else:app.run(host='0.0.0.0', port=int(cf_port), debug=True)

File 2 of 4: manifest.yml

The manifest specifies for instance the memory requirements for the application.

---applications:- memory: 128MB command: python sacplanningmlunitssold.py random-route: true

File 3 of 4: runtime.txt

The runtime file specifies the Python version that is to be used.

(Video) SAP Analytics Cloud for Planning – Getting a Handle on Advanced Formulas

python-3.9.x

File 4 of 4: requirements.txt

The requirements file specifies the Python libraries that are to be installed.

Flaskhana-ml==2.15.23011100shapelyIPython

Place these four files into a local folder on your laptop. Now push them to Cloud Foundry as application. This concept was introduced in the blog mentioned above, Scheduling Python code on Cloud Foundry. Since the application requires the database credentials, which haven’t been provided yet, the application is pushed, but not started yet.

cf7 push sacplanningmlunitssold --no-start

Now store the database credentials in a user-defined variable in Cloud Foundry, from where the application can retrieve them.

cf7 set-env sacplanningmlunitssold HANACRED {\"address\":\"REPLACEWITHYOURHANASERVER\",\"port\":443,\"user\":\"REPLACEWITHYOURUSER\",\"password\":\"REPLACEWITHYOURPASSWORD\"}

Everything is in place. Start the Cloud Foundry application and the path of the REST-API is displayed.

cf7 start sacplanningmlunitssold

Extending SAC Planning – Creating custom calculations or ML (9)

Calling this REST-API triggers the logic in the sacplanningcustomcalc.py file. The code connect to the SAP HANA Cloud system that is part of SAP Datasphere. It instructs SAP HANA Cloud to work with the SAC Planning values. A Machine Learning model is trained on the actuals data to explain the UnitsSold based on the UnitsPrice. For future months the planned UnitsPrice is used to predict the estimated UnitsSold. . That forecast is written to the staging table, where SAC Planning can pick then up.

Before embedding the REST-API into the workflow, just give it a test. You can use Postman for instance. Call the POST endpoint, which should result in the staging table being updated.

Extending SAC Planning – Creating custom calculations or ML (10)

The REST-API is up and running. Calling it triggers the Machine Learning forecasts, which are written to the staging table. We conclude the blog by showing how to make this REST-API accessible to SAP Analytics Cloud.

SAP Analytics Cloud can access REST-APIs that support “Basic Authentication” or “OAuth 2.0 Client Credentials”. In our example we take the OAuth option. Hence “Add a New OAuth Client” in SAP Analytics Cloud → Administration → App Integration. We name the client here “CloudFoundry for Custom ML”. Keep the default settings, just enter the path of the REST-API into the “Redirect URI” option. Be sure to add “https://” in front.

Extending SAC Planning – Creating custom calculations or ML (11)

When the client has been created, note down the following values. They will be needed in the next step.

(Video) Sales and Revenue Planning in SAP Analytics Cloud

  • OAuth Client ID
  • Secret:
  • Token URL

Still in SAP Analytics Cloud, create now a new connection of type “HTTP API”.

Extending SAC Planning – Creating custom calculations or ML (12)

Enter these values:

  • Connection Name: API CloudFoundry for Custom ML
  • Data Service URL: [The same URL as used in the previous step as “Redirect URI”]
  • Authentication Type: OAuth 2.0 Client Credentials
  • OAuth Client ID: [The ID you received in the previous step]
  • Secret: [The secret you received in the previous step]
  • Token URL: [also from the previous step]

Extending SAC Planning – Creating custom calculations or ML (13)

The connection has been created. SAP Analytics Cloud can now call the REST-API to trigger the Machine Learning.

Extending SAC Planning – Creating custom calculations or ML (14)

In the steps described in this blog we accessed the planning data from the SAC Planning model to create predictions, which are written into a temporary staging table.

In the next blog you will see how this data can be imported into the planning model and how the whole process can be operationalised.

Early on in the above coding example it was mentioned, that you can keep your database password safe in the Secure User Store. Here are the steps to set this up:

Step 1: Install on the machine, on which you execute the notebook (probably your laptop), the SAP HANA Client 2.0. This also installs the Secure User Store.

Step 2: Save your database credentials in the Secure User Store. Open your operating system’s command prompt (ie cmd.exe on Windows). Navigate in that prompt to the SAP HANA Client’s installation folder (ie ‘C:\Program Files\SAP\hdbclient’. The following command stores the credentials under the key MYDATASPHERE. You can change the name of the key as you like.

hdbuserstore -i SET MYDATASPHERE "[YOURENDPOINT]:443" [YOURDBUSER]

Step 3: Now the hana_ml package can pull the credentials from the Secure User Store to establish a connection. The code does not contain the password in clear text anymore.

import hana_ml.dataframe as dataframeconn = dataframe.ConnectionContext(userkey='MYDATASPHERE')conn.connection.isconnected()

FAQs

What is SAP SAC planning? ›

SAP Analytics Cloud for planning enables extended planning and analysis (xP&A), allowing you to crowdsource plans across your business, create accurate forecasts and drive better business outcomes. Make agile planning and analysis decisions with one enterprise solution!

Why do we need sac? ›

Its main function is the creation of data reports.

It allows every user to produce reports of varying complexity with complete autonomy. SAC is positioned in the segment of analytical tools for data visualization, in the same way as the Microsoft tool, Power BI.

What are the features in sac? ›

SAP Analytics Cloud Features
  • Analytics and Business Intelligence. Intuitive self-service analytics can help you explore data across the organization and deliver insights at the point of decision. ...
  • Enterprise Planning. ...
  • Composable Analytics. ...
  • Prebuilt Business Content.

Why should we use SAP Analytics Cloud? ›

SAP Analytics Cloud lets you explore data sets and visualize new insights.

Is SAP SAC easy to learn? ›

SAC is easy to learn and has a quick turnaround for Professionals in their careers.

What are the benefits of SAP SAC planning? ›

Predictive features of SAC can automate the baseline expense planning by analyzing the previous data. Accuracy indicators also provide some sort of insights for financial planning. It is easy to monitor the plan in real-time using the SAP Analytics Cloud.

Does SAP SAC require coding? ›

We can confidently take decisions which are powered by AI and Machine Learning without the need of learning coding. Planning, SAP content and business content are the main differentiators of SAC and the main thing that makes it a complete one-and-only solution for all of the above needs.

What are limitations of SAC? ›

- Maximum number of columns: 60. - Maximum number of cells of data: 3 million. The following limitations apply to the export: - Maximum number of cells of data: 500 thousand.

What are datasets in SAC? ›

A dataset is a simple collection of data, usually presented in a table. You can use a dataset as the basis for your story, and as a data source for Smart Predict.

What is the SAC part of SAP? ›

SAP Analytics Cloud (SAC) is a Software-as-a-Service-Platform (SaaS-Platform) for SAP's Business Intelligence (BI). SAP Analytics Cloud was developed to combine all analysis functions like ad-hoc analysis / self-service, dashboarding and planning within one product for all users.

What are filters in sac? ›

Filters let you focus on a specific set of data. You can apply filters to an entire story, a single page, or a specific chart on a page.

What are the types of models in SAC? ›

In SAC we have 2 types of Models. The Analytical Model and Planning Model. A Model consists of Dimensions and Measures. Analytical Model : It consists of Account dimension which is the required dimension and can have 'n' number of generic dimensions.

Why is SAP so widely used? ›

SAP is so popular among enterprises because businesses can get an all-inclusive set of integrated, cross-functional dealing processes with it. SAP regularly researches market trends and needs of the business units.

Is SAP Analytics Cloud worth learning? ›

Overall, SAP Analytics Cloud is a valuable tool for organizations looking to harness the power of data and make informed business decisions. Pros: Unified platform: SAP Analytics Cloud combines business intelligence, planning, and predictive analytics, providing a comprehensive solution for data-driven decision-making.

Can we learn SAP in 3 months? ›

Choose SAP, take up the virtual counselling, learn SAP, get certified in 6-10 weeks based on your learning capabilities and the course you have chosen.

What skills do you need for SAP SAC? ›

Good Understanding of SAP data structures, End to End dataflow, SAC predictive capabilities, augmented analytics, forecasting, etc. Ability to be flexible and work analytically in a problem-solving environment.

Can you learn SAP by yourself? ›

4) SAP Learning Hub

SAP Learning Hub is a paid, self-serve training module. If you are looking for a free version of the module, you can use the discovery edition. The SAP Learning Hub provides subscribers with e-learning courses and live learning sessions. You can connect with peers who are also learning about SAP.

What is the difference between model and story in sac? ›

Model will define each column of data as either as dimension or a measure. The process of creating a model is called modeling. Below is model created using data from spreadsheet. Stories – Stories are where you explore and visualize our data for reporting, planning and analysis.

Can we copy a model in sac? ›

From the ( ) Main Menu, select Files. Select the check box beside the model you want to copy. If the model can be copied, the copy icon is enabled. Select (Copy To).

What are the 3 main benefits of SAP implementation through Activate? ›

Using three key components—ready-to-run business processes, guided methodology, and tools for adoption and extensibility—SAP Activate aims to simplify implementation processes and help users build strong landscapes.

What does SAC stand for in SAP? ›

5 12 38,422. In this blog , I want to write about SAP Analytics Cloud Platform (SAC) from a Data analysts perspective.

What are roles in SAC in SAP? ›

Includes all authorizations that are required to create, view, update or delete analytic applications. Includes all authorizations that are required to view analytic applications. To grant permissions to any user, navigate to Main Menu → Security → Roles.

Which module is best in SAP without coding? ›

SAP MM – Materials Management for students of core branches like Mechanical, Civil, etc as well as for MBA and MCA students. MM is a functional module and DOES NOT require any coding or development skills. This module is used by most of the companies that have implemented SAP.

Is SAP Analytics Cloud difficult to learn? ›

The learning journey is geared towards the business user, so you don't need to be a data scientist to successfully use augmented analytics in SAP Analytics Cloud. In fact, you don't need any prior AI or machine learning knowledge and it's easy to understand!

How do you write a script in SAC? ›

1. Read Master Data
  1. 1.1. The Dimension 'User' We have the Dimension 'User' in a Model that is used in my Analytic Application. ...
  2. 1.2. Create a Planning Model in the Analytic Application. ...
  3. 1.3. Create Variables for the User ID. ...
  4. 1.4. Create Script in Analytic Application. ...
  5. 2.1. Popup. ...
  6. 2.2. Create Script in Analytic Application.
Nov 29, 2021

Can SAC store data? ›

Since SAC is predominantly a visualization tool, with limited modeling capabilities, it needs a backend system to store data. In such a case, writing back data to the source system becomes a crucial part of the entire planning exercise.

What is the limit of SAC data import? ›

Modeling row limit: Subsequent data imports to an existing model cannot exceed a total of 2^31-1 (2,147,483,647) rows. You cannot import data or schedule a data import into an existing model if the resulting fact data would include over 2^31-1 (2,147,483,647) rows.

What is an SAC model? ›

In the SAP Analytics Cloud (SAC), the data foundation uses the SAC models as a basis. In business management terminology, a model is a representation of the business data of an organization or business area.

Where is SAC data stored? ›

Data is imported from an underlying system (presumably SAP S/4HANA or BW/4HANA as well as any third-party system) into our Planning model in SAC. In this case, the data is stored in the cloud.

Is SAC a data warehouse? ›

Overview. SAP Analytics Cloud and SAP Data Warehouse Cloud are SAP's strategic cloud solutions for Data Warehousing, Planning and Analytics. In a nutshell: SAP Analytics Cloud (SAC) is unique in the market by combining Planning, Business Intelligence and Predictive Analytics in one cloud solution.

What is the difference between SAC data model and dataset? ›

Dataset defines your raw data with measures and dimension columns. Models are where you do all your data modeling on the dataset.

Is SAP SAC free? ›

Try SAP Analytics Cloud for free

Start your digital transformation today with a free trial for up to 60 days* of SAP Analytics Cloud; connect your data sources, model your data, create visualizations, add augmented analytics, and try predictive planning.

Which SAP software strategy does SAC represent? ›

SAP Analytics Cloud, popularly known as SAC, is a Software-as-a-Service (SaaS)-based cloud data visualization software from SAP. It consolidates business intelligence, predictive analysis, and planning into a single cloud environment for easy data access and smooth decision-making.

Why is a SAP called a SAP? ›

What does SAP stand for? (What is SAP full form?) The name is an initialism of the company's original German name: Systemanalyse Programmentwicklung, which translates to System Analysis Program Development.

What are the 5 types of filters? ›

The 5 Types of Filters
  • Mechanical Filters.
  • Absorption Filters.
  • Sequestration Filters.
  • Ion Exchange Filters.
  • Reverse Osmosis Filters.

What are the 4 stages of filter? ›

The ideal setup in terms of the cost and performance perspective is 4 stages: sediment filtration, carbon filtration, reverse osmosis, and the post taste refining filter.

What are the 4 types of models? ›

9 types of modelling explained
  • Runway models. A runway model works most commonly on the catwalk, which is the runway at fashion shows where designers showcase their work, such as a new clothing line. ...
  • Fashion/editorial models. ...
  • Commercial models. ...
  • Photographers. ...
  • Textile designers.
May 11, 2023

What is account dimension in sac? ›

The account dimension defines the chart of accounts for your model, and how those accounts are calculated and aggregated. Any dimension that is assigned the type A is considered an account dimension. Each model can have only one account-type dimension.

What are the three main types of models? ›

Many types of models can be grouped into three categories; visual models, mathematical models, and computer models.

Is SAP outdated now? ›

Enterprise and software giant SAP has set a 2027 deadline after which its systems will be built to run on just one single database. The year 2025 was originally the S/4HANA deadline.

Why is SAP highly paid? ›

In the case of SAP, the supply of SAP experts is decidedly less from the perspective of an enterprise. This, in turn, increases the demand for trained SAP professionals in the market. Thus, employers are ready to pay the extra to achieve the need for highly trained SAP employees.

What is SAP in simple words? ›

What does SAP stand for? (What is SAP full form?) The name is an initialism of the company's original German name: Systemanalyse Programmentwicklung, which translates to System Analysis Program Development.

What is SAP Analytics Cloud planning salary? ›

SAP Analytics Consultant salary in India ranges between ₹ 3.6 Lakhs to ₹ 24.6 Lakhs with an average annual salary of ₹ 11.0 Lakhs.

What is the average salary of SAP Analytics Cloud? ›

SAP Analytics Consultant salary in India ranges between ₹ 11.5 Lakhs to ₹ 18.0 Lakhs with an average annual salary of ₹ 14.6 Lakhs.

How much does a SAP Analytics Cloud Analyst earn? ›

How much does a Sap Analytics Cloud make? As of May 16, 2023, the average annual pay for a Sap Analytics Cloud in the United States is $127,492 a year.

What does SAP SAC stand for? ›

individuals. SAP Analytics Cloud (SAC) is a Software-as-a-Service-Platform (SaaS-Platform) for SAP's Business Intelligence (BI). SAP Analytics Cloud was developed to combine all analysis functions like ad-hoc analysis / self-service, dashboarding and planning within one product for all users.

What is the difference between BPC and SAC planning? ›

BPC uses a Linear Security model: security is defined by the dimensions of member ids. SAC adopted a Matrix Security Model: it is possible to specify a combination of members to grant or revoke access to data.

What is the overview of SAC? ›

SAP Analytics Cloud (SAC) brings together the analysis, planning, forecasting and visualisation of data in one tool. It combines a world-class business intelligence (BI) system with business planning, including the use of artificial intelligence and machine-learning technologies.

How is SAP used for production planning? ›

SAP Production Planning is a module that handles all business processes related to production of a company. SAP PP includes all activities like Material Requirement Planning(MRP), Bill of Materials(BOM), Capacity Planning(CP) and Supply Chain Planning(SCP).

What is the future of SAP BPC? ›

Upgrade or Migrate to SAP BPC 11.1

Another thing to note is that SAP has officially stated the BPC 11.1 version is only supported through 12/31/2024 – but there is a possibility that it may be supported through the end of 2027. It is likely that this version will have mainstream maintenance dates extended.

What is SAC hierarchy? ›

A level-based hierarchy organizes the members of a dimension into levels, such as Country, State, and City. You can add level-based hierarchies to generic dimensions and organization dimensions by selecting a dimension, and then from the Dimension Settings panel, selecting Create Hierarchy Level-Based Hierarchy.

What is a sac consultant? ›

Job Title: SAP Analytics Cloud (SAC) Consultant. Job Description: As a SAP Analytics Cloud Consultant, you will design, implement and deliver the functional and technical components of planning, budgeting and forecasting solutions using SAC Planning.

What is SAC integration in SAP? ›

SAP SAC is used to carry out analyses, reporting, and planning. The course focuses primarily on data modeling, data analysis, and business planning by using SAP Analytics Cloud and the integration of SAC with various data sources.

What are input controls in sac? ›

Input control in SAC is a widget which can be used to view data from various perspectives by allowing filtering on dimensions or measures. SAP Analytics Cloud allow you to control your input data just by a radio button to change the dimension and measure in graph or table view.

Videos

1. Creating Data Actions in SAP Analytics Cloud (Part 2)
(Analysis Prime University)
2. End-to-end Planning Capabilities Using SAP Analytics Cloud
(SimpleFi Solutions | PlaniFi Platform Analytics)
3. SAP Analytics Cloud (SAC) : How to Import S/4HANA Master data and Hierarchies to SAC Dimension
(Kiran B)
4. Create new Model Type in SAP Analytics Cloud | SAP Analytics Cloud End to End Tutorials | SAC Models
(Anubhav Trainings (official channel))
5. SAC Introduction, Creation of Dimension and Update data using csv File
(Srinivas M SAP Analytics Tutorial)
6. How to get the most from your data with DMCi.
(Westernacher Consulting)
Top Articles
Latest Posts
Article information

Author: Edmund Hettinger DC

Last Updated: 17/04/2023

Views: 6222

Rating: 4.8 / 5 (78 voted)

Reviews: 85% of readers found this page helpful

Author information

Name: Edmund Hettinger DC

Birthday: 1994-08-17

Address: 2033 Gerhold Pine, Port Jocelyn, VA 12101-5654

Phone: +8524399971620

Job: Central Manufacturing Supervisor

Hobby: Jogging, Metalworking, Tai chi, Shopping, Puzzles, Rock climbing, Crocheting

Introduction: My name is Edmund Hettinger DC, I am a adventurous, colorful, gifted, determined, precious, open, colorful person who loves writing and wants to share my knowledge and understanding with you.