Tuesday, 31 October 2017

ABAP On HANA – My experience in SAP Inside Track

I would like to share my knowledge, which I gained from SIT (SAP Inside Track). Before getting started with a particular topic, I would like to share my experience on SIT, which was conducted in Hyderabad. It was a great pleasure for me to be a part of SIT. There were many valuable sessions throughout the day for 3 tracks (Technical, Functional and Analytical). As basically I am from technical side, the topics which they chosen are really excellent and presented in a very good manner.

Monday, 30 October 2017

Launch Classical UIs from HANA Cloud Portal

Until few weeks back, we could only use the Fiori Launchpad on HANA Cloud Platform to launch Fiori Apps deployed within the platform and also other contents via URLs and Mobile document. Fiori Launchpad is the entry point to access all SAP applications and we see many customers  using the Fiori Launchpad on HCP to launch standard and custom Fiori Apps. However, there are still lot of customers who have a requirement to also expose SAP GUI transactions and WebDynpro ABAP applications along with these HTML5 based Fiori Apps. They were looking for ways to provide one single access point to their end users in order to access all types of applications (on-premise/cloud apps). With the release of ABAP Add-on SAP_UI 751, it is now possible to launch even Classical UIs like SAP GUI (HTML) and WebDynpro ABAP applications which are available in the backend SAP system. In this blog, I am going to show the steps required to do this.

Sunday, 29 October 2017

Unfair speed test – ABAP CDS on ASE vs HANA CDS on HANA

The title of the blog post says it all. This will be an unfair speed test.

If you’ve read the previous two posts of this series, you’ll note that I have successfully connected the ABAP 751 SP02 Developer Edition to a HANA Express 2.0 server as a secondary DB and used a simple ABAP program that helped replicate data over from ASE to HANA DB.

To perform this test, I created two very similar CDS, one on ABAP that will be selecting from the ABAP 751 server running on ASE and another directly on HANA hosted on my HANA Express Edition server.

Saturday, 28 October 2017

Join cardinality setting in Calculation Views

Mechanism of join cardinality setting

“cardinality” is a setting that can be applied to joins in calculation views. It specifies how many matching entries for entries of one table exist in the other table of a join. It consist of two numbers, the left number describes the number of matching entries for entries of the right table while the right number describes the number of matching entries for entries of the left table. As an example, assume a join on field “employee” between Table 1 (left table) and Table 2 (right table). A join cardinality of “1..n” specifies that Table 2 has at most 1 matching entry in Table 1. Conversely, each entry in Table 1 might have 0 to n matching entries in Table 2. The symbol “n” stands here for an arbitrary positive number. For example, entry “Alice” in Table 1 might have 0, 1, or an arbitrary number of matches in Table 2. Take another example “1..1”. This indicates that each entry in Table 1, e.g., entry “Alice” of Table 1 has 0, or 1 matching entry in Table 2. Analogously, “Alice” of Table 2 has also at most 1 match in Table 1.

Friday, 27 October 2017

Simple sql table export in ABAP for HANA

Idea was to play with HANA and to try it’s functionality for educational purposes.

For data extraction I wrote a simple ABAP Report, which extracts selected tables with its data and prepares sql script for import.

Here is a source code (s. attached file).

Here is a screenshot of selection screen.

Thursday, 26 October 2017

HANA Savepoint Analysis

1.What are savepoints?

◉ Savepoints are required to synchronize changes in memory with the persistency on disk level. All modified pages of row and column store are written to disk during a savepoint.
◉ Each SAP HANA host and service has its own savepoints.
◉ The data belonging to a savepoint represents a consistent state of the data on disk and remains untouched until the next savepoint operation has been completed.

Wednesday, 25 October 2017

Apache Hadoop as NLS solution for BW/HANA Part 2

For part 1 : Apache Hadoop as NLS solution for SAP HANA Part 1

After searching the internet for hours and days, trying to figure out the HANA and Hadoop integration process, I realized that there are a number of articles out there that talk about the Why’s the What’s and the Who’s of the process but not many have delved into the “How” aspect of it. So here is my humble attempt.

Tuesday, 24 October 2017

Demystifying SAP Database Migration tools for your SAP HANA migration and Cloud Adoption

This knowledge demystifies SAP Migration tools and its technology to facilitate your SAP HANA migration or Cloud adoption by helping you to define and choose the most appropriate SAP migration tool & strategy.

Monday, 23 October 2017

Apache Hadoop as NLS solution for SAP HANA Part 1


Apache Hadoop has become the poster child for big data largely due to its high scalability analytics platform capable of processing large volumes of structured and unstructured data. SAP HANA on the other hand, has gained ground as the leading in memory data analytics platform that lets you accelerate business processes and deliver quantifiable business intelligence at lightening speed. Both these database platforms are independent of each other and have pros and cons which make them a perfect fit for a long term sustainable high performance data lake strategy for any large multinational corporation.

Tuesday, 17 October 2017

AMDP based BEx Customer HANA Exit

With the release of the AS ABAP 7.4 many new capabilities were introduced, one of them is the AMDP methodology. This methodology is transformational as developers can leverage the best of both ABAP and SQL programming language to build models and applications. SAP BW on HANA, S/4 HANA embedded BW and BW/4 HANA can also take benefit from this framework.

Monday, 16 October 2017


SAP continues to improve its ERP products to HANA technology gradually. One of the most critical product is that instead of the classic MRP, it now becomes a new product as “MRP Live“. For the benefit of HANA’s speed and efficiency, MRP Live is available.

Such changes and innovations have always had advantages and disadvantages as well. These changes are especially important for backend developers. The biggest advantage of MRP Live is speed and performance, the biggest disadvantage is the length of development time. Even for a simple customer demand, we may need to write dozens of lines of code. This situation will change according to needs, of course, but it seems that we will spend much more time for according to the previous development environment and language as known SAP GUI and ABAP. MRP live is fully developed with ABAP managed database procedure(AMDP) and also for enhancements AMDP BADIs is used for MRP Live HANA development with top-down approach.

Sunday, 15 October 2017

Step by Step Hierarchies in S/4 HANA Analytics

I am going to show step by step procedure to display hierarchies in S/4 using analytical CDS views. I will use manger employee hierarchy as an example here.

To achieve this you need:

◉ A dimension view with an employee as key, and his manager as attribute, and optionally some time-dependency information; this view gets the @Hierarchy annotations.

Friday, 13 October 2017

Code Push Down for HANA Starts with ABAP Open SQL

What is Code Push Down?

One of the key differences for developing applications in ABAP for HANA is that you can push down data intense computations and calculations to the HANA DB layer instead bringing all the data to the ABAP layer and the processing the data to do computations. This is what is termed as Code-to-Data paradigm in the context of developing ABAP applications optimized for HANA.

Thursday, 12 October 2017

Under the HANA hood of an ABAP Managed Database Procedure

I’ve been looking into ABAP managed database procedures for HANA recently and decided to take a look at what’s actually created under the hood in the HANA database when an AMDP is created.

I created a small test class in our CRM on HANA system with a method to read a couple of columns from the crmd_orderadm_h table using sqlscript. The method takes one input parameter IV_OBJECT_ID and has one export parameter ET_ORDER.

Wednesday, 11 October 2017

Hana DB Row Store Reorganization

How To Perform Hana DB Row Store Reorganization

Row store memory size is a lot bigger than the actual data size in row store and shows high fragmentation ratio

A row store table requires more memory to store records, the table takes a free page from existing segments. If no segment has a free page, a new segment is allocated. Deleting a large number of records may result in a number of sparse segments. In such a case, row store reorganization is very useful and can be performed for memory compaction.

Tuesday, 10 October 2017

SAP HANA System Copy- Homogeneous Recovery Method Using HANA Studio

SAP HANA system copy Procedure- Below is the HANA System copy using Recovery method from PRD to QAS using Hana Studio
  • Take the backup of HANA PRD system
  • Copy/Move the backup form PRD Host to QAS Host
  • Ensure that QAS is having enough space for Backup.
  • Login to Hana Studio of QAS system with ID “SYSTEM”.
  • After Login to QAS system, go to Backup and recovery option.

Monday, 9 October 2017

Column Encryption Decryption on HANA

The last project I’m working on, we have some special requirements to secure sensitive customer data(e.g. social security number, driver license number, credit card number, etc) with encryption and decryption to protect it from data breach. It has nothing to do with recent data breach at Equifax, the client is keen to protect customer sensitive data for a long time before migrating to HANA. Since the customer data security has become such a hot topic after the Equifax data breach, I think it is worth sharing the information with the community.

Friday, 6 October 2017

Alexa, ask (on-Prem) HANA …

Voice Assistants such as Apple Siri, Google Home, Amazon Echo and Microsoft Cortana are coming around to be a common sight in various personal aspects. We often use them for asking about weather and news updates, looking up local businesses, getting directions, etc. It is indeed a convenient user interface for on-demand updates without having to open up an app, sign-in, navigate menus, etc.

This article presents a general architecture for enterprises that can leverage SAP Cloud Platform to integrate Amazon Alexa to on-Prem SAP applications, specifically SAP HANA in this case.

Thursday, 5 October 2017

Unassigned Member Null Handling Hierarchy in Calculation View

Not Assigned Members enabled hierarchies with SAP HANA Calculation Views

Often occurs that there will be entries in fact table for which corresponding master data doesn’t exist, for instance sales records for unregistered customers. In such cases when the two tables joined together with referential integrity, sales data for unregistered customers will be lost. Using the feature Unassigned Member Null Handling in hierarchy, we can see the whole data in a hierarchical fashion without any loss of data wherein the not assigned member/unregistered customer details will be grouped under a node.

Wednesday, 4 October 2017

Getting started with Internet of Things Applications: Device Management

The world is talking about Internet of Things. IoT has become rapidly growing topic of discussion at workplace and among-st the tech savvy engineers. Well IoT is nothing but connecting any device which can be switched on and off with Internet.

Being a HANA consulant tried my hands on the SAP HANA Cloud Platform for IoT.

Tuesday, 3 October 2017

Node.js Connecting to HANA + Mongo + Neo4J

I tried to document my recent learnings on consuming different kinds of data, such as HANA database, MongoDB (Document Store), Neo4j DB (Graph Database), Google Map APIs using Node.js. This blog is my learning code to built few use cases to understand the possibilities of using different APIs. I am definite that there are better of ways of coding, better ways of connecting to different databases (like connecting to Mongo or PostgreSQL using cloud foundry backing services) or using Cloud connectors.