Friday, 29 December 2017

The new ArcGIS Enterprise Geodatabase for HANA – First Impressions

In this blog, I’ll look at what it takes to create an enterprise geodatabase for HANA, enable it and copy some feature classes from another enterprise geodatabase into the HANA one.  I’ll discuss the creation and loading of utility models in another post.

As many of you know, the ArcGIS platform has been able to access tables in HANA using query layers since ArcGIS Server and Desktop – 10.3 and Pro 1.2 were released in 2014.  This enabled spatial data in HANA to be consumed and updated by the ArcGIS platform.  As of ArcGIS Server 10.3.1, feature services against HANA were supported.  This is commonly known as an agile spatial datamart or “sidecar” scenario.

Wednesday, 27 December 2017

‘Hello Block’ – Hana Xs Blockchain Proof of Work Application

Welcome to all of you in my blog, this post is all about basic blockchain and how it works actually. I came across lots of blogs (not in sap) where people discussed lots about blockchain, the funny one was ‘how i explained blockchain to my grandma’. But i always wonder what it means to developer like me, how can we implement blockchain from scratch no more api’s lets build everything from scratch. Before jumping into code i would like to explain some core technical concept which is very important.

Friday, 22 December 2017

Using Topology for Data Analysis II

In this second part 1, we use Topological Data Analysis (TDA) on a dataset consisting on spatial information related to traffic. We’ll compare to usual “DBSCAN” method from Machine Learning.

DBSCAN is a method for finding clusters in data. It means Density-Based Spatial Clustering of Applications with Noise. It usually requires two parameters and the data: a radius, known as eps, and the minimum number of points required to form a cluster, that is the “density” part.

In any case, the parameters are unknown a priori. TDA in this case can help giving connected components as initial election of clusters and also, being robust against noise, these clusters will persist.

Wednesday, 20 December 2017

Using Topology for Data Analysis

When researching data we want to find features that help us understand the information. We look for insight in areas like Machine Learning or other fields in Mathematics and Artificial Intelligence. I want to present here a tool initially coming from Mathematics that can be used for exploratory data analysis and give some geometric insight before applying more sophisticated algorithms.

The tool I want to describe is Persistent Homology, member of a set of algorithms known as Topological Data Analysis. In this post I will describe the basic methodology when facing a common data analysis scenario: clustering.

Monday, 18 December 2017

Bringing Machine Learning (TensorFlow) to the enterprise with SAP HANA

In this blog I aim to provide an introduction to TensorFlow and the SAP HANA integration, give you an understanding of the landscape and outline the process for using External Machine Learning with HANA.

There’s plenty of hype around Machine Learning, Deep Learning and of course Artificial Intelligence (AI), but understanding the benefits in an enterprise context can be more challenging.  Being able to integrate the latest and greatest deep learning models into your enterprise via a high performance in-memory platform could provide a competitive advantage or perhaps just keep up with the competition?

Friday, 15 December 2017

SAP Automated Predictive Library (APL) Installation and configuration for SAP HANA

What is SAP HANA Automated Predictive Library (APL)?

SAP HANA APL is an Application Function Library (AFL) which lets you use the data mining capabilities of the SAP Predictive Analytics automated analytics engine on your customer datasets stored in SAP HANA.

Wednesday, 13 December 2017

Subsequent Document Splitting in S/4 HANA Finance 1709 (on Premise)

1. Purpose of this document


This document is for SAP FICO Application consultants. You would be able to implement subsequent document splitting in S/4 HANA with the help of this document. You must know already how document splitting works in general, it is sort of prerequisite.

Monday, 11 December 2017

HANA SDI | Smart Data Integration 2.0 – H2H Real-time Replication: Lessons Learned

In this blog entry I would like to convey some of the experiences we made throughout an SDI HANA to HANA (H2H) implementation project. To gather an understanding of the context, we will start off with the scenario description and solution architecture.

These are the items that will be covered throughout this blog:

Friday, 8 December 2017

SAP HANA Under The Hood: HDB – by the SAP HANA Academy

Introduction


The objective of this SAP HANA Under The Hood series is to provide some insights on particular SAP HANA topics. Like a mechanic, look under the hood and take apart a piece to examine its functioning.

Wednesday, 6 December 2017

Introduction To SAP Landscape Transformation (SLT)

New to Real Time Replication? with this blog I would like to share basic information on SAP Landscape Transformation Replication Server i.e SLT running on the Netweaver platform .

SLT is the SAP first ETL tool that allows you to load and replicate data in real-time or schedule data from the source system and Non-Source System into SAP HANA Database.

Monday, 4 December 2017

Importing spatial map client included with SAP HANA Spatial

One of the advantages of SAP HANA Spatial is that it includes a map client and other content at no additional cost. This is an example of a spatial application created using the HERE map provided with SAP HANA.

Friday, 1 December 2017

Server-side SSL configuration on HANA for inter-node communication and System Replication using openSSL

I have been seeing a growing number of security related questions from customers. This blog will cover step-by-step configuration of SSL for internal communication and system replication. I hope this will help you guys out.

Security is one of the most significant feature any product should posses. In SAP HANA, we can precisely configure both internal and external communication.