Saturday, 18 May 2019

Reverse GeoCode HANA Data with Google API

We recently came across a scenario with a client, where we needed to Reverse Geocode their coordinate information. In essence what that means, is that they’d provide latitude and longitude data, and we needed to retrieve the geopolitical information surrounding that coordinate, like street name, number, city, state, country, etc.

Unfortunately for us, the blog was from 2014 and Google’s API interface for reverse geocoding has changed a bit. Also, Kevin’s solution is very sophisticated and dynamic, and our requirement was very basic, so we had to simplify it a bit.

So with that in mind, I wanted to provide a walk through of the solution that we came up with, built on top of Kevin’s original one.

We are using the Google API for this. I’m sure there are other resources available for the same purpose. Please note that Google API is a paid product, so read through their pricing to understand the cost implications:
https://developers.google.com/maps/documentation/geocoding/usage-and-billing

STEP1

First thing we did was to create a schema and a table that would contain the coordinates (latitude + longitude), as well as the additional reverse geocoding attributes that we’re retrieving from Googles API

Firstly we created the schema “REVERSE_GEOCODE”, and also granted select to the SYS_REPO user:

CREATE SCHEMA "REVERSE_GEOCODE";

GRANT SELECT ON SCHEMA "REVERSE_GEOCODE" TO _SYS_REPO WITH GRANT OPTION;

We then created a simple table with latitude and longitude, as well as geopolitical information to be retrieved from Googles API:

CREATE COLUMN TABLE "REVERSE_GEOCODE"."REVERSE_GEOCODE" (
"LATITUDE" DOUBLE CS_DOUBLE NOT NULL ,
"LONGITUDE" DOUBLE CS_DOUBLE NOT NULL ,
"STREET_NO" NVARCHAR(30),
"ADDRESS" NVARCHAR(100),
"NEIGHBORHOOD" NVARCHAR(100),
"CITY" NVARCHAR(100),
"COUNTY" NVARCHAR(100),
"STATE" NVARCHAR(100),
"COUNTRY" NVARCHAR(100),
"POST_CODE" NVARCHAR(30),
"POST_CODE_SUFFIX" NVARCHAR(30),
"STATUS" NVARCHAR(30),
PRIMARY KEY (
"LATITUDE",
"LONGITUDE")) UNLOAD PRIORITY 5 AUTO MERGE

We then inserted some random coordinate information to test the solution:

insert into "REVERSE_GEOCODE"."REVERSE_GEOCODE" values(36.778259,-119.417931,'','','','','','','','','','');
insert into "REVERSE_GEOCODE"."REVERSE_GEOCODE" values(39.742043,-104.991531,'','','','','','','','','','');
insert into "REVERSE_GEOCODE"."REVERSE_GEOCODE" values(41.881832,-87.623177,'','','','','','','','','','');
insert into "REVERSE_GEOCODE"."REVERSE_GEOCODE" values(42.331429,-83.045753,'','','','','','','','','','');
insert into "REVERSE_GEOCODE"."REVERSE_GEOCODE" values(51.509865,-0.118092,'','','','','','','','','','');

STEP2

Ensure that before you start your development, you have an API key from Google, following the instructions below:

https://developers.google.com/maps/documentation/javascript/get-api-key

This will be required to actual query the API.

STEP3

Now we can build our Javascript objects in HANA. Let’s start by creating an XS Project to house all the relevant objects:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Give it a name:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Ensure you select the correct repository workspace:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Ensure both the XS Application Access and XS Application Descriptor are checked. Also add the geodataEnrich XS JavaScript and hit Finish:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Your XS project will be created with the selected objects:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Next, let’s create the HTTP Destination Configuration file. Within this new package, click new:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Give it the name below:

And enter the information below. Not that the Key will be the token obtained in step1. Also note that some of these parameters will need to be tailored based on your network settings:

host = “maps.googleapis.com”;
port = 443;
description = “Google Geocoding API”;

pathPrefix = “/maps/api/geocode/json?key=token_obtained_in_step1“;
authType = none;
useProxy = false;
timeout = 0;
useSSL = true;
sslAuth = anonymous;

This is what it should look like:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Next, activate the file by right clicking on it and selecting Activate:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

We’ll now create a Javascript Library file:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Give it the name below:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Once the file is created, enter the code below:

//------------------------------------------------------------------------------
// PUBLIC methods
//------------------------------------------------------------------------------
//------------------------------------------------------------------------------
// function reverseGeocode(lat, lon)
// Takes a latitude and longitude value, returns an addressResults object containing these
// properties (see https://developers.google.com/maps/documentation/geocoding): 
//     country
//     administrative_area_level_1
//     administrative_area_level_2
//     administrative_area_level_3
//     postal_code
//     status (see https://developers.google.com/maps/documentation/geocoding/#StatusCodes)
//--------------------------------------------------------------------------------
function reverseGeocode(lat, lon) {

// Init address data to remove all existing properties 
gAddressResults = {};
// Call Google reverse geocoding API
var dest = $.net.http.readDestination("ReverseGeocode", "geocodeApiGoogleDest");
var client = new $.net.http.Client();
var req = new $.web.WebRequest($.net.http.GET,"&latlng=" + lat + "," + lon);
client.request(req, dest);
var response = client.getResponse();

// Parse results, this adds properties to  as it goes
var geoData = JSON.parse(response.body.asString());
//log(req);
//log(response.body.asString());
//log("&latlng=" + lat + "," + lon );
var j;
if(geoData.hasOwnProperty("results")){
var i=0;
for (i = 0; i < geoData.results.length; i++) {
if (geoData.results[i].hasOwnProperty("address_components")){
for (j = 0; j < geoData.results[i].address_components.length; j++) {
if (geoData.results[i].address_components[j].hasOwnProperty("types")){
if (geoData.results[i].address_components[j].types.indexOf("route") !== -1){
gAddressResults['ADDRESS'] = geoData.results[i].address_components[j].short_name;
}
if (geoData.results[i].address_components[j].types.indexOf("street_number") !== -1){
gAddressResults['STREET_NO'] = geoData.results[i].address_components[j].short_name;
}
if (geoData.results[i].address_components[j].types.indexOf("neighborhood") !== -1){
gAddressResults['NEIGHBORHOOD'] = geoData.results[i].address_components[j].short_name;
}
if (geoData.results[i].address_components[j].types.indexOf("locality") !== -1){
gAddressResults['CITY'] = geoData.results[i].address_components[j].short_name;
}
if (geoData.results[i].address_components[j].types.indexOf("administrative_area_level_2") !== -1){
gAddressResults['COUNTY'] = geoData.results[i].address_components[j].long_name;
}
if (geoData.results[i].address_components[j].types.indexOf("administrative_area_level_1") !== -1){
gAddressResults['STATE'] = geoData.results[i].address_components[j].long_name;
}
if (geoData.results[i].address_components[j].types.indexOf("country") !== -1){
gAddressResults['COUNTRY'] = geoData.results[i].address_components[j].long_name;
}
if (geoData.results[i].address_components[j].types.indexOf("postal_code") !== -1){
gAddressResults['POST_CODE'] = geoData.results[i].address_components[j].long_name;
}
if (geoData.results[i].address_components[j].types.indexOf("postal_code_suffix") !== -1){
gAddressResults['POST_CODE_SUFFIX'] = geoData.results[i].address_components[j].long_name;
}
}
}
}
}

}
// Status
var status = geoData.status || "Status unknown";
gAddressResults['STATUS'] = status;
return gAddressResults;
}

var gAddressResults = {};
var gDepth = 0;
var gStoreLastLongname = "emptyLongname";

function log(s) {
let i = 1;
gLog += '\n';
gLog += s.toString();
// optionally copy log to hana trace files
if (gConfig.detailedLogging === 'hana') {
$.trace.info(s.toString());
}
}

Pay close attention to the highlighted part below, it needs to point to the package that was created in your environment, as well as the correct http destination file:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Activate the file.

Next, add the code below to the geodataEnrich.xsjs file that was created when we created the XS Project:

//------------------------------------------------------------------------------
// geodataEnrich.xsjs
//------------------------------------------------------------------------------
information.  For example, records
// with a latitude and longitude can have additional data populated giving country, region, postcode etc.  The address
// information is retrieved using Google's geocoding API.
//--------------------------------------------------------------------------------
// URL Parameters:
//   maxrecs    = max records to update eg 1000. Read google license about max per day
//   mindelayms = minimum delay in ms 
//   log        = omit parameter entirely for no logging, use log=active to see details on screen when url finishes, and use 
//                log=hana to write to hana trace file only (as an info level trace record)
//   simulate   = omit parameter entirely to update records, use simulate=active to not do any update
//   schema     = source schema name
//   table      = source table name
//   fldlat     = source field holding latitude
//   fldlon     = source field holding longitude
//   fldcty     = name of field in source table that will receive the Country address information (optional)
//   fldad1     = name of field in source table that will receive the admin level 1 information, like a region (optional)
//   fldad1     = name of field in source table that will receive the admin level 2 information, like a sub-region (optional)
//   fldad1     = name of field in source table that will receive the admin level 3 information, like a city (optional)
//   fldpost    = name of field in source table that will receive the post code or zip code information (optional)
//   fldblank   = name of field in source table that is used to identify records you want to write to, this is to prevent the
//                same records being written to over and over again.  So if this field is NULL then this service will attempt
//                to write to all target fields.  If this field is filled, the record will not be selected.
//   fldstat    = name of field in source table this will receive the status of the geocode API call (optional)
//--------------------------------------------------------------------------------
// Sample URLs:
//   All URLs will start as usual, http://<server>:80<instance>/<package path>/
//
//   Example 1
//   Simulate the update of 10 records to table "GEODATA"."testtable01", with 400ms delay between calls, logging to screen, and storing
//   result of geocode API call in the field STATUS.  The field to select on is COUNTRY (ie search for records with COUNTRY=NULL) and 
//   the fields to write to are ZIP and COUNTRY. 
//   geodataEnrich.xsjs?schema=GEODATAENRICH&table=testtable01&maxrecs=10&mindelayms=400&log=active&simulate=active&fldblank=COUNTRY&fldstat=STATUS&fldpost=ZIP&fldcty=COUNTRY
//
//   Example 2
//   Actual update of 2000 records, with 100ms delay between calls, with no logging.  The field to select on is COUNTRY and the fields to 
//   write to are POSTCODE, REGION and SUBREGION. 
//   geodataEnrich.xsjs?schema=GEODATAENRICH&table=testtable01&maxrecs=2000&mindelayms=100&fldblank=COUNTRY&fldpost=POSTCODE&fldad1=REGION&fldad2=SUBREGION
//
//--------------------------------------------------------------------------------
$.import("ReverseGeocode","geocodeApiGoogle");
var GEOAPIGOOGLE = $.ReverseGeocode.geocodeApiGoogle;

//--------------------------------------------------------------------------------
//GLOBALS
//--------------------------------------------------------------------------------
var gConstants = {
EMPTY : 'EMPTY',
NOTUSED : 'NOT USED',
};

var gConfig = {
// Values below are used as defaults if values not specified in URL
maxRecords : 2500,
minDelayMS : 500,
serviceProvider : 'google',
detailedLogging : 'notactive',
simulate: 'notactive',
};

var gTable = {
// Inbound table-related globals (values below areused as defaults of not specified in URL)
sourceSchema : gConstants.EMPTY,
sourceTable : gConstants.EMPTY,
sourceTableKey : gConstants.EMPTY, // string of all key fields suitable for a select statement field list
sourceTableKeyFieldList : [],       // list (well, an array) of key fields
sourceTableKeyFieldCount : 0,       // count of key fields
sourceFieldLat : 'LATITUDE',
sourceFieldLon : 'LONGITUDE',
// Processing table-related globals
sourceTableKeyCount : 0,
resultSet : null,
resultSetFieldCount : 0,
targetProperties : [],  // array of used JSON property names that the geocode xsjslib library call will return, indexed same as targetFieldnames
targetFieldnames : [],  // array of table field names to write to, indexed same as targetProperties
targetFieldCount : 0,   // count of fields that will be filled eg just country, or country + zip etc
// Outbound table-related globals
targetFieldCountry : gConstants.NOTUSED,
targetFieldAdmin1 : gConstants.NOTUSED,
targetFieldAdmin2 : gConstants.NOTUSED,
targetFieldAdmin3 : gConstants.NOTUSED,
targetFieldPostalCode : gConstants.NOTUSED,
targetFieldThatIsBlank : gConstants.EMPTY,
targetFieldStatus : gConstants.NOTUSED,
};

//------------------------------------------------------------------------------
//Entry point
//------------------------------------------------------------------------------
var gLog = '';  // global log
var gRecsProcessed = 0;
main();

function main() {
try {
prepareParameters();
readDataFromTable();
mainProcessing();
} catch(e) {
// on exception, force log to be shown
gConfig.detailedLogging = 'active';
if (e.name === 'UrlError' || e.name === 'TableError') {
// Error already logged, nothing to do
} else { 
throw(e);
}
} finally {
//finish();
}
}



//--------------------------------------------------------------------------------
// Functions
//--------------------------------------------------------------------------------
// global log
function log(s) {
let i = 1;
gLog += '\n';
gLog += s.toString();
// optionally copy log to hana trace files
if (gConfig.detailedLogging === 'hana') {
$.trace.info(s.toString());
}
}

// Read parameters from URL or use defaults
function prepareParameters() {
var i = 0;
// Override defaults with parameters from the URL
gConfig.maxRecords = gConfig.maxRecords;
gConfig.minDelayMS = gConfig.minDelayMS;
gConfig.serviceProvider = gConfig.serviceProvider;
gConfig.detailedLogging = gConfig.detailedLogging;
gConfig.simulate = gConfig.simulate;
gTable.sourceSchema = 'REVERSE_GEOCODE';//$.request.parameters.get("schema") || gTable.sourceSchema;
gTable.sourceTable = 'REVERSE_GEOCODE';//$.request.parameters.get("table") || gTable.sourceTable;
gTable.sourceFieldLat = 'LATITUDE';// || gTable.sourceFieldLat;
gTable.sourceFieldLon = 'LONGITUDE';//$.request.parameters.get("fldlon") || gTable.sourceFieldLon;
log('=== Parameters ====================================');
log('Config');
log('  Maximum API calls to make  : ' + gConfig.maxRecords);
log('  Min delay between calls    : ' + gConfig.minDelayMS + ' milliseconds');
log('Source data');
log('  Source schema              : ' + gTable.sourceSchema);
log('  Source table               : ' + gTable.sourceTable);
log('  Source field for latitude  : ' + gTable.sourceFieldLat);
log('  Source field for longitude : ' + gTable.sourceFieldLon);

}

function readDataFromTable() {
//--------------------------------------------------------
// Read the table's meta data
//--------------------------------------------------------
var query = prepareQueryForMetadata();
var connSelect = $.db.getConnection();
// query string with ? params
var pstmtSelect = connSelect.prepareStatement(query);
// parameter replacement
pstmtSelect.setString(1, gTable.sourceSchema);
pstmtSelect.setString(2, gTable.sourceTable);
var rs = pstmtSelect.executeQuery();
var fld = '';
var keyCount = 0;
// Build string representing table key and table key with parameters
gTable.sourceTableKey = '';
gTable.sourceTableKeyFieldList = [];
while (rs.next()) {
fld = rs.getString(1);
gTable.sourceTableKey += ('\"' + fld + '\"' + ' ');
gTable.sourceTableKeyFieldList[keyCount] = fld;
keyCount = keyCount + 1;
}
gTable.sourceTableKey = gTable.sourceTableKey.trim();
gTable.sourceTableKey = gTable.sourceTableKey.replace(/ /g, ', '); // global replace space, with space comma
log('=== Table Information ============================');
log('Table Metadata Query (template): ' + query);
if (keyCount > 0){
    log('Table Key: ' + gTable.sourceTableKey);
    gTable.sourceTableKeyFieldCount = keyCount;
    // not logging key field list, but could
} else {
log('*** ERROR: table ' + gTable.sourceTable + ' does not exist, or does not have a primary key');
throw {name : "TableError", };
}
//--------------------------------------------------------
// Read source table data proper 
//--------------------------------------------------------
query = prepareQueryForMainRead();
log('Main Select Query: ' + query);
connSelect = $.db.getConnection();
pstmtSelect = connSelect.prepareStatement(query);
// Store results
gTable.resultSet = pstmtSelect.executeQuery();
gTable.sourceTableKeyCount = keyCount;
gTable.resultSetFieldCount = keyCount + 2;  // number of fields in key plus 2 for lat lon
log(' ');
}

// Prepare metadata selection query, returns query string (with params as ? to be filled later)
function prepareQueryForMetadata() {
var select = 'SELECT \"COLUMN_NAME\"';
var from = ' FROM \"SYS\".\"CONSTRAINTS\"';
var where = ' WHERE \"SCHEMA_NAME\" = ? AND \"TABLE_NAME\" = ?';
var orderby = ' ORDER BY \"COLUMN_NAME\"';
var query = select + from + where + orderby;
return query;
}

// Prepare main selection query, returns query string  (no params possible here)
function prepareQueryForMainRead() {
var select = 'SELECT TOP ' + gConfig.maxRecords + ' ' + gTable.sourceTableKey;
var from = ' FROM \"' + gTable.sourceSchema + '\".\"' + gTable.sourceTable + '\"';
var where = ' WHERE \"STATUS\"!=\'OK\' or \"STATUS\" is null';
var orderby = ' ORDER BY ' + gTable.sourceTableKey;
var query = select + from + where + orderby;
return query;
}

// Prepare update statement, returns query string (with params as ? to be filled later)
function prepareQueryForUpdate(addressData,lat,long) {
//--------------------------------------------------------
// The UPDATE clause
//--------------------------------------------------------
var qupdate = 'UPDATE \"' + gTable.sourceSchema + '\".\"' + gTable.sourceTable + '\"';
//--------------------------------------------------------
// The SET clause
//--------------------------------------------------------
var i = 0;
var qset = ' SET ';
for (var key in addressData) {
qset += ( '\"' + key + '\" = \'' + addressData[key] + '\'');
qset += ', ';
}
qset = qset.slice(0, -2);
var qwhere = ' WHERE "LATITUDE" = \'' + lat + '\' AND "LONGITUDE" = \'' + long + '\'';
var queryUpdate = qupdate + qset + qwhere;
return queryUpdate;
}

// Main processing, this loops over the result set of data, calls the geocode API to get the new data
// and writes it back to the database one record at a time.
function mainProcessing() {
var rs = gTable.resultSet;
var i = 0;
var keyValue = '';
var remainingTime = 0;
var overallDuration = 0;
// record-level vars
var currentRecord = {
// Current record-related working vars
sourceFieldValues : '',  // all field values as string for logging
lat : 0,
lon : 0,
timeStart : 0,
timeEnd : 0,
duration : 0,
keyValues : [],      // key field values, used in update call
    addressData : null,  // the object retuned by the geo API call with properties containing address values we want
};
log('=== Main Processing ===========================');
// iterating a recordset is a one-way ticket, so have to do all processing per record
while (rs.next()) {
//--------------------------------------------------------
// Main process per rs record: call geocode API, write to DB
//---------------------------------------------------------
// Clear previous record
currentRecord.sourceFieldValues = '';
currentRecord.lat = 0;
currentRecord.lon = 0;
currentRecord.timeStart = 0;
currentRecord.timeEnd = 0,
currentRecord.duration = 0,
currentRecord.keyValues = [];
currentRecord.addressData = null;
// Examine the key values, for logging and later on they used in the Update statement
for (i = 0; i < gTable.sourceTableKeyCount; i++) {
keyValue = rs.getString(i + 1);
currentRecord.sourceFieldValues = currentRecord.sourceFieldValues +  ' ' + keyValue;
currentRecord.keyValues[i] = keyValue;
}
log('Source record (selected fields): ' + currentRecord.sourceFieldValues);
// Get lat lon from source record
currentRecord.lat = parseFloat(rs.getString(1));
currentRecord.lon = parseFloat(rs.getString(2));
log('Current record lat: ' + currentRecord.lat.toString() + ' lon: ' + currentRecord.lon.toString());
// Timer to ensure we don't swamp the google API and get banned
currentRecord.timeStart = 0;
currentRecord.timeEnd = 0;
currentRecord.timeStart = new Date().getTime();
//--------------------------------------------------------
// Call our library that wraps the Google service
// The addressData object that is returned is guaranteed to contain these properties:
//   country, administrative_area_level_1, _2, _3, postal_code
//--------------------------------------------------------
currentRecord.addressData = GEOAPIGOOGLE.reverseGeocode(currentRecord.lat, currentRecord.lon);
log('Reverse Geocode Results: ' + JSON.stringify(currentRecord.addressData)); 
//--------------------------------------------------------
// Write back to database
//--------------------------------------------------------
// query with ? params
var queryUpdate = prepareQueryForUpdate(currentRecord.addressData,currentRecord.lat, currentRecord.lon);
log('Record Update Query (template): ' + queryUpdate);
var connUpdate = $.db.getConnection();
var cstmtUpdate = connUpdate.prepareCall(queryUpdate);
// eg UPDATE "GEODATAENRICH"."testtable01" SET "COUNTRY" = ?, "ZIP" = ? WHERE "KEYREF" = ? AND "KEYYEAR" = ?
// parameter replacement for SET
/*
for (i = 0; i < gTable.targetFieldCount; i++) {
var s = currentRecord.addressData[gTable.targetProperties[i]];
cstmtUpdate.setString(i + 1, s);
}
// parameter replacement for WHERE
for (i = 0; i < gTable.sourceTableKeyFieldCount; i++) {
var kfv = currentRecord.keyValues[i]; 
cstmtUpdate.setString(i + gTable.targetFieldCount + 1, kfv);  // note counter increments from key count
}
*/
log('Record Update Query (template): ' + queryUpdate);
if (gConfig.simulate === 'notactive') {
cstmtUpdate.execute();
connUpdate.commit();
} else {
log('In simulate mode, no table update done.');
}
connUpdate.close();
//--------------------------------------------------------
// Wait until duration reached before allowing loop to continue
//--------------------------------------------------------
currentRecord.timeEnd = new Date().getTime();
currentRecord.duration = currentRecord.timeEnd - currentRecord.timeStart;
log('Execution Time (ms): ' + currentRecord.duration);
remainingTime = gConfig.minDelayMS - currentRecord.duration;
if (remainingTime > 50) {
log('  sleeping...');
// This blocks CPU, not ideal at all, but easier to implement than a callback in this case
sleep(remainingTime);
overallDuration = ((new Date().getTime()) - currentRecord.timeStart);
log('  overall duration: ' + overallDuration + ' ms');
}
log(' ');
gRecsProcessed++;
};
}

// This blocks CPU, not ideal but works
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds) {
break;
};
};
}

function finish() {
$.response.contentType = "text/plain";
if (gConfig.detailedLogging === 'active') {
$.response.setBody(gLog);
} else {
$.response.setBody('Done. Processed ' + gRecsProcessed + ' records.');
}
};

Again, make sure you modify the package in the xsjs syntax to reflect that which was created in your system:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Also, update the section below in the code to reflect the schema and table name that was created that has the coordinate information:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Once that’s done activate the xsjs file.

You should now be able to run the URL to call the Google API:

https://your_tenant_host:port/Package_name/javascript_file.xsjs

This is what it looks like in our tenant, omitting the tenant name:

https://our_tenant.com:4390/ReverseGeocode/geodataEnrich.xsjs

Once the URL is executed, we should see the table populated with the necessary information:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

We can now use this table in our HANA modelling, either by augmenting the data with these attributes, or possibly building name path hierarchies that allow to drill into the geo political data.

STEP4

We’ll now walk through the process of scheduling this javascript so that the information can be updated on a pre-defined schedule.

We will create an XS Job Schedule file:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Provide the name below and hit finish:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Enter the syntax below:

{
    "description": "Reverse Geocode",
    "action": "ReverseGeocode:geodataEnrich.xsjs::main",
    "schedules": [
       {
          "description": "Reverse Geocode",
          "xscron": "* * * * * 59 59"
       }
    ]
}

Make sure you updated the highlighted section below to be the package name and javascript created in your system

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Activate the file.

Note that the scheduling notation is in XSCRON. Here’s the help reference page: https://help.sap.com/viewer/d89d4595fae647eabc14002c0340a999/2.0.02/en-US/6215446213334d9fa96c662c18fb66f7.html

Ensure the user scheduling the job has the correct roles assigned:

sap.hana.xs.admin.roles::HTTPDestAdministrator

sap.hana.xs.admin.roles::JobAdministrator

Below is the explanation of the XSCRON syntax:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

So for our specific XS Job flie, we are scheduling it to run every day of every month and year, for every hour, on the 59th minute and 59th second mark. You can change that based on your requirements.

Next, need to go to the XS Admin page using the syntax below:

http://YOUR_HOST.com:8090/sap/hana/xs/admin/

Once logged in, click on the menu and select XS Artifact Administration

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

You should see all of the packages in your system. Click on the arrow next to the package containing your XS Job file

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

You should see some of your artifacts. Click on the XS Job that you created:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

The information below will be displayed. Next hit the configuration tab:

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

Enter your user and password, click the active checkbox, hit the save job button, and voila. Your job is scheduled. You can then click the View Logs button to see the status of the job.

SAP HANA Study Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides

And there you have it.

1 comment: