Oct 182014
 

Hi,

Recently coming across this issue, and problems relating to it. I took upon a following approach which I would like to share across, helping a wider community and get reviews on the same:

Problem:

Many times designing the cubes there are situations in which the cube may not have all the dimensions required for the reporting purposes reason’s being, the planning users want to splash data only on certain limited set of dimensions, but for analytical purposes they need some additional dimensions (which may be attributes of previously used dimensions for planning cubes).

Or for many other reasons where there are lookups involved or complex computation is involved between the planning and reporting cubes. But the summary of the problem is, keeping the cubes in sync real time. So that the numbers remain homogeneous across the system and do not get out of balance, confirmed by many numerous reconciliation reports which are tedious to keep an eye always.

Some arguments do exist that Rules can help achieve this to link multiple cubes, but to my experience so far, Rules are great when used for computations upon variables within one cube itself (even good when cube sizes are relatively small with look up formulas), but as soon as lookup functions start to appear in the rules like “PALO.DATAC” that’s where none my rules were ever performant from usability of the system (as my projects involved large data set cubes), not just with Jedox event Microsoft confirms this nature of SSAS and suggest to keep the cubes comprehensive of data as far as possible.

Microsoft:

Any calculation using the LookupCube function is likely to suffer from poor performance. Instead of using this function, consider redesigning your solution so that all of the data you need is present in one cube.

http://msdn.microsoft.com/en-au/library/ms144720.aspx

Solution:

At work I realized always keep things configurable (Golden Rule), as every day some thing changes like rules, logic, constants. So the best mechanism I planned to suit the flexibility requirements to keep the cubes in sync was to leverage the Jedox ETL but custom designing the ETL project to keep the cubes in sync  by the required variables, and kicking off the ETL by the web service triggered by the Supervision Server on the cell change event, this way we wont loose out on any cube change event and all the affected cubes are also synced up:

The configuration of the set up is as follows:

1. Configure the “PALO.INI” for monitoring cell change events via Supervision Server

Add the following command:

“use-cube-worker”

2. In the “SVS/sample_script” folder of Jedox create a copy of “sep.inc.on_cell_change.php” and place it in the “custom_script” folder by renaming it to your choice, in my case it is “sep.inc.on_cell_change_cube_sync.php”

Amend the code in the file as follows:


/**
* Sample Script for cell change event in Jedox OLAP Server. Version 6.0.0
* SVN: $Id: sep.inc.on_cell_change.php 239 2013-02-12 14:52:17Z vmalicevic $
*/

// adapt protocol, host, port here to match your setup
define('ETL_URL','http://127.0.0.1:7775');

// wait for job to finish prior returning status
define('WAIT_FOR_FINISH', true);

// max execution time of script in seconds used in combination with WAIT_FOR_FINISH
define('MAX_EXECUTION_TIME', 10);

class SEPEventHandler extends SEPEventHandlerBase
{
public function OnUserLogin($username)
{
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< User logged in: $username >>");
}

public function OnUserLogout($username)
{
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< User logged out: $username >>");
}

public function OnUserAuthenticate($username, $password) { // bool
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< User authenticate >>");
return true;
}

public function OnUserAuthorize($username, $password, array& $groups) { // bool
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< User authorize >>");
return true;
}

public function OnWindowsUserAuthorize($domain, $username, array $winGroups, array& $groups) { // bool
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< User Windows authorize, domain $domain, username $username >>");
$groups = $winGroups;
return true;
}

public function OnServerShutdown() { // void
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< Server shutdown handler >>");
}

public function OnDatabaseSaved($database) { // void
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< Databased saved : $database >>");
}

public function OnTermination() { // void
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< Termination handler >>");
}

# setting the cubes or part of cubes which should be triggered
public function InitCubeWorker($database,$cube)
{
if($database == "Demo")
{
# set triggering area for cube ‚Sales‘
if($cube == "Sales")
{
$AreaA = array(DIMENSION_TOTAL,DIMENSION_TOTAL,DIMENSION_TOTAL, DIMENSION_TOTAL,DIMENSION_TOTAL,array('Units','Turnover'));
$this->WatchCubeArea($AreaA,'SalesCube');
}
}
}

public function OnDrillThrough( $database, $cube, $mode, $arg ) { // string
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< OnDrillThrough >>");
return "not implemented;\r\n";
}

public function InitDimensionWorker()
{
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< Get triggering dimensions >>");
}
}

function SalesCube($database, $cube, $areaid, $sid2, $coordinates, $value, $splashMode, $additive)
{

include_once('etl_code.php');</pre>
<pre>$user = get_user_for_sid($sid2); // user who made the change
$groups = get_groups_for_sid($sid2); // groups of the user who made the change

if ($coordinates[5] == 'Units')
{
# read the Price of target product from it's attribute cube
$unit_price = palo_dataa("SupervisionServer/$database", '#_Products',array('Price Per Unit',$coordinates[0]));

# calculate turnover
$turnover_cell = array ($coordinates[0], $coordinates[1], $coordinates[2], $coordinates[3], $coordinates[4], 'Turnover');
$turnover_value = $unit_price * $value;

# starting to lock the database (needed to be allowed to change data)
# (notice that the time the server will be locked for writing has been minimized to the minimum time needed)
event_lock_begin($areaid,$sid2);

# get old units value
$oldUnitsVal = palo_data("SupervisionServer/$database", $cube, $coordinates[0], $coordinates[1], $coordinates[2], $coordinates[3], $coordinates[4], $coordinates[5]);

# write original dataset at given coordinates in 'Sales' cube
# (notice that in case of a triggered event you may decide yourself whether the input data should be written to
# database or not. In case you want it written you must not forget to write it into the database)
$write = palo_setdataa($value, FALSE, "SupervisionServer/$database", $cube, $coordinates);

# write calculated turnover into column 'Turnover' in 'Sales' cube
$write = palo_setdataa($turnover_value, FALSE, "SupervisionServer/$database",$cube, $turnover_cell);

# ending lock of affected database
event_lock_end();
}
elseif ($coordinates[5] == 'Turnover')
{
# Throws a popup notice in Excel
sep_error("Warning", "No one is allowed to change the Turnover");
}

//Put the ETL Kickoff code at the bottom of the script so that once the changes intended by the user are saved to the cube can be picked up by the corresponding ETL event.
//Location of ETL kick off code is vital to address the requirements.

sep_log("ETL Function is about to be called !!!");
sep_log(dirname(__FILE__));

startJob();

}

Also addtionally in the same folder create the following file called “etl_code.php” for custom code relating to ETL kick off

</pre>

function connectSoap()
{
 $wsdl_url = ETL_URL . '/etlserver/services/ETL-Server?wsdl';
 $server = @new SoapClient($wsdl_url, array('exceptions' => true, 'location' => $wsdl_url));
 return $server;
}

function startJob(){
sep_log("ETL Function is called...");
$server = connectSoap();
$project = 'LearnDummy';
$type = 'jobs';
$name = '[J][JustFromEvent]';
$locator = "$project.$type.$name";
 $response = $server->execute(array('locator' => $locator));
 sep_log("ETL Kick off has now been called...");
}

3. Following the above changes amend the “SVS” configuration file for the script “sep.inc.php” to point to you new custom script:

include './custom_scripts/sep.inc.on_cell_change_cube_sync.php';

4. Finally restart the MOLAP service and , fire up a Jedox spreadsheet and with your inputs flowing in to the cube and keep an eye on the Jedox ETL log accumulating, indicating the process works and your ETL is being called upon any changes occurring in the targeted cube region. (In my case the ETL name was “LearnDummy” with the Job name as “[J][JustFromEvent]“)

CropperCapture[6]

Note: One thing to be very careful for this process is, to ensure that this mechanism is primarily built for user based inputs to the cube from spreadsheets, to track them and sync them over. But for any ETL based cube load, the SVS event must be ensured to be de-activated for the optimal performance in the nightly load processes, if missed can lead to huge overburdening of the process which it wasn’t designed to cater.

The main benefit I experienced from this solution is that its an async operations hence user don’t realised any performance impacts, as the user inputs trapped are only used for kicking off the ETL job which works on its own, effectively by queuing jobs under heavy loads and ensuring all operations are accomplished with the logs in place to review and audit.

Hope it helps, Enjoy !!!

Oct 112014
 

Hi,

A seemingly trivial looking task took me all over the board recently to figure out a solution for the business where users didn’t wanted to create consolidated nodes in the dimensions to get the aggregations, instead they wanted to get the aggregated results via subset which where filtered based upon the attributes flags, phew !!!

Ne ways I thought this should be achievable by some tips and tricks applied in the areas of excel based formulas and get the result. So instead of extending my words further I will first present you with the picture which would directly illustrate the problem and solution which I will describe below more:

CropperCapture[1]

From the above as you can see, the “Sum by Subset” row has the longest looking formula, but this formula once broken down to components is very easy to interpret.

This formula is initially using the PALO.TFILTER to generate the filter for the dimension year to include only the elements “2011” and “2012”.

This is then fed to the PALO.SUBSET formula which is responsible to generate the subset required to be fed into  the PALO.DATAV (be careful this is DATAV not DATAC, which is array based retrieval of values from the cube via Excel)

Once we have got the PALO.DATAV working we wrap up the output into the Excel based SUM formula and enter the all content of the formula as array formula (Ctrl+Shift+Enter).

And there you go, you have got the result as desired aggregated summed values for the two years worth of data in excel spreadsheet, this can also be leveraged in paste view generated via Jedox add-in for excel.

Now what you have read so far is part of the story till Excel was in picture, but the same successful story is not so successful in Jedox web, because I do acknowledge that Jedox has made a smart and rich UI to replicate excel on web, but still some glitches are found, which actually elongated this story.

Jedox web is not so good as evaluating the nested array formulas as of yet and I expect it to be fixed up soon in next coming releases, once its done it awesomeness will be increased further…

So the solution to the rescue for Jedox Web for the similar to achive is the custom UDF via Macro.

CropperCapture[2]

<pre>
function subset_sum($region,$yearPattern){

// Define constant variables.
define('HOST', 'localhost');
define('PORT', '7777');
define('USER', 'admin');
define('PASS', 'admin');

// Establishes a new connection to the OLAP server.
$connection = palo_init(HOST, PORT, USER, PASS);
// create the variables db_name and cube_name
$db_name = 'Demo';
$cube_name = 'Sales';

$tFilter = palo_tfilter(array($yearPattern),true);
$ySubset = palo_subset($connection, $db_name,"Years",1,null,null,null,null,$tFilter, null);
$ySubset = array_map(function($element){return $element['name'];}, $ySubset);
array_unshift($ySubset,2,1);
$val = palo_datav($connection, $db_name, $cube_name,
"All Products",$region,"Year",$ySubset,"Actual","Units");
unset($val[0]);
unset($val[1]);

// Closes the established connection.
palo_disconnect($connection);
return array_sum($val);
}

Attached along with this post are both the Excel and Jedox web based solutions… And I look forward to the comments if any thing better can be done !!!

Hope it helps…

Download:

http://amolpandey.com/home/downloads/blog23_download.zip

Oct 092014
 

Hi,

In this post I would like to introduce you to the new feature set provided by MSSQL server 2012 (syntax) for paged result sets using the “OFFSET FETCH” clause

http://technet.microsoft.com/en-us/library/gg699618(v=sql.110).aspx

Now we got the SQL in place to provide the flexibility, so how we go about leverage it. For the same, I have sketched out a very user friendly interface with all the features of a paged report along with the features for the user to operate upon the rows:

CropperCapture[4]

The above report provided the functionalities of sorting, paging, user selections and operations upon the same with the help of dynamic sql constructed by the help of variables.

Give it a try with the code base provided in the download and I hope it provide you the infrastructure required to achieve similar for your feature rich reports.

Downloads:

 http://amolpandey.com/home/downloads/blog22_download.zip

Oct 052014
 

Screen Shot’s” associated with the windows keyboard “Print Screen” function key, have been and still being utilized heavily to report error’s occurring on the client side with the application. And clients not sending/capturing the screen state at the time of errors lead to difficulty reproducing the issue within the development arena in order to resolve the same.

Thus to simplify a similar situation, I thought it would be great if the screen shot could be captured at the time of error occurring (error handling “catch” block) and sent across to server and saved as an image without client going through the pain of capturing, saving, attaching and send via portal tickets/email.

Below is a small POC demonstration with the help of a JavaScript library and custom PHP code customization to achieve the similar.

The functionality above illustrated, will allow you to capture the screen shot of the current spreadsheet in Jedox, (but the widget area will remain blank as the JavaScript API excludes the iframe html components to render in the the screen capture image). But this technique is not restricted to Jedox it can be part of any large scale application which is custom HTML written.

With this interesting technique I would only like to highlight the dangers of the same, as it can be wrongly used to target the client side confidential information, so please beware !!! (Mischievous intent people shoo!!! away)

CropperCapture[13]

The process above illustrated is a two step process, in which the first step source target area screen shot is captured via “html2canvas.js” and converted and appended to he body of the code base as html “Canvas” element, which is then picked up by next JavaScript code blocks and sent to the server as raw blob content where it is saved back as a “PNG” file on the server.

Sample Code:

HTML/Javascript:


<html>
<body>
<p>Screen Shot Widget</p>
<input type="button" value="Capture Screenshot" onclick="takeScreenShot();">
<input type="button" value="Post Screenshot" onclick="postImage();">
<script src="/pr/custom/html2canvas.js"></script>
<script type="text/javascript">
 function takeScreenShot(){
 html2canvas(window.parent.document.body, {
 onrendered: function(canvas) {
 var cand = document.getElementsByTagName('canvas');
 if(cand[0] === undefined || cand[0] === null){

 }else{
 //cand[0].remove();
 document.body.removeChild(cand[0]);
 }
 document.body.appendChild(canvas);
 }
 });
 }

 function postImage(){
 var cand = document.getElementsByTagName('canvas');
 var canvasData = cand[0].toDataURL("image/png");
 var ajax = new XMLHttpRequest();
 ajax.open("POST",'/pr/custom/testSave.php',false);
 ajax.setRequestHeader('Content-Type', 'application/upload');
 ajax.send(canvasData );
 alert('done');
 }

 takeScreenShot();

</script>

</body>

</html>

PHP:


<?php
if (isset($GLOBALS["HTTP_RAW_POST_DATA"]))
{
// Get the data
$imageData=$GLOBALS['HTTP_RAW_POST_DATA'];

// Remove the headers (data:,) part.
// A real application should use them according to needs such as to check image type
$filteredData=substr($imageData, strpos($imageData, ",")+1);

// Need to decode before saving since the data we received is already base64 encoded
$unencodedData=base64_decode($filteredData);

//echo "unencodedData".$unencodedData;

// Save file. This example uses a hard coded filename for testing,
// but a real application can specify filename in POST variable
$fp = fopen( 'test.png', 'wb' );
fwrite( $fp, $unencodedData);
fclose( $fp );
}
?>

References:

http://permadi.com/blog/2010/10/html5-saving-canvas-image-data-using-php-and-ajax/

http://html2canvas.hertzen.com/

Download:

http://amolpandey.com/home/downloads/blog20_download.zip