Oct 222014
 

Hi,

With my another addition to the series of blog’s, this time I am trying to leverage the .NET potential with Jedox. In this post I will be writing a custom .NET dll, and utilizing it in the Jedox Macro.

Solution:

Step 1: Create a “Class Library”  C# project in your Visual Studio, remember to switch to the .NET version to 3.5 as the PHP likes it more, than version 4 and beyond. (Also remember, when you fire up your Visual Studio, Run it as Administrator)

CropperCapture[16]

 

Step 2: For you project generate a signing key which is require for you dll to be put into GAC (Global Assembly Cache, a .NET store for global library referencing This is the place where PHP will be looking for your dll)

CropperCapture[17]

Step 3: Navigate to your “AssemblyInfo.cs” file in your project, and change the following line, making your assembly as “ComVisible”.


// Setting ComVisible to false makes the types in this assembly not visible
// to COM components. If you need to access a type in this assembly from
// COM, set the ComVisible attribute to true on that type.
[assembly: ComVisible(true)]

 

Step 4: Now you are ready to jot down your .NET functionality, in your class file.


using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace DotNetToJedox
{
public class JedoxInDotNet
{
public string GetData(string message)
{
return "Message:" + message;
}
}
}

Step 5: Fire your “Command Prompt” as an admin, and run the following command:

C:\>gacutil -i “<<Path to your dll>>\DotNetToJedox.dll”

Note: In case if you command prompt suggest that it cant find gacutil, just add the relevant refrence account to your OS, for the path:

C:\Program Files\Microsoft SDKs\Windows\v6.0A

The following post can help you further on that:

http://stackoverflow.com/questions/3397479/where-is-gacutil-exe

The above will add your dll in your machine GAC, which is now ready for your PHP/Jedox installation to utilize.

You can check the physical assembly being present in GAC by navigating to the following path and confirming:

C:\Windows\assembly

CropperCapture[18]

Step 6: Now its time to write some PHP Macro for Jedox to utilize our dll, So the code in one of the spreadsheets is as follows:


function test_dotnet(){

$sampleClass = new DOTNET("DotNetToJedox, Version=1.0.0.0, Culture=neutral, PublicKeyToken=9fd42dd22c7fef0d, processorArchitecture=MSIL", "DotNetToJedox.JedoxInDotNet");
return __msgbox($sampleClass->GetData('Get me the message !!') , 'Dot Net To Jedox');

}

And the result is :

CropperCapture[19]

 

Note: This mechanism will work on single machine as GAC is system wide only. So if you need a .NET library functionality for you application, it has to be GAC deployed on the server where Jedox is installed.

Hope it helps !!

Downloads:

http://amolpandey.com/home/downloads/blog25_download.zip

Oct 202014
 

Hi,

Working with Jedox and saying Grid is not enough is little sarcastic… There cells all over the places !!!

Well I thought to bring out this topic for a simple reason that though in the product such as Jedox there are enough grid to keep one busy, but by the nature of them they are still little limited to certain features which the power of widgets combined with JavaScript multiple mature grid components can spark life in your applications.

Limitations by not using Widgets/JS Grid Combo:

1. If you tend to use a vertical dyna-range (Jedox), it dynamically expands the whole set of rows across all columns (like insert row functionality) though good enough but from a ease of use design perspective it severely limits the report/app design, as many of the times when working with dyna-ranges with only small number of columns such as 2~5  we would had to leave the rest of the columns unused as when the report is rendered any content on the same will be pushed down as the dyna-range will insert set of rows to represent the table.

CropperCapture[11]

As you can see in the image above, the coloured area next to dyna-range is wasted real estate on the UI as we are unable to put any functionlaity as button or report

2. Advanced features such as paged report, which I have posted in my last articles is hard to get in practice as so many aspects of paging has to be taken care of when designing it manually, some thing off the self is more ease and robust to put in working.

3. Rich UI, Jedox in itself is a well designed product but at some places you need search functionality, richer UI and more features as business gets demanding.

And the list continues as you browse through the new flashy grid controls offered by the JavaScript technology.

Solution

So in order to over come the above limitations, I have built a  sample report showcasing the usage of popular JavaScript Grids as follows:

1. Data Table

http://datatables.net/

CropperCapture[12]

2. Handson JS Grid

http://handsontable.com/

CropperCapture[13]

3. ExtJS Grid

http://dev.sencha.com/extjs/5.0.0/examples/index.html

CropperCapture[14]

In all the above tables the data is sourced from Macro (except ExtJS, in which its local but can be easily amended to any source).

In the above file the widgets are driven by custom HTML code files to which the widget refers to and sends the data array to be displayed on the grid. Thus now you can more effectively use the UI area for user functionality and build comprehensive applications with good means of reporting as well.

This above method can greatly help you to develop richer UI’s and leverage more power of Jedox as BI portal.

Hope this trick helps !!!

Downloads:

http://amolpandey.com/home/downloads/blog24_download.zip

Oct 182014
 

Hi,

Recently coming across this issue, and problems relating to it. I took upon a following approach which I would like to share across, helping a wider community and get reviews on the same:

Problem:

Many times designing the cubes there are situations in which the cube may not have all the dimensions required for the reporting purposes reason’s being, the planning users want to splash data only on certain limited set of dimensions, but for analytical purposes they need some additional dimensions (which may be attributes of previously used dimensions for planning cubes).

Or for many other reasons where there are lookups involved or complex computation is involved between the planning and reporting cubes. But the summary of the problem is, keeping the cubes in sync real time. So that the numbers remain homogeneous across the system and do not get out of balance, confirmed by many numerous reconciliation reports which are tedious to keep an eye always.

Some arguments do exist that Rules can help achieve this to link multiple cubes, but to my experience so far, Rules are great when used for computations upon variables within one cube itself (even good when cube sizes are relatively small with look up formulas), but as soon as lookup functions start to appear in the rules like “PALO.DATAC” that’s where none my rules were ever performant from usability of the system (as my projects involved large data set cubes), not just with Jedox event Microsoft confirms this nature of SSAS and suggest to keep the cubes comprehensive of data as far as possible.

Microsoft:

Any calculation using the LookupCube function is likely to suffer from poor performance. Instead of using this function, consider redesigning your solution so that all of the data you need is present in one cube.

http://msdn.microsoft.com/en-au/library/ms144720.aspx

Solution:

At work I realized always keep things configurable (Golden Rule), as every day some thing changes like rules, logic, constants. So the best mechanism I planned to suit the flexibility requirements to keep the cubes in sync was to leverage the Jedox ETL but custom designing the ETL project to keep the cubes in sync  by the required variables, and kicking off the ETL by the web service triggered by the Supervision Server on the cell change event, this way we wont loose out on any cube change event and all the affected cubes are also synced up:

The configuration of the set up is as follows:

1. Configure the “PALO.INI” for monitoring cell change events via Supervision Server

Add the following command:

“use-cube-worker”

2. In the “SVS/sample_script” folder of Jedox create a copy of “sep.inc.on_cell_change.php” and place it in the “custom_script” folder by renaming it to your choice, in my case it is “sep.inc.on_cell_change_cube_sync.php”

Amend the code in the file as follows:


/**
* Sample Script for cell change event in Jedox OLAP Server. Version 6.0.0
* SVN: $Id: sep.inc.on_cell_change.php 239 2013-02-12 14:52:17Z vmalicevic $
*/

// adapt protocol, host, port here to match your setup
define('ETL_URL','http://127.0.0.1:7775');

// wait for job to finish prior returning status
define('WAIT_FOR_FINISH', true);

// max execution time of script in seconds used in combination with WAIT_FOR_FINISH
define('MAX_EXECUTION_TIME', 10);

class SEPEventHandler extends SEPEventHandlerBase
{
public function OnUserLogin($username)
{
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< User logged in: $username >>");
}

public function OnUserLogout($username)
{
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< User logged out: $username >>");
}

public function OnUserAuthenticate($username, $password) { // bool
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< User authenticate >>");
return true;
}

public function OnUserAuthorize($username, $password, array& $groups) { // bool
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< User authorize >>");
return true;
}

public function OnWindowsUserAuthorize($domain, $username, array $winGroups, array& $groups) { // bool
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< User Windows authorize, domain $domain, username $username >>");
$groups = $winGroups;
return true;
}

public function OnServerShutdown() { // void
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< Server shutdown handler >>");
}

public function OnDatabaseSaved($database) { // void
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< Databased saved : $database >>");
}

public function OnTermination() { // void
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< Termination handler >>");
}

# setting the cubes or part of cubes which should be triggered
public function InitCubeWorker($database,$cube)
{
if($database == "Demo")
{
# set triggering area for cube ‚Sales‘
if($cube == "Sales")
{
$AreaA = array(DIMENSION_TOTAL,DIMENSION_TOTAL,DIMENSION_TOTAL, DIMENSION_TOTAL,DIMENSION_TOTAL,array('Units','Turnover'));
$this->WatchCubeArea($AreaA,'SalesCube');
}
}
}

public function OnDrillThrough( $database, $cube, $mode, $arg ) { // string
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< OnDrillThrough >>");
return "not implemented;\r\n";
}

public function InitDimensionWorker()
{
sep_log("<< USING SAMPLE SCRIPT PLEASE ADJUST TO MATCH YOUR CONFIGURATION >>");
sep_log("<< Get triggering dimensions >>");
}
}

function SalesCube($database, $cube, $areaid, $sid2, $coordinates, $value, $splashMode, $additive)
{

include_once('etl_code.php');</pre>
<pre>$user = get_user_for_sid($sid2); // user who made the change
$groups = get_groups_for_sid($sid2); // groups of the user who made the change

if ($coordinates[5] == 'Units')
{
# read the Price of target product from it's attribute cube
$unit_price = palo_dataa("SupervisionServer/$database", '#_Products',array('Price Per Unit',$coordinates[0]));

# calculate turnover
$turnover_cell = array ($coordinates[0], $coordinates[1], $coordinates[2], $coordinates[3], $coordinates[4], 'Turnover');
$turnover_value = $unit_price * $value;

# starting to lock the database (needed to be allowed to change data)
# (notice that the time the server will be locked for writing has been minimized to the minimum time needed)
event_lock_begin($areaid,$sid2);

# get old units value
$oldUnitsVal = palo_data("SupervisionServer/$database", $cube, $coordinates[0], $coordinates[1], $coordinates[2], $coordinates[3], $coordinates[4], $coordinates[5]);

# write original dataset at given coordinates in 'Sales' cube
# (notice that in case of a triggered event you may decide yourself whether the input data should be written to
# database or not. In case you want it written you must not forget to write it into the database)
$write = palo_setdataa($value, FALSE, "SupervisionServer/$database", $cube, $coordinates);

# write calculated turnover into column 'Turnover' in 'Sales' cube
$write = palo_setdataa($turnover_value, FALSE, "SupervisionServer/$database",$cube, $turnover_cell);

# ending lock of affected database
event_lock_end();
}
elseif ($coordinates[5] == 'Turnover')
{
# Throws a popup notice in Excel
sep_error("Warning", "No one is allowed to change the Turnover");
}

//Put the ETL Kickoff code at the bottom of the script so that once the changes intended by the user are saved to the cube can be picked up by the corresponding ETL event.
//Location of ETL kick off code is vital to address the requirements.

sep_log("ETL Function is about to be called !!!");
sep_log(dirname(__FILE__));

startJob();

}

Also addtionally in the same folder create the following file called “etl_code.php” for custom code relating to ETL kick off

</pre>

function connectSoap()
{
 $wsdl_url = ETL_URL . '/etlserver/services/ETL-Server?wsdl';
 $server = @new SoapClient($wsdl_url, array('exceptions' => true, 'location' => $wsdl_url));
 return $server;
}

function startJob(){
sep_log("ETL Function is called...");
$server = connectSoap();
$project = 'LearnDummy';
$type = 'jobs';
$name = '[J][JustFromEvent]';
$locator = "$project.$type.$name";
 $response = $server->execute(array('locator' => $locator));
 sep_log("ETL Kick off has now been called...");
}

3. Following the above changes amend the “SVS” configuration file for the script “sep.inc.php” to point to you new custom script:

include './custom_scripts/sep.inc.on_cell_change_cube_sync.php';

4. Finally restart the MOLAP service and , fire up a Jedox spreadsheet and with your inputs flowing in to the cube and keep an eye on the Jedox ETL log accumulating, indicating the process works and your ETL is being called upon any changes occurring in the targeted cube region. (In my case the ETL name was “LearnDummy” with the Job name as “[J][JustFromEvent]“)

CropperCapture[6]

Note: One thing to be very careful for this process is, to ensure that this mechanism is primarily built for user based inputs to the cube from spreadsheets, to track them and sync them over. But for any ETL based cube load, the SVS event must be ensured to be de-activated for the optimal performance in the nightly load processes, if missed can lead to huge overburdening of the process which it wasn’t designed to cater.

The main benefit I experienced from this solution is that its an async operations hence user don’t realised any performance impacts, as the user inputs trapped are only used for kicking off the ETL job which works on its own, effectively by queuing jobs under heavy loads and ensuring all operations are accomplished with the logs in place to review and audit.

Hope it helps, Enjoy !!!

Oct 112014
 

Hi,

A seemingly trivial looking task took me all over the board recently to figure out a solution for the business where users didn’t wanted to create consolidated nodes in the dimensions to get the aggregations, instead they wanted to get the aggregated results via subset which where filtered based upon the attributes flags, phew !!!

Ne ways I thought this should be achievable by some tips and tricks applied in the areas of excel based formulas and get the result. So instead of extending my words further I will first present you with the picture which would directly illustrate the problem and solution which I will describe below more:

CropperCapture[1]

From the above as you can see, the “Sum by Subset” row has the longest looking formula, but this formula once broken down to components is very easy to interpret.

This formula is initially using the PALO.TFILTER to generate the filter for the dimension year to include only the elements “2011” and “2012”.

This is then fed to the PALO.SUBSET formula which is responsible to generate the subset required to be fed into  the PALO.DATAV (be careful this is DATAV not DATAC, which is array based retrieval of values from the cube via Excel)

Once we have got the PALO.DATAV working we wrap up the output into the Excel based SUM formula and enter the all content of the formula as array formula (Ctrl+Shift+Enter).

And there you go, you have got the result as desired aggregated summed values for the two years worth of data in excel spreadsheet, this can also be leveraged in paste view generated via Jedox add-in for excel.

Now what you have read so far is part of the story till Excel was in picture, but the same successful story is not so successful in Jedox web, because I do acknowledge that Jedox has made a smart and rich UI to replicate excel on web, but still some glitches are found, which actually elongated this story.

Jedox web is not so good as evaluating the nested array formulas as of yet and I expect it to be fixed up soon in next coming releases, once its done it awesomeness will be increased further…

So the solution to the rescue for Jedox Web for the similar to achive is the custom UDF via Macro.

CropperCapture[2]

<pre>
function subset_sum($region,$yearPattern){

// Define constant variables.
define('HOST', 'localhost');
define('PORT', '7777');
define('USER', 'admin');
define('PASS', 'admin');

// Establishes a new connection to the OLAP server.
$connection = palo_init(HOST, PORT, USER, PASS);
// create the variables db_name and cube_name
$db_name = 'Demo';
$cube_name = 'Sales';

$tFilter = palo_tfilter(array($yearPattern),true);
$ySubset = palo_subset($connection, $db_name,"Years",1,null,null,null,null,$tFilter, null);
$ySubset = array_map(function($element){return $element['name'];}, $ySubset);
array_unshift($ySubset,2,1);
$val = palo_datav($connection, $db_name, $cube_name,
"All Products",$region,"Year",$ySubset,"Actual","Units");
unset($val[0]);
unset($val[1]);

// Closes the established connection.
palo_disconnect($connection);
return array_sum($val);
}

Attached along with this post are both the Excel and Jedox web based solutions… And I look forward to the comments if any thing better can be done !!!

Hope it helps…

Download:

http://amolpandey.com/home/downloads/blog23_download.zip

Oct 092014
 

Hi,

In this post I would like to introduce you to the new feature set provided by MSSQL server 2012 (syntax) for paged result sets using the “OFFSET FETCH” clause

http://technet.microsoft.com/en-us/library/gg699618(v=sql.110).aspx

Now we got the SQL in place to provide the flexibility, so how we go about leverage it. For the same, I have sketched out a very user friendly interface with all the features of a paged report along with the features for the user to operate upon the rows:

CropperCapture[4]

The above report provided the functionalities of sorting, paging, user selections and operations upon the same with the help of dynamic sql constructed by the help of variables.

Give it a try with the code base provided in the download and I hope it provide you the infrastructure required to achieve similar for your feature rich reports.

Downloads:

 http://amolpandey.com/home/downloads/blog22_download.zip