1. To Download File from FTP , we first have to create connection by providing all credentials of FTP Server (Host-URl, Username, Password, Port No and which type of connection it is like FTP or SFTP)component Name : tFTPConnection
2. Then Next Task is to Provide Ftp File path (For this mention Ftp Location of file’s ) component Name : tFTPFileList
3. Once Done then we have to mention where we want to put that file (Here we mention Local System path where we want to put our Ftp file’s) component Name : tFTPGet



1. To Untar a tar file there is a component tFileArchieve but instead of that I am using GZipCompressor by using Java code in tJava component .
2. Here we just need to drag-n-drop tJava component , and in that provide the location of tar  file  and path  where you  untaring  your  tar file…

File dest = new File(dirName);

TarArchiveInputStream tarIn = new TarArchiveInputStream(

new GzipCompressorInputStream(

new BufferedInputStream( new FileInputStream( TarName ) )


TarArchiveEntry tarEntry = tarIn.getNextTarEntry();

while (tarEntry != null)


// create a file with the same name as the tarEntry

File destPath = new File(dest, tarEntry.getName());

System.out.println(“working: ” + destPath.getCanonicalPath()+”— Tar Entry: “);


System.out.println(“\nCSV FILE Location ::::”+context.csvloc+”\n”);



System.out.println(“Dest: “+dest);



if (tarEntry.isDirectory())


System.out.println(“Createing directory: “+tarEntry.getName());






byte [] btoRead = newbyte[2048];

BufferedOutputStream bout = new BufferedOutputStream(new FileOutputStream(destPath));

int len; //variable declared

while((len = != -1)





btoRead = null;


tarEntry = tarIn.getNextTarEntry();

}//while loop end here


(This  code is capable  of  searching  tar  file  in  given  folder as  well  as  untaring  that  file into  specified  folder  path)

Here “ dirName” denotes location where Tar file is present and “TarName” denotes name of the Tar file.
3. Regarding Iteration you can connect tFTPGet-component to this tJava-component by Iterate. By this way tJava-component get one Tar file at a time and processed it.

So  lastly the  flow  is  similar to  the  below  picture….


Reading multiple properties files in spring 3 web MVC

    Reading multiple properties files in spring 3 web MVC

Spring allows us to externalize string literals in its context configuration files into external properties files, in order to separate application specific settings from framework specific configuration. Spring will read all the properties files declared by PropertyPlaceholderConfigurer bean to resolve the placeholders at application’s start up time.

Simple Example

Declare a PropertyPlaceholderConfigurer bean in spring’s application context as follows:
That tells spring to load the properties file name in the class path. An exception will be thrown if spring could not find specified property file.
The file should contain the name and value pair for example
Name = xyz
Filepath = C:/Test/Setting.xml
Host = localhost
User = testuser

By default spring look for the properties files in the application directory. So if we specify

Then it will find file under WEB-INF directory of the application in case of spring MVC application.
We can use prefix classpath: to tell spring load properties file in the application’s classpath. For example:

In the case of spring application the file should be present in WEB/INF/classes directory or in source directory src in eclipse IDE.

Use the prefix file:/// or file: to load property file from file system.

Spring context load the file from D:/Applicatin/Config else throws the exception if file is not present in the specified path.
Loading multiple properties files
Spring allows us to specify multiple properties files for the PropertyPlaceholderConfigurer as follows



We can add as many as properties file in list as shown above. Spring context will load all the properties files specified in list as shown above at the application start up time.

Loading Properties files values in spring controller
Suppose you have properties file as follows
Name = xyz
Filepath = C:/Test/Setting.xml
Host = localhost
User = testuser

We can use the variable read from properties files in spring controller by declaring @Component annotation in controller class for example:
public class Controller{
String Name;
@Value(“${Filepath }”)
String Filepath;
@Value(“${Host }”)
String Host;
@Value(“${User }”)
String User;

The value of properties file automatic assign to declared variable by spring framework we can use these variables inside the application.

Muqtar Ahmed
Helical IT Solutions

Define JNDI for Pentaho BA server 5.0

Define JNDI for Pentaho BA server 5.0

Note: For illustration I’m showing Oracle 11g configuration. Replace the Resource name, username, password, driverClassName, and url parameters, or any relevant connection settings.

Add Driver

Place appropriate driver for Oracle 11g which is ojdbc6- to this directory for the BA Server: /pentaho/server/biserver-ee/tomcat/lib/.

Note: There should be only one driver for one database in this directory. Ensure that there are no other versions of the same vendor’s driver in this directory. If there are, back up the old driver files and remove them to avoid version conflicts.

Specify JNDI connections

  1. Stop server & BA server.
  2. Edit the /tomcat/webapps/pentaho/WEB-INF/web.xml file.
  3. At the end of the <web-app> element, in the same part of the file where you see <!– insert additional resource-refs –>, add this XML snippet.
  4. myDatasource
    jdbc/ myDatasource

  5. Save & Close web.xml
  6. Open & modify this file /tomcat/webapps/pentaho/META-INF/context.xml.
  7. Add this XML snippet anywhere in context.xml
  8. Save & Close context.xml
  9. Open simple jndi
  10. Open & Modify this file biserver-ce/pentaho-solutions/system/simple-jndi/
  11. Add these lines
  12. myDatasource /type=javax.sql.DataSource
    myDatasource /driver= oracle.jdbc.OracleDriver
    myDatasource /url= jdbc:oracle:thin:@//localhost:1521/sysdba
    myDatasource /user=dbuser
    myDatasource /password=password

  13. Save & Close
  14. Delete the pentaho.xml filed located in the /tomcat/conf/catalina/directory.
  15. Start tomcat & BA server.

Working with Crosstab Component In Pentaho Report Designer

This blog will teach reader about Working with Crosstab Component In Pentaho Report Designer

Crosstab is an experimental feature in Pentaho Report Designer . By default we can’t see crosstab component in PRD.

To enable this feature we need to do some modifications in Edit – > Preferences


In this Panel , go to “Other Settings” Tab and check the 2nd Option i.e, “Enable (unsupported) experimental features”.

Then Crosstab component will be automatically visible in PRD.

Data Source : JDBC : Sample Data Memory

Query :


If we will preview this query then we will get the data like below :


But In the above resultset it is very difficult to analyze the data .

So , its better visualization , I decided to create report with crosstab on this query .

Steps :

  1. Go to Master Report in Structure Tab
  2. On the right click of Master Report click on Add Crosstab Group


  1. Then you will find the below window


  1. Here you need to decide which field you want to see in row and which in column , accordingly set the fields you will get the output .



Now Preview the report you will find something like below :




Rupam Bhardwaj

Helical IT Solutions


Dear Readers, this blog will be talking about how to create even in MySQL and some advanced MySQL commands. To create event in MySQL, we have to follow a Pattern which is written below..

1. Declare Delimiter

2. Define Name of the event

3. Define when to schedule

4. Start with “DO”

5. Then “Begin”

6. Define Business logic

(Like variable declaration, job which you want to schedule through your Event, any condition…So mainly it is the body of your Event)

7. Then Declare end of your event like “END <delimiter>”

8. Change Delimiter to normal Delimiter.

Example:- Here I tried to call a stored procedure in this event which is scheduled after every 4-Hour, and passed the parameter for that stored procedure by taking two date parameter, and also with some additional parameters, I tried to use loop and if-else condition also in this event…     (Tested and executed Event)

delimiter $$

CREATE EVENT ue_schedule_test





DECLARE to_temp TEXT(25);

DECLARE from_temp TEXT(25);

DECLARE pv_temp TEXT(20);




OPEN curs1;

read_loop: LOOP

FETCH curs1 INTO pv_temp;

SELECT DATE_FORMAT(DATE_ADD(convert_tz(CURDATE(),'SYSTEM','+00:00'),INTERVAL -4 HOUR),'%Y-%m-%d %H:%i:%S') INTO from_temp;

SELECT DATE_FORMAT(DATE_ADD(convert_tz(CURDATE(),'SYSTEM','+00:00'),INTERVAL 4 HOUR),'%Y-%m-%d %H:%i:%S') INTO to_temp;

DELETE FROM availibility WHERE Date=from_temp;

CALL usp_availability_test(from_temp,to_temp,pv_temp,'Total','+00:00','custom');

IF done THEN

LEAVE read_loop;



CLOSE curs1;

END $$

delimiter ;

Some Other Important My-Sql Commands
1. To clear console — \! clear
2. To delete procedure — drop prcedure
3. Show all stored procedure — show procedure status
4. To get 2nd highest salary —SELECT DISTINCT(Salary) FROM employee ORDER BY Salary DESC LIMIT 1,1
5. To get 3rd highest salary —SELECT DISTINCT(Salary) FROM employee ORDER BY Salary DESC LIMIT 2,1
6. To convert string into datetime type —SELECT STR_TO_DATE(yourdatefield, ‘%m/%d/%Y’) FROM
7. To check event scheduler is ON/OFF — select @@event_scheduler
8. To start event-scheduler — set GLOBAL event_scheduler=ON
9. To delete duplicate records from table
— delete from table1 USING table1, table1 as vtable
WHERE table1.ID<vtable.ID AND table1.field_name=vtable.field_name;
So, These are the some Advanced My-SQL features , which may help you…

Have a Good Day………!!!
Helical IT Solutions

Anonymous Authentication in Pentaho

This blog will be talking about anonymous authentication in Pentaho. You can bypass the built-in security on the BA Server by giving all permissions to anonymous users. An “anonymousUser” is any user, either existing or newly created, that you specify as an all-permissions, no-login user, and to whom you grant the Anonymous role. The procedure below will grant full BA Server access to the Anonymous role and never require a login.

1. Stop the BA Server.
2. Open the /pentaho/server/biserver-ee/pentaho-solutions/system/applicationContext-spring-security.xml file and ensure that a default anonymous role is defined. Match your bean definition and property value to the example below.

<bean id=”anonymousProcessingFilter” class=””>

<!– omitted –>

   <property name=”userAttribute” value=”anonymousUser,Anonymous” />



3. Find these two beans in the same file .
o filterSecurityInterceptor
o filterInvocationInterceptorForWS
Locate the objectDefinitionSource properties inside the beans and match the contents to this code example.

<bean id=”filterInvocationInterceptor” class=””>
    <property name=”authenticationManager”>
        <ref local=”authenticationManager” />
    <property name=”accessDecisionManager”>
        <ref local=”httpRequestAccessDecisionManager” />
    <property name=”objectDefinitionSource”>
\A/.*\Z=Anonymous,Authenticated ]]> </value>


4. Save the file, then open pentaho.xml in the same directory.
5. Find the anonymous-authentication lines of the pentaho-system section, and define the anonymous user and role.

<!– omitted –>
    </anonymous-authentication> <!– omitted –>

6. Open the file in the same directory.

a) Find the singleTenantAdminUserName and replace the value with the anonymousUser name.
b) Find the singleTenantAdminAuthorityName and replace the value with Anonymous.
c) Save the file.

Open the pentahoObjects.spring.xml file.
Find all references to the bean id=”Mondrian-UserRoleMapper” and make sure that the only one that is uncommented (active) is this one:

<bean id=”Mondrian-UserRoleMapper”
    <property name=”sessionProperty” value=”MondrianUserRoles” /> </bean>

Save pentahoObjects.spring.xml and close the file.
Restart BA Server.
Enter http://localhost:8080/pentaho in browser address field. You will find that the pentaho home page opens without requiring login.

Archana Verma
Helical IT Solutions

D3 Bubble Chart Integration with Jaspersoft

In this blog we will be discussing about D3 Bubble Chart Integration with Jaspersoft using HTML method of integration.

All the reports are develop using ireport 5.5 professional and jasper server 5.5

As html component of jasper server does not load any scripts in the html component, we loaded the script in one of the decorator page(jsp page). The page is located at the location:


In the page we included the scripts which we want to load. We added the following code in the jsp page at line no 46:

<script type="text/javascript" language="JavaScript"

The script to be added should be kept at location:


Bubble Chart

Bubble Charts Integration with Jaspersoft

Bubble Charts Integration with Jaspersoft

    Bubble Chart:-

For this chart we need to include one more js script file in the decorator page as described in the start of the document.
The js file is sankey.js and can be downloaded from

Sample Code is shown below:

var svg = dimple.newSvg("#chartContainer", 1090, 500);
var data = [
"Analyst":"Cidalina Rivera",
"Total Minutes":2114200,
"Minutes Per Item":1100,
"% SLA":80
"Analyst":"Kiran Parvathala",
"Total Minutes":391800,
"Minutes Per Item":1959,
"% SLA":21
"Analyst":"KrishnaReddy Mavuru",
"Total Minutes":1056125,
"Minutes Per Item":1207,
"% SLA":26
"Analyst":"Narasimha Dara",
"Total Minutes":386740,
"Minutes Per Item":610,
"% SLA":30
"Analyst":"AslamJavid Shaik",
"Total Minutes":1573856,
"Minutes Per Item":1096,
"% SLA":94
"Analyst":"Harini Vemulapalli",
"Total Minutes":2846340,
"Minutes Per Item":1890,
"% SLA":100
"Analyst":"William Nelson",
"Total Minutes":1205502,
"Minutes Per Item":662,
"% SLA":40
"Analyst":"Janaki Govindarajan",
"Category":"Batch Job",
"Total Minutes":72684,
"Minutes Per Item":673,
"% SLA":79
"Analyst":"Chiranjeevi Krishna Karne",
"Total Minutes":542348,
"Minutes Per Item":3307,
"% SLA":52
"Analyst":"Masaru Hirata",
"Total Minutes":129090,
"Minutes Per Item":662,
"% SLA":12
"Analyst":"Naveen Kodali",
"Total Minutes":75076,
"Minutes Per Item":548,
"% SLA":84
"Analyst":"JoshyPeter Joseph",
"Total Minutes":574860,
"Minutes Per Item":2948,
"% SLA":27
"Analyst":"Maheshwar Malkapuram",
"Total Minutes":19845,
"Minutes Per Item":105,
"% SLA":92
"Analyst":"Sunil Bhalerao",
"Total Minutes":353000,
"Minutes Per Item":1765,
"% SLA":20
"Analyst":"Tina Chan-Browne",
"Total Minutes":51121,
"Minutes Per Item":469,
"% SLA":95
"Analyst":"Sirajuddin Mohammad",
"Total Minutes":323363,
"Minutes Per Item":1693,
"% SLA":47
"Analyst":"Nishanth Nadam",
"Total Minutes":620000,
"Minutes Per Item":10000,
"% SLA":22
"Analyst":"Santoshkumar Shinde",
"Total Minutes":715000,
"Minutes Per Item":11000,
"% SLA":90
"Analyst":"Keith Moller",
"Total Minutes":948000,
"Minutes Per Item":12000,
"% SLA":100
"Analyst":"AshokKumar Sangeetham",
"Total Minutes":1166550,
"Minutes Per Item":7070,
"% SLA":88
data = dimple.filterData(data, "Date", "1/1/2011");
var myChart = new dimple.chart(svg, data);
myChart.setBounds(400, 60, 500, 330);
myChart.addMeasureAxis("x", "Minutes Per Item");
myChart.addMeasureAxis("y", "Total Minutes");
myChart.addMeasureAxis("z", "% SLA");
myChart.addSeries(["Analyst","Category"], dimple.plot.bubble);
myChart.addLegend(600, 10, 360, 30, "right");

    Integration with JasperServer:

The data which we use for developing the calendar view can be fetched from any database. The data fetched from database is stored in a variable and is then accessed in the html component using the same variable. Applying this of process makes the report dynamic instead of static. Few parameters can also be added in the report which can be used in query and/or html component.
Generally for these type of charts we pass a variable which contains required data containing date, hour and a value associated with that particular date and hour. The string is created according to JSON format, so that when accessed in script tag, can be easily converted to JSON object using eval function.
Any variable/parameter can be accessed as shown below:
” var arr =”+$V{variable1}+” ”
Parameter in query:
Select * from table_name
where date between $P{parameter1} and $P{parameter2}

The steps on how to integrate it with jasperserver was discussed in my previous blog(D3 Integrating with Jasperserver).

Creating Report In iReport using Linear Gauge as component

This blog will teach reader how to create report in ireport using linear gauge as component and publishing it on the jasper server.

Purpose : to compare the avg(salary) of male and female employee in an organization

Database server : – postgesql

Database name : foodmart

Table name : employee

Below are the steps :

# 1 : Create two datasets named “MaleSlary” &  “FemaleSalary” for calculating the avg(salary) for

Male and female respectively:

Dataset 1(MaleSalary) :  select gender,avg(salary) from employee where gender like ‘M’group by


Dataset 2(FemaleSalary) : select gender,avg(salary) from employee where gender like ‘F’group by


# 1:- Drag and drop two “linear gauge “ as widget type from WidgetPro Palette chart in ireport

#2 :- Add the above datasource for widget1 as MaleSalary and widget2 as FemaleSalary

#3:- right click on the widget chart -> Edit Widget Peoperties

linear gauge jasper report

Here for each tab in the properties we can customize our widget visualization.

Example : Suppose we need to add % symbol after the Widget Pointer value,then in that case

We need to go to the Advanced Properties of Widget Configuration and add

Property Name : number suffix and Value Expression : ‘%’.

linear gauge jasper report 2

Example 2 : Suppose we need to add the Color Ranges For the Widget then in the widget properties,

Color Range Option is there, we just only have to give our condition.


# 4:- After Publishing the report int jasper server , the report will look like below :

linear gauge in iReport


Helical IT Solutions



In my previous blog, I shared how to install liferay on existing tomcat using liferay source code. You can found my previous blog here

This blog will be talking about how to install liferay on Tomcat using WAR (existing Tomcat)

For this Section, I will refer to your tomcat’s installation folder as $TOMCAT_HOME. Before you begin, make sure that you have downloaded Liferay latest war file. If you haven’t downloaded, you can download from (Find “Download Wars” section And portal dependencies files from “Dependencies” section).

After downloading, you will get a liferay-portal-6.1.x-<date>.war and liferay-portal-dependencies-6.1.x-<date>.zip.

If you have liferay in your machine, you don’t need to download liferay-portal-dependencies. You can use same Liferay global library as your portal-dependencies files.

Follow these steps, to install Liferay war in Tomcat:


Create folder $TOMCAT_HOME/lib/ext.


Extract the Liferay dependencies file to $TOMCAT_HOME/lib/ext.

The best way to get the appropriate versions of these files is, If you have liferay in your machine, then copy all .jar from $LIFERAY_HOME/lib/ext to $TOMCAT_HOME/lib/ext  (If you are going through this step, ignore Step-3 and Step-4)


Download the Liferay source code and get them from there. Once you have downloaded the Liferay source, unzip the source into a temporary folder and Copy the following jars from $LIFERAY_SOURCE/lib/development to $TOMCAT_HOME/lib/ext











Make sure the JDBC driver for your database is accessible by Tomcat. Copy JDBC driver for your version of the database server to $TOMCAT_HOME/lib/ext.



Liferay requires an additional jar to manage transactions. You may find this .jar here:


Now, Edit $TOMCAT_HOME/conf/ file. Change this line





Create setenv.bat in $TOMCAT_HOME/bin folder and add these lines:

if exist “%CATALINA_HOME%/[email protected]@/win” (

    if not “%JAVA_HOME%” == “” (

       set JAVA_HOME=



    set “JRE_HOME=%CATALINA_HOME%/[email protected]@/win”



set “JAVA_OPTS=%JAVA_OPTS% -Dfile.encoding=UTF8 -Dorg.apache.catalina.loader.WebappClassLoader.ENABLE_CLEAR_REFERENCES=false -Duser.timezone=GMT -Xmx1024m -XX:MaxPermSize=256m”



I am deploying liferay in $TOMCAT_HOME/webapps/ROOT folder. So we need to Create the directory $TOMCAT_HOME/conf/Catalina/localhost and create a ROOT.xml file in it. Edit this file and populate it with the following contents to set up a portal web application:

<Context path="" crossContext="true">

    <!-- JAAS -->
    Uncomment the following to disable persistent sessions across reboots.
    <!--<Manager pathname="" />-->
    Uncomment the following to not use sessions. See the property
    "session.disabled" in
    <!--<Manager className="" />-->




Now, Deploy Liferay.

If you are manually installing Liferay on a clean Tomcat server, delete the contents of the $TOMCAT_HOME/webapps/ROOT directory. This undeploys the default Tomcat home page. Then extract the liferay-portal-6.1.x-<date>.war file to $TOMCAT_HOME/webapps/ROOT.


Start Tomcat by executing $TOMCAT_HOME/bin/

Congratulations on successfully installing and deploying Liferay on Tomcat!

For any confusion, please get in touch with us at Helical IT Solutions

Dimensional Modeling Process

Dimensional Modeling

Dimensional modeling is a technique, used in data warehouse design, for conceptualizing and visualizing data models as a set of measures that are described by common aspects of the business. It is especially useful for summarizing and rearranging the data and presenting views of the data to support data analysis.

Dimensional Modeling Vocabulary


A fact table is the primary table in a dimensional model where the numerical performance measurements of the business are stored. The term fact represents a business measure that can be used in analyzing the business or business processes. The most useful facts are numeric and additive.

For e.g. Sales, Quantity ordered


Dimension tables are integral companions to a fact table. The dimension tables contain the textual descriptors of the business. Each dimension is defined by its single primary key, which serves as the basis for referential integrity with any given fact table to which it is joined.

Dimensions are the parameters over which we want to perform Online Analytical Processing (OLAP). For example, in a database for analyzing all sales of products, common dimensions could be:

  • · Time
  • · Location/region
  • · Customers
  • · Salesperson


Dimensional Modeling Process

Identify business process

In dimensional modeling, the best unit of analysis is the business process in which the organization has the most interest. Select the business process for which the dimensional model will be designed. Based on the selection, the requirements for the business process are gathered.

At this phase we focus on business processes, rather than on business departments, so that we can deliver consistent information more economically throughout the organization. If we establish departmentally bound dimensional models, we’ll inevitably duplicate data with different labels and terminology.

For example, we’d build a single dimensional model to handle orders data rather than building separate models for the sales and marketing departments, which both want to access orders data.

Define grain

The granularity of a fact is the level of detail at which it is recorded. If data is to be analyzed effectively, it must all be at the same level of granularity. As a general rule, data should be kept at the highest (most detailed) level of granularity.

For example, grain definitions can include the following items:

  • A line item on a grocery receipt
  • A monthly snapshot of a bank account statement

Identify dimensions & facts

Our next step in creating a model is to identify the measures and dimensions within our requirements.

A user typically needs to evaluate, or analyze, some aspect of the organizations business. The requirements that have been collected must represent the two key elements of this analysis: what is being analyzed, and the evaluation criteria for what is being analyzed. We refer to the evaluation criteria as measures and what is being analyzed as dimensions.


If we are clear about the grain, then the dimensions typically can be identified quite easily. With the choice of each dimension, we will list all the discrete, text like attributes that will flesh out each dimension table. Examples of common dimensions include date, product, customer, transaction type, and status.


Facts are determined by answering the question, “What are we measuring?” Business users are keenly interested in analyzing these business process performance measures.


Creating a dimension table

Now that we have identified dimensions, next we need to identify members , hierarchies & properties or attributes of each dimension that we need to store in our table.

Dimension Members:

A dimension contains many dimension members. A dimension member is a distinct name or identifier used to determine a data items position. For example, all months, quarters, and years make up a time dimension, and all cities, regions, and countries make up a geography dimension.

Dimension Hierarchies:

We can arrange the members of a dimension into one or more hierarchies. Each hierarchy can also have multiple hierarchy levels. Every member of a dimension does not locate on one hierarchy structure.

Creating a fact table

Together, one set of dimensions and its associated measures make up what we call a fact. Organizing the dimensions and measures into facts is the next step. This is the process of grouping dimensions and measures together in a manner that can address the specified requirements. All candidate facts in a design must be true to the grain defined in step 2. Facts that clearly belong to a different grain must be in a separate fact table.


Archana Verma

Helical IT Solutions