Adding new chart in Helical Dashboard Insight (HDI)

Adding new chart in Helical Dashboard Insight (HDI)

1) Adding the Pie Chart in HDI:-
HDI use D3 (Data-Driven Documents) library. D3 allows you to bind arbitrary data to a Document Object Model (DOM), and then apply data-driven transformations to the document. For example, you can use D3 to generate an HTML table from an array of numbers.
For adding the Pie chart in the HDI following steps should be followed:-
1) EFW file:- EFW contain the Title, author, description, Template name, visibility of the Dashboard.

2) EFWD File: – EFWD file contain the Data Source Connection Properties such as connection id and connection type.It also contain Url of the connection to the database, User name and Password to the Database

The DataSource Details used in our demo is shown as below:-


<DataSources>
        <Connection id="1" type="sql.jdbc">
           <Driver>com.mysql.jdbc.Driver</Driver>
           <Url>jdbc:mysql://192.168.2.9:3306/sampledata</Url>
            <User>devuser</User>
            <Pass>devuser</Pass>
        </Connection>
    </DataSources>

Data Map contains Map id and connection and connection Type. Map id is same as that specified in the EFWVF. Query for the Data Map and the Parameter to be used is specified in the Tags and Parameter in the Tags.


<DataMap id="2" connection="1" type="sql" >
       <Name>Query for pie chart component - Order Status</Name>
		<Query>
			<![CDATA[
					select STATUS, count(ORDERNUMBER) as totalorders 
					from ORDERS
					where STATUS in (${order_status})
					group by STATUS;  
			]]>
                </Query>

       <Parameters>
          <Parameter name="order_status" type="Collection" default="'Shipped'"/>
       </Parameters>
</DataMap>

3)EFWVF File :-
In EFWVF file we first set the chart id the chart we set the chart properties. For Pie Chart we set the chart Properties between the tags. The properties such as Chart name, Chart type, Chart Data Source. In chart section we specify the chart script.

In Chart Script we set the below variables to customize the Pie chart
Setting Up the Chart


var placeHolder = "#chart_1";
var chartHeader = "Orders By status";
var width = 300,
height = 300;

The query returns 2 columns – ‘STATUS’ and ‘totalorders’.

As in the JasperReport we set Catagary Expression, Value Expression, Tool Tip for the pie chart,same way the Below variables are set accordingly.


var category = "d.data.STATUS";
var values="d.totalorders";
var tooltip = "\"Total orders with status \"+ d.data.STATUS+\" are \"+d.data.totalorders";
var legendValues = "d.STATUS";

You may change the below script directly for further customization


function angle(d)
{
var a = (d.startAngle + d.endAngle) * 90 / Math.PI - 90;
return a > 90 ? a - 180 : a;
}
$(placeHolder).addClass('panel').addClass('panel-primary');
var radius = Math.min(width, height) / 2;
var color = d3.scale.ordinal().range(["#1f77b4", "#ff7f0e", "#2ca02c", "#d62728", "#9467bd", "#8c564b", "#e377c2", "#7f7f7f", "#bcbd22", "#17becf"]);
var arc = d3.svg.arc().outerRadius(radius - 10).innerRadius(0);
var pie = d3.layout.pie().sort(null).value(function(d) { return eval(values); });
var heading = d3.select(placeHolder).append("h3").attr("class", "panel-heading").style("margin", 0 ).style("clear", "none").text(chartHeader);
Creating The SVG Element
var svg = d3.select(placeHolder).append("div").attr("class", "panel-body").append("svg")
.attr("width", width)
.attr("height", height)
.append("g")
.attr("transform", "translate(" + width / 2 + "," + height / 2 + ")");

Drawing The Pie


var g = svg.selectAll(".arc")
.data(pie(data))
.enter().append("g")
.attr("class", "arc");
g.append("path")
.attr("d", arc)
.style("fill", function(d) { return color(eval(category)); });
g.append("text")
.attr("transform", function(d) {
d.outerRadius = radius -10;
d.innerRadius = (radius -10)/2;
return "translate(" + arc.centroid(d) + ")rotate(" + angle(d) + ")"; })
.attr("dy", ".35em")
.style("text-anchor", "middle")
.style("fill", "White")
.text(function(d) { return eval(category); });

Drawing The Lable and Title


g.append("title")
.text(function(d){ return eval(tooltip)});
var legend = d3.select(placeHolder).select('.panel-body').append("svg")
.attr("class", "legend")
.attr("width", 75 * 2)
.attr("height", 75 * 2)
.selectAll("g")
.data(data)
.enter().append("g")
.attr("transform", function(d, i) { return "translate(50," + (i * 20) + ")"; }); legend.append("rect")
.attr("width", 18)
.attr("height", 18)
.style("fill", function(d){return color(eval(legendValues))});
legend.append("text")
.attr("x", 24)
.attr("y", 9)
.attr("dy", ".35em")
.text (function(d) { return eval(legendValues); });
]]>

4) HTML File:-
HTML file name should be the same that specified in the EFW file under the Template Section.
In HTML File On Top we specify links of the external resources such as :-


<script src="getExternalResource.html?path=UI_Testing/dimple.js"></script>

Below it we create the Divisions for proper alignments and the Position where the Pie Chart should be placed.


<div class="row">
    <div class="col-sm-5">
       <div id="supportChartObj4"></div>
    </div>
    <div class="col-sm-5">
       <div id="supportChartObj5"></div>
    </div>
</div>

Below the Division section,we have Script section we specify the parameters and the chart.

Parameter Creation:-


var select =
{
name: "select",
type: "select",
options:{
multiple:true,
value : 'STATUS',
display : 'STATUS'
},
parameters: ["order_status"],
htmlElementId: "#supportChartObj1",
executeAtStart: true,
map:1,
iframe:true
};

Chart Creation:-


var chart = {
name: "chart",
type:"chart",
listeners:["order_status"],
requestParameters :{
order_status :"order_status"
},
vf : {
id: "1",
file: "Test.efwvf"
},
htmlElementId : "#supportChartObj4",
executeAtStart: true
};

And all the parameters and chart specified in HTML file are passed to the dashboard.init.

This Way Pie chart is Created.

pie

2)Adding the Pie Chart in HDI:-

1)EFW file:- EFW contain the Title, author, description, Template name, visibility of the Dashboard.

2)EFWD File: – EFWD file contain the Data Source Connection Properties such as connection id and connection type.It also contain Url of the connection to the database, User name and Password to the Database

The DataSource Details used in our demo is shown as below:-


<DataSources>
        <Connection id="1" type="sql.jdbc">
           <Driver>com.mysql.jdbc.Driver</Driver>
           <Url>jdbc:mysql://192.168.2.9:3306/sampledata</Url>
            <User>devuser</User>
            <Pass>devuser</Pass>
        </Connection>
    </DataSources>

Data Map contains Map id and connection and connection Type. Map id is same as that specified in the EFWVF. Query for the Data Map and the Parameter to be used is specified in the Tags and Parameter in the Tags.
Ex:-


<DataMap id="2" connection="1" type="sql" >
       <Name>Query for pie chart component - Order Status</Name>
		<Query>
			<![CDATA[
					select STATUS, count(ORDERNUMBER) as totalorders 
					from ORDERS
					where STATUS in (${order_status})
					group by STATUS;  
			]]>
                </Query>

       <Parameters>
          <Parameter name="order_status" type="Collection" default="'Shipped'"/>
       </Parameters>
</DataMap>

3)EFWVF File :-
In EFWVF file we first set the chart id the chart we set the chart properties. For Pie Chart we set the chart Properties between the tags. The properties such as Chart name, Chart type, Chart Data Source. In chart section we specify the chart script.

In Chart Script we set the below variables to customize the Pie chart

Setting Up the Chart


var placeHolder = "#chart_1";
var chartHeader = "Orders By status";
var margin = {top: 20, right: 30, bottom: 30, left: 70},
width = 500 - margin.left - margin.right,
height = 400 - margin.top - margin.bottom;

The query returns 2 columns – ‘STATUS’ and ‘totalorders’.

As in the JasperReport we set Catagary Expression, Value Expression, Tool Tip for the pie chart,same way the Below variables are set accordingly.


var category = "d.STATUS";
var values="d.totalorders";
var tooltip = "\"Total orders with Status \"+ d.STATUS+\" are\"+d.totalorders";

You may change the below script directly for further customization


$(placeHolder).addClass('panel').addClass('panel-primary');
var x = d3.scale.ordinal().rangeRoundBands([0, width], .1);
var y = d3.scale.linear().range([height, 0]);
var xAxis = d3.svg.axis().scale(x).orient("bottom");
var yAxis = d3.svg.axis().scale(y).orient("left");
var heading = d3.select(placeHolder).append("h3").attr("class", "panel
heading").style("margin", 0 ).style("clear", "none").text(chartHeader);
var chart = d3.select(placeHolder).append("div").attr("class", "panel-body")
.append("svg")
.attr("width", width + margin.left + margin.right)
.attr("height", height + margin.top + margin.bottom)
.append("g")
.attr("transform", "translate(" + margin.left + "," + margin.top + ")");

Drawing The Bar Chart


x.domain(data.map(function(d) { return eval(category); }));
y.domain([0, d3.max(data, function(d) { return eval(values); })]);
        chart.append("g")
        .attr("class", "x axis")
	.attr("transform", "translate(0," + height + ")")
	.call(xAxis);

	chart.append("g")
	.attr("class", "y axis")
	.call(yAxis);

	chart.selectAll(".bar")
	.data(data)
	.enter().append("rect")
	.attr("class", "bar")
	.attr("x", function(d) { return x(eval(category)); })
	.attr("y", function(d) { return y(eval(values)); })
	.attr("height", function(d) { return height - y(eval(values)); })
	.attr("width", x.rangeBand());
	
        chart.selectAll(".bar")
	.append("title")
	.text(function(d){ return eval(tooltip)});

4) HTML File:-
HTML file name should be the same that specified in the EFW file under the Template Section.
In HTML File On Top we specify links of the external resources such as :-


<script src="getExternalResource.html?path=UI_Testing/dimple.js"></script>

Below it we create the Divisions for proper alignments and the Position where the Pie Chart should be placed.


<div class="row">
    <div class="col-sm-5">
       <div id="supportChartObj4"></div>
    </div>
    <div class="col-sm-5">
       <div id="supportChartObj5"></div>
    </div>
</div>

Below the Division section,we have Script section we specify the parameters and the chart.

Parameter Creation:-


var select =
{
name: "select",
type: "select",
options:{
multiple:true,
value : 'STATUS',
display : 'STATUS'
},
parameters: ["order_status"],
htmlElementId: "#supportChartObj1",
executeAtStart: true,
map:1,
iframe:true
};

Chart Creation:-


var chart = {
name: "chart",
type:"chart",
listeners:["order_status"],
requestParameters :{
order_status :"order_status"
},
vf : {
id: "2",
file: "Test.efwvf"
},
htmlElementId : "#supportChartObj4",
executeAtStart: true
};

And all the parameters and chart specified in HTML file are passed to the dashboard.init.

This Way Bar chart is Created.

Bar Chart

Steps to migrate oracle with pentaho

Step 1:-

Run script as DB admin.

Script is available in location:- biserver-ce\data\oracle10g.

Modify configuration file:-

  1. applicationContext-spring-security-hibernate.properties.

Location:-

pentaho-solutions\system\applicationContext-spring-security-hibernate.properties.

original code:-

jdbc.driver=org.hsqldb.jdbcDriver

jdbc.url=jdbc:hsqldb:hsql://localhost:9001/hibernate

jdbc.username=hibuser

jdbc.password=password

hibernate.dialect=org.hibernate.dialect.HSQLDialect

Modified code:-

dbc.driver=oracle.jdbc.OracleDriver

jdbc.url=jdbc:oracle:thin:@localhost:1521/sysdba

jdbc.username=hibuser

jdbc.password=password

hibernate.dialect=org.hibernate.dialect.Oracle10gDialect

  1. hibernate-settings.xml

Location:- pentaho-solutions\system\hibernate\hibernate-settings.xml.

Original code

<config-file>system/hibernate/hsql.hibernate.cfg.xml</config-file>

Modified code:-

<config-file>system/hibernate/oracle10g.hibernate.cfg.xml</config-file>

oracle10g.hibernate.cfg.xml:-

Location:-

pentaho-solutions\system\hibernate\ oracle10g.hibernate.cfg.xml

Do not need to change any code in this file.. Just need to check everything is perfect or not

 

   <!– Oracle 10g Configuration –>

<property name=”connection.driver_class”>oracle.jdbc.driver.OracleDriver</property>

<property name=”connection.url”>jdbc:oracle:thin:@localhost:1521/sysdba

</property>

<property name=”dialect”>org.hibernate.dialect.Oracle10gDialect</property>

<property name=”connection.username”>hibuser</property>

<property name=”connection.password”>password</property>

<property name=”connection.pool_size”>10</property>

<property name=”show_sql”>false</property>

<property name=”hibernate.jdbc.use_streams_for_binary”>true</property>

<!– replaces DefinitionVersionManager –>

<property name=”hibernate.hbm2ddl.auto”>update</property>

<!– load resource from classpath –>

<mapping resource=”hibernate/oracle10g.hbm.xml” />

 

  1. quartz.properties:-

Location:-

pentaho-solutions\system\quartz\quartz.properties

Original Code

org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.PostgreSQLDelegate

Modified Code:-

org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.oracle.OracleDelegate

4.context.xml:-

Location:-

tomcat\webapps\pentaho\META-INF\context.xml

Original Code

<Resource name=”jdbc/Hibernate” auth=”Container” type=”javax.sql.DataSource”

factory=”org.apache.commons.dbcp.BasicDataSourceFactory” maxActive=”20″ maxIdle=”5″

maxWait=”10000″ username=”hibuser” password=”password”

driverClassName=”org.hsqldb.jdbcDriver” url=”jdbc:hsqldb:hsql://localhost/hibernate

validationQuery=”select count(*) from INFORMATION_SCHEMA.SYSTEM_SEQUENCES” />

 

<Resource name=”jdbc/Quartz” auth=”Container” type=”javax.sql.DataSource”

factory=”org.apache.commons.dbcp.BasicDataSourceFactory” maxActive=”20″ maxIdle=”5″

maxWait=”10000″ username=”pentaho_user” password=”password”

driverClassName=”org.hsqldb.jdbcDriver” url=”jdbc:hsqldb:hsql://localhost/quartz

validationQuery=”select count(*) from INFORMATION_SCHEMA.SYSTEM_SEQUENCES”/>

 

Modified Code:-

<Resource validationQuery=”select 1 from dual”

url=” jdbc:oracle:thin:@localhost:1521/sysdba

driverClassName=”oracle.jdbc.OracleDriver” password=”password”

username=”hibuser” maxWait=”10000″ maxIdle=”5″ maxActive=”20″

factory=”org.apache.commons.dbcp.BasicDataSourceFactory”

type=”javax.sql.DataSource” auth=”Container” name=”jdbc/Hibernate”/>

                       

<Resource validationQuery=”select 1 from dual”

url=” jdbc:oracle:thin:@localhost:1521/sysdba

driverClassName=”oracle.jdbc.OracleDriver” password=”password”

username=”quartz” maxWait=”10000″ maxIdle=”5″ maxActive=”20″

factory=”org.apache.commons.dbcp.BasicDataSourceFactory”

type=”javax.sql.DataSource” auth=”Container” name=”jdbc/Quartz”/>

6. repository.xml

Location of the file: pentaho-solutions\system\jackrabbit\repository.xml.

Comment this code means (<! – – every thing here – -> )

Active means: Remove comment

i)                    FileSystem part

Comment this code

<FileSystem class=”org.apache.jackrabbit.core.fs.local.LocalFileSystem”>

     <param name=”path” value=”${rep.home}/repository”/>

</FileSystem>

Active this code:-

<FileSystem class=”org.apache.jackrabbit.core.fs.db.OracleFileSystem”>

   <param name=”url” value=”jdbc:oracle:thin:@localhost:1521“/>

   <param name=”user” value=”jcr_user”/>

   <param name=”password” value=”password”/>

   <param name=”schemaObjectPrefix” value=”fs_repos_”/>

   <param name=”tablespace” value=”pentaho_tablespace”/>

</FileSystem>

ii)                  DataStore part

Comment this code

<DataStore class=”org.apache.jackrabbit.core.data.FileDataStore”/>

Active this code:-

<DataStore class=”org.apache.jackrabbit.core.data.db.DbDataStore”>

<param name=”url” value=”jdbc:oracle:thin:@localhost:1521/sysdba”/>

<param name=”driver” value=”oracle.jdbc.OracleDriver”/>

<param name=”user” value=”jcr_user”/>

<param name=”password” value=”password”/>

<param name=”databaseType” value=”oracle”/>

<param name=”minRecordLength” value=”1024″/>

<param name=”maxConnections” value=”3″/>

<param name=”copyWhenReading” value=”true”/>

<param name=”tablePrefix” value=””/>

<param name=”schemaObjectPrefix” value=”ds_repos_”/>

</DataStore>

iii)                Security part in the FileSystem Workspace part

Comment this code:-

<FileSystem class=”org.apache.jackrabbit.core.fs.local.LocalFileSystem”>

<param name=”path” value=”${wsp.home}”/>

</FileSystem>

Active this code:-

<FileSystem class=”org.apache.jackrabbit.core.fs.db.OracleFileSystem”>

<param name=”url” value=”jdbc:oracle:[email protected]:1521/sysdba”/>

<param name=”user” value=”jcr_user”/>

<param name=”password” value=”password”/>

<param name=”schemaObjectPrefix” value=”fs_ws_”/>

<param name=”tablespace” value=”pentaho_tablespace”/>

</FileSystem>

iv)       PersistenceManager part

Comment this code:-

<PersistenceManager class=”org.apache.jackrabbit.core.persistence.pool.H2PersistenceManager”>

<param name=”url” value=”jdbc:h2:${wsp.home}/db”/>

<param name=”schemaObjectPrefix” value=”${wsp.name}_”/>

</PersistenceManager>

Active This Code:-

<PersistenceManager class=”org.apache.jackrabbit.core.persistence.bundle.OraclePersistenceManager”>

<param name=”url” value=”jdbc:oracle:thin:@localhost:1521/sysdba”/>

<param name=”driver” value=”oracle.jdbc.OracleDriver”/>

<param name=”user” value=”jcr_user”/>

<param name=”password” value=”password”/>

<param name=”schema” value=”oracle”/>

<param name=”schemaObjectPrefix” value=”${wsp.name}_pm_ws_”/>

<param name=”tablespace” value=”pentaho_tablespace”/>

</PersistenceManager>

v)       FileSystem Versioning part

Comment This Code:-

<FileSystem class=”org.apache.jackrabbit.core.fs.local.LocalFileSystem”>

<param name=”path” value=”${rep.home}/version” />

</FileSystem>

Active This Code:-

<PersistenceManager class=”org.apache.jackrabbit.core.persistence.bundle.OraclePersistenceManager”>

<param name=”url” value=”jdbc:oracle:thin:@localhost:1521/sysdba”/>

<param name=”driver” value=”oracle.jdbc.OracleDriver”/>

<param name=”user” value=”jcr_user”/>

<param name=”password” value=”password”/>

<param name=”schema” value=”oracle”/>

<param name=”schemaObjectPrefix” value=”pm_ver_”/>

<param name=”tablespace” value=”pentaho_tablespace”/>

</PersistenceManager>

 

 

Stopping HSQL db start up

In web.xml file

Comment or delete this code (Commenting is preferable)

<!– [BEGIN HSQLDB DATABASES] –>

<context-param>

<param-name>hsqldb-databases</param-name>

<param-value>sampledata@../../data/hsqldb/sampledata,hibernate@../../data/hsqldb/hibernate,quartz@../../data/hsqldb/quartz</param-value>

</context-param>

<!– [END HSQLDB DATABASES] –>

 

Also comment this code

<!– [BEGIN HSQLDB STARTER] –>

<listener>

<listener-class>org.pentaho.platform.web.http.context.HsqldbStartupListener</listener-class>

</listener>

<!– [END HSQLDB STARTER] –>

You have done with integrating pentaho 5.0.1 CE with Oracle

Now login to the Pentaho server .

URL:  http://localhost:8080/pentaho

Username/Password : Admin/password

Define JNDI for Pentaho BA server 5.0

Define JNDI for Pentaho BA server 5.0

Note: For illustration I’m showing Oracle 11g configuration. Replace the Resource name, username, password, driverClassName, and url parameters, or any relevant connection settings.

Add Driver

Place appropriate driver for Oracle 11g which is ojdbc6-11.2.0.2.jar to this directory for the BA Server: /pentaho/server/biserver-ee/tomcat/lib/.

Note: There should be only one driver for one database in this directory. Ensure that there are no other versions of the same vendor’s driver in this directory. If there are, back up the old driver files and remove them to avoid version conflicts.

Specify JNDI connections

  1. Stop server & BA server.
  2. Edit the /tomcat/webapps/pentaho/WEB-INF/web.xml file.
  3. At the end of the <web-app> element, in the same part of the file where you see <!– insert additional resource-refs –>, add this XML snippet.
  4. myDatasource
    jdbc/ myDatasource
    javax.sql.DataSource
    Container

  5. Save & Close web.xml
  6. Open & modify this file /tomcat/webapps/pentaho/META-INF/context.xml.
  7. Add this XML snippet anywhere in context.xml
  8. Save & Close context.xml
  9. Open simple jndi
  10. Open & Modify this file biserver-ce/pentaho-solutions/system/simple-jndi/jdbc.properties.
  11. Add these lines
  12. myDatasource /type=javax.sql.DataSource
    myDatasource /driver= oracle.jdbc.OracleDriver
    myDatasource /url= jdbc:oracle:thin:@//localhost:1521/sysdba
    myDatasource /user=dbuser
    myDatasource /password=password

  13. Save & Close jdbc.properties.
  14. Delete the pentaho.xml filed located in the /tomcat/conf/catalina/directory.
  15. Start tomcat & BA server.

How to create reports using Pentaho Report Designer

How to create reports using Pentaho Report Designer

Pentaho Business Intelligence platform provides several tools to design and deploy reports, the easiest is the Pentaho BI web reporting wizard (see pentaho_reporting).

While this approach is useful for basic or self service reporting, for complex reports with charts, cross linking and scripting a tool like Pentaho Report Designer is more appropriate. Pentaho Report Designer is the reporting engine integrated in Pentaho Business Intelligence.

Other excellent Pentaho compliant reporting tools are Jasper Reports and Eclipse BIRT, however Pentaho Report Designer has the advantage of being more integrated and easily deploy-able.

Steps :

Connect a database in Pentaho Report Designer

Pentaho Report Designer is an open source reporting tool available at http://sourceforge.net/projects/pentaho/files/

Once unzipped the application can be run with report-designer.bat (report-designer.sh in unix). To create a new report press File->New and then Data->Add datasource->JDBC. Pentaho Report Designer connects with several data source providers, for example SQL databaseMondrian OLAP and Pentaho Data Integration (Kettle).

In this example we’ll use a common JDBC connection for Postgres databases. To add a new connection press the plus icon on the JDBC dialog.

JDBC Connection

On the new dialog set the data base connection credentials, in this example a pre-existent Postgres database named Foodmart, with the following parameters:

Connection name: SugarBI
Connection type: Postgres
Host name: <host>
Database name: <database name>
Port number: 5432
User name: root
Password: <your password>
Access: Native (JDBC)

img-2

Now press Test to check the connection and then press ok. In case of connection error check the connection parameters and whether the JDBC driver is present in /report-designer/lib/jdbc.

Now press the plus icon and fill the SQL query text and query name, like in the following picture.

img-3

To check if the query works press Preview button and then Ok to close the dialog.

In a real enterprise application a JNDI connection would be a better solution than a direct JDBC connection. A JNDI connection, which is explained at the end of this post, defines a JDBC connection alias and avoids the hassle to update all the reports in case of database parameters changes.

Build a report with Report Designer

After the Postgresql connection configuration the Data tab (on the right) should display the connection name with all its fields. Placing a field in the report is very straight forward and can be accomplished by dragging and dropping the field on the Details band.

img-4

 

To set a page title click on “Ab” button in the left toolbar and drop it on Page Header band and to set a column label simply drag the “Ab” button over the Report Header band.

A page number can be added by right clicking Functions (right Data tab) and selecting Add functions..->Common->Pages of Pages. The new Pages of Pages field can then be dragged on the Page Footer.

img-5

Pentaho Report Designer allows to test the final result at any time by pressing the play icon and choosing an output format, for example PDF or HTML.

img-6

How to publish a report in Pentaho

Once built the report can be run directly on Pentaho Business Intelligence suite. Pentaho Report Designer provides direct access to Pentaho‘s repository, however the access needs to be enabled first. To enable the repository access edit biserver\pentaho-solutions\system\publisher_config.xml and set the password in the publisher-password tag:

<publisher-config>
<publisher-password>your password here</publisher-password>
</publisher-config>

img-7

The default credentials are:

URL: http://localhost:8080/pentaho
User: joe
Password: password

Click Ok and if Pentaho server is running a form like this one should appear:

img-8

Press ok, the report should appear on the Pentaho repository browser of the Pentaho Business Intelligence suite:

img-6

 

 

 

 

 

Anonymous Authentication in Pentaho

This blog will be talking about anonymous authentication in Pentaho. You can bypass the built-in security on the BA Server by giving all permissions to anonymous users. An “anonymousUser” is any user, either existing or newly created, that you specify as an all-permissions, no-login user, and to whom you grant the Anonymous role. The procedure below will grant full BA Server access to the Anonymous role and never require a login.

1. Stop the BA Server.
2. Open the /pentaho/server/biserver-ee/pentaho-solutions/system/applicationContext-spring-security.xml file and ensure that a default anonymous role is defined. Match your bean definition and property value to the example below.

<bean id=”anonymousProcessingFilter” class=”org.springframework.security.providers.anonymous.AnonymousProcessingFilter”>

<!– omitted –>

   <property name=”userAttribute” value=”anonymousUser,Anonymous” />

</bean>


 

3. Find these two beans in the same file .
o filterSecurityInterceptor
o filterInvocationInterceptorForWS
Locate the objectDefinitionSource properties inside the beans and match the contents to this code example.

<bean id=”filterInvocationInterceptor” class=”org.springframework.security.intercept.web.FilterSecurityInterceptor”>
    <property name=”authenticationManager”>
        <ref local=”authenticationManager” />
    </property>
    <property name=”accessDecisionManager”>
        <ref local=”httpRequestAccessDecisionManager” />
    </property>
    <property name=”objectDefinitionSource”>
        <value>
            <![CDATA[ CONVERT_URL_TO_LOWERCASE_BEFORE_COMPARISON
\A/.*\Z=Anonymous,Authenticated ]]> </value>
    </property>
</bean>

 

4. Save the file, then open pentaho.xml in the same directory.
5. Find the anonymous-authentication lines of the pentaho-system section, and define the anonymous user and role.

<pentaho-system>
<!– omitted –>
    <anonymous-authentication>
        <anonymous-user>anonymousUser</anonymous-user>
        <anonymous-role>Anonymous</anonymous-role>
    </anonymous-authentication> <!– omitted –>
</pentaho-system>

6. Open the repository.spring.properties file in the same directory.

a) Find the singleTenantAdminUserName and replace the value with the anonymousUser name.
b) Find the singleTenantAdminAuthorityName and replace the value with Anonymous.
c) Save the file.

Open the pentahoObjects.spring.xml file.
Find all references to the bean id=”Mondrian-UserRoleMapper” and make sure that the only one that is uncommented (active) is this one:

<bean id=”Mondrian-UserRoleMapper”
        name=”Mondrian-SampleUserSession-UserRoleMapper”
        class=”org.pentaho.platform.plugin.action.mondrian.mapper.
                            MondrianUserSessionUserRoleListMapper”
        scope=”singleton”>
    <property name=”sessionProperty” value=”MondrianUserRoles” /> </bean>

Save pentahoObjects.spring.xml and close the file.
Restart BA Server.
Enter http://localhost:8080/pentaho in browser address field. You will find that the pentaho home page opens without requiring login.

Archana Verma
Helical IT Solutions

Creating Cascading Parameters with Pentaho Report Designer 5.0.1 Stable

Here we will be talking about creating cascading parameters with Pentaho Report Designer 5.0.1 Stable.

Cascade parameters provide a way of managing large amounts of data in reports. You can define a set of related parameters so that the list of values for one parameter depends on the value chosen in another parameter.

PRD cascading parameter

Step-1. Build the Parameter Queries :

I will need to build two parameter queries, the first will need to display a distinct list of countries and the second will need to display a distinct list of cities which belong to the country.

 

To create a new query click on the Add a new query icon (add a new query) and enter a SQL statement which will retrieve a distinct list of countries – make sure you don’t forget to name your query (in this example my country parameter query is named countryList):

PRD cascading parameter 2

Here is the above query in a format which you can copy and paste:

SELECT
 DISTINCT `customers`.`COUNTRY`
FROM
 `customers`

The next step is to preview the query – as you can see from the screen shot below the countryList parameter query is retrieving a distinct list of countries:

PRD cascading parameter 3

Now we will need to add the second parameter which will display a distinct list of cities based on the value of the country parameter. Close the preview window and click on the icon (add a new query) and enter a SQL statement which will retrieve a distinct list of cities – make sure you don’t forget to name your query (in this example my city parameter query is named cityList):

PRD cascading parameter 4

Here is the above query in a format which you can copy and paste:

SELECT
 DISTINCT `customers`.`CITY`

FROM
 `customers`

Preview this query. At the moment the above query is only retrieving a distinct list of every city. I need to make sure that this query retrieves a list of cities based on the country parameter

PRD cascading parameter 5

For this query to only show a list of cities based on the value of the country parameter I will need to add the country parameter name to the WHERE clause of the cityList query. As I have not yet created the country or city parameters I will need to make note of the country parameter name I will be specifying in the cityList query – I have chosen to call the country parameter sCountryName. The new cityList query now looks like this:

PRD cascading parameter 6

Here is the above query in a format which you can copy and paste:

SELECT
 DISTINCT `customers`.`CITY`
FROM
 `customers`
WHERE
 `customers`.`COUNTRY` = ${sCountryName}

I have now created both parameter queries – I will need to revisit this area later on to modify the report query customerList but for now click on the OK button to close the JDBC Data Source window.

 

Step -2. Create the Parameters

I will need to create two parameters:

  1. A country drop down parameter which is named sCountryName (step 1)
  2. A city drop down parameter which will be named sCityName

Creating the sCountryName Parameter

To create a new parameter make sure you have the Data tab active, right click on the Parameters menu item and select the Add Parameter… option:

PRD cascading parameter 7

An Add Parameter… window will pop up. Before you make any changes to the options expand the connection folder on the left and then select the query which will populate the parameter, in this example it is the countryList query. Below is a screen shot of all the sCountryName parameter options completed:

PRD cascading parameter 8

 

The options above are fairly self explanatory however here are descriptions of the most important options:

  • Name: This is the name we specified in step 1, this must match the value of this parameter we put in the cityList WHERE clause
  • Type: The type of this parameter is a drop down
  • Query: The query that will populate this parameter is the countryList query which was created in step 1
  • Value and Display Name: I have set these both to the COUNTRY field as the value is the same as the output I would like displayed in the drop down parameter
  • Value Type: The COUNTRY field is a string
  • Mandatory: I have checked this option as a user must select a country before running the report – this also ensures that the city parameter will be populated

Click the OK button to save the sCountryName parameter.

Creating the sCityName Parameter

The second parameter I will need to create is for the city drop down. There isn’t much different from this parameter and the sCountryName parameter so I can copy and paste this parameter and then change some options.

To copy the parameter right click on the sCountryName parameter under the Data tab and select the Copy option (alternatively you can use the CTRL+C shortcut):

PRD cascading parameter 9

To paste the parameter right click on the Parameters… item and select the Paste option (alternatively you can use the CTRL+V shortcut):

PRD cascading parameter 10

This will create an identical copy of the sCountryName parameter.

PRD cascading parameter 11

 

Double click on the sCountryName parameter which is located at the bottom of the Parameters list, this will open the Edit Parameter… window. Before you make any changes to the options expand the connection folder on the left and then select the query which will populate this parameter, in this example it is the cityList query. Below is a screen shot of all the sCityName parameter options completed:

PRD cascading parameter 12

The options above are fairly self explanatory however here are descriptions of the most important options:

  • Name: I have decided to name this parameter sCityName
  • Type: The type of this parameter is a drop down
  • Query: The query that will populate this parameter is the cityList query which was created in step 1
  • Value and Display Name: I have set these both to the CITY field as the value is the same as the output I would like displayed in the drop down parameter
  • Value Type: The CITY field is a string
  • Mandatory: I have checked this option as a user must select a city before running the report

Step-3. Modify the Report Query

Now that the parameters and parameter queries have been created we are now ready to modify the report query (customerList) to use the new parameters. To modify the report query expand the data source connections under the Data tab and double click on the report query, customerList:

PRD cascading parameter 13

This will open the JDBC Data Source window and automatically highlight the customerList query:

PRD cascading parameter 14

To make the customerList query use the two new parameters created in step 2 you will need to add in two new conditions to the WHERE clause like:

PRD cascading parameter 15

 

Here is the above query in a format which you can copy and paste from below:

SELECT
 `customers`.`CUSTOMERNUMBER`,
 `customers`.`CUSTOMERNAME`,
 `customers`.`CITY`,
 `customers`.`COUNTRY`

FROM
 `customers`
WHERE
 `customers`.`COUNTRY` = ${sCountryName}
AND
 `customers`.`CITY`    = ${sCityName}

Now the report query will use the values of the sCountryName and sCityName parameters in its WHERE clause. Click on the OK button to close the JDBC Data Source window.

Step -4. Preview the Report

The last step is to preview the report, to do this you can click on the (preview icon) found in the top left hand corner or alternatively click on View > Preview menu item.

PRD cascading parameter 16

By default nothing will be displayed in the report and none of the drop down parameters will be populated (if you would like a country to be set when you first run your report set a value for the Default Value option in step 2 for the sCountryName parameter).

To check if the cascading parameters are working pick a country from the first drop down for example Australia:

PRD cascading parameter 17

 

The city drop down parameter should automatically populate with the cities which belong to Australia (even though I’m not sure if you would classify Glen Waverly as a city):

PRD cascading parameter 18

After selecting a city i.e. Melbourne click on the UPDATE button and now the report is filtered to customers who are located in the country Australia and the city Melbourne:

PRD cascading parameter 19

 

Thanks

Rohit Verma

Helical IT Solutions

Installation of Pentaho 5.0.1 CE with Postgresql db (in Windows 7 / Ubuntu / AWS)

This blog will elaborate about Installation of Pentaho 5.0.1 CE with Postgresql db (in Windows 7 / Ubuntu / AWS)

1.Putty configuration
Step-1: Go to start menu and click putty or Desktop find where is putty.
Step-2: Just provide IP Address or host name of client machine. See the below image and provide Host name in red rectangle area.
Putty configuration

Step-3: After that go to connection tab > Data and provide username of client machine. In below image you can see and follow the steps.

Putty configuration 2

Step-4: After that browse and upload the .ppk file.In below image you can see the procedure how to go there and  provide .ppk file, and click Open button.
Putty configuration 3

2.Java Configeration
PART- I JAVA (Open – JDK)- Installation

Installation of JAVA(Open JDK) in ubntu server
NOTE : You must issue sudo(Superuser do) permission before you ready to install java
Command : sudo su
In the next line : You need to issue the password that you used for log-ing using PuTTY.

* Install Open JDK and JRE using the below commands
1) JAVA installation command.
apt-get install oracle-java7-installer

2) Find whether java installed or not
which java
Ouptut
/usr/bin/java

3) Find which version installed
java -version
Ouput
java version “1.7.0_51”
OpenJDK Runtime Environment (IcedTea 2.3.10) (7u25-2.3.10-1ubuntu0.12.04.2)
OpenJDK 64-Bit Server VM (build 23.7-b01, mixed mode)

IMP: NOTE-1
You need NOT to set JAVA_HOME if you see the above message(when you issue java -version) for ubntu server, at the time of java installation it will set the JAVA_HOME.
This may differ from version to version for ubntu server.

NOTE-2
By default the open JDK install in the below location
/usr/lib/jvm/

NOTE-3 ( Optional – set java_home)

Open .bashrc file using an editor. If you use VI then
vi ~/.bashrc
and add the following 2 lines in your .bashrc file.
JAVA_HOME=/usr/lib/jvm/jdk1.7.0_21/
export JAVA_HOME
3.Pentaho Bi-Server Configuration
PART- II Pentaho 5.0.1 CE installation
1) Create a folder under home
mkdir pentaho
2) Download pentaho
Issue the below command to down load pentaho zip file from sourceforge site.
wget  http://sourceforge.net/projects/pentaho/files/Business Intelligence Server/5.0.1-stable/biserver-ce-5.0.1-stable.zip/

3) Unzip file

i) Once the down load is completed you will be getting “biserver-ce-5.0.1-stable.zip”  zip file under  pentaho
ii) unzip the file using below command
unzip biserver-ce-5.0.1-stable.zip
NOTE: if unzip is not installed in ubuntu server issue the below command to install unzip package
sudo apt-get install unzip
4) biserver-ce  folder
i) Once unzip completed you will be getting a folder biserver-ce
ii) Do the following.
root@———–#/opt/pentaho# cd biserver-ce
root@———-#/opt/pentaho/biserver-ce# ls
OUTPUT
data               import-export.sh  pentaho-solutions  set-pentaho-env.bat  start-pentaho.bat        start-pentaho-debug.sh  stop-pentaho.bat  third-party-tools
import-export.bat  licenses          promptuser.js      set-pentaho-env.sh   start-pentaho-debug.bat  start-pentaho.sh        stop-pentaho.sh   tomcat
NOTE : You need to give the permissions to the .sh files to run
Issue below command : chmod -R 777

5) Start the bi server using below command

[[email protected] biserver-ce]# ./start-pentaho.sh
OUTPUT
/opt/pentaho/biserver-ce
/opt/pentaho/biserver-ce
DEBUG: _PENTAHO_JAVA_HOME=/usr/lib/jvm/jre
DEBUG: _PENTAHO_JAVA==/usr/lib/jvm/jre/bin/java
Using CATALINA_BASE:   /opt/pentaho/biserver-ce/tomcat
Using CATALINA_HOME:   /opt/pentaho/biserver-ce/tomcat
Using CATALINA_TMPDIR: /opt/pentaho/biserver-ce/tomcat/temp
Using JRE_HOME:        /usr/lib/jvm/jre
Using CLASSPATH:       /opt/biserver-ce/tomcat/bin/bootstrap.jar

NOTE: stop the server by issuing this command
root@ —— :/opt/pentaho/biserver-ce# ./stop-pentaho.sh
[email protected]:/home/pentaho/biserver-ce# ./stop-pentaho.sh
/opt/pentaho/biserver-ce
/opt/pentaho/biserver-ce
DEBUG: _PENTAHO_JAVA_HOME=
DEBUG: _PENTAHO_JAVA=java
Using CATALINA_BASE:   /opt/pentaho/biserver-ce/tomcat
Using CATALINA_HOME:   /opt/pentaho/biserver-ce/tomcat
Using CATALINA_TMPDIR: /opt/pentaho/biserver-ce/tomcat/temp
Using JRE_HOME:        /usr/lib/jvm/jre
Using CLASSPATH:       /opt/biserver-ce/tomcat/bin/bootstrap.jar

6) Go the web-browser and give the below URL
http://YourHost:8080/pentaho
Example : 192.168.56.92:8080/pentaho

NOTE: pentaho by default uses 8080 port number.
Note: Below process is applicable for window machine not linux machine.
If you want to change default port number then follow steps like D:\Software\biserver-ce-5.0.1-stable\biserver-ce\tomcat\conf\server.xml   Find the port no. 8080 and replace port number according to requirement and save the file and exit.

4.Pentaho Home Page
Credentials you need to supply once you get the home page are :
User name : Admin
Password : password

Installed Pentaho

7) After that click Home and go to market place and Install Plugging like according to requirement     like Community Dashboard Editor (CDE),Community Dashboard DataAccess, Community Dashboard Framework, Saiku Analytics etc.
8) And once stop the Bi-server and again restart server server.

Rohit Verma

Helical IT Solutions

Create Organization using REST Webservice in Jasperserver / Jaspersoft

This blog will talk about how to create organizations using REST webservice in JasperServer.

STEP 1:-

Put the following jar files in lib folder:

Commons-codec-1.4.jar

Commons-io-1.4.jar

http-core-4.1.jar

httpclient-4.1.1.jar

httpmime-4.1.jar

Jakarta-httpcore.jar

Restclient-cli.3.2.2-jar-with-dependencies.jar

Servlet-api.jar

 

STEP 2:-

Create a jsp page

index.jsp:

<html>

<head>

<meta http-equiv=“Content-Type” content=“text/html; charset=ISO-8859-1”>

<title>Insert title here</title>

</head>

<body>

<form action=TestWebService>

<input type=“submit” value=“submit”>

</form>

 

STEP 3:=

Create client :

com.helical.restwebservices->TestWebService.java

TestWebService.java

package com.helical.restwebservices;

 

import java.io.BufferedReader;

import java.io.IOException;

import java.io.InputStreamReader;

import java.io.OutputStream;

import java.net.HttpURLConnection;

import java.net.MalformedURLException;

import java.net.URL;

import java.util.List;

import javax.servlet.http.HttpServlet;

import javax.servlet.http.HttpServletRequest;

import javax.servlet.http.HttpServletResponse;

import org.apache.http.HttpResponse;

import org.apache.http.client.HttpClient;

import org.apache.http.client.ResponseHandler;

import org.apache.http.client.methods.HttpGet;

import org.apache.http.client.methods.HttpPost;

import org.apache.http.client.methods.HttpPut;

import org.apache.http.entity.StringEntity;

import org.apache.http.impl.client.BasicResponseHandler;

import org.apache.http.impl.client.DefaultHttpClient;

 

public class TestWebService extends HttpServlet{

private static final long serialVersionUID = 1L;

public void doGet(HttpServletRequest req , HttpServletResponse res)

{

String jSessionId;

jSessionId = jWSLogin(“[email protected]“, “test”, null);

addOrganization(jSessionId,res);

 

}

public String jWSLogin(String username, String password, String organization)

{

if(username == null || password == null)

{

throw new RuntimeException(“Failed: Username or Password can’t be null”);

}

String j_uname = organization == null ?  username : (organization.trim().length() == 0 || organization == “”) ? username : username +”%7C”+ organization;

 

String j_password = password;

String jSessionIdCookie=””;

try {

URL url = new URL(“http://test.com/login“);

HttpURLConnection conn = (HttpURLConnection) url.openConnection();

conn.setDoOutput(true);

conn.setRequestMethod(“POST”);

conn.setRequestProperty(“Content-Type”, “application/x-www-form-urlencoded”);

String input = “j_username=”+j_uname+”&j_password=”+j_password;

OutputStream os = conn.getOutputStream();                os.write(input.getBytes());

os.flush();

if (conn.getResponseCode() != 200) {

throw new RuntimeException(“Failed : HTTP error code : “+   conn.getResponseCode());

}

else

{

BufferedReader br = new BufferedReader(new InputStreamReader(

(conn.getInputStream())));

List<String> cookies = conn.getHeaderFields().get(“Set-Cookie”);

 

         for (String string : cookies)

{

jSessionIdCookie = string;

}

}

conn.disconnect();

} catch (MalformedURLException e) {

e.printStackTrace();

} catch (IOException e) {

e.printStackTrace();

}

  return jSessionIdCookie;

}

public void addOrganization(String jSessionId, HttpServletResponse res)

{

HttpClient httpClient = new DefaultHttpClient();

try{

HttpPost request = new         HttpPost(“http://test.com/rest_v2/organizations?create=true&“);

request.setHeader(“Cookie”, jSessionId);

StringEntity params =new StringEntity(“{\”id\”:\”helicaltest123\”,\”alias\”:\”helicaltest123\”,\”parentId\”:\”organizations\”,\”tenantName\”:\”helicaltest123\”,\”tenantDesc\”:\”Audit Department of Finance\”,\”theme\”:\”default\”}”);

request.addHeader(“content-type”, “application/json”);

request.setEntity(params);

HttpResponse response = httpClient.execute(request);

}catch(Exception e)

{

e.printStackTrace();

}finally {

httpClient.getConnectionManager().shutdown();

}

 

}

}

NOTE:=

We have created two methods in TestWebService.java

  1. jWSLogin()
  2. addOrganization()

jWSLogin() is use to aunthenticate and return session so that we can use that session in another task.

addOrganization()

is use to add  organization. Data which we want to insert should be in json or xml format. In this example I have taken in json format.

 

STEP 4:-

Create web.xml:=

<?xml version=“1.0” encoding=“UTF-8”?>

<web-app xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance xmlns=http://java.sun.com/xml/ns/javaee xmlns:web=http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd xsi:schemaLocation=http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd id=“WebApp_ID” version=“2.5”>

<display-name>RestWebServices-new</display-name>

<welcome-file-list>

<welcome-file>index.jsp</welcome-file>

</welcome-file-list>

 

<servlet>

<servlet-name> TestWebService </servlet-name>

<servlet-class>com.helical.restwebservices. TestWebService </servlet-class>

</servlet>

<servlet-mapping>

<servlet-name> TestWebService </servlet-name>

<url-pattern>/ TestWebService </url-pattern>

</servlet-mapping>

</web-app>

 

For any other help on Jasper Server, please get in touch with us

Helical IT Solutions

Using Pentaho Schema Workbench (PSW)

Schema -:
A schema defines a multi-dimensional database. It contains a logical model, consisting of cubes, hierarchies, and members, and a mapping of this model onto a physical model.
The logical model consists of the constructs used to write queries in MDX language: cubes, dimensions, hierarchies, levels, and members.
The physical model is the source of the data which is presented through the logical model. It is typically a star schema, which is a set of tables in a relational database; later, we shall see examples of other kinds of mappings.

Schema files :
Mondrian schemas are represented in an XML file. An example schema, containing almost all of the constructs we discuss here, is supplied as demo/FoodMart.xml in the Mondrian distribution. The dataset to populate this schema is also in the distribution.

The structure of the XML document is as follows:
<schema>
<cubes>
<AggName>
Agg Element
</AggName>
<Dimension>
<Hirerchy>
</hirerchy>
</Dimension>
</cubes>
</schema>

You can see example:
Step-1 Open your Schema Workbench
Step-2 Go to File ->New ->Schema.
Step-3 Click On Schema and set the name of schema like foodmart
Step-4 Right Click On schema and select Add cube
Step-5 click on add cube and set the name of add cube like sales or whatever        you want.
Step-6 Right Click On cube and select a add table  and set the schema like public and name like select database table  means  sales_fact_1997
Step-7 again right click on cube and add  dimension and click dimension
<Dimension type=”StandardDimension” visible=”true” `        foreignKey=”store_id” name=”Dimension Test”>
double click on dimension you can see the hierarchy and set the hierarchy <Hierarchy name=” Hierarchy Test” visible=”true” hasAll=”true”>
Step-8  Inside hierarchy right click on hierarchy and create add table and click table and then provide name and schema name like :<Table name=”store” schema=”public”>
Step-9 In hierarchy you need to set a Level i.e right click on hierarchy and create add level. <Level
name=”Level Test” visible=”true” table=”store” column=”store_country” nameColumn=”store_country” uniqueMembers=”false”>
Step-10 Inside Cube right click on cube and create Add Measure and click Measure and <Measure name=”Measure Creation” column=”customer_id” datatype=”Integer” aggregator=”count” visible=”true”>
Step-11 According to requirement You can create a lots of cube ,procedure is same
Step-12 Now publish the server what you create a Schema ———-
Step-13 click option and click connection set the
connection name =foodmart
select the connection type what you want like select postgresql
Host Name :192.168.2.9
Database Name: foodmart
Port Number:5432
Username :postgres
Password :postgres
Access like : Native (JDBC)
Click Test Button  Connection is done if not come any problem then click ok…
Step-14 Now To publish In server local server type in browser: localhost:7080/pentaho and use username and password what they are provided thats it.
Step-15 Go to Schema Inside File ->click on publish Button set server  url:http://localhost:7080/pentaho/
user:Admin
password:password
Pentaho or JNDI Data source: foodmart click next to public button
Message will shown on screen connection successfully.
Step-16 Now go to type in browser :localhost:7080/pentaho    click file->New-    >JPivot view. Then select schema and cube what you were creating. And click and check all icon you can see all the output type….
Step-17 this is process to create a schema and deployed in local server…….
This code will execute properly——–Schema1.xml
<Schema name=”foodmart”>
<Dimension type=”StandardDimension” visible=”true” highCardinality=”false” name=”Store”>
<Hierarchy visible=”true” hasAll=”true” primaryKey=”store_id”>
<Table name=”store” schema=”public”>
</Table>
<Level name=”Store Country” visible=”true” column=”store_country” type=”String” uniqueMembers=”false” levelType=”Regular” hideMemberIf=”Never”>
</Level>
<Level name=”Store State” visible=”true” column=”store_state” type=”String” uniqueMembers=”false” levelType=”Regular” hideMemberIf=”Never”>
</Level>
<Level name=”Store City” visible=”true” column=”store_city” type=”String” uniqueMembers=”false” levelType=”Regular” hideMemberIf=”Never”>
</Level>
<Level name=”Store Name” visible=”true” column=”store_name” type=”String” uniqueMembers=”false” levelType=”Regular” hideMemberIf=”Never”>
<Property name=”Store Type” column=”store_type” type=”String”>
</Property>
<Property name=”Store Manager” column=”store_manager” type=”String”>
</Property>
<Property name=”Store Sqft” column=”store_sqft” type=”Numeric”>
</Property>
</Level>
</Hierarchy>
</Dimension>
<Cube name=”Sales” visible=”true” cache=”true” enabled=”true”>
<Table name=”sales_fact_1997″ schema=”public”>
</Table>
<DimensionUsage source=”Store” name=”Store” visible=”true” foreignKey=”store_id” highCardinality=”false”>
</DimensionUsage>
<Measure name=”Unit Sales” column=”unit_sales” formatString=”Standard” aggregator=”sum” visible=”true”>
</Measure>
</Cube>
<Cube name=”Warehouse” visible=”true” cache=”true” enabled=”true”>
<Table name=”inventory_fact_1997″ schema=”public”>
</Table>
<DimensionUsage source=”Store” name=”Store” visible=”true” foreignKey=”store_id” highCardinality=”false”>
</DimensionUsage>
<Measure name=”Store Invoice” column=”store_invoice” aggregator=”sum” visible=”true”>
</Measure>
</Cube>
some thing about schema?
=>A schema defines a multiple dimensional database.it consist a logical model,consisting of cubes, hierarchies and member,and mapping on this model onto a physical model

For any questions on Pentaho Schema Workbench, please get in touch with us @ Helical IT Solutions

Pentaho BI and Ctools – Introduction

Pentaho Overview:
It offer a suite of open source Business Intelligence (BI) product called Pentaho. Business Analytics provide, Data integration, OLAP Service, Reporting, Dash boarding, Data Mining, ETL Capabilities and BI platform. It is web application like Jaspersoft. It uses Tomcat server.

Pentaho BI Platform:
Pentaho BI Platform supports Pentaho end to end business intelligence capabilities and provide central access to your business information, with back end security, integration, scheduling, auditing and more. Pentaho BI Platform has been designed in such a way that it can be scaled to meet the needs of any size organization.

OLAP:-
OLAP Stands For Online Analytical Processing , OLAP is an Approach to answering a multi-dimensional analytical query swiftly.

Dash Boarding:-
Pentaho Dashboards give business users the critical information they need to understand and improve organizational performance. Provides immediate insight into individual, departmental, or enterprise performance by delivering key metrics in an attractive and intuitive visual interface

ETL (Extract Transform Load):-
ETL System are Commonly used to integrate data from multiple application, typically developed and supported by different vendors or hosted on separate computer Hardware.

Data integration:
Data integration involves combining data residing in different sources and providing users with a unified view of these data.

There are many tools present inside Pentaho, the same have been elaborated below,
Pentaho Tools -> Pentaho is having many tools like.

1) PRD – Pentaho Report Designer (Reporting)
Transform all your data into meaningful information tailored to your audience with Reporting, a suite of Open Source tools that allows you to create pixel-perfect reports of your data in PDF, Excel, HTML, Text, Rich-Text-File, XML and CSV. These computer generated reports easily refine data from various sources into a human readable form.

2) PSW – Pentaho Schema Workbench or Mondrian Schema Workbench
The Mondrian Schema Workbench is a designer interface that allows you to create and test Mondrian OLAP cube schemas visually. The Mondrian engine processes MDX requests with the ROLAP (Relational OLAP) schemas. These schema files are XML metadata models that are created in a specific structure used by the Mondrian engine. These XML models can be considered cube-like structures which utilize existing FACT and DIMENSION tables found in your RDBMS. It does not require that an actual physical cube is built or maintained; only that the metadata model is created.

3) PME – Pentaho Meta Editor
Pentaho Metadata Editor (PME) is a tool that allows you to build Pentaho metadata domains and relational data models. A Pentaho Metadata Model maps the physical structure of your database into a logical business model.

4) PDI – Pentaho Data Integration
Pentaho data integration prepares and blends data to create a complete picture of your business that drives actionable insights.

In Server -> There are many tools inside server like.
1) C-Tools ->CDA,CDF,CDE
C-Tools-> Used to create Dashboard.
CDA : Community Dashboard Data Access
CDA – Community Data Access is a community project that allows to fetch data in various formats from Pentaho BI platform. It will be accessed by simple url calls and supports the following datasources:
1)      SQL
2)      MDX
3)      Metadata
4)      Kettle

CDF : Community Dashboard Framework
Community Dashboard Framework (CDF) is a project that allows you to create friendly, powerful and fully featured Dashboards on top of the Pentaho Business Intelligence server.
CDE :Community Dashboard Editor
CDE and the technology underneath it (CDF, CDA and CCC) allows the development and deployment of Pentaho Dashboards in a fast and effective way.

2) Saiku Analytics:-  Explore Data,Visualize  etc

 

Please get in touch with us for any Pentaho related queries, Helical IT Solutions