Python – Get data from CSV and create chart

Python – Get data from CSV and create chart

Hello everyone, this blog shows how we can import data from csv and plot as mesg grid chart in python libraries, through the help of Pycharm IDE (Community Version)

Prerequisite :

The following are the steps for creating mesh-grid in pycharm :
1. Installing NumPy,Pandas and Matplotlib libraries in Pycharm IDE :

  • 1. Create a scratch file in Pycharm.
  • 2. Click on “File” and goto settings.
  • 3. Goto “Project Interpreter” and click on ‘+’ symbol to add new libraries.
  • NumPy

2. Writing python code for importing csv data and creating meshgrid :

  • 1. Create a csv file containing these data “-5,5,0.01”.
  • 2. Use this python code in your scratch file .
  • import csv
    import numpy as np
    import matplotlib.pyplot as plt

    point = ""
    with open('C:/Users/Helical/Desktop/Blog/Python/MeshGrid.csv','r') as csvfile:
    meshreader = csv.reader(csvfile, delimiter=',')
    for row in meshreader:
    point = point + ', '.join(row)
    meshpoints = point.split(',')
    points = np.arange(float(meshpoints[0]),float(meshpoints[1]),float(meshpoints[2]))
    dx, dy = np.meshgrid(points,points)
    z = (np.sin(dx)+np.sin(dy))
    plt.title('plot for sin(x)+sin(y)')

  • If you are able to sucessfully run the code, then you will generate mesh grid in an new Pycharm window.
  • GraphMatplotLib

Hide logout option for externally authenticated users in JasperReport Server

Hide logout option for externally authenticated users in Jasper Report Server

  • An external user is a user outside of an organization who does not directly login to Jasper Reports Server.
  • JasperReports Server does not store the passwords of external users, and is not impacted by changes to user passwords or user password policies on the external authority only the login name is stored in an external user account.
  • In JasperReports Server if you want to hide logout option for externally authenticated users, following are the steps to achieve it :

Used Jasper Report Server Version : 6.4.0

  1. Goto to path : \WEB-INF\decorators\
  2. Open the decorator.jsp file
  3. Comment below code :
  4.         <authz:authorize ifNotGranted="ROLE_ANONYMOUS">
            <li id="main_logOut" class="last" role="menuitem">
            <a id="main_logOut_link" tabindex="-1">
            <spring:message code="menu.logout"/>

  5. Add below code before commented code which check for externally authenticated user and hides the logout option for it :

  6. <c:set var="isExternallyDefinedUser" value="<%= ((com.jaspersoft.jasperserver.api.metadata.user.domain.User)SecurityContextHolder.getContext().getAuthentication().getPrincipal()).isExternallyDefined()%>"/>
    <c:if test="${!isExternallyDefinedUser}">
    <authz:authorize ifNotGranted="ROLE_ANONYMOUS">
    <li id="main_logOut" class="last" role="menuitem">
    <a id="main_logOut_link" tabindex="-1">
    <spring:message code="menu.logout"/>

  7. Save decorator.jsp file

To test hide logout option for externally authenticated user :

  1. Custom authentication
  2. Manually update “externallydefined” flag in database and test.

We can test hide logout option for externally authenticated user by second mentioned way :

  1. Connect to Jasperserver postgres database.
  2. Check users present in jasperserver by executing below query :

Select * from jiuser;

  1. You will get all users details which are present in jasper server , check the id of user to which you want to make it as externally defined user.
  2. Let us assume if you want to make testuser as externally defined user and its id is 65874 then to make is as externally defined user execute below query which set the “externallydefined” flag/value to “t”.

Verify the set flag value by executing Select * from jiuser;

  1. Now restart the jasperserver and its repository(postgres).
  2. Login to jasper server with superuser access through browser.
  3. Goto->Mange->Users and select testuser.
  4. Click on Login as User button.


9.You will directly logged with testuser which don’t having the logout option .





Sayali Mahale | Helical IT Solutions

Creating a folder under Functionalities Tree in SpagoBI Server

Creating a folder under Functionalities Tree in SpagoBI Server

In general Spago BI uses its own functionalities tree which allows to better organize documents / reports by grouping them in folders. Although in built Folders are provided to organize (Analytical documents, Custom Documents and Audit & monitoring under Functional Tree) but most of the cases we like to maintain our reports to be placed under different folder.

Based on Role-Based permissions these folders are visible to user. This structure can be created or modified only by the administrator.

Let’s start creating a folder under Functionalities Tree:

  1. Login as biadmin.
  2. From left menu, Select Functionalities Management under Profile tab.
  3. By clicking on a node of the functionalities Tree a set of possible actions is shown:
  4. While clicking on Functionalities insert option fill in Label, Name & Description (option) on Functionalities Detail page.
    • Insert
    • Detail
    • Erase
    • Move up/down
  5. Assign roles as preferred and Save the page

Work Flow

Thanks & Regards
Venkatanaveen Dasari™

Introduction to SpagoBI

Introduction to SpagoBI


SpagoBI allows to produce reports using structured information views (lists,tables, crosstabs, graphs) and it allows the multidimensional analysis through OLAP engines, which are more flexible and user-friendly, compared to structured reports

overall architecture is composed of five modules:

SpagoBI Server:

  • The main module, which provides all the BI capabilities, is the SpagoBI Server. This is the only mandatory module, the one that offers a web-based end-user environment to run all types of analysis.
  • This is the only place / server where all BI documents are stored and used by end users. The end-user GUI is entirely web based and can be accessed via a common web browser


SpagoBI Meta:

  • Is the module of SpagoBI suite specifically focused on technical metadata management, whereas business metadata are managed by SpagoBI Server.


SpagoBI Studio:

  • Developers use SpagoBI Studio (client application) to develop analytical documents using the business model, to get data and deploy documents into a remote SpagoBI Server.


SpagoBI SDK:

  • Mainly to authenticate the user, send or receive metamodels, analytical documents or data sets.
  • It is the standard development kit for java developers who need to integrate SpagoBI with their applications.
  • It is a collection of web services, tag libraries or APIs that allow to relate an external application to SpagoBI Server, allowing to use some SpagoBI documents and functionalities in this environment.


The integration layer that allows external tools and applications to interact with SpagoBI Server

SpagoBI Applications:

  • SpagoBI Applications are ready to use. Usually, they don’t include code.
    Some pre-built analytical models can be installed into SpagoBI Server and be immediately released for the end-user.


Spago BI offers many engines for reporting and analytical area, following are few:

JasperReport,BIRTReport, BO (displaying reports managed by Business Objects 6.5),
JPivot/Mondrian, JPalo/Mondrian, JPivot/XMLA Server.

Overview of Jasper Server Authentication & Authorization

Overview of Jasper Server Authentication & Authorization

In JasperReports Server, access control is determined by two separate mechanisms, authentication and authorization.

  • Authentication is the process of verifying user credentials to give access to the server.
  • Authorization is the process of checking permissions granted to users and roles to access resources within the server.


Authentication is the verification of a user’s identity to allow access to JasperReports Server. By default, anonymous access is disabled, and all users must present valid credentials to log in: a user ID, a password, and in certain cases an organization ID. Authentication is more than gathering the user credentials, it is a process that ensures that every page is secure, either viewed by a verified user or denied access until valid credentials are provided.

Jasper server supports following two types of authentication mechanisms:-

Internal Authentication (Default ):- DB based


External Authentication:- process of gathering and verifying user credentials through a third-party application; for example:-

  • LDAP

Lightweight Directory Access Protocol (LDAP) is one of the most popular architectures for enterprise directories. By centralizing all user management in an LDAP directory, applications across the enterprise can share the same user database, and administrators do not need to duplicate user accounts in every application. LDAP authentication does not provide single sign-on (SSO) functionality.


  • CAS

Central Authentication Service (CAS) is an open source, Java-based authentication server that includes a mechanism for single sign-on (SSO) across web applications, including those running on different application servers. When a user requests a page from a CAS-enabled web application, the application redirects the user to the CAS server login page. Thereafter, logged-in users can navigate between all participating applications without needing to log in again. Each application communicates with the CAS server in the background to verify that the user is valid before providing access to its resources.


Authentication happens only once at the beginning of the user’s session. After a user is authenticated, the user’s session is represented by an in-memory instance referred to as the principal object. The existence of the principal object determines that the user is logged on, and the user can access pages throughout the application. The principal object also stores the roles and organization ID of the user, which are required for authorization within JasperReports Server.


Authorization is the verification of a user’s roles and organization ID to access features of the server, resources in the repository, and, in some cases, data. For every page that the user requests, the server determines which menu items, resources, and report contents the user can access, based on the principal object. Authorization happens every time a user accesses a resource.

In the JasperReports Server architecture, which is based on the Spring Framework and spring security authorization is always performed by internal mechanisms. Part of configuring external authentication is to define a mapping of external roles and organization IDs into the principal object so that authorization can proceed internally.

Authorization mechanism:-

  • In JasperReports Server, all resources are stored in the repository in a hierarchical structure of folders.
  • Each resource in the repository has a set of permissions explicitly granting certain kinds of access. Folders also have permissions, and all contents of a folder inherit these permissions if they do not explicitly define their own. Often, many resources need the same permissions, and therefore it is easier to manage permissions on folders.
  • Permissions can grant access to users or to roles. Roles are groups of users created for the purpose of simplifying authorization. Often, it is easier to manage permissions for groups of users, and then manage role membership separately. However, if necessary, permissions can be granted to specific users.


– By Archana Verma

BI Solution For Property and Casualty Insurance Domain

A BI solution will help an insurance company to get a holistic 360 degree view of customers, issues faced. Helical has got immense experience in developing BI solution in Property and casulaty insurance domain. Having built more than 50 reports, 20 dashboards, geographical dashboard, adhoc reports and OLAP cubes, these BI solution can be used across many different departments like underwriting, claims, billing, reinsurance, insurance etc.

Below present are snapshots of some of the reports dashboard developed by us for client.

Loss Stratification


This report includes stratification of total incurred with claim count within each stratum. This report includes the ability to define the bands ($0-$100K, $100K-$250K, etc.) to meet your needs.


This information is useful in reviewing severity patterns. For example, we see in this report that approximately 60-65% of claim count is generated by losses valued at $1,000 or less. Loss stratification provides information that can be used in several ways. It is an important consideration in insurance program design – retentions, limits, etc. And, this report is also useful in setting severity reduction goals and monitoring performance.

loss stratification report

loss stratification report

Loss Triangle


A table of loss experience showing total losses for a certain period at various, regular valuation dates, reflecting the change in amounts as claims mature. Older periods in the table will have one more entry than the next youngest period, leading to the triangle shape of the data in the table. Loss triangles can be used to determine loss development for a given risk.

Loss Triangle Report

Loss Triangle Report

Large Loss Report


This report provides detailed information on individual claims. It allows selection of claims over a certain threshold.

The amount of detail shown on this type of report can be completely customized to meet each client’s needs.


By isolating claims over a chosen dollar amount, clients are able to focus upon claims making the greatest contribution to total incurred. Uses include claim reviews with administrators, insurance submissions, actuarial analysis and focusing operations staff on opportunities such as lost time reduction.

large loss report

large loss report


Open Claims


These reports can provide a simple listing with relatively little detail or much more complete information for specific claims.


It is often valuable to focus specifically on open claims. We have worked with clients to develop a number of approaches and reports to address this need. Often the focus is to mitigate lost time and close claims. Most effective claim management programs include reinforcement of this objective to both operations staff and claim administrators.

Examples provided here show all open claims, open claims with total incurred

Open Claims Report

Open Claims Report


Average Claims by Severity


The Report displays business KPIs like Incurred Loss, Paid Amt, Claim Frequency and Avg Incurred Loss summarized by Adjusting Office.

average claims severity by adjuster

average claims severity by adjuster


Claims Cause

Description:This report helps identify specific causes of loss.

Claims Cause Report

Claims Cause Report




Claims Registered

Description: The report provides detailed information of all the claims which are registered. An end user will have the option to filter the data according to dates (month and year), company, line of business, dealer group etc. Based on those input parameters which are selected, the report gets populated up


Claims Registered Report

Claims Registered Report

Claims Schedule – By Company

Description: The report provides detailed information regarding the Payments, Loss Adjusting Expense, Outstanding Loss Reserves, Incurred Loss and Salvage Amount by claims.

Claims Schedule

Claims Schedule


Claims Recovery Summary

Description: The report displays summary of recovery amount by Country, Company, Agent, Line of Business, Claims, Coverage and Date of Loss.

claims recovery summary

claims recovery summary


General Premium Summary Report

Description: The report displays summary of premium amounts by Country, Company, LOB, Dealer group and Coverage.

general premium summary report

general premium summary report



Inforce & Unearned Premium Summary Report

Description: The report displays summary of new/renewal Inforce amount and new/renewal Unearned amounts premiums by Dealer Group and coverage

Premium summary report

Premium summary report



Loss Paid and Reserve by LOB

Description: The report displays summary of Loss Paid (MTD), Loss Paid (YTD), Loss Reserve, Expense Paid (MTD), Expense Paid (YTD) and Incurred Amount by LOB.

loss paid report

loss paid report



Policy Transaction Report

Description: The report gives details of policy transactions in given period.

policy transaction report

policy transaction report



Premium Bordereaux

premium bordereaux

premium bordereaux



Written & Earned Premium Comparison

Description: The report display comparison between Policy count, Written and Earned Premium for current period and previous period.

premium comparison report

premium comparison report



YTD Loss Listing

Description: The report displays YTD summary of Loss Payments, Recoveries and Incurred Loss by Company, Coverage, Dealer Group, Dealer, Claim Number and Date of Loss.

YTD loss listing

YTD loss listing



Premium production by agent

Description: The report allows user to view Written Premium, Average written Premium, and Policy Count for New and Renewal premiums by Agent

premium production by agent

premium production by agent



Profitability – Top 10 Agents

For the selected duration this report will show the amount earner from different companies.

Profitability top 10 agents

Profitability top 10 agents



Policy Register

Description: Description: The report provides detailed information of all policies registered for s

executive dashboard

executive dashboard

elected period.

policy register

policy register



Executive Dashboard

There could be various dashboards. For example the below dashboard shows information like loss ratio, incurred losses amount and change percentage, written and earned premium, money earner in new versus the renewals etc.

executive dashboard

executive dashboard


Underwriting Dashboard

The below dashboard, for the selected product, shows things like written versus earner premium, revenue growth over a period of time from new business / renewal / retention. It is also showing the loss ratio.

underwriting dashboard

underwriting dashboard



What – If Analysis

A what if analysis to in which there will be input parameter to select the product. Once the product is selected, end user can select different parameters like policy premium, policies per month, renewal retention, claim severity etc and accordingly he can see what how it will affect the loss ration, earner premium, written premium etc.


For having a demo of the same please get in touch at [email protected]

Nikhilesh Tiwari

Helical IT Solutions

Define JNDI for Pentaho BA server 5.0

Define JNDI for Pentaho BA server 5.0

Note: For illustration I’m showing Oracle 11g configuration. Replace the Resource name, username, password, driverClassName, and url parameters, or any relevant connection settings.

Add Driver

Place appropriate driver for Oracle 11g which is ojdbc6- to this directory for the BA Server: /pentaho/server/biserver-ee/tomcat/lib/.

Note: There should be only one driver for one database in this directory. Ensure that there are no other versions of the same vendor’s driver in this directory. If there are, back up the old driver files and remove them to avoid version conflicts.

Specify JNDI connections

  1. Stop server & BA server.
  2. Edit the /tomcat/webapps/pentaho/WEB-INF/web.xml file.
  3. At the end of the <web-app> element, in the same part of the file where you see <!– insert additional resource-refs –>, add this XML snippet.
  4. myDatasource
    jdbc/ myDatasource

  5. Save & Close web.xml
  6. Open & modify this file /tomcat/webapps/pentaho/META-INF/context.xml.
  7. Add this XML snippet anywhere in context.xml
  8. Save & Close context.xml
  9. Open simple jndi
  10. Open & Modify this file biserver-ce/pentaho-solutions/system/simple-jndi/
  11. Add these lines
  12. myDatasource /type=javax.sql.DataSource
    myDatasource /driver= oracle.jdbc.OracleDriver
    myDatasource /url= jdbc:oracle:thin:@//localhost:1521/sysdba
    myDatasource /user=dbuser
    myDatasource /password=password

  13. Save & Close
  14. Delete the pentaho.xml filed located in the /tomcat/conf/catalina/directory.
  15. Start tomcat & BA server.

Debug Mondrian Cubes

This blog will talk and educate the reader on how to debug Mondrian Cubes.

Question: How to look at the queries that Mondrian generates while the user is navigating the OLAP cube. ? It’s really useful to look at Mondrian log files because they give us a lot of useful information about how our system is behaving. We can
o look at sql statements and MDX queries,
o Have some profiling information on queries that are executed,
o Get other useful debugging information.

The following Steps illustrate how to enable Mondrian debugging logs, adding some properties to the Mondrian configuration file. After that, we’ll configure two new log4j appenders to have the desired log files properly written on our file system.

Step 1: Enable Mondrian debug log – Mondrian has a big set of configuration settings that can be modified. In our case, to enable Mondrian debug information follow the steps detailed below: Open the file located in <bi-server_home>/pentaho-solution/system/mondrian and add the following line.

Example: file location.
D:\Installation Softwares\Pentaho\biserver-ce-4.8.0-stable\biserver-ce\pentaho-solutions\system\mondrian

Debug Mondrian Cube

Step 2: Update log4j configuration
At this point we’re going to modify the log4j configuration file adding the required appenders categories to have our logging information displayed properly Open the log4j.xml file located in <bi-server_home>/tomcat/webapps/pentaho/WEB-INF/classes Based on what you want to log, add the one or each of the following lines to the file. They will create two new RollingFileAppenders. You’re free to use the kind of appender you prefer.

Example: Location of log4j.xml file

D:\Installation Softwares\Pentaho\biserver-ce-4.8.0-stable\biserver-ce\tomcat\webapps\pentaho\WEB-INF\classe Add the following code.

NOTE: The code is already available within the file.. Just we need to un-comment.

<appender name=”MDXLOG” class=”org.apache.log4j.RollingFileAppender”>
<param name=”File” value=”../logs/mondrian_mdx.log”/>
<param name=”Append” value=”false”/>
<param name=”MaxFileSize” value=”500KB”/>
<param name=”MaxBackupIndex” value=”1″/>
<layout class=”org.apache.log4j.PatternLayout”>
<param name=”ConversionPattern” value=”%d %-5p [%c] %m%n”/>

<category name=”mondrian.mdx”>
<priority value=”DEBUG”/>
<appender-ref ref=”MDXLOG”/>

<!– ========================================================= –>
<!– Special Log File specifically for Mondrian SQL Statements –>
<!– ========================================================= –>

<appender name=”SQLLOG” class=”org.apache.log4j.RollingFileAppender”>
<param name=”File” value=”../logs/mondrian_sql.log”/>
<param name=”Append” value=”false”/>
<param name=”MaxFileSize” value=”500KB”/>
<param name=”MaxBackupIndex” value=”1″/>
<layout class=”org.apache.log4j.PatternLayout”>
<param name=”ConversionPattern” value=”%d %-5p [%c] %m%n”/>

<category name=”mondrian.sql”>
<priority value=”DEBUG”/>
<appender-ref ref=”SQLLOG”/>


Step 3: Enable the new log settings To have the new log settings enabled restart the Pentaho bi-server instance.

Log files location
D:\Installation Softwares\Pentaho\biserver-ce-4.8.0-stable\biserver-ce\tomcat\logs
After restarting the server and when you run the CUBE-OLAP…
You can find the following two files in the above location.

Files are:
i) Mondrian_mdx.log
ii) Mondrian_sql.log

Debug Mondrian Cube 2

Now, Enjoy analysing SQL queries that generated while performing run actions in various tools like Pentaho Analyser, Saiku Analysis.

Get in touch with us at Helical IT Solutions

Create Organization using REST Webservice in Jasperserver / Jaspersoft

This blog will talk about how to create organizations using REST webservice in JasperServer.

STEP 1:-

Put the following jar files in lib folder:










STEP 2:-

Create a jsp page




<meta http-equiv=“Content-Type” content=“text/html; charset=ISO-8859-1”>

<title>Insert title here</title>



<form action=TestWebService>

<input type=“submit” value=“submit”>



STEP 3:=

Create client :


package com.helical.restwebservices;









import java.util.List;

import javax.servlet.http.HttpServlet;

import javax.servlet.http.HttpServletRequest;

import javax.servlet.http.HttpServletResponse;

import org.apache.http.HttpResponse;

import org.apache.http.client.HttpClient;

import org.apache.http.client.ResponseHandler;

import org.apache.http.client.methods.HttpGet;

import org.apache.http.client.methods.HttpPost;

import org.apache.http.client.methods.HttpPut;

import org.apache.http.entity.StringEntity;

import org.apache.http.impl.client.BasicResponseHandler;

import org.apache.http.impl.client.DefaultHttpClient;


public class TestWebService extends HttpServlet{

private static final long serialVersionUID = 1L;

public void doGet(HttpServletRequest req , HttpServletResponse res)


String jSessionId;

jSessionId = jWSLogin(“[email protected]“, “test”, null);




public String jWSLogin(String username, String password, String organization)


if(username == null || password == null)


throw new RuntimeException(“Failed: Username or Password can’t be null”);


String j_uname = organization == null ?  username : (organization.trim().length() == 0 || organization == “”) ? username : username +”%7C”+ organization;


String j_password = password;

String jSessionIdCookie=””;

try {

URL url = new URL(““);

HttpURLConnection conn = (HttpURLConnection) url.openConnection();



conn.setRequestProperty(“Content-Type”, “application/x-www-form-urlencoded”);

String input = “j_username=”+j_uname+”&j_password=”+j_password;

OutputStream os = conn.getOutputStream();                os.write(input.getBytes());


if (conn.getResponseCode() != 200) {

throw new RuntimeException(“Failed : HTTP error code : “+   conn.getResponseCode());




BufferedReader br = new BufferedReader(new InputStreamReader(


List<String> cookies = conn.getHeaderFields().get(“Set-Cookie”);


         for (String string : cookies)


jSessionIdCookie = string;




} catch (MalformedURLException e) {


} catch (IOException e) {



  return jSessionIdCookie;


public void addOrganization(String jSessionId, HttpServletResponse res)


HttpClient httpClient = new DefaultHttpClient();


HttpPost request = new         HttpPost(““);

request.setHeader(“Cookie”, jSessionId);

StringEntity params =new StringEntity(“{\”id\”:\”helicaltest123\”,\”alias\”:\”helicaltest123\”,\”parentId\”:\”organizations\”,\”tenantName\”:\”helicaltest123\”,\”tenantDesc\”:\”Audit Department of Finance\”,\”theme\”:\”default\”}”);

request.addHeader(“content-type”, “application/json”);


HttpResponse response = httpClient.execute(request);

}catch(Exception e)



}finally {







We have created two methods in

  1. jWSLogin()
  2. addOrganization()

jWSLogin() is use to aunthenticate and return session so that we can use that session in another task.


is use to add  organization. Data which we want to insert should be in json or xml format. In this example I have taken in json format.


STEP 4:-

Create web.xml:=

<?xml version=“1.0” encoding=“UTF-8”?>

<web-app xmlns:xsi= xmlns= xmlns:web= xsi:schemaLocation= id=“WebApp_ID” version=“2.5”>







<servlet-name> TestWebService </servlet-name>

<servlet-class>com.helical.restwebservices. TestWebService </servlet-class>



<servlet-name> TestWebService </servlet-name>

<url-pattern>/ TestWebService </url-pattern>




For any other help on Jasper Server, please get in touch with us

Helical IT Solutions

Hire Pentaho Consultants / Hire Pentaho Developers

Hire Pentaho Consultants / Hire Pentaho Developers

Helical IT Solutions, with its deep expertise on Pentaho and being an open source DW BI expert, can help in designing and constructing your BI solution. Helical IT solutions is having experience on end to end Pentaho BI suite right from ETL, data warehousing, C-tools, CDE, CDF, Pentaho Report Designer, Pentaho Schema Workbench, data mining etc. Our in-depth knowledge of both Business Intelligence applications and the Pentaho platform to ensure successful development and deployment of your BI initiatives.

Pentaho Consultant

Helical can help at a number of aspects like

– BI Tool Selection : Based on your requirement, with our deep rooted expertise on a number of BI solutions, we can help you in determining which BI tools would be best suited for you. Whether you actually need to purchase enterprise edition or if your requirement can be fulfilled via community edition by using some tweaks or custom coding or best of breed solution. The tool selection will be done after a thorough analysis of your requirements, present hardware and software, budget, speed of solution delivery required etc.

– Pentaho BI POC :- If your company is still in dilemma to go ahead or not, then we can help you in developing a POC (proof of concept). We help evaluate the open source BI and ETL solution for your environment, demonstrate how it will work, and recommend how you can leverage the technology, even if you have other BI products.

– Pentaho BI Solution Development :- With our technical expertise and domain knowledge of end to end BI solution development in a number of different verticals ( we have executed BI solutions in energy, healthcare, insurance, supply chain, e-commerce, human resource), we can help you in designing and developing the perfect BI solution for you. We will help you in the right KPI parameters selection, reports and dashboards development, OLAP cubes, ETL scripts generation, plugin designing, security implementation, application integration, fetching data etc. For designing the solution for you, we can use any of the below mentioned tool present in the BI suite.

  • Pentaho Business Intelligence Platform
  • Pentaho Data Integration (Kettle)
  • Pentaho Analysis Services (Mondrian)
  • Pentaho Analytics – Agile BI (commercial), Saiku (open source)
  • Pentaho Reporting
  • Pentaho Data Mining (Weka)
  • Pentaho Dashboards

– Pentaho Data Integrator (PDI) – Kettle :- Pentaho BI comes with a powerful ETL (Extract, Transform, Load) suite that allows your existing data to be transformed, summarized and aggregated into a form that puts the business information at your fingertips. At Helical, we have extensive ETL knowledge, employing it both for Business Intelligence purposes and to transform and migrate data between systems, software’s & databases. Having a lot of experience on data warehousing, ETL, data modelling, data mart designing, query optimization etc, our dedicated team can also help you with all of your ETL requirement, creating transformations and jobs etc.

– Training and Documentation :- A software is as good as its end user. In order to make sure that the end user is able to use the developed software pretty well, we will be providing all the necessary training to the end user. Also, the same would be supported by documentation like installation document, troubleshooting document etc.

– Support :- Helical team will be providing 24×7 support to the client, thus ensuring that the solution is up and running always.

Helical provides support and services on the entire Pentaho BI stack, as mentioned below

  • Pentaho Business Intelligence Platform
  • Pentaho Data Integration (Kettle)
  • Pentaho Analysis Services (Mondrian)
  • Pentaho Analytics – Agile BI (commercial), Saiku (open source)
  • Pentaho Reporting
  • Pentaho Data Mining (Weka)
  • Pentaho Dashboards
  • Metadata design and development
  • Performance tuning
  • Application integration

Contact us at [email protected], (+91-7893947676) Pentaho developer