Leave the past behind: switch to CSS3

June 28, 2011

After years of saying ‘what-if,’ ‘some-day,’ and ‘boy wouldn’t that be nice,’ Pentaho has adopted CSS3 in its Java Web application. It took a major release (Pentaho BI 4) with a commitment to refresh our UI to adopt a more modern look. What the Pentaho development team delivered is part Apple IOS and part Windows 7: rounded corners; transparency effects; gradients; drop-shadows; text-shadows; and inner glows; all created with CSS3 while gracefully degrading for less-capable browsers. This single decision to adopt CSS3 made it possible to implement this new look in weeks instead of months.

Old: New:

A big leap to CSS3

Personal desires aside, browser usage statistics and a bunch of other boring factors pushed our corporate decision to move to CSS3. Suffice it to say, the time was right for us; the benefits of transitioning to CSS3 vs. continuing with the way we’d done things in the past suggested a huge advantage.

With our traditional approach, accomplishing this updated look would have required manual slicing of images and painstaking back-and-forth with developers, designers, and managers. A typical skinning exercise went something like this:

read the full post on DZone: Put your HTML on a diet, Alpha gradients with RGBA, Dealing with the down-side, Is it time for you to switch to CSS3?

blog by: Nick Baker, Lead Programmer, Pentaho


What does the economic buyer of Agile BI know that you don’t?

May 18, 2011

Agile is a familiar term to product managers and software developers who are looking to build rapid prototypes and iterate through software development life cycles fairly quickly and effectively.

With recent market trends, Agile has now made it to the agenda of the economic BI buyer. If you are a CFO, CIO, or CEO, and have been hearing about Agile BI in the industry, you are probably looking to quantify the benefits of Agile BI in terms of direct cost savings.

As a CxO you know that your Business Intelligence costs are mainly driven by these 4 areas:

  1. License acquisition costs
  2. Skill development and training
  3. Project deployment duration and man hours
  4. Ongoing cost of change management once the solution is deployed

The question is whether Agile BI can save you costs in any of these categories? While Agile BI can immediately imply faster deployment of the BI solution (#3 above), in Pentaho we add value in all the 4 areas. Here is how:

  1. Consolidation of licenses: Any BI implementation requires some form of Data Integration, Data Warehousing/Data Mart development, and Data Visualization (Reports, Analysis, and Dashboards). Current BI vendors in the market have disparate products for each of these areas, and offer each product at a separate license acquisition and maintenance cost. Pentaho provides great value in this area as it includes all these components in “one” subscription to Pentaho BI Suite Enterprise Edition, giving you an ultimate price tag that is by far a fraction of the cost of other BI tools in the market.
  2. Collapsing skill sets into one: Each specialized tool mentioned above also requires a set of highly trained staff. In a traditional BI project, a crew of ETL Developers, DBAs, Data Modelers, and BI Developers were involved, each building one piece of the big puzzle. An all in one tool such as Pentaho BI Suite EE offers “one” single integrated tool set for all areas of BI development. This enables organizations to collapse the diverse skill sets into one. This level of self-sufficiency reduces the amount of IT staff that needs to be on board for building and maintaining a successful BI program.
  3. Rapid deployment – Pentaho offers an Agile Development Environment, as part of its BI Suite EE. This integrated data integration and business intelligence development environment turns data to decision in a matter of days as opposed to months/ years. Interactive data explorations and visualizations for slicing and dicing data across multiple sources are instantly auto-generated using this tool. Unlike a waterfall approach, this tool allows business and technical teams to build quick prototypes, and iterate upon that all within a unified workspace that empowers sharing, collaboration, and rapid results.
  4. Rapid change management – The need for quick turnarounds when adding additional business metrics or changing existing ones is a reality in BI deployments. When disparate tools are used, adding a new data source, or changing a metric, can take a long time. With Agile BI Development Environment, unique to Pentaho, any change to ETL flows or to the business semantic layer, is automatically reflected in the visualization layer (Reporting, Analysis, Dashboards). This helps organizations to quickly incorporate changes and adjust their BI solution to current business requirements, without long wait times and IT bottleneck delays.

Ready to start saving? How about this….try the Agile BI functionality of Pentaho BI Suite or Pentaho Data Integration for FREE (30-day supported enterprise evaluation). Ready now?

Farnaz Erfan
Product Marketing Manager
Pentaho Corporation


“Drilling” in to the detail with Pentaho

November 30, 2010

“Drilling”…..(with respect to Business Intelligence applications and Information Technology). Where did that word come from? What does it mean? What can it mean? I am sure you have heard the phrase “Drill down to detail” before, but you may have also heard “Drill Up”, “Drill Out”, “Drill Across”, “Drill In” and “Drill Through” and don’t forget “Drill Anywhere”.

In general, it means to simply move from summary level information to underlying detail data, either within its current data set or even outside to another data set. Its main purpose is to allow one to easily view summarized information in the form of a chart, table or some graphical visualization with the added ability to “click” on a value, series or region and “drill in” to the next level of detail or out to some other dimension. “Drilling” allows business users to make informed decisions quickly without having to page through sheets of raw data.

For example, summarized sales revenue for the year 2010 is $200K, but upon drilling down we see that $175K was brought in by 3 out of 4 regions, leaving 1 region with very low numbers. This now exposes a single region as being an outlier or a entity that needs focused attention. The power of Business Intelligence applications at work, turning raw data into actionable information.

The Pentaho BI Suite can provide “Drilling” in a number of ways depending on which module you deploy. We will explore each of these in this article……….read more about it here at the Pentaho Evaluation Sandbox.

http://sandbox.pentaho.com/2010/11/drilling-in-to-the-details-with-pentaho/

Regards,

Michael Tarallo
Director of Sales Engineering
Pentaho


Data, Data, Data

October 12, 2010

It’s everywhere and expanding exponentially every day. But it might as well be a pile of %#$& unless you can turn all of that data into information. And do so in a timely, efficient and cost-effective manner.  The old-school vendors don’t operate in a timely (everything is slow), efficient (everything is over-engineered, over-analyzed, over-staffed, etc) or cost-effective mode (the bloated supertanker needs feeding and the customer gets to pay for those inefficiencies), so that means new technologies and business models will drive innovation which ultimately serves the customers and communities.

Back to Data, Data, Data – Enter open source technologies like Hadoop and Pentaho BI/DI to drive next gen big data analytics to the market. Hadoop and Pentaho have both been around about 5 years, are both driven by very active communities, and have both been experiencing explosive growth over the last 18 months. Our community members are the ones who came up with the original integration points for the two techs, not because it was a fun, science project thing to do but because they had real business pains they were trying to solve. This all started in 2009 – we started development in 09, we launched our beta program in June 2010 (had to cap enrollment in the beta program at 60), launched a Pentaho for Hadoop roadshow (which was oversubscribed) and are now announcing the official release of Pentaho Data Integration and BI Suite for Hadoop.

I’m in NYC today at Hadoop World and we’re making four announcements:

  1. Pentaho for Hadoop – our Pentaho BI Suite and Pentaho Data Integration are now both integrated with Hadoop
  2. Partnership with Amazon Web Services – Pentaho for Hadoop now supports Amazon Elastic Map Reduce (EMR) and S3
  3. Partnership with Cloudera – Pentaho for Hadoop will support certified versions of Cloudera’s Distribution for Hadoop (CDH)
  4. Partnership with Impetus – a major Solutions Provider (over 1,000 employees) with a dedicated Large Data Analytics practice.

Consider this as phase I of building out the ecosystem.

We’re all about making Hadoop easy and accessible. Now you can take on those mountains of data and turn them into value. Download Pentaho for Hadoop.

Richard


Part 2: Configuring Server Side Data Connections – even easier!

July 23, 2010

Welcome to Part 2 of the Pentaho Video Tutorial Series. In Part 1, we covered the installation of the Pentaho BI Suite on an Ubuntu Linux O/S using the Pentaho BI Suite Installer. Many have already experienced how easy it was to download and install, thanks for your comments. (You can obtain the installer from the Pentaho KB or from http://www.pentaho.com/download/.)

In this part we will cover configuring Pentaho server side data connections that can access your traditional RDBMS data sources. In this example I demonstrate adding connections to Oracle and Microsoft SQL Server 2008. It is important to note that the Pentaho design tools, such as Report Designer, Schema Workbench, Data Integration and the Metadata Editor can publish content using these server side data source connections that have been configured. (We will cover this in Part 3 of the series) To prepare for this configuration, please have the appropriate vendor specific JDBC Type 4 driver available for the data source you want to connect to. You can obtain them from the database vendor’s web site, your DBA, or perhaps the installation location of the RDBMS.

You can view the video inline in this blog, or download the higher quality .wmv (Windows Media Video) to your workstation for local viewing.

To see all related Tutorials in this series posted by me, click on my pic on the bottom right under Authors.

Mike Tarallo
Pre-Sales Director
Pentaho Corporation

Para ver esta entrada en español chasque aquí here
Pour regarder cette entrée en français cliquez sur ici
Um diese Eintragung auf deutsch anzusehen klicken Sie hier


Part 1: Installing Pentaho Enterprise Edition – yep, it’s that easy

July 21, 2010

That’s right, I said it. It’s that easy. Hello everyone, I am Michael Tarallo, Pentaho Pre-Sales Director.  It is an absolute pleasure to be blogging in the Swamp with the rest of the Pentaho crew.  Originally I started creating a series of Pentaho related tutorial type videos published exclusively on my blog michaeltarallo.blogspot.com. However, I have also been invited to the Swamp to share my BI technical musings and knowledge with y’all. (no really, I am originally from the North East, I don’t ever say y’all) –  I thought this would be a great way to share this series and create an awareness of how easy it is to get started using and evaluating the Pentaho software.

This is part 1 of an N part series (there is so much to showcase I don’t have a definitive number) that will showcase the power, flexibility and extensibility of the Pentaho BI Suite. My goal is to help you evaluate the Pentaho BI Suite easily and quickly as well as show you tips and techniques that make Pentaho the right choice for current and future needs.  In this 15 min video (available in my blog or as a download here)  you will watch me go through the process, of downloading, installing and testing the Pentaho BI Suite Enterprise Edition on an Ubuntu Linux operating system. From start to finish, you will see it truly is just that easy. The next entry will focus on configuring data sources to be used for content creation. Enjoy and I hope to speak with you in person!

To see all related Tutorials in this series posted by me, click on my pic on the bottom right under Authors.

Mike Tarallo
Pre-Sales Director
Pentaho Corporation

Para ver esta entrada en español chasque aquí here (Google Translation)
Pour regarder cette entrée en français cliquez sur ici (Google Translation)
Um diese Eintragung auf deutsch anzusehen klicken Sie hier (Google Translation)

You can download the high quality .wmv (Windows Media) here


Six reasons why Pentaho’s support of Apache Hadoop is great news for ‘big data’

May 19, 2010

Earlier today Pentaho announced support for Apache Hadoop – read about it here.

There are many reasons we are doing this:

  1. Hadoop lacks graphical design tools – Pentaho provides plug-able design tools.
  2. Hadoop is Java –  Pentaho’s technologies are Java.
  3. Hadoop needs embedded ETL – Pentaho Data Integration is easy to embed.
  4. Pentaho’s open source model enables us to provide technology with great price/performance.
  5. Hadoop lacks visualization tools – Pentaho has those
  6. Pentaho provides a full suite of ETL, Reporting, Dashboards, Slice ‘n’ Dice Analysis, and Predictive Analytics/Machine Learning

The thing is, taking all of these in combination, Pentaho is the only technology that satisfies all of these points.

You can see a few of the upcoming integration points in the demo video (above). The ones shown in the video are only a few of the many integration points we are going to deliver.

Most recently I’ve been working on integrating the Pentaho suite with the Hive database. This enables desktop and web-based reporting, integration with the Pentaho BI platform components, and integration with Pentaho Data Integration. Between these use cases, hundreds of different components and transformation steps can be combined in thousands of different ways with Hive data. I had to make some modifications to the Hive JDBC driver and we’ll be working with the Hive community to get these changes contributed. These changes are the minimal changes required to get some of the Pentaho technologies working with Hive. Currently the changes are in a local branch of the Hive codebase. More specifically they are a ‘Short-term Rapid-Iteration Minimal Patch’ fork – a SHRIMP Fork.

Technically, I think the most interesting Hive-related feature so far is the ability to call an ETL process within a SQL statement (as a Hive UDF). This enables all kinds of complex processing and data manipulation within a Hive SQL statement.

There are many more Hadoop-related ETL and BI features and tools to come from Pentaho.  It’s gonna be a big summer.

James Dixon
Chief Geek
Pentaho Corporation

Learn more - watch the demo



Follow

Get every new post delivered to your Inbox.

Join 101 other followers