What does the economic buyer of Agile BI know that you don’t?

May 18, 2011

Agile is a familiar term to product managers and software developers who are looking to build rapid prototypes and iterate through software development life cycles fairly quickly and effectively.

With recent market trends, Agile has now made it to the agenda of the economic BI buyer. If you are a CFO, CIO, or CEO, and have been hearing about Agile BI in the industry, you are probably looking to quantify the benefits of Agile BI in terms of direct cost savings.

As a CxO you know that your Business Intelligence costs are mainly driven by these 4 areas:

  1. License acquisition costs
  2. Skill development and training
  3. Project deployment duration and man hours
  4. Ongoing cost of change management once the solution is deployed

The question is whether Agile BI can save you costs in any of these categories? While Agile BI can immediately imply faster deployment of the BI solution (#3 above), in Pentaho we add value in all the 4 areas. Here is how:

  1. Consolidation of licenses: Any BI implementation requires some form of Data Integration, Data Warehousing/Data Mart development, and Data Visualization (Reports, Analysis, and Dashboards). Current BI vendors in the market have disparate products for each of these areas, and offer each product at a separate license acquisition and maintenance cost. Pentaho provides great value in this area as it includes all these components in “one” subscription to Pentaho BI Suite Enterprise Edition, giving you an ultimate price tag that is by far a fraction of the cost of other BI tools in the market.
  2. Collapsing skill sets into one: Each specialized tool mentioned above also requires a set of highly trained staff. In a traditional BI project, a crew of ETL Developers, DBAs, Data Modelers, and BI Developers were involved, each building one piece of the big puzzle. An all in one tool such as Pentaho BI Suite EE offers “one” single integrated tool set for all areas of BI development. This enables organizations to collapse the diverse skill sets into one. This level of self-sufficiency reduces the amount of IT staff that needs to be on board for building and maintaining a successful BI program.
  3. Rapid deployment – Pentaho offers an Agile Development Environment, as part of its BI Suite EE. This integrated data integration and business intelligence development environment turns data to decision in a matter of days as opposed to months/ years. Interactive data explorations and visualizations for slicing and dicing data across multiple sources are instantly auto-generated using this tool. Unlike a waterfall approach, this tool allows business and technical teams to build quick prototypes, and iterate upon that all within a unified workspace that empowers sharing, collaboration, and rapid results.
  4. Rapid change management – The need for quick turnarounds when adding additional business metrics or changing existing ones is a reality in BI deployments. When disparate tools are used, adding a new data source, or changing a metric, can take a long time. With Agile BI Development Environment, unique to Pentaho, any change to ETL flows or to the business semantic layer, is automatically reflected in the visualization layer (Reporting, Analysis, Dashboards). This helps organizations to quickly incorporate changes and adjust their BI solution to current business requirements, without long wait times and IT bottleneck delays.

Ready to start saving? How about this….try the Agile BI functionality of Pentaho BI Suite or Pentaho Data Integration for FREE (30-day supported enterprise evaluation). Ready now?

Farnaz Erfan
Product Marketing Manager
Pentaho Corporation


High availability and scalability with Pentaho Data Integration

March 31, 2011

Experts often possess more data than judgment.” – Colin Powell….hmmm, those experts surely are not using a scalable Business Intelligence solution to optimize that data which can help them make better decisions. :-)

Data is everywhere! The amount of data being collected by organizations today is experiencing explosive growth. In general, ETL (Extract Transform Load) tools have been designed to move, cleanse, integrate, normalize and enrich raw data to make it meaningful and available for knowledge workers and decision support systems. Once data has been “optimized,” only then can it be turned into “actionable” information using the appropriate business applications or Business Intelligence software. This information could then be used to discover how to increase profits, reduce costs or even write a program that suggests what your next movie on Netflix should be. The capability to pre-process this raw-data before making it available to the masses, becomes increasingly vital to organizations who must collect, merge and create a centralized repository containing “one version of the truth.” Having an ETL solution that is always available, extensible and highly scalable is an integral part of processing this data.

Pentaho Data Integration

Pentaho Data Integration (PDI) can provide such a solution for many varying ETL needs. Built upon a open Java framework, PDI uses a metadata driven design approach that eliminates the need to write, compile or maintain code. It provides an intuitive design interface with a rich library of prepacked plug-able design components. ETL developers with skill sets that range from the novice to the Data Warehouse expert can take advantage of the robust capabilities available within PDI immediately with little to no training.

The PDI Component Stack

Creating a highly available and scalable solution with Pentaho Data Integration begins with understanding the PDI component stack.

● Spoon – IDE – for creating Jobs, Transformations including the semantic layer for BI platform
● Pan – command line tool for executing Transformations modeled in Spoon
● Kitchen – command line tool for executing Jobs modeled in Spoon
● Carte – lightweight ETL server for remote execution
● Enterprise Data Integration Server – remote execution, version control repository, enterprise security
● Java API – write your own plug-ins or integrate into your own applications

Spoon is used to create the ETL design flow in the form of a Job or Transformation on a developer’s workstation. A Job coordinates and orchestrates the ETL process with components that control file movement, scripting, conditional flow logic, notification as well as the execution of other Jobs and Transformations. The Transformation is responsible for the extraction, transformation and loading or movement of the data. The flow is then published or scheduled to the Carte or Data Integration Server for remote execution. Kitchen and Pan can be used to call PDI Jobs and Transformations from your external command line shell scripts or 3rd party programs. There is also a complete Java SDK available to integrate and embed these process into your Java applications.

Figure 1: Sample Transformation that performs some data quality and exception checks before loading the cleansed data

PDI Remote Execution and Clusters

The core of a scalable/available PDI ETL solution involves the use of multiple Carte or Data Integration servers defined as “Slaves” in the ETL process. The remote Carte servers are started on different systems in the network infrastructure and listen for further instructions. Within the PDI process, a Cluster Scheme can be defined with one Master and multiple Slave nodes. This Cluster Scheme can be used to distribute the ETL workload in parallel appropriately across these multiple systems. It is also possible to define Dynamic Clusters where the Slave servers are only known at run-time. This is very useful in cloud computing scenarios where hosts are added or removed at will. More information on this topic including load statistics can be found here in an independent consulting white paper created by Nick Goodman from Bayon Technologies, “Scaling Out Large Data Volume Processing in the Cloud or on Premise.”

Figure 2: Cx2 means these steps are executed clustered on two Slave servers
All other steps are executed on the Master server

The Concept of High Availability, Recover-ability and Scalability

Building a highly available, scalable, recoverable solution with Pentaho Data Integration can involve a number of different parts, concepts and people. It is not a check box that you simply toggle when you want to enable or disable it. It involves careful design and planning to prepare and anticipate the events that may occur during an ETL process. Did the RDBMS go down? Did the Slave node die? Did I lose network connectivity during the load? Was there a data truncation error at the database? How much data will be processed on peak times? The list can go on and on. Fortunately PDI arms you with a variety of components including complete ETL metric logging, web services and dynamic variables that can be used to build recover-ability, availability, scalability scenarios into your PDI ETL solution.

For example, Managing Consultant in EMEA, Jens Bleuel developed a PDI implementation of the popular Watchdog concept. A solution that includes checks to monitor if everything is on track is using the concept of a Watchdog when executing its tasks and events. Visit the link above for more information on this implementation.

 

 

Putting it all together – (Sample)

Diethard Steiner, active Pentaho Community member and contributor, has written an excellent tutorial that explains how to set up PDI ETL remote execution using the Carte server. He also provides a complete tutorial (including sample files provided by Matt Casters, Chief Architect and founder of Kettle) on setting up a simple “available” solution to process files, using Pentaho Data Integration. You can get it here. Please note that advanced topics such as this are also covered in greater detail (designed by our Managing Consultant Jens Bleuel – EMEA) in our training course available here.

Summary

When attempting to process the vast amounts of data collected on a daily basis, it is critical to have a Data Integration solution that is not only easy to use but easily extendable. Pentaho Data Integration achieves this extensibility with its open architecture, component stack and object library which can be used to build a scalable and highly available ETL solution without exhaustive training and no code to write, compile or maintain.

Happy ETLing.

Regards,

Michael Tarallo
Senior Director of Sales Engineering
Pentaho

This blog was originally published on the Pentaho Evaluation Sandbox. A comprehensive resource for evaluating and testing Pentaho BI.


Sex
 & 
Sizzle 
– 
not
 without
 plumbing

November 16, 2010

What sells BI software? Sex and Sizzle! What makes BI projects successful? All of the data work done before any grids or graphs are ever produced. It’s the side of the business most BI vendors don’t talk about as they’d rather just woo and wow people with flash charts and glossy dashboards. Not that there is anything wrong with that – who doesn’t like great looking output? But, if the backend plumbing is either too complicated or non-existent, then it doesn’t matter how sexy this stuff is.

Today Pentaho announced the Pentaho Enterprise Data Services Suite to help make the “plumbing” as easy and efficient as possible. We’ve enabled people to iteratively get from raw data–from virtually any source–all the way through to metadata and onto visualization in less than an hour. We’ve enabled a new set of users to accomplish this by taking away many of the complexities.

In about 80% of the use cases we encounter, our customers want to quickly create and perform analytics on the fly, do this in an iterative approach, and when satisfied put their projects into production. You shouldn’t need a Ph.D in Data Warehousing to accomplish this, nevertheless many tools require extensive knowledge of DW methodologies and practices. It is fine to demand this knowledge with larger Enterprise DWs (EDW) but why make everyone pay the price – both in terms of software cost and experience/training required.

Now it would be one thing to provide data integration with RDBMSs, another thing to integrate with ROLAP, and yet another to integrate with Big Data like Hadoop, but how nice would it be to have a single Data Integration and Business Intelligence platform to work for all of these? Almost as nice as the Florida Gators winning a national championship but we won’t have to worry about that in 2010…had to digress for a moment.

A big part of our product release today centers around Pentaho for Hadoop integration including the GA for Pentaho Data Integration and BI Suite for Hadoop. Big Data and the whole ”data explosion” trend is just starting, so if you aren’t there today, give it time and know that Pentaho is already positioned to help in these use cases.

Pentaho allows you to start down an easy path with Agile BI and then scale up to EDW when and if necessary with enterprise data services. Our engineering team and community have spent significant time and effort to bring these services to market, and today is the official release. Please take a few minutes to read up on the new Pentaho Enterprise Data Services Suite and attend the launch webcast. Or, go ahead and download the Pentaho Enterprise Data Services Suite and start making easier, faster, better decisions.

Richard


Data, Data, Data

October 12, 2010

It’s everywhere and expanding exponentially every day. But it might as well be a pile of %#$& unless you can turn all of that data into information. And do so in a timely, efficient and cost-effective manner.  The old-school vendors don’t operate in a timely (everything is slow), efficient (everything is over-engineered, over-analyzed, over-staffed, etc) or cost-effective mode (the bloated supertanker needs feeding and the customer gets to pay for those inefficiencies), so that means new technologies and business models will drive innovation which ultimately serves the customers and communities.

Back to Data, Data, Data – Enter open source technologies like Hadoop and Pentaho BI/DI to drive next gen big data analytics to the market. Hadoop and Pentaho have both been around about 5 years, are both driven by very active communities, and have both been experiencing explosive growth over the last 18 months. Our community members are the ones who came up with the original integration points for the two techs, not because it was a fun, science project thing to do but because they had real business pains they were trying to solve. This all started in 2009 – we started development in 09, we launched our beta program in June 2010 (had to cap enrollment in the beta program at 60), launched a Pentaho for Hadoop roadshow (which was oversubscribed) and are now announcing the official release of Pentaho Data Integration and BI Suite for Hadoop.

I’m in NYC today at Hadoop World and we’re making four announcements:

  1. Pentaho for Hadoop – our Pentaho BI Suite and Pentaho Data Integration are now both integrated with Hadoop
  2. Partnership with Amazon Web Services – Pentaho for Hadoop now supports Amazon Elastic Map Reduce (EMR) and S3
  3. Partnership with Cloudera – Pentaho for Hadoop will support certified versions of Cloudera’s Distribution for Hadoop (CDH)
  4. Partnership with Impetus – a major Solutions Provider (over 1,000 employees) with a dedicated Large Data Analytics practice.

Consider this as phase I of building out the ecosystem.

We’re all about making Hadoop easy and accessible. Now you can take on those mountains of data and turn them into value. Download Pentaho for Hadoop.

Richard


Now available: Pentaho Kettle Solutions

September 21, 2010

Congrats to Matt Casters, Roland Bourman and Jos van Dongen for their new book available today: Pentaho Kettle Solutions: Building Open Source ETL Solutions with Pentaho Data Integration. You can buy it on Amazon or Wiley (they also have free exerpts)

If their names sound familiar it is because Roland and Jos are also the authors of the Amazon best seller – Pentaho Solutions: Business intelligence and Data Warehousing with Pentaho and MySQL published August 2009. Matt is the Kettle Project founder and Chief Architect of Pentaho Data Integration.

My copy is ordered and I can’t wait to read it.

Congrats
Doug Moran
Pentaho Community Guy


Pentaho’s BIG, Fast, and Agile August

August 4, 2010

Pentaho is hitting the road this month to show you the world’s first BI integration for Hadoop with our three-city roadshow, ‘Harnessing Hadoop for Big Data’.  Next, prepare to see blazing fast business intelligence when we pair Ingres Vectorwise with Pentaho’s Agile BI initative.

BIG – We’re rolling into town to show you how Pentaho, as the face of Hadoop, can leverage the power of business intelligence and data integration for your Big Data analysis needs.  These live seminars are free but space is limited, so be sure to register now.

  • Harnessing Hadoop for Big Data – Live Seminar Series

FAST & AGILE – See what is possible when you combine the power of Agile BI with Ingres Vectorwise, the next generation of analytic database technology during this live webcast.

  • Blazing Fast, Agile BI with Ingres VectorWise and Pentaho
    • Webcast:  Thursday, August 12th 2010

Want to learn more about Pentaho and meet the team?  This month we will be holding Classroom Training classes in Buenos Aires, Argentina and here on the home front in Orlando, Florida.

Where else can you find Pentaho?  This month and every month, we invite you to join the conversation with us on Twitter, Facebook, and LinkedIn.

Visit our Events page for more details and updated events.  Here’s to a BIG, Fast, and Agile August!


Top 10 reasons behind Pentaho’s success

July 20, 2010

Today we announced our Q2 results. In summary Pentaho:

  • More than doubled new Enterprise Edition Subscriptions from Q2 2009 to Q2 2010.
  • Exceeded goals resulting in Q2 being the strongest quarter in company history and most successful for the 3rd quarter in a row.
  • Became the only vendor that lets customers choose the best way to access BI: on-site, in the cloud, or on the go using an iPad.
  • Led the industry with a series of market firsts including delivering on Agile BI.
  • Expanded globally, received many industry recognitions and added several stars to our executive bench.

How did this happen? Mostly because of our laser focus over the past 5 years to build the leading end-to-end open source BI offering. But if we really look closely over the last 12-18 months there are some clear signs pointing to our success (my top ten list):

Top 10 reasons behind Pentaho’s success:

1.     Customer Value – This is the top of my list. Recent analyst reports explain how we surpassed $2 billion mark during Q2 in terms of cumulative customer savings on business intelligence and data integration license and maintenance costs. In addition, ranked #1 in terms of value for price paid and quality of consulting services amongst all Emerging Vendors.

2.     Late 2008-Early 2009 Global Recession – this was completely out of our control but it helped us significantly by forcing companies to look for lower cost BI alternatives that could deliver the same or better results than the high priced mega-vendor BI offerings. Making #1 more attractive to companies worldwide.

3.     Agile BI – we announced our Agile BI initiative in Nov 2009 and received an enormous amount of press and positive reception from the community, partners, and customers. We’ve been showing previews and releasing RCs in Q1-Q2 2010 and put PDI 4.0 in GA at the end of Q2 2009.

4.     Active Community – A major contributing factor to our massive industry adoption is our growing number of developer stars (the Pentaho army) that continue to introduce Pentaho into new BI and data integration projects. Our community triples the amount of work of our QA team, contributes leading plug-ins like CDA and PAT, writes best-selling books about our technologies and self-organizes to spread the word.

5.    BI Suite 3.5 & 3.6 – 3.5 was a huge release for the company and helped boost adoption and sales in Q3-Q4 2009. This brought our reporting up to and beyond that of competitors. In Q2 2010 the Pentaho BI Suite 3.6 GA brought this to another level including enhancements and new functionality for enterprise security, content management and team development as well as the new Enterprise Edition Data integration Server.  The 3.6 GA also includes the new Agile BI integrated ETL, modeling and data visualization environment.

6.     Analyzer – the addition of Pentaho Analyzer to our product lineup in Sept-Oct 2009 was HUGE for our users – the best web-based query and reporting product on the market.

7.     Enterprise Edition 30-Day Free Evaluation – we started this “low-touch/hassle free” approach in March 2009 and it has eliminated the pains that companies used to have to go thru in order to evaluate software.

8.     Sales Leadership – Lars Nordwall officially took over Worldwide Sales in June 2009 and by a combination of building upon the existing talent and hiring great new team members, he has put together a world-class team and best practices in place.

9.     Big Data Analytics – we launched this in May 2010 and have received very strong support and interest in this area. We currently have a Pentaho-Hadoop beta program with over 40 participants. There is a large and unfulfilled requirement for Data Integration and Analytic solutions in this space.

10.   Whole Product & Team – #1-#9 wouldn’t work unless we had all of the key components necessary to succeed – doc, training, services, partners, finance, qa, dev, vibrant community, IT, happy customers and of course a sarcastic CTO ;-)

Thanks to the Pentaho team, community, partners and customers for this great momentum. Everyone should be extremely proud with the fact that we are making history in the BI market. We have a great foundation in which to continue this rapid growth, and with the right team and passion, we’ll push thru our next phase of growth over the next 6-12 months.

Quick story to end the note:  I was talking and white boarding with one of my sons a few weeks ago (yes, I whiteboard with my kids) and he was asking certain questions about our business (how do we make money, why are we different than our competitors, etc.) and I explained at a high level how we are basically “on par and in many cases better” than the Big Guys (IBM, ORCL, SAP) with regards to product, we provide superior support/services, yet we cost about 10% as much as they do. To which my son replied, “Then why doesn’t everyone buy our product?”  Exactly.

Richard
CEO, Pentaho


Where is Pentaho this July?

July 12, 2010

This July, Pentaho continues the Worldwide Techcast Series demonstrating how Pentaho’s Agile BI initiative will help you speed development of new BI applications and better ensure that these applications meet the needs of your business users. Learn about Pentaho Data Integration 4.0 and Agile BI in six languages: Italian, German, Portuguese, Spanish, French and Norwegian. Sign-up for a live techcast this month or watch the series on-demand.

We are also holding an executive breakfast the city where high-tech meets southern hospitality, Raleigh, North Carolina on July 20th at EvoApp Live. If you are in the ‘Triangle’ make sure to reserve your seat for this interactive panel discussion about how business analytics are driving top preforming companies to make better, faster, more informed decisions.

Pentaho Featured Events

The highlighted events for the month of July are webcast for those looking to learn more about OSBI and simple ways to get started. These webcast are free, however, early registration is recommended.

Comparing the Cost of Business Intelligence – Proprietary Vs. Open Source.
With leading analyst Mark Madsen
July 13 at 14:00 EDT (18:00 GMT)
Register Now

* If you cannot attend the webcast on July 13, register for the July 22 webcast.
* For additional background to this webcast download Mark Madsen’s latest white paper, Lowering the cost of Business Intelligence with Open Source.

BI On-Demand – See your data in a dashboard in just 3 days!
Wednesday, July 28, 2010 14:00 EDT (18:00 GMT)
Register Now

Visit our events page for more details and updated events.
Follow-us on Twitter or Facebook for event reminders and updates.


Big data should not mean big cost

May 19, 2010

Data is exploding at rates our industry has never seen before and the huge opportunity to leverage this data is stymied by the archaic licensing practices still in use by the old school software companies.  Currently, the big guys like Oracle, IBM, SAP, Teradata and other proprietary database and data warehouse vendors have a very simple solution to “big data” environments – just keep charging more money, a lot more money. The only “winners” in this scenario are the software sales reps. Our industry (Tech) is artificially slowed in order to support these old school business models – they can’t afford to innovate in licensing and they surely don’t want to kill the golden goose – The Perpetual License fee.

A major gaming company, for example, had been using Oracle for its database and BI tech. With traffic reaching 100 million to 1 billion impressions per day, the database giant’s only answer was to sell more expensive licenses. Even then, the best it could do was analyze four days worth of information at a time.

Organizations like Mozilla, Facebook, Amazon, Yahoo, RealNetworks and many others are now collecting immense amount of structured and unstructured data. The size of weblogs alone can be enormous.  Management wants to be able to triangulate what people are doing at their sites in order to do a better job of

a)     Turning prospects into customers
b)     Offering customers what they want in a more timely manner
c)     Spotting trends and reacting to them in real time.

Any company, small or large, that is trying to sift through terabytes of structured and complex data on an hourly, daily or weekly basis for any kind of analytics had better take a long hard look at what it is really paying for. Just like the worldwide recession of 08-09 brought tremendous attention to lower cost, better value prop alternatives like Pentaho, the “big data” movement is doing the same thing in the DB/DW space. And where do you find some of the best innovations in the tech space? The answer is open source.

Specifically, an open source tech called Apache Hadoop is addressing the “better value proposition for Big Data.” It also is the only tech capable of handling some of these big data applications. Sounds great, right? Well not exactly. The issue with Hadoop is it is a very technical product with a command line interface. Once that data gets into Hadoop, how do you get it out? How do you analyze that data? If only there was an ETL and BI product tightly integrated with Hadoop, and available with the right licensing terms…

Today I’m proud to announce that Pentaho has done just that. Early May 19th we announced our plans to deliver the industry’s first complete end-to-end data integration and business intelligence platform to support Apache Hadoop.  Over the next few months we’ll be rolling out versions of our Pentaho Data Integration product and our BI Suite products that will provide Hadoop installations with a rich, visual analytical solution. Early feedback from joint Hadoop-Pentaho sites have been extremely positive and the excitement level is high.

Hadoop came out of the Apache open source camp. It is the best technology around for storing monster data sets. Until recently, only a small number of organizations used it, primarily those with deep technical resources. However, as the tech matures the audience is widening and now with a rich ETL and analytical solution it is about to get even bigger.

Stay tuned to our website and to this blog as I’ll be sharing many success stories over the next 90 days. And most importantly, watch out for the ‘Golden Goose’ licensing schemes from the old school vendors.

Richard

Visit www.pentaho.com/hadoop to watch a demo of Pentaho Enterprise integration with Hadoop and reserve your place in the beta program.


Former Informatica exec joins Pentaho

May 4, 2010

If you have the world’s most popular open source data integration (DI) offering and want to make it even better, you need the top DI market strategist behind the product. That is why we turned to DI and analytics veteran, Joe Nicholson. With over 25 years of technology and marketing management experience, with companies like Informatica, DecisionPoint and Trintech I asked Joe the same question I did to Tom Leonard when he joined, “What brought you to Pentaho?” Welcome Joe!

Guest Blogger: Joe Nicholson

I have been in DI and analytics in one form or another for most of my career. Whether it was marketing the data integration platform at Informatica or doing business development around packaged finance analytics at DecisionPoint, I am really an analytics guy at heart. Part of my passion no doubt comes from solving the enduring complexity and depth of the BI challenge. A decade ago, our challenges were focused on dealing with the complexity of data sources and providing cost effective information and analytics to the emerging generation of managers and mobile workers. At that time, data volumes and sources were exploding (or so we thought) and our primary customer prospects were those hand-coding their ETL projects or dumping their finance and other data into Excel and wanted a better way. What is particularly intriguing to me is that those challenges haven’t really changed over time. Data volumes and sources continue to grow, now at a rate that was impossible to imagine 10 years ago. And, while the use of BI has certainly increased for the business user, it is nowhere near where I would have expected it to be given all of the hype around Pervasive BI, BI for the Masses, the Democratization of BI or the other terms that were being used at that time. As a technology, it is still too expensive, too complex and too slow.

What has changed is the mindset of the business user. Frankly, the ROI for BI was oversold in the early and middle years and the recent economy has focused end users on TCO like never before as the need for faster, more agile and diverse projects now drive end user requirements.

I had followed Pentaho for several years, both from the technology and business model angles. First, from a technology perspective, they are solving many of the issues faced by end users by producing a complete, integrated BI suite that includes purpose built ETL, data modeling, reporting and analysis, all within a single platform that handles administration, security and the rest of the functions needed for deployment and maintenance of the BI applications.

Second, old way of licensing BI technology (and other software) has run its course. The era of open source and commercial open source has arrived. In today’s cost conscious, business driven climate, why would anyone run a long, extensive procurement process, construct costly prototypes and pay $10,000 to $100,000+ of license plus maintenance fees before you can actually see if the technology works for your purpose.

Enter Pentaho and the open source development and distribution model. Pentaho and its thousands of community members together with its own professional engineering staff have developed the Pentaho Enterprise BI Suite that is on par with all of the major players in the market across ETL, modeling, reporting analysis and data modeling. The game has changed with Pentaho’s recent announcements of the Pentaho Data Integration Version 4.0 and the Agile BI integrated development environment that forever changes how IT developers and business users collaborate to produce relevant BI applications.

It’s an exciting time to be in the DI and BI markets and even more exciting to be at Pentaho.

Joe Nicholson,
Vice President Product Marketing
Pentaho Corporation

Read the press release Pentaho Names Joe Nicholson Vice President of Product Marketing


Follow

Get every new post delivered to your Inbox.

Join 101 other followers