Integration Solutions are Key in Unlocking Big Data Benefits

November 6, 2013

VR_HorizontalLogoWe’re tremendously excited that Ventana Research is launching a new benchmark research on Big Data Integration.  As it becomes clearer to the market that Big Data has huge potential to deliver business value, identifying the right tools & technologies to manage it in the context of specific organizational goals is crucial.  The new research aims to take a step forward in this realm, as it promises to take a hard look at current and planned Big Data deployments, honing in on the solutions for accessing and transforming Big Data in a variety of different technical architectures and business use cases. It should provide great insight into ways that organizations can “unlock” Big Data’s potential.  See the press release here – Ventana Research Launches Big Data Integration Benchmark Research

For further context, we recommend reading a recent blog post from Ventana’s VP & Research Director, Tony Consentino titled, Big Data and Analytics Helps Business Transform and Gain Competitive Advantage. Consentino highlights the broad applicability of Big Data solutions across industries to empower organizations in the face of accelerating changes in markets, operations, & customer preferences.  The blog also cites staffing and training as the two biggest challenges to Big Data analytics, which underlines the importance of the new benchmark research in helping businesses identify which integration approaches will help accelerate time to value through usability by the broadest base of team members.

If you are interested in learning more about or participating in this benchmark research visit ventanaresearch.com/BigDataIntegration. There are several incentives for qualified research participants such as a $5 Amazon.com gift certificate and a complementary report valued at $995 as well as access to a free educational webinar on best practices from the benchmark research.

Finally, the Ventana Research Innovation Summit is next week, November 11-12 in Chicago. Let us know if you will be attending as we would love to meet-up!

Ben Hopkins
Product Marketing


Pentaho 5 has arrived with something for everyone!

September 18, 2013

I am tremendously excited to announce that Pentaho Business Analytics 5 is available for download!  This release is represents the culmination of over 30 man years of engineering effort and contains over 250 new features and improvements.  There truly is something for everyone in Pentaho 5.  If you are an end user, administrator, executive or developer I wanted to share with you what I think are the 3 top areas of improvement for you:

  1. Improving productivity for end users and administrators
  2. Empowering organizations to easily and accurately answer questions using blended big data sets
  3. Simplifying the experience for developers integrating with or embedding Pentaho Business Analytics

Improving Productivity for End Users and Administrators

multi-charts

homepage

18 months ago, we challenged ourselves to think deeply about the different profiles of users working with the Pentaho suite and identify the top areas where we could significantly improve our ease-of-use.  Based on the feedback from countless customer interviews and usability studies, the first thing you will notice about Pentaho 5 is a dramatically overhauled User Console.  Beyond the fresh, new, modern look and feel, we’ve introduced a new concept called “perspectives” making it easier than ever for end users to:

  • navigate between open documents
  • browse the repository
  • manage scheduled activities

Throughout the User Console, end users will enjoy numerous improvements and better feedback for common workflows such as designing dashboards or scheduling the execution of a parameterized report. Administrators will appreciate that we have consolidated all Administration capabilities directly into the User Console, enhanced security with the ability to create more specific role types with control the types of actions they can perform, and bundled a comprehensive audit mart providing out-of-the-box answers to common questions about usage patterns, performance and errors.

Analytics-ready Big Data Blending

MattCasters_Blog_graphic

partner logos

In the dawn of the Big Data era, a wide range of new storage and processing technologies have flooded the market, each bringing specialized characteristics to help solve the next wave of data challenges.   Pentaho has long been a leader and innovator in delivering an end-to-end platform for designing scalable and easily maintainable Big Data solutions.  Powered by the Pentaho Adaptive Big Data Layer, we’ve dramatically expanded our support for Hadoop with all new certifications for the latest distributions from Cloudera, Hortonworks, MapR and Intel.  Furthermore, we’ve integrated our complete analytics platform for use with Cloudera Impala.  Other Big Data highlights in Pentaho 5 include new integration with Splunk and dramatic ease-of-use improvements when working with NoSQL platforms such as MongoDB and Cassandra.

blendingAs organizations large and small map out their next generation data architectures, we see best practice design patterns emerging that help organizations target the appropriate data technology for each use case.  Evident in all of these design patterns is the fact that Big Data technologies are rarely information silos.  Solving common use cases such as optimizing your data warehousing architecture or performing 360 degree analysis on a customer require that all data be accessible and blended in an accurate way.  Pentaho Data Integration provides the connectivity and design ease-of-use to implement all of these emerging patterns, and with Pentaho 5 I’m excited to announce the world’s first SQL (JDBC) driver for runtime transformation.  This integration empowers data integration designers to accurately design blended data sets from across the enterprise, and put them directly in the hands of end users using tools they are already familiar with – reporting, dashboards and visual discovery – as well as predictive analytics.

Simplified Platform for OEMs and Embedders

marketplace perspective

integration samples

Finally, I’d like to highlight how this release further solidifies the Pentaho suite as the best platform for enterprises and OEMs who want to enrich their applications with better data processing or business analytics.  Pentaho 5 delivers a more customizable User Console providing developers with complete control over the menu bar and toolbar, improvements to the underlying theming engine and an all new plugin layer for adding custom perspectives.  Furthermore, we’ve dramatically simplified our service architecture by introducing a brand new REST-based API along with a rich library of integration samples and documentation to get you started.

These enhancements are just a few of the many great improvements in Pentaho 5. If you want a more in-depth overview and demonstration, register for the Pentaho 5.0 webinar on September 24th – 2 times to choose:  North America/LATAM & EMEA. You can also access great resources from videos to solutions briefs at Pentaho.com/5.0.

Jake Cornelius

SVP Products


Pentaho 5.0 blends right in!

September 12, 2013

Dear Pentaho friends,

Ever since a number of projects joined forces under the Pentaho umbrella (over 7 years ago) we have been looking for ways to create more synergy across this complete software stack.  That is why today I’m exceptionally happy to be able to announce, not just version 5.0 of Pentaho Data Integration but a new way to integrate Data Integration, Reporting, Analyses, Dashboarding and Data Mining through one single interface called Data Blending, available in Pentaho Business Analytics 5.0 Commercial Edition

Data Blending allows a data integration user to create a transformation capable of delivering data directly to our other Pentaho Business Analytics tools (and even non-Pentaho tools).  Traditionally data is delivered to these tools through a relational database. However, there are cases where that can be inconvenient, for example when the volume of data is just too high or when you can’t wait until the database tables are updated.  This for example leads to a new kind of big data architecture with many moving parts:

Evolving Big Data Architectures

Evolving Big Data Architectures

From what we can see in use at major deployments with our customers, mixing Big Data, NoSQL and classical RDBS technologies is more the rule than the exception.

So, how did we solve this puzzle?

The main problem we faced early on was that the default language used under the covers, in just about any business intelligence user facing tool, is SQL.  At first glance it seems that the worlds of data integration and SQL are not compatible.  In DI we read from a multitude of data sources, such as databases, spreadsheets, NoSQL and Big Data sources, XML and JSON files, web services and much more.  However, SQL itself is a mini-ETL environment on its own as it selects, filters, counts and aggregates data.  So we figured that it might be easiest if we would translate the SQL used by the various BI tools into Pentaho Data Integration transformations. This way, Pentaho Data Integration is doing what it does best, not directed by manually designed transformations but by SQL.  This is at the heart of the Pentaho Data Blending solution.

MattCasters_Blog_graphic

The internals of Data Blending

In other words: we made it possible for you to create a virtual “database” with “tables” where the data actually comes from a transformation step.

To ensure that the “automatic” part of the data chain doesn’t become an impossible to figure out “black box”, we made once more good use of existing PDI technologies.  We’re logging all executed queries on the Data Integration server (or Carte server) so you have a full view of all the work being done:

Data Blending Transparency

Data Blending Transparency

In addition to this, the statistics from the queries can be logged and viewed in the operations data mart giving you insights into which data is queried and how often.

We sincerely hope that you like these new powerful options for Pentaho Business Analytics 5.0!

Enjoy!

Matt

–If you want to learn more about the new features in this 5.0 release, Pentaho is hosting a webinar and demonstration on September 24th – Two options to register:  EMEA & North America time zones.

Matt Casters
Chief Data Integration, Kettle founder, Author of Pentaho Kettle Solutions (Wiley)


“There is nothing more constant than change”—Heraclitus 535BC

June 26, 2013

13-090 Pentaho Labs logo v3

Change and more change. It’s been incredible watching the evolution of and innovation in the big data market.  A few years ago we were helping customers understand Hadoop and the value it could bring in analyzing large volumes of unstructured data. Flash-forward to today as we attend our third Hadoop Summit in San Jose and we see the advances customers have made in adopting these technologies in their production big data environments..

It’s the value of a continuum of innovation. As the market matures we are only limited by what we don’t leave ourselves open to.  Think for a minute about the next “big data,” because there will be one. We can’t anticipate what it look like, where it will come from or how much of it will be of value.  In the same way we couldn’t predict the advent of Facebook or Twitter.

We do know that innovation is a constant. Today’s big data will be tomorrow’s “traditional” data.

Pentaho’s announcement today of an adaptive big data layer and Pentaho Labs are in anticipation of just this type of change.  We’ve simplified for Pentaho and our customers the ability to leverage current and new big data technologies like Hadoop, NoSQL and specialized big data stores.

In the spirit of innovation (which stems from our open source history) we’ve established Pentaho Labs – our place for free thinking innovation that leads to new capabilities in our platform in areas like real time and predictive analytics.

Being a leader at the forefront of a disruptive and ever-changing market means embracing change and the innovation. That’s the future of analytics.

Donna Prlich
Senior Director, Product Marketing, Pentaho


What Makes Pentaho Hot

September 18, 2012

At Pentaho we’re proud to be named a “hot” vendor in Ventana Research’s new 2012 Value Index for Data Integration. Inclusion in this category assures buyers that the Pentaho Business Analytics platform delivers optimal value, product maturity and superior customer support according to Ventana’s comprehensive evaluation of more than a dozen vendors.

For Pentaho, the research validates that to deliver the future of analytics you need tightly coupled data integration and business analytics. What makes heat grids and other powerful data visualizations in the Pentaho Business Analytics platform “hot” is the data integration behind them. According to Ventana’s Benchmark research, 55% of organizations identify data integration as a critical component of their information management strategies.

Don’t get me wrong, we love being recognized for the ‘brains behind our beauty‘ and the ‘substance behind our sizzle.’  An integrated platform with Data Integration and Business Analytics is the future of analytics…now that’s HOT.

Let us know what you think.

Donna Prlich
Director, Product Marketing
Pentaho


Words of Wisdom

July 20, 2012

We are very lucky to have some words of wisdom today from The Most Interesting Man in the World.

Stay integrated my friends!


Matt Casters on DM Radio – Future of ETL

March 20, 2012

Pentaho’s Matt Casters, Chief Architect, Pentaho Data Integration and Kettle Project Founder was featured last week on DM Radio on their radio broadcast titled: On the Move: Why ETL is Here to Stay.

Listen to Matt’s interview with Hosts Eric Kavanagh and Jim Ericson along with panelist Nimitt Desai of Deloitte, Geoff Malafsky of Phasic Systems and Josh Rogers of Syncsort.

Starting at 13:33 Listen to Matt talk about:

  • How Big data and ETL intersect and what that means
  • Points to keep in mind when starting to working and accessing data in and out of Hadoop
  • How to keep track of changing technologies and architectures
  • Why its important to not just do data integration for data integration sake
  • Why there’s a lack of best practices
  • What Matt’s seeing: need for high level of metadata and modeled ETL generation

Access both Matt’s segment and the full podcast here: http://www.information-management.com/dmradio//-10022068-1.html


Pentaho’s first mover status in the big data space

February 8, 2012

I am very excited to share the news the Pentaho is cited as the only  ‘Strong Performer’ in The Forrester Wave™: Enterprise Hadoop Solutions, Q1 2012 (February 2012).

The best way to summarize Pentaho’s position in the market is straight from the report by James G. Kobielus, Senior Analyst, Forrester Research:

“Pentaho is a strong performer with an impressive Hadoop data integration tool. Among data integration vendors that have added Hadoop functionality to their products over the past year, it has the richest functionality and the most extensive integration with open source Apache Hadoop.”

We believe that the inclusion of the Pentaho Kettle data integration product in the first Forrester Wave: Enterprise Hadoop Solutions report is a strong testimonial to Pentaho’s first mover status in this space. The scores awarded to Pentaho are a testimonial to the fact that Pentaho is helping companies operationalize Big Data by addressing critical pain points associated with Hadoop Data Integration and easy analysis of Big Data.

I encourage you to access the full report and see why Pentaho was named a strong performer and how we stack up against other vendors.

Richard

About the report
The Forrester Wave™: Enterprise Hadoop Solutions, Q1 2012 included 13 enterprise Hadoop solution providers in its assessment for its Enterprise Hadoop Solutions report. The firm examined past research and user needs assessment and conducted vendor and expert interviews in order to formulate a comprehensive set of 15 evaluation criteria, which Forrester grouped into the three high level buckets of current offering, strategy and market presence.


Top 10 Reasons Behind Pentaho’s Success

September 2, 2011

To continue our revival of old blog posts, today we have our #2 most popular blog from last July. Pentaho is now 7 years old, with sales continually move up and to the right. In a crazy economy, many are asking, “What is the reason behind your growth and success?” Richard Daley reflected on this question after reporting on quartlery results in 2010 .

*****Originally posted on July 20, 2010*****

Today we announced our Q2 results. In summary Pentaho:

  • More than doubled new Enterprise Edition Subscriptions from Q2 2009 to Q2 2010.
  • Exceeded goals resulting in Q2 being the strongest quarter in company history and most successful for the 3rd quarter in a row.
  • Became the only vendor that lets customers choose the best way to access BI: on-site, in the cloud, or on the go using an iPad.
  • Led the industry with a series of market firsts including delivering on Agile BI.
  • Expanded globally, received many industry recognitions and added several stars to our executive bench.

How did this happen? Mostly because of our laser focus over the past 5 years to build the leading end-to-end open source BI offering. But if we really look closely over the last 12-18 months there are some clear signs pointing to our success (my top ten list):

Top 10 reasons behind Pentaho’s success:

1.     Customer Value – This is the top of my list. Recent analyst reports explain how we surpassed $2 billion mark during Q2 in terms of cumulative customer savings on business intelligence and data integration license and maintenance costs. In addition, ranked #1 in terms of value for price paid and quality of consulting services amongst all Emerging Vendors.

2.     Late 2008-Early 2009 Global Recession – this was completely out of our control but it helped us significantly by forcing companies to look for lower cost BI alternatives that could deliver the same or better results than the high priced mega-vendor BI offerings. Making #1 more attractive to companies worldwide.

3.     Agile BI – we announced our Agile BI initiative in Nov 2009 and received an enormous amount of press and positive reception from the community, partners, and customers. We’ve been showing previews and releasing RCs in Q1-Q2 2010 and put PDI 4.0 in GA at the end of Q2 2009.

4.     Active Community – A major contributing factor to our massive industry adoption is our growing number of developer stars (the Pentaho army) that continue to introduce Pentaho into new BI and data integration projects. Our community triples the amount of work of our QA team, contributes leading plug-ins like CDA and PAT, writes best-selling books about our technologies and self-organizes to spread the word.

5.    BI Suite 3.5 & 3.6 – 3.5 was a huge release for the company and helped boost adoption and sales in Q3-Q4 2009. This brought our reporting up to and beyond that of competitors. In Q2 2010 the Pentaho BI Suite 3.6 GA brought this to another level including enhancements and new functionality for enterprise security, content management and team development as well as the new Enterprise Edition Data integration Server.  The 3.6 GA also includes the new Agile BI integrated ETL, modeling and data visualization environment.

6.     Analyzer – the addition of Pentaho Analyzer to our product lineup in Sept-Oct 2009 was HUGE for our users – the best web-based query and reporting product on the market.

7.     Enterprise Edition 30-Day Free Evaluation – we started this “low-touch/hassle free” approach in March 2009 and it has eliminated the pains that companies used to have to go thru in order to evaluate software.

8.     Sales Leadership – Lars Nordwall officially took over Worldwide Sales in June 2009 and by a combination of building upon the existing talent and hiring great new team members, he has put together a world-class team and best practices in place.

9.     Big Data Analytics – we launched this in May 2010 and have received very strong support and interest in this area. We currently have a Pentaho-Hadoop beta program with over 40 participants. There is a large and unfulfilled requirement for Data Integration and Analytic solutions in this space.

10.   Whole Product & Team – #1-#9 wouldn’t work unless we had all of the key components necessary to succeed – doc, training, services, partners, finance, qa, dev, vibrant community, IT, happy customers and of course a sarcastic CTO ;-)

Thanks to the Pentaho team, community, partners and customers for this great momentum. Everyone should be extremely proud with the fact that we are making history in the BI market. We have a great foundation in which to continue this rapid growth, and with the right team and passion, we’ll push thru our next phase of growth over the next 6-12 months.

Quick story to end the note:  I was talking and white boarding with one of my sons a few weeks ago (yes, I whiteboard with my kids) and he was asking certain questions about our business (how do we make money, why are we different than our competitors, etc.) and I explained at a high level how we are basically “on par and in many cases better” than the Big Guys (IBM, ORCL, SAP) with regards to product, we provide superior support/services, yet we cost about 10% as much as they do. To which my son replied, “Then why doesn’t everyone buy our product?”  Exactly.

Richard
CEO, Pentaho


Facebook and Pentaho Data Integration

July 15, 2011

Social Networking Data

Recently, I have been asked about Pentaho’s product interaction with social network providers such as Twitter and Facebook. The data stored within these “social graphs” can provide its owners with critical metrics around their content. By analyzing trends within user growth and demographics as well as consumption and creation of content…owners and developers are better equipped to improve their business with Facebook and Twitter. Social networking data can already be viewed and analyzed utilizing existing tools such as FB Insights or even purchasable 3rd party software packages created specifically for this purpose. Now…Pentaho Data Integration in its traditional sense is an ETL (Extract Transform Load) tool. It can be used to extract and extrapolate data from these services and merge or consolidate it with other relative company data. However, it can also be used to automatically push information about a company’s product or service to the social network platforms. You see this in action today if you have ever used Facebook and “Liked” something a company had to offer. At regular intervals, you will sometimes note unsolicited product offers and advertisements posted to your wall from those companies. A great and cost effective way to advertise to the masses.

Application Programming Interface

Interacting with these systems is made possible because they provide an API. (Application Programming Interface) To keep it simple, a developer can write a program in “some language” to run on one machine which communicates with the social networking system on another machine. The API can leverage a 3GL such as Java or JavaScript or even simpler, RESTful services. At times, software developers/vendors will write connectors in the native API that can be distributed and used in many software applications. These connectors can offer a quicker and easier approach than writing code alone. It may be possible within the next release of Pentaho Data Integration, that an out of the box Facebook and/or Twitter transformation step is developed – but until then the RESTful APIs provided work just fine with the simple HTTP POST step.  Using Pentaho Data Integration with this out of the box component, allows quick access to social network graph data. It can also provide the ability to push content to those applications such as Facebook and Twitter without writing any code or purchasing a separate connector.

The Facebook Graph API

Both Facebook and Twitter provide a number of APIs, one worth mentioning is the Facebook Graph API (don’t worry Twitter, I’ll get back to you in my next blog entry).

The Graph API is a RESTful service that returns a JSON response. Simply stated an HTTP request can initiate a connection with the FB systems and publish / return data that can then be parsed with a programming language or even better yet – without programing using Pentaho Data Integration and its JSON input step.

Since the FB Graph API provides both data access and publish capabilities across a number of objects (photos, events, statuses, people pages) supported in the FB Social graph, one can leverage both automated push and pull capabilities.

If you are interested in giving this a try or seeing this in action, take a look at this tutorial available on the Pentaho Evaluation Sandbox.

Kind Regards,

Michael Tarallo
Director of Enterprise Solutions
Pentaho


Follow

Get every new post delivered to your Inbox.

Join 96 other followers