The Road to Success with Big Data – A Closer Look at Expectations vs. the Reality

June 5, 2013

Stay on course
Big Data is complex. The technologies in Big Data are rapidly maturing, but are still in many ways in an adolescent phase. While Hadoop is dominating the charts for Big Data technologies, in the recent years we have seen a variety of technologies born out of the early starters in this space- such as Google, Yahoo, Facebook and Cloudera. To name a few:

  • MapReduce: Programming model in Java for parallel processing of large data sets in Hadoop clusters
  • Pig: A high-level scripting language to create data flows from and to Hadoop
  • Hive: SQL-like access for data in Hadoop
  • Impala: SQL query engine that runs inside Hadoop for faster query response times

It’s clear, the spectrum of interaction and interfacing with Hadoop has matured beyond pure programming in Java into abstraction layers that look and feel like SQL. Much of this is due to the lack of resources and talent in big data – and therefore the mantra of “the more we make Big Data feel like structured data, the better adoption it will gain.”

But wait, not so fast—->you can make Hadoop act like a SQL data store. However, there are consequences, as Chris Deptula from OpenBI explains in his blog, A Cautionary Tale for Becoming too Reliant on Hive. You are forgoing flexibility and speed if you choose Hive for a more complex query as opposed to pure programming or using a visual interface to MapReduce.

This goes to show that there are numerous areas of advancements in Hadoop that have yet to be achieved – in this case better performance optimization in Hive. I come from a relational world – namely DB2 – where we spent a tremendous amount of time making this high-performance transactional database – that was developed in the 70’s – even more powerful in the 2000s, and that journey continues today.

Granted, the rate of innovation is much faster today than it was 10, 20, 30 years ago, but we are not yet at the finish line with Hadoop. We need to understand the realities of what Hadoop can and cannot do today, while we forge ahead with big data innovation.

Here are a few areas of opportunity for innovation in Hadoop and strategies to fill the gap:

  • High-Performance Analytics: Hadoop was never built to be a high-performance data interaction platform. Although there are newer technologies that are cracking the nut on real-time access and interactivity with Hadoop, fast analytics still need multi-dimensional cubes, in-memory and caching technology, analytic databases or a combination of them.
  • Security: There are security risks within Hadoop. It would not be in your best interest to open the gates for all users to access information within Hadoop. Until this gap is closed further, a data access layer can help you extract just the right data out of Hadoop for interaction.
  • APIs: Business applications have lived a long time on relational data sources. However with web, mobile and social applications, there is a need to read, write and update data in NoSQL data stores such as Hadoop. Instead of direct programming, APIs can simplify this effort for millions of developers who are building the next generation of applications.
  • Data Integration, Enrichment, Quality Control and Movement: While Hadoop stands strong in storing massive amounts of unstructured / semi-structured data, it is not the only infrastructure in place in today’s data management environments. Therefore, easy integration with other data sources is critical for a long-term success.

The road to success with Hadoop is full of opportunities and obstacles and it is important to understand what is possible today and what to expect next. With all the hype around big data, it is easy to expect Hadoop to do anything and everything. However, successful companies are those that choose combination of technologies that works best for them.

What are your Hadoop expectations?

- Farnaz Erfan, Product Marketing, Pentaho


The Tesla vs. NY Times – How Analytics Helped Tesla Win

February 21, 2013

Tesla’s-Pricing-Strategy-for-the-Model-S-Luxury-Sedan-25

In the last couple of weeks the feud between The NY Times Editor, John Broder – and Tesla Motors’ CEO, Elon Musk has played out in the media.

It all started when Broder took a highway trip between Washington D.C. and Boston, cruising in Tesla’s Model S luxury sedan. The purpose of the trip was to range test the car between two new supercharging stations. This 200 miles trip was well under the Model S’s 265-mile estimated range. But nonetheless the trip was filled with anxiety for Broder. Fearful of not reaching his charging destination, he had to turn off the battery-draining amenities such as radio and heater (in a 30 degree weather) to finally reach his destination – feet and knuckles “frozen”.

In rebutting Broder’s claims, Tesla’s chief executive, Elon Musk, has charged that the story was faked, that Mr. Broder intentionally caused his car to fail. On his Tesla blog, he released graphs and charts, based on driving logs that contest many of the details of Mr. Broder’s article.

With the logs now published, one thing is clear — Tesla’s use of predictive analytics helped them warn Broder on what is ahead. By calculating the range based on the energy consumption, Tesla signaled Broder to charge the vehicle in time. Had Tesla not been able to call its log files as witness, this futuristic motor tech company could have experienced serious brand damage.

What’s interesting is that Tesla’s story is not unique. Today, virtually anything that we use, an appliance, a mobile phone, an application, generates some sort of data – machine-generated data. And the truth exists behind that data. Such data, when analyzed and mined properly, provides indicators that solve problems, ahead of time.

Having real-time access to machine-generated data to foresee problems and improve performance is exactly why NetApp is using Pentaho. Using Hadoop and Pentaho Business Analytics to process and drive insights from 2-5 TBs of incoming data per week, NetApp has built a solution that sends alerts and notifications ahead of the actual hardware failure. The solution has helped NetApp predict its appliance interruptions for the E-Series storage units, offering new ways to exceed customer SLAs and protect the brand’s image.

Tesla, NetApp or other, if you run a data-driven business, the more your company can act on that data to improve your application, service or product performance, the better off your customers and the better your brand will be.

Pentaho Business Analytics gives companies fast and easy ways for collecting, analyzing and predicting data patterns. Pentaho’s customers see the value of analytics in many different facets and use cases. NetApp’s use case will be featured in Strata’s upcoming conference on Thursday, February 28, 2012.

Join us to find out more.

- Farnaz Erfan, Product and Solution Marketing, Pentaho


Looking to the Future of Business Analytics with Pentaho 4.8

November 12, 2012

Last week Pentaho announced Pentaho 4.8, another milestone in delivering the future of analytics. It has been an exciting ride. Our partners’ and our customers’ feedback have kept us ecstatic and ready to excel further into the future.

Pentaho 4.8 is a true testament on what the future of analytics needs. The future of analytics is driven by the data problems that businesses face every day – and is dependent on the information users and their expectations for solving those problems.

Let me give you a good example. I recently had the pleasure to meet with one of our customers – BeachMint. BeachMint is a fashion and style ecommerce company who uses celebrities / celebrity stylists to promote its retail business.

This rapidly growing online retailer needed to keep tabs on its large twitter and facebook communities to track customer sentiment and social influence. It then uses the social data to define customer cohorts and design marketing campaigns that best target each cohort.

For BeachMint insight to data is extremely important. But on one hand, the volumes and variety of data – in this case unstructured social data and click-through ad feeds – has increased its complexity. And on the other hand, the speed in which it gets created has accelerated rapidly. For example, in addition to analyzing the impact of customer sentiments on their purchasing behavior, BeachMint also needed to gain up-to-the-minute information on the activity of key promotional codes – to immediately identify those that leak out.

Pentaho understands these data challenges and user expectations. In this release Pentaho takes full advantage of its tightly coupled Data Integration and Business Analytics platform – to simplify data exploration, discovery and visualization for all users and all data types – and to deliver this information to users immediately – sometimes even at a micro-second level. In this release Pentaho delivers:

- Pentaho Mobile – the only Mobile BI application with the power to instantly create new analysis on the go.

- Pentaho Instaview – the industry’s first instant and interactive big data visualization application.

Want to find out more? Register for Pentaho 4.8 webinar and see for yourself.

- Farnaz Erfan, Product Marketing, Pentaho


Is Your Big Data Hot or Not?

July 23, 2012

Data is the most strategic asset for any business. However, massive volumes and variety of data has made catching it at the right time and right place, discovering what’s hot – and needs more attention – and what’s not, a bit trickier these days.

 Heat grids are ideal for seeing a range of values in data as they provide a gradient scale, showing a change in data intensity through the use of colors. For example, you can see what’s hot in red and what’s normal in green; and everything else in various shades of color in between. Let me give you two examples of how companies have used heat grids to see if their data is hot or not:

Example #1 – A retailer is looking at week-by-week sales of a new fashion line to understand how each product line is performing as items get continually discounted throughout the season. Data is gathered from thousands of stores across the country and then entered into a heat grid graph that includes:

  • X axis – week 1 through 12, beginning from the launch of a new campaign (e.g. Nordstrom’s Summer Looks)
  • Y axis – product line (e.g. shoes, dresses, skirts, tops, accessories)
  • Color of the squares – % of discount (e.g. dark red = 70%, red = 60%, orange = 50%, yellow = 30%, green = full price)
  • Size of the squares – # of units sold

Looking at this graph, the retailer can easily see that most shoes sell at the beginning of the season – even without heavy discounts. This helps the retailer predict inventory levels to keep up with the demand for shoes.

It also shows that accessories almost never sell at regular prices, nor do they sell well when the discount levels are higher than 70%. Knowing this, the retailer can control its capital spending by not overstocking on this item. The retailer can also increase profit per square footage of their store by reselling its accessories earlier in the season to avoid high markdowns and inventory overstocks at the end of the season.

Example # 2 – A digital music streaming service provider is using analytics to assess the performance of its sales channels (direct vs. sales through different social media sites such as Facebook and Twitter) to guide future marketing and development spend. For that, the company uses a heat grid to map out:

  • X axis – various devices (iPhone, iPad, Android Smartphone, Android Tablet, Blackberry)
  • Y axis – various channels (direct site, Facebook, Twitter, …)
  • Color of the circles – # of downloads (0-100 = red, 100-1000=orange, 1000-10000 = yellow, 10000+ = green)
  • Size of the circles – app usage hours per day – the bigger the size, the more usage

This graph helps the music service provider analyze data from millions of records to quickly understand the popularity and usage patterns of their application on different devices, sold through different channels.

Heat grids can be use in variety of other forms, such as survey scales, product rating analysis, customer satisfaction studies, risk analysis and more. Are you are ready to find out whether your big data is hot or not? Check out this 3 minute video to learn how heat grids can help you.

Understanding buyers/users and their behavior is helping many companies including ideeli – one of the most popular online retailers – and Travian Games – top German MMO (massively multiplayer online) game publisher – gain better insight from their hottest asset – their big data!

What is your hottest business asset?

- Farnaz Erfan, Product Marketing, Pentaho

This post originally appeared on Smart Data Collective on July 13,2012


4 Questions to Ask Before You Define Your Cloud BI Strategy

June 25, 2012

These days, when it comes to enterprise software, it seems that it is all about the cloud. Some software applications such as Salesforce, Marketo, and Workday, have made quite a name for themselves in this space. Can Business Intelligence follow the same path to success? Does it make sense to house your BI in the cloud? I believe that it depends. Let’s explore why.

There are four criteria that impact the decision for a cloud vs. on-premise BI strategy.  Let’s take a look at how they affect your approach.

Question 1: Where is the data located?

Your BI Strategy should vary depending on the location of data.  If your data is distributed, some data may already be in the cloud, e.g. web data / clickstreams; and some on-premise, such as corporate data. For real-time or near real-time analytics, you need to deploy your BI as close to the source as possible. For example, when analyzing supply chain data out of an on-premise SAP system, where your database, application and infrastructure are all sitting on-premise, it is expensive and frankly impractical to move the data to the cloud before you start analyzing it.

Your data can also be geographically distributed. Unless your cloud infrastructure is co-located with your data geo zones, your BI experience can suffer from data latency and long refresh intervals.

Question 2: What are the security levels of data?

It’s important to acknowledge that data security levels are different in the cloud. You may not be able to put all your analytics outside of the company firewall. According to Cisco’s 2012 Global Cloud Networking survey, 72% of respondents cited data protection security as the top obstacle to a successful implementation of cloud services.

Question 3: What are the choice preferences of your users?

Customer preference is extremely important today. The balance of power has shifted, and users and customers are now the ones who decide whether an on-premise or a cloud deployment is suitable for them. What’s more, each customer’s maturity model is different. As an application provider or business process automation provider, you need to cater to your individual customers’ business needs.

Question 4: What operational SLAs does your Cloud BI vendor oblige you to?

Your operational SLAs can depend on cloud infrastructure providers, obliging you to service quality levels different from what you need. Pure cloud BI vendors provide their BI software over the public Internet through a utility pricing and delivery scheme. As much as this model provides an attractive alternative when resources are limited, it’s not for everyone. In most cases, the SaaS BI vendor depends on IaaS vendors (such as Amazon, Savvis, OpSource, etc.) for storage, hardware, and networks. As a result, the SaaS BI vendors’ operational processes have to align with the infrastructure vendors’ for housing, running, and backup/recovery of the BI software. Depending on your BI strategy, these nested and complex SLAs may or may not be the right choice.

Large enterprises, or even mid-market companies inspired by growth, typically develop an IT strategy that is provider-agnostic and has the flexibility to be hosted on-premise or in the in the cloud.   This strategy helps companies avoid lock-in and inflexibility down the road.

As cloud technology remains one of the hottest trends in IT today, it is important to assess whether cloud is the right choice for BI. The reality is that it depends. The center of gravity for BI is still on premise; however, it will move to the cloud over time mostly through the embedded BI capabilities of enterprise SaaS applications. Successful organizations will be the ones that can navigate the boundary between the two strategies and provide gr

eater flexibility and choice by offering a product that can be deployed on-premise, in the cloud, or a hybrid of both.

What is your Business Intelligence Cloud strategy?

- Farnaz Erfan, Product Marketing, Pentaho

Originally posted on SmartData Collective on June 21, 2012


The Diary of a Construction Manger in Love with His BI Tool

June 7, 2012

Hi, my name is Bob and I am a construction manager. I oversee all aspects of managing the operations of a construction project, including budgets, staffing, and the compliance of the entire construction project.

In 10+ years of my experience, I have never had a Business Intelligence (BI) tool. I had to create spreadsheets to track daily activities, calculate risks and build formulas to measure impact. Given the size of the projects I worked on, this was extremely complex.  As a result, I would spend a lot of my time putting out fires to problems that I knew could have been prevented if I had the right information.

Recently my company introduced BI to our team. Since I’m using BI for the first time, I decided to create an activity log similar to a diary of my project.

Let me share some highlights with you:

October 28, 2011

We are 4 weeks into the project. We have the crew working on the ground. The foundation is done. The structural engineer has finished his design. We are ready to roll.

January 11, 2012

This morning I received an alert about my Preventative vs. Corrective Maintenance. My monthly work mix by type looks like this: preventative 36%; repair 24%; rebuild 5%; and modify 35%. My preventative costs have gone down from an optimal 40% to 36% and my repair costs have increased correspondingly.

When I drilled down into the repairs, I see that we are responding to higher than normal number of heating and insulation work items. I am going to talk to Edward – my HVAC contractor – about it.

February 29, 2012

I have been monitoring our electrical work. Our average Cost per SQ Foot is 13% less than industry average. This is a breakthrough thanks to the changes I have made monitoring the project with BI and making data-driven decisions. It lets me monitor these costs on an ongoing basis, so I can take preventative actions to stay below industry average to protect our funding and even justify additional headcount.

March 16, 2012

Productivity Rate is one of my favorite indicators – because it truly provides me with real-time info about the performance of my team. On average, our productivity rate stays on optimal levels. However, the plumbing trade group’s actual cost is exceeding the estimated costs. This will affect my cost-to-complete and margins, as I have to pay overtime for this contractor.

But I don’t have to worry… my BI tool lets me drill into this indicator to see whether the reason is ‘labor’ or ‘supply’ related. Drill-thru was something a spreadsheet could never let me do.

March 30, 2012

Two weeks have passed since I shifted resources for plumbing. Our productivity rates have improved since then and the project is looking on time and on budget.

With 40 more days to go, I want to make sure we deliver on time and meet our SLA with the building owners. I see no bottlenecks. Cycle Times – the average time to complete an activity – shows me that we are actually 4 days ahead of the schedule.

May 21, 2012

I’m very happy to report that we are done with the construction. The ROI on this project was greater than we expected and my client is very happy. Next weekend is the Memorial Day weekend. I have the time and money I need to take a nice vacation with my wife and son.

-As told by Bob, a fictional construction manager.

Even though the story is fictional, it’s based on reality. Business users and project managers – such as facility managers, supply chain logistic specialists, even dairy farmers – use Pentaho business intelligence to make their jobs easier and to make smarter, data driven decisions – just like our fictional friend, Bob.

Who knew BI could be so handy for construction managers?

What is your secret BI story? Drop me a line.

- Farnaz Erfan, Product Marketing, Pentaho


Big Data Bubbles Up Trouble!

May 16, 2012

Today Big Data has made the impossible, possible. Collecting and analyzing unstructured data types such as social media data, web click streams, network and data center logs is no longer a daunting task. While Hadoop and MapReduce are the technologies behind the scenes to crunch massive volumes of data, advanced visualizations have become the art that show us the best (and worst) parts about our data.

Out of all types of different visualization styles, bubble charts are unique in the sense that they allow you to show hundreds of individual values at once! They are the perfect visualization for showing data sets that have a high degree of distribution in their frequency. Let me give you a couple examples:

Example # 1 - A call center looking at hourly tickets, on a week by week basis, to understand what issues cause the most service calls. Optimizing the call center performance requires analyzing average call duration as well as call wait times, for each service issue. This can be 1000s of calls every hour of every day. How you graph this? With a bubble chart that shows:

  • X axis – days of the week
  • Y axis – hours of the day
  • Color of the bubbles – reason for the call / service issue
  • Size of the bubbles – quantity of calls or call duration or call wait time

Example # 2 – A marketing organization looking to improve its branding and customer sentiment. What is the best way to visualize their online presence such as tweets as well as other social media sentiments?

  • X axis – previous week vs. current week, for a week over week analysis
  • Y axis – their company and their major competitors
  • Color of the bubbles – keywords such as “bad quality” or “crash”
  • Size of the bubbles – volumes (# of tweets) or the intensity of the sentiment (e.g. “hate”, “worst”, “like”, “love”, etc.)

Want to see this in action? Check out this 3 minute video to find out how a bubble chart can help you visualize your data.

What are your data troubles? Do you have the Big Data technology and the advanced visualizations required to see it? I would love to hear more use cases of this visualization. So, drop me a line at @farnazerfan or leaving a comment below.

Farnaz Erfan
Product Marketing
Pentaho


The Power of Location in Your Data

May 10, 2012

Did you know that 70 percent of all business data contains a location component? With this increasing amount of location-based data, geo-mapping visualizations can help you detect geographic trends, such as customer clusters or outliers.

For marketers, sales executives and product managers, these types of visualizations are critical in understanding customer demographics. Determining where your best markets are concentrated or analyzing your sales performance against incoming demand, keeps you one step ahead of the competition.

Pentaho’s new geographic data visualizations can help you answer fundamental questions, such as:

  • Where are my customers located?
  • Which countries visit my website the most?
  • Which regional marketing campaigns are working?
  • Are my sales territories and client clusters aligned?
  • Which stores are carrying the most shipping costs?
  • Where is my mobile app used the most?

Powerful, isn’t it? For a quick peek at Pentaho’s new geo-mapping visualization capability, watch this three minute video.

What location-based data are you tracking? Let me know by leaving a comment.

- Farnaz Erfan, Product Marketing, Pentaho


Finding Wheelchairs in 1s and 0s: The Power of Location in Data

March 22, 2012

RTLS (real time location systems) have long been embraced by retailers to monitor store foot traffic and secure merchandise. Today, hospitals are also making use of the technology. RTLS systems are used to track and identify the location and status of objects in real time, using sensors that monitor physical or environmental conditions, such as temperature, sound, vibration, pressure, or motion.

For healthcare providers RTLS means hard-dollar savings! With thousands of assets in constant motion each and every day, it becomes very difficult know what is used where, when, and why. These assets are core to providing care; therefore, dirty, in-use, or broken equipment can completely break the processes that take place in healthcare facilities. Simple activities like finding a piece of equipment can consume most of a caregiver’s time, slowing down patient flow, adding costs, and even impacting patient care.

How can a healthcare organization overcome this issue and put their location data into real use? —>By using powerful analytics. Let’s explore. There are two types of analytics:

1. Historical analysis. By understanding the actual utilization rates of equipment, hospitals can better estimate the inventory levels they need to have on hand, tailoring future purchases to maintain optimum inventory levels.

2. Real-time analysis. Monitoring the usage of equipment in real time and providing alerts when rental equipment is sitting idle, or when a piece of recalled piece of medical equipment enters a patient room,  or when par levels of clean and available equipment are not maintained, boosts the performance of the organization, improves staff efficiency, increases patient satisfaction, and improves patient safety and quality of care.

A great example of applied analytics in healthcare is what Intelligent InSites has implemented within their enterprise RTLS Asset Management software solution. Using this tool, some of their customers save up to $30,000 a month by monitoring real-time information on rental equipment and eliminating unnecessary expenses, such as paying for unused equipment. Intelligent InSites embeds Pentaho Business Analytics as part of their RTLS software solution. Their RTLS healthcare platform enables hospitals and healthcare facilities to analyze data from RTLS and RFID tags on medical equipment, such as wheelchairs or IV infusion pumps, gaining visibility into the location or status of these assets, identifying operational bottlenecks, and ultimately improving their patients’ safety and satisfaction.

Great use case, great story! But what are some things to look for when you are searching for business analytics software?

1. Big Data Support. Sensor and wireless data are considered new and emerging sources of information. Data feeds from RFID/RTLS tags are typically stored in a NoSQL database, such as Hadoop HBase, MongoDB, CouchDB and XML data stores. While transactional sources, such as point-of-sales data, will continue to use relational data formats, the value of an analytics platform lies in the visibility that it provides across all sources of data, comparing and contrasting one data set to the other.  Be sure to look for a business analytics solution that has a broad spectrum of data source connectivity, including both un-structured and structured data sets.

2. Embedded Analytics. Aberdeen research shows that the greatest benefit of business intelligence lies in the value of embedded analytics within an enterprise app. Rather than asking your end users—namely doctors, nurses, administrative staff, and knowledge workers—to switch back and forth between their business processes and the analytical application to drive insight, you can cut the latency and deliver analytics in real time.

A great example of this is Intelligent InSites’ embedded analytics from Pentaho that provides data on asset locations, status, usage, utilization and availability, directly from the end user’s RTLS Asset Management application. At a glance, hospital staff can locate the nearest available wheelchair or stretcher, saving valuable time.

3. Power to the User. Given that most users in healthcare are doctors, nurses, and administrative staff, ease of use and an intuitive user interface is one of the most crucial selection criteria. These users should not only be able to easily read and understand packaged reports, but also have interactive design tools to build their own analysis and dashboards.

Selecting the right Business Analytics software for your location data requires some level of due diligence. Know that you are not alone: location-based intelligence and analysis is applied across all types of industries. Whether you are a retailer looking to understand your customer preferences, a hospital tracking your equipment and resources, or even a horse race sponsor connecting your race track data to betting shops and TV screens, analyzing real-time location data unlocks immediate value.

What location data are you analyzing? Drop me a comment.

Farnaz Erfan
Product Marketing
Pentaho


Powered by Pentaho 101

March 14, 2012

This week we announced a new program for ISV and SaaS providers called “Powered by Pentaho.” I received several questions from clients and press so I thought I would share them with you to help explain the details behind this great new offer.

What is Powered by Pentaho?

Powered by Pentaho enables Pentaho OEM partners to deliver market-leading analytics capabilities in as little as eight weeks. The new OEM program is a response to the rapid rise in Pentaho’s 2011 OEM sales bookings, which grew more than 130 percent over the same period in 2010.

What does this 8-week program entail?

Pentaho provides the training, support and integration recommendations that best fit your solution objectives. You do the development and quality assurance. Keep in mind that all throughout your development cycle and thereafter, you have access to Pentaho experts who are intimately familiar with the Pentaho architecture and APIs. The best way to picture this is to think of Pentaho’s engineering team as an extension of your own engineering team. We want you to become successful, go to market fast, and build market leadership using our business analytics.

What about Pentaho makes this possible in eight week?

Pentaho technology - We provide embedding options that require little to no development. All you need is basic HTML skills to change the look and feel of our product to match your style and branding. We refer to these options as ‘Bundled’ or ‘Mashup.’ Pentaho offers more in-depth integration level, for OEM partners that require extensions and customization. We often see our OEM partners start with a re-branding and single sign-on approach and later move to a deeper integration.

Pentaho support and training - Pentaho has built services specific to every phase of an OEM’s software development lifecycle. You can not only go to market faster, but also build your future releases, changes and modifications much easier. These services include:

  • Architecture Workshop – Learn the best practices and best integration strategies for your development approach;
  • Tailored Training – Get your engineers and support staff a solid foundation for developing and troubleshooting your solution;
  • Development Support – Get your engineering staff access to Pentaho Java developers with in-depth knowledge of Pentaho architecture to get you to market faster.

Am I the right candidate?

This program is ideal for companies with information-centric software or packaged applications that want to go to market faster with attractive and sophisticated business intelligence and data visualization capabilities. All our customers who have successfully done this in eight weeks or less have a set of common characteristics. They typically have:

  • A phased approach, usually starting with a Bundled / Mashup type embedding option;
  • Data sources that have been prepared, cleansed, and put into a business analytics / reporting format. Pentaho has tools to help you do that;
  • At least one developer – with HTML and some Java skills – staffed – who has taken part in our training and architecture workshop classes.

Does Pentaho have proof points?

To date, hundreds of ISVs and SaaS providers have become Pentaho OEM partners. Marketo is a great example. Marketo was looking for both a modern, flexible technology and a true partner to help them build a brand new business analytics product. With Pentaho they were able to go to market in just eight weeks, delivering a feature-rich product that became a new source of revenue.

We have several great resources such as white papers, webinars, OEM Partner success stories and more. Visit pentaho.com/explore/embedded-bi/ for more information.

Farnaz Erfan
Product Marketing
Pentaho


Follow

Get every new post delivered to your Inbox.

Join 96 other followers