The Tesla vs. NY Times – How Analytics Helped Tesla Win

February 21, 2013

Tesla’s-Pricing-Strategy-for-the-Model-S-Luxury-Sedan-25

In the last couple of weeks the feud between The NY Times Editor, John Broder – and Tesla Motors’ CEO, Elon Musk has played out in the media.

It all started when Broder took a highway trip between Washington D.C. and Boston, cruising in Tesla’s Model S luxury sedan. The purpose of the trip was to range test the car between two new supercharging stations. This 200 miles trip was well under the Model S’s 265-mile estimated range. But nonetheless the trip was filled with anxiety for Broder. Fearful of not reaching his charging destination, he had to turn off the battery-draining amenities such as radio and heater (in a 30 degree weather) to finally reach his destination – feet and knuckles “frozen”.

In rebutting Broder’s claims, Tesla’s chief executive, Elon Musk, has charged that the story was faked, that Mr. Broder intentionally caused his car to fail. On his Tesla blog, he released graphs and charts, based on driving logs that contest many of the details of Mr. Broder’s article.

With the logs now published, one thing is clear — Tesla’s use of predictive analytics helped them warn Broder on what is ahead. By calculating the range based on the energy consumption, Tesla signaled Broder to charge the vehicle in time. Had Tesla not been able to call its log files as witness, this futuristic motor tech company could have experienced serious brand damage.

What’s interesting is that Tesla’s story is not unique. Today, virtually anything that we use, an appliance, a mobile phone, an application, generates some sort of data – machine-generated data. And the truth exists behind that data. Such data, when analyzed and mined properly, provides indicators that solve problems, ahead of time.

Having real-time access to machine-generated data to foresee problems and improve performance is exactly why NetApp is using Pentaho. Using Hadoop and Pentaho Business Analytics to process and drive insights from 2-5 TBs of incoming data per week, NetApp has built a solution that sends alerts and notifications ahead of the actual hardware failure. The solution has helped NetApp predict its appliance interruptions for the E-Series storage units, offering new ways to exceed customer SLAs and protect the brand’s image.

Tesla, NetApp or other, if you run a data-driven business, the more your company can act on that data to improve your application, service or product performance, the better off your customers and the better your brand will be.

Pentaho Business Analytics gives companies fast and easy ways for collecting, analyzing and predicting data patterns. Pentaho’s customers see the value of analytics in many different facets and use cases. NetApp’s use case will be featured in Strata’s upcoming conference on Thursday, February 28, 2012.

Join us to find out more.

- Farnaz Erfan, Product and Solution Marketing, Pentaho


4 Questions to Ask Before You Define Your Cloud BI Strategy

June 25, 2012

These days, when it comes to enterprise software, it seems that it is all about the cloud. Some software applications such as Salesforce, Marketo, and Workday, have made quite a name for themselves in this space. Can Business Intelligence follow the same path to success? Does it make sense to house your BI in the cloud? I believe that it depends. Let’s explore why.

There are four criteria that impact the decision for a cloud vs. on-premise BI strategy.  Let’s take a look at how they affect your approach.

Question 1: Where is the data located?

Your BI Strategy should vary depending on the location of data.  If your data is distributed, some data may already be in the cloud, e.g. web data / clickstreams; and some on-premise, such as corporate data. For real-time or near real-time analytics, you need to deploy your BI as close to the source as possible. For example, when analyzing supply chain data out of an on-premise SAP system, where your database, application and infrastructure are all sitting on-premise, it is expensive and frankly impractical to move the data to the cloud before you start analyzing it.

Your data can also be geographically distributed. Unless your cloud infrastructure is co-located with your data geo zones, your BI experience can suffer from data latency and long refresh intervals.

Question 2: What are the security levels of data?

It’s important to acknowledge that data security levels are different in the cloud. You may not be able to put all your analytics outside of the company firewall. According to Cisco’s 2012 Global Cloud Networking survey, 72% of respondents cited data protection security as the top obstacle to a successful implementation of cloud services.

Question 3: What are the choice preferences of your users?

Customer preference is extremely important today. The balance of power has shifted, and users and customers are now the ones who decide whether an on-premise or a cloud deployment is suitable for them. What’s more, each customer’s maturity model is different. As an application provider or business process automation provider, you need to cater to your individual customers’ business needs.

Question 4: What operational SLAs does your Cloud BI vendor oblige you to?

Your operational SLAs can depend on cloud infrastructure providers, obliging you to service quality levels different from what you need. Pure cloud BI vendors provide their BI software over the public Internet through a utility pricing and delivery scheme. As much as this model provides an attractive alternative when resources are limited, it’s not for everyone. In most cases, the SaaS BI vendor depends on IaaS vendors (such as Amazon, Savvis, OpSource, etc.) for storage, hardware, and networks. As a result, the SaaS BI vendors’ operational processes have to align with the infrastructure vendors’ for housing, running, and backup/recovery of the BI software. Depending on your BI strategy, these nested and complex SLAs may or may not be the right choice.

Large enterprises, or even mid-market companies inspired by growth, typically develop an IT strategy that is provider-agnostic and has the flexibility to be hosted on-premise or in the in the cloud.   This strategy helps companies avoid lock-in and inflexibility down the road.

As cloud technology remains one of the hottest trends in IT today, it is important to assess whether cloud is the right choice for BI. The reality is that it depends. The center of gravity for BI is still on premise; however, it will move to the cloud over time mostly through the embedded BI capabilities of enterprise SaaS applications. Successful organizations will be the ones that can navigate the boundary between the two strategies and provide gr

eater flexibility and choice by offering a product that can be deployed on-premise, in the cloud, or a hybrid of both.

What is your Business Intelligence Cloud strategy?

- Farnaz Erfan, Product Marketing, Pentaho

Originally posted on SmartData Collective on June 21, 2012


The Diary of a Construction Manger in Love with His BI Tool

June 7, 2012

Hi, my name is Bob and I am a construction manager. I oversee all aspects of managing the operations of a construction project, including budgets, staffing, and the compliance of the entire construction project.

In 10+ years of my experience, I have never had a Business Intelligence (BI) tool. I had to create spreadsheets to track daily activities, calculate risks and build formulas to measure impact. Given the size of the projects I worked on, this was extremely complex.  As a result, I would spend a lot of my time putting out fires to problems that I knew could have been prevented if I had the right information.

Recently my company introduced BI to our team. Since I’m using BI for the first time, I decided to create an activity log similar to a diary of my project.

Let me share some highlights with you:

October 28, 2011

We are 4 weeks into the project. We have the crew working on the ground. The foundation is done. The structural engineer has finished his design. We are ready to roll.

January 11, 2012

This morning I received an alert about my Preventative vs. Corrective Maintenance. My monthly work mix by type looks like this: preventative 36%; repair 24%; rebuild 5%; and modify 35%. My preventative costs have gone down from an optimal 40% to 36% and my repair costs have increased correspondingly.

When I drilled down into the repairs, I see that we are responding to higher than normal number of heating and insulation work items. I am going to talk to Edward – my HVAC contractor – about it.

February 29, 2012

I have been monitoring our electrical work. Our average Cost per SQ Foot is 13% less than industry average. This is a breakthrough thanks to the changes I have made monitoring the project with BI and making data-driven decisions. It lets me monitor these costs on an ongoing basis, so I can take preventative actions to stay below industry average to protect our funding and even justify additional headcount.

March 16, 2012

Productivity Rate is one of my favorite indicators – because it truly provides me with real-time info about the performance of my team. On average, our productivity rate stays on optimal levels. However, the plumbing trade group’s actual cost is exceeding the estimated costs. This will affect my cost-to-complete and margins, as I have to pay overtime for this contractor.

But I don’t have to worry… my BI tool lets me drill into this indicator to see whether the reason is ‘labor’ or ‘supply’ related. Drill-thru was something a spreadsheet could never let me do.

March 30, 2012

Two weeks have passed since I shifted resources for plumbing. Our productivity rates have improved since then and the project is looking on time and on budget.

With 40 more days to go, I want to make sure we deliver on time and meet our SLA with the building owners. I see no bottlenecks. Cycle Times – the average time to complete an activity – shows me that we are actually 4 days ahead of the schedule.

May 21, 2012

I’m very happy to report that we are done with the construction. The ROI on this project was greater than we expected and my client is very happy. Next weekend is the Memorial Day weekend. I have the time and money I need to take a nice vacation with my wife and son.

-As told by Bob, a fictional construction manager.

Even though the story is fictional, it’s based on reality. Business users and project managers – such as facility managers, supply chain logistic specialists, even dairy farmers – use Pentaho business intelligence to make their jobs easier and to make smarter, data driven decisions – just like our fictional friend, Bob.

Who knew BI could be so handy for construction managers?

What is your secret BI story? Drop me a line.

- Farnaz Erfan, Product Marketing, Pentaho


Happy Holidays from Pentaho

December 21, 2011

On behalf of the Pentaho team around the globe, I wish you and your families a very Happy Holidays and a healthy and prosperous New Year.

Although we would never claim that Business Analytics is ‘child’s play’ we invited some of our team’s kids to star in our latest marketing video to remind us all of what’s really important as we wind things down for the holiday season and turn our attention to family and friends.

Thank you very much for your continued support and we look forward to making big things happen with you in 2012!

Cheers,

Quentin


Building the Future of Analytics

December 1, 2011

It has been a terrific first two months at Pentaho; the excitement from our employees, customers and partners for helping Pentaho build the future of analytics has been profound.

We believe the future of analytics is all about building a modern platform that allows business users and technologists the ability to interact with their data in an easy, visual way. The consumerization of IT requires the traditional analytics functions regarding data extraction and transformation, metrics creation, ad hoc analysis and reporting to be done in a “point and click” and “drag and drop” fashion.  The latest Pentaho Business Analytics release delivers on such a premise and provides “Power to the user.”

We’ve had great feedback on how IT professionals love the visual, non-programmatic way Pentaho integrates any data source from traditional relational sources such as Oracle, SAP and MySQL, to SaaS applications like Salesforce, to non-relational sources including Hadoop, NoSQL and MPP, to analytic platforms including EMC, Teradata and IBM. Plus, the ease of all ETL functions are accomplished with a simple point and click.

Business users are free to create measures, metrics and on-the-fly reports. Deep ad hoc analysis and predictive analytical features are a drag and drop away.

Another design tenant Pentaho has built for the modern analytics platform is an open architecture and plug-in framework. We understand that our customers require freedom of choice selecting databases, plug-in end user components and web services. Pentaho’s open source heritage and extensible plug-in framework allows business users and IT professionals to easily integrate preferred components to fulfill their analytical needs.

The future of analytics is ensuring that our customers can easily gain insight from emerging technology trends including big data, Cloud computing and mobility.

Pentaho has released robust big data analytics solutions, which allow customers to truly “operationalize” big data applications. Pentaho is focused on helping customers address the biggest pain points around big data as well as the challenges of getting non-relational data in and out of Hadoop and into the hands of business users who can gain insight from reporting and ad hoc analysis. We already have several customers in the media and retail verticals who have tapped into big data for market analysis, customer sentiment and website analytics using Pentaho solutions.

Our modern technology platform was designed with Cloud computing in mind. Pentaho’s service oriented architecture allows SaaS and Cloud companies to easily embed rich analytics into their applications. Customers like Marketo, ExactTarget and Cipal have enhanced their service offerings with advanced analytics for their customers. Additionally, the Pentaho suite is integrated with common platform as a service (PaaS) and infrastructure as a service (IaaS) offerings, including Amazon Web Services and RackSpace for customers who would prefer to run their analytic applications in the Cloud.

We constantly strive to make our products more relevant. As our customers look to empower remote employees with better insight and analytics, Pentaho has released a rich mobile solution for tablets including the iPad.

The big idea for the future of analytics is allowing customers to freely move and easily integrate analytical applications between the traditional world of transactional, relational data and the new world of web, social and device data. The ultimate goal is to provide deep business insight by enabling users to easily mash-up data from traditional relational applications, such as Oracle, SAP and Microsoft, with web and social data sitting in Hadoop and NoSQL data stores.

I would be remiss not to mention that a major business design goal for the future of analytics is price performance and value. Building on our foundation of being easy to do business with from evaluation to renewal, Pentaho is continually improving its support and services. In the past year, we have added the Customer Success Team to provide additional customer service to our direct and OEM customers worldwide, created a specialized OEM services team and enhanced our support platform by migrating to Zendesk. All of this while still delivering the future of analytics at 20 percent of the cost of traditional analytics solutions from the mega vendors!

The future of analytics is here and now with Pentaho. I look forward to the journey!

Quentin


Follow

Get every new post delivered to your Inbox.

Join 96 other followers