Developing Predictive Analytical Applications with Big Data
How to Quench Your Thirst for Knowledge from Big Data
- Daily Point of Sale Data from 3,000 stores for Ten of Thousands of SKUs
- Daily RX Data for Hundreds of Thousands of Individual Patients
- Real Time Demand for Electricity At Each Point Along a 250,000 Square Mile Electric Grid
We all thirst for information and learning from these huge data sets. But, in the time it take to read this post, terabyte upon terabyte of transactions data has been piling up in corporate databases around the world. How can we harness the power of Big Data, while not drowning in it? Identify the Strategic Drivers of Your Business By identifying what is really important to know in your business and focusing on analyzing just those issues, you can cut down the real time and daily data requirements for your business. Key Performance Indicators or KPIs have been around for a generation. It’s very important to have a subclass of these indicators which are real time KPIs. This enables you to whittle down you Big Data set into hopefully a manageable one. These KPIs can be as unique as the industries that they are in.
- Mobile Phones: Pre-orders for Soon to be Launched Phones
- Fashion Apparel: Color Mix Across all Styles and Lines
- Pharmaceuticals: List of Patients More than 7 Days Overdue for a Refill
- Utilities: Hourly Sunshine and Temperature Forecasts by City
- On-line Retailer: Recommendation Engine
Using Automated Tools to Analyze the Data As Yogi Berra once said, “you can’t think and hit at the same time” – meaning that a ball player does not have a lot of time to think about whether or to swing or not and what type of swing to take, it must be a trained instinct. In a similar manner, predictive analytics for Big Data must be analyzed automatically. In other words, if it takes you a day to some up with a forecast of tomorrow’s sales, you are never going to be able to come up with a forecast in time to make a decision based on the forecast. RoadMap’s forecasting algorithms make extensive use of re-entrant coding as well as in-memory data caching which allows for fast and efficient automated processing. Look for a Total Solution, Not an ERP Suite or a Single Package Depending on the complexity of the predictive model, your existing suite of software may just not have enough horsepower. That’s why open source projects such as the R Project for Statistical Computing can be such as valuable part of the total solution. R, an Open Source project which is supported by numerous corporations, universities and hospitals has developed over 5000 statistical models called packages. So, if your Big Data analysis engine needs to be turbocharged, it’s a good bet that there will be some predictive model in R which will help fill the bill. Your analytic needs may also exceed your IT infrastructure. That’s why cloud based solutions work so well with Big Data. IT departments usually work off a 18 month long backlog of applications requests. Strategic projects can’t wait 18 months; they have to be done immediately, with whatever resources are available. In summary, corporations that can cut their predictive analytics down to size by focusing on strategic imperatives, develop automated systems, and use multi-vendor approaches, which may even include open source or cloud solutions, stand the best chance of gaining business insight.
You must log in to post a comment.