Think of all the data you possess that is not being fully utilized. Not just data generated by standard customer transactions, but data collected from nontraditional sources such as social media, the web, voice files, and image documents. Today you can harness all this data—structured and unstructured alike—to gain competitive advantage. One key to obtaining these insights is analytics—in particular, predictive analytics.
Predictive analytics is the process of using all the different kinds of data that your organization creates and collects to gain insight into potential future outcomes. Note the word potential. Predictive analytics helps forecast what has a reasonable chance of occurring in the future, based on running what-if scenarios and assessing probabilities using existing data.
Predictive analytics is considered an advanced analytics technique. Advanced analytics methods differ from traditional ones in that they help businesses gaze forward instead of backward. So instead of what happened? You ask, what is likely to happen? For example, will this elevator require repair in 50 more hours of service? Or even, as your experience and expertise in analytics grows, what are we going to do about it? Or—most excitingly—as you master advanced analytics, you simply trust the system: analyze the data and do what is best for my business.
Analytics deployments often map to a five-stage maturity curve. Traditional analytics comprise the first two types of analytics on this maturity curve, advanced analytics the second three.
Many enterprises today are eager to move beyond traditional business intelligence (BI) to advanced analytics such as predictive analytics.
But what can predictive analytics do for your business?
Many things. You’ll finally be able to harness your rapidly expanding volumes of data—both structured and unstructured—in real time to answer business questions about staffing, pricing, and inventory management, not to mention operational issues such as data center uptime and SLAs.
Two real-world examples:
Each type of analytics comes with a unique set of infrastructure requirements. Your first step is to look at your existing infrastructure. Analyze where your compute, network, and storage capabilities are aging and holding you back.
For predictive analytics, you may be required to modernize your infrastructure to deliver the performance, security, and memory or storage required. Your infrastructure must be flexible enough to run both commercial and open source predictive analytics solutions, and provide ample room to grow—It’s no longer enough to just grow in a linear fashion, your infrastructure might have to scale beyond your normal expectations to accomplish what you need.
The infrastructure should be capable of running a range of analytics workloads—from real-time, in-memory SAP HANA* or Oracle* Exadata* databases to streaming analytics (Storm*, Flink*) and big data Hadoop* deployments. Today you might want to build a Hadoop* data lake, but tomorrow, a standalone Spark* environment. Your infrastructure must be flexible enough to do both.
Don’t forget to consider cloud. Cloud compute and storage capabilities can augment your infrastructure when your goal is large-scale predictive analytics, and allow you to grow both on-premise and off-premise where appropriate. Cloud also can speed up the deployment of infrastructure or platform solutions that you don’t currently have in your plan.
To achieve all this, choose infrastructure components that adhere to industry standards, but don’t stop there. That’s just the minimum. Your components should also be tested on and optimized for predictive analytics workloads, going beyond—for example—general processor capabilities to customizable FPGAs for acceleration of targeted analytics workloads, memory, storage, Ethernet, and interconnect as well as platforms optimized for deep learning.
You can deploy predictive analytics without evaluating open source tools but you’d be missing out. The open source analytics community is broad and deep, and has produced a stellar portfolio of advanced analytics tools, from Hadoop*, to Spark*, to Hive*, plus a multitude of others that are constantly being improved and advanced by the global open source community.
Open source predictive analytics tools are—on the surface—significantly less costly to deploy than proprietary analytics platforms. This makes them attractive to companies just dipping into the predictive analytics waters. They are also extremely flexible, providing you with a multitude of deployment options for a wide range of analytics workloads.
Yet this very flexibility makes them challenging to use. You need people with emerging and advanced skills—data scientists, data engineers, and data analysts—if you go the open source route. You can hire them or you can grow the expertise in house, which takes time. Many companies end up hiring consultants, which adds to the cost of open source initiatives.
Some companies choose to do proof-of-concept (POC) tests of predictive analytics systems using open source, then switch to proprietary solutions for production. But increasingly, open source is playing a key role in production solutions because of the opportunities it offers companies to leverage all their data—structured and unstructured—and to test exciting new analytics concepts. This often leads to a mix of open and proprietary technologies, which allows you to pick the best solutions for different jobs, and combine them for optimal results.
One of the biggest barriers to adopting predictive analytics is estimating the value a proposed initiative will offer your business.
You know the value of your BI solution today because you can't live without it. But establishing the value of predictive analytics? First, you must justify the upfront expense of building out a new infrastructure, hiring or growing the skills, and purchasing the analytics platforms or tools. You have to prove that the investment is going to provide your business with more than the rear-view mirror view that traditional analytics has been giving you for the last decade.
The overarching first rule for ensuring value: engage the business. Predictive analytics doesn’t exist in a vacuum. You use it to solve business problems. Ask your business users to identify their pain points that can be solved with predictive analytics. Pick a challenge that they’ve been struggling to solve, but which has been beyond the capabilities of your current data sources and analytics systems. Or choose a new problem that they’ve never considered trying to solve before because the data sources were new, untested, or unstructured.
The second rule is start small. Predictive analytics can be overwhelming. It is, after all, a highly complex area that changes every day. New solutions and new tools are always coming onto the market—particularly in the open source world—and it’s not clear how they all integrate with each other. There are also data security considerations to consider.
Ask yourself some basic questions: What exactly am I trying to achieve for the business? How will predictive analytics give me more insight than traditional analytics? Is the data that I need available—and will it provide competitive business advantages? What’s the expected ROI?
Intel is driving innovation to help you thrive in predictive analytics. Intel® technologies span every aspect of infrastructure to help businesses harness predictive analytics for competitive advantage.
Specifically, Intel defines and drives the standards that drive compute, network, and storage for the world. Intel innovations have been tested on and optimized for the broadest ecosystem of predictive analytics solutions in the industry and support a predictive analytics-capable infrastructure for a broad range of workloads—whether running on open source or proprietary platforms. Because the new predictive analytics platforms are all based on Intel architecture, you have the opportunity to do analytics everywhere, opening up possibilities for distributed analytics as part of every deployment.
When it comes to compute, Intel® processors cover the full range of predictive analytics needs. The portfolio extends beyond general-purpose Intel® Xeon® processors to critical ancillary technologies such as customizable FPGAs for acceleration of analytics workloads, memory, storage, ethernet, and interconnect.
When modernizing your storage for predictive analytics, Intel technology delivers breakthrough application performance—and faster time to insights. Designed for seamless performance and enhanced capabilities with Intel® processors, chipsets, firmware, software, and drivers, Intel storage solutions provide speed, low price points, and performance.
Intel also helps with the networking aspects of predictive analytics. Delays in moving your data from your data lake or warehouse to your compute infrastructure can be costly in real-time operations. To help your organization avoid these delays, Intel provides one of the fastest fabrics available to speed predictive analytics workloads across network pipes.
And don’t forget security: Intel hardware and software security tools help protect access security, as well as both security of data at rest and data in motion.
Click here to learn more about how Intel can support your organization’s predictive analytics strategy.