Rutgers Big Data Certificate Program

How to start using big data SLAs

3/26/2017

0 Comments

 
Big data leaders, when you're asking IT pros for service level agreements, there's no need to reinvent the wheel--start the process with three basic steps.
Picture
Many companies have moved past the experimental stage of big data and turned their attention to implementing big data and analytics processing in production—they're even making some of these applications mission critical. Moving these applications to a mission-critical status requires them to be timely and readily accessible to the decisions makers who need them.
Given these circumstances, the time has come for big data service level agreements (SLAs).
The purpose of an SLA is to guarantee business users get certain minimum performance and service levels on their IT investment. SLAs are most commonly used for transactional systems, such as the ability to process x millions of hotel reservation transactions an hour or a commitment to 24/7 computer uptime for an airline reservation system.
Because big data and analytics have been largely experimental for organizations, users have yet to demand SLAs for big data from IT, and IT has not volunteered to offer them, either. It's time for this to change.
SEE: How to build a successful data scientist career (free PDF) (TechRepublic)
First, let's look at the big data user side.
If users are utilizing analytics reports and they expect IT to deliver these reports in a timely way to achieve business impact, requirements have to be defined for report delivery. In some cases, such as the Internet of Things (IoT), users will want real-time status reporting with to-the-minute alerts that are actionable. In other cases, it might be sufficient to get analytics reporting on a daily, weekly, monthly, quarterly, or yearly cycle.
A second area of user concern is the time to market for new big data applications that they want for the business. Users want these applications as quickly as possible so they can start getting business value from them.

When it Needs to get Done at 2am., That's when you can rely on CA Workload Automation
At any given moment, countless customers are making transactions that generate multiple related queries, transactions, and exchanges. And all depend on highly available workload processing night and day. But rest assured there is a solution. CA Workload Automation lets you provide a superior customer experience — anytime — by automating and simplifying complex workloads through a single point of control. Check out our eBook to see why you can rely on CA Workload Automation.

White Papers provided by CA TechnologiesNow, let's look at the services that relate to IT operational performance that must be met in order to meet business users' needs.
More about Big Data
  • How Apache Kafka promises to be your enterprise's central nervous system for data
  • How AI and next-generation genomic sequencing is helping cancer patients
  • How to build a successful data scientist career (free PDF)
  • Subscribe to our Big Data Essentials newsletter
As new big data applications are developed, the underlying technical goals have to be 1) speeding up the time it takes to develop, debug, and place new applications into production; and 2) speeding up system efficiencies and processing so that more developers can use development resources concurrently.
On the systems side, this could translate into SLAs for system performance, the ability to handle a specific number of application development users concurrently at one time, or tools that can reduce the time it takes to debug applications because of the automation they offer.
On the network side, there might be some quality of service (QoS) minimum service levels that must be met in order to facilitate a given level of concurrent big data development and testing activity.
Finally, there are the big data deliverables. For the analytics reports that must get into the hands of users instantaneously, system uptime must be guaranteed. For the analytics reports that are to be delivered daily, weekly, monthly, quarterly, and annually, analytics batch processing performance jobs must be written, implemented, and monitored to ensure that all deliverable targets are met.
SEE: Quick glossary: Big data (Tech Pro Research)
How to start using big data SLAsFew companies have well-orchestrated application and report delivery SLAs on the analytics side that can match what they have in transactional IT, so what is the best way to get started? This three-step process should help.
First, borrow a page from the guarantees that you provide business users for transactional reporting. You should sit down with users and determine which analytics they need in real time (with system uptime performance guarantees), and which analytics reports they need on the batch side (i.e., daily, weekly, monthly, quarterly, annual reports) so you can schedule the production of these reports and plan big data computing resources to produce them in the required timeframes.
Second, review your big data processing and application development approach. Most big data and analytics systems are highly distributed; they don't have all of the storage and processing "under one hood," like a mainframe or a single monolithic server that processes incoming hotel or airline reservations. Instead, big data and analytics systems feature multiple servers that all run in parallel. It is more challenging to manage workload execution in an IT environment that is spaced across multiple servers instead of just one server.
Third, don't forget about the network. It doesn't matter how well you tune your systems or organize your workloads if your internal network can't provide the bandwidth needed to support big data processing and data transport.
The good newsYou don't need to reinvent the wheel when you define metrics, service, and deliverable targets for big data analytics because you can borrow the mechanics from the transactional side of your IT work. For many companies, it's simply a matter of looking at service level processes that are already in place for their transactional applications, and then applying these processes to big data and making necessary adjustments to address unique elements of the big data environment, such as parallel processing and the handling of many more types and forms of data.

Source:  ​http://www.techrepublic.com/article/how-to-start-using-big-data-slas/
0 Comments



Leave a Reply.

    Archives

    April 2017
    March 2017
    February 2017

    Categories

    All

    RSS Feed

LOCATIONS

New Brunswick, New Jersey
San Francisco Location, California
Online or on-site at your location

Contact us

Lisa Doehring
Program Manager
ldoering@cx.rutgers.edu
415 343 0264

BIGDAta@RUTGERS

The Big Data Certificate Program at Rutgers University is offered in connection with the Rutgers Center for Innovation Education and the Rutgers Professional Science Master's Degree.

SUBSCRIBE

Join our mailing list today!
Join Now