Resources

Global Partner for simplifying solutions for IT infrastructure needs

How to Cool the Big Data Effect

Big Data was a much discussed concept across IT management during 2012. In fact the effect of big data and the non-stop massive growth has been a very real challenge for CSP data infrastructure and technology teams for quite some time. Global tier 1 service providers have been experiencing activity levels of billions of call data records per day for some years to the point where custom database technologies are now being developed and modified each year. Peta-byte scale database systems holding years of Call Data Records (CDRs) and Internet Protocol Detail Records (IPDRs) are now considered the norm at many tier 1 providers. When you consider the more stringent compliance regulations being imposed, the Big Data effect can become even more complex and challenging.

A single global provider now faces a situation where their data centers and database solutions need to proactively configure, manage and retain specific customer records for a designated time period and each country regulations impose different timeframes, requiring different data types, in different formats as well as specific response times to access that data. In the past year alone, we have seen Japan and India enforce government-led initiatives upon providers regardless of size and subscriber level. Fulfilling lawful intercept requires substantial investments by providers in storage, database and search technologies and of course trained resources with the right skill-set to deploy such technology solutions. Lawful intercept involves both real-time monitoring and access to xDR’s, for example, law enforcement will obtain a judicial warrant to tap the landlines, cell phones and email accounts of suspects, as well as to receive copies of their call, SMS/MMS and other communications records. Having this data readily available in the correct format and delivered in a timely manner creates ongoing pressure to make the data center run efficiently and cost effectively, not to mention the pressure of a legal fine.

Today CSPs must think innovatively not only through the power of interoperability with other global partners, providers and suppliers but use technology, network architecture and standardization in order to achieve scale and better IT economics.

In a recent Gartner industry report named: Predicts 2011: CSPs Must Rethink Business Paradigms to Meet Market Challenges, it cited that in development markets the margins of CSPs in mobile, fixed combined services sectors are falling, with very few exceptions. This is due to Internet Protocol (IP) substitution, lower average revenue per unit (ARPU) from new customers, increasing competition, regulators getting a better view of noncompetitive price elements, and the need to invest in expensive broadband networks with long pay-back cycles. Improving the customer experience and service level is absolutely a necessary part of staying ahead and there are partnerships formed each day to provide new applications and mobile capabilities to drive new channels of revenue. However, all the front-end customer facing products and services that are rolled out will only be possible and even sustainable if the back-office and data infrastructure are capable of management and scale over time.

Let’s consider the data problem for today’s IT team. If you examine more closely the life-cycle of a single customer’s communication “transactions” in say a given month or billing cycle, the core data (essentially Call Data Records and Internet Protocol Detail Records) that are captured, collated and stored is immediately historical in nature which means once the “transaction” takes place, it does not undergo any further insert, update or delete (change), which is quite different when you compare to a traditional transaction that occurs in an online transactional database system (OLTP) such as a retail purchase. The customer communication data is then “moved” through a series of applications in order to enable various IT and business functions such as network capacity planning, traffic analysis, user trending, revenue assurance, billing, digital advertising and more. Some of these OSS & BSS applications leverage the same database system and in other cases, the data is moved around with varying degrees of latency. Data sub-sets are formed depending on application function and requirements and data warehouses are fed with customer activity in order to perform trending analysis and determine the overall value of a customer at any point in time. Many CSPs today retain much of this data in a traditional database such as Oracle and for many functions across the lifecycle of the transaction, this data is not changing and therefore can be stored in a database whose primary purpose is for long term retention.
Given the growth in data volume and complexity across diverse systems and applications, organizations are forced to retain the data for as little time as possible largely because systems have simply bloated to the point where it negatively impacts performance. For a large provider, even the critical function of replicating data from a complex billing application in order to provide continuous uptime can cause significant technology challenges due to the volume of the data sets. However, new database technologies are now coming to the fore-front of solving this problem directly. What is now available and is being deployed by providers is a specialized database with the ability to significantly de-duplicate and reduce this data to a much smaller footprint.
When you consider a 100 terabyte system reduced to a storage footprint of 5 terabytes, you can begin to imagine the cost savings not only in storage but also hardware capacity. Data in this specialized database built for the purpose of storing data online for long time-periods (i.e. years) can be queried and retrieved using standard SQL Query language or various business intelligence and analytics solutions. With a database solution focused on big data retention that enables significant cost reduction, CSP’s can store critical customer activity data for much longer timeframes – beyond what is required by regulators. By having access to larger historic data sets, CSP’s can gain greater insights to customer behavior patterns and even discover new opportunities to further segment the market and create new services with a wider range of price points.

In order for CSPs to gain completive advantage in the ever-changing markets, they need to look more inward than out and discover new technologies and new approaches to managing their “core” – essentially the customer and transaction data. By applying new data infrastructure capabilities to achieve greater cost efficiencies, CSP’s can literally discover new service offerings and even new markets. The ocean of Big Data faced by today’s CSP does not have to reach boiling point– applying specialized database capabilities across the infrastructure will significantly reduce cost, manage scale and complexity and will certainly have a cooling effect.

© Copyright 2012. rootshellinc.com. All Rights Reserved