Saturday, June 17, 2017

CSI: DB2 Historical Data Forensics On Demand for Audit Defense

Imagine that you get audited by the IRS for claiming a large business loss due to your online retail business facing some unforeseen competition. You try to recall the details of all your business expenses such as the times you used your car and home for business purposes.  You wish you had kept log records of all your business activities neatly organized and indexed on your computer for quick analysis.  instead, you attempt to cobble the details together to put forth some semblance of proof.  Every detail that you cannot prove costs you money.

Now imagine you are the Risk Officer at a $30 Billion/year Enterprise that services some of the most sensitive transactional data in the world.  This could be Social Security numbers, medical records/lab results, credit card numbers, account balances.  Changes to this data are under constant scrutiny by regulatory bodies in each industry sector.  Many organizations devote significant financial and technical resources on risk management.  For example, internal governance rules may require housing 20+ years of historical records in case of a law suit.  Audits related to government regulations (HIPAA, SEC Rule 17a-4) may not only require maintenance of historical data, but also a view of all data changes.  In order to do this, organizations may:
  • Transform all transactional ‘Update’ operations into ‘Insert’ and ‘Delete’ pairs to retain before-and-after images of records. 
  • Employ procedural code (e.g. triggers) to keep track of changes. 
  • Create copies of the historical data on external systems which may increase the liability of data breach and lead to additional costs related to data copying, transformation, storage, and maintenance.   

Performing these tasks may cost Millions in yearly costs associated with additional transactions, additional procedural computing, increased storage, copying data to external environments, etc. But what if there was a way to:
  • Keep an entire history of changes to the data without manually changing the transactions themselves (i.e. without requiring code to transform updates into insert/delete pairs)
  • Automatically maintain beginning and end timestamps for each row of data where the timestamps indicate the “life” of the data (i.e. without requiring procedural code) 
  • Access and analyze this data via the transactional systems (without impacting resources on these transactional systems)
  • Create a snapshot of the data as it existed at any point in time or range(s) of time with massive parallelism (without creating separate data connections and credentials  
All of these "data forensic" enabling features are made possible on System z through two technologies.  The first is a capability within DB2 for z/OS called Temporal Tables.  The second is through a technology called the IBM DB2 Analytics Accelerator (The Accelerator).  Please see the following paper (published soon) for details on using Temporal tables and the Accelerator for 'Historical Data Forensic' capabilities On Demand!

Thursday, June 15, 2017

New opportunities to drive analytics value into business operations: IBM DB2 Analytics Accelerator

Today, many System z clients are using the IBM DB2 Analytics Accelerator (the Accelerator) to help their organizations gain even greater insight and value from their data. Organizations can offload data-intensive and complex DB2 for z/OS queries to the Accelerator in order to support data warehousing, business intelligence and analytic workloads. The Accelerator executes these queries quickly, without requiring CPU utilization by DB2 for z/OS. The Accelerator is a logical extension of DB2 for z/OS, so DB2 manages and regulates all access to the Accelerator. DB2 for z/OS directly processes relevant workloads, such as OLTP queries and operational analytics. Queries that run more efficiently in a massively parallel processing (MPP) environment are seamlessly rerouted by DB2 for z/OS to the Accelerator. There is one set of credentials that is governed by RACF security, and all access flows through DB2 for z/OS. Users often first see the business value of the Accelerator in handling long-running queries, but many are also finding that the Accelerator can drive cost savings in areas such as administration, storage and consolidation as well as delivering real-time analytics.

This white paper discusses how organizations can improve analytic insight with the IBM DB2 Analytics Accelerator. It offers guidance to help organizations more quickly uncover new opportunity areas where the Accelerator can have the greatest impact. The paper covers topic areas including:

    •    Accessing enterprise data in place
    •    Gaining advocates from IT, application teams and Lines of Business
    •    Uncovering and expanding opportunities for the DB2 Analytics Accelerator
    •    Measuring the business value of the DB2 Analytics Accelerator
    •    Case studies
    •    The potential for the DB2 Analytics Accelerator to provide even greater ROI

Monday, September 14, 2015

z Analytics Business Value Validation Methodology

Are you considering an investment in a z Systems Analytics solution? How will you evaluate the Return on Investment (ROI) that will be realized using this solution? Does the measurement of 'Return' align with your business objectives? The z Systems 'Business value validation workshop' offered by IBM will validate both technically and financially if/how a z Systems centric solution can help you meet your key business objectives.  Typical business objectives include cost savings, cost avoidance, new customer value, customer satisfaction, reduced liability, increased security. 

For example, a fictional company 'Acme Systec' is focused on reducing costs and reduce data sprawl.  This assessment would be used to explore the savings, efficiencies, and new value that can be gained by reducing data sprawl within Acme Systec's IT infrastructure through use case definition, requirement gathering, technical validation and a cost benefit analysis.  The workshop would focus on Acme Systec's specific environment and business requirements, forging a partnership between the application teams, infrastructure teams, and key decision makers.  The application teams provide relevant insight into use cases and business usage, while the infrastructure teams provide insight into current costs and technical configurations.  The workshop recommendations provide a holistic approach to both technical architecture improvement and financial cost reduction. 

The following link contains a sample offering focused on determining the cost savings that can be realized through DB2 z/OS + the IBM DB2 Analytics Accelerator: IDAA Cost Benefit Analysis Link.  For more information about the Business Value Validation Methodology for z Systems, please contact your local IBM z Systems sales specialist.

-Shantan

Friday, September 11, 2015

Could your analytics strategy cost your business USD 100 million?

How new technologies can help protect your analytics data and your bottom line


    Technology trends and forces such as cloud, mobile and big data can represent big opportunities to bring analytic insight to the enterprise. They can also represent big risks if proper data security and governance controls are not in place. In 2015, one of the largest health benefits companies in the United States reported that its systems were the target of a massive data breach. This exposed millions of records containing sensitive consumer information such as social security numbers, medical IDs and income information. Various sources, including The Insurance Insider, suggest that this company's USD 100 million cyber-insurance policy would be depleted by the costs of notifying consumers of the breach and providing credit monitoring services—and that doesn’t consider other significant costs associated with a breach such as lost business, regulatory fines and lawsuits. 
    Data is now so important that it is has a value on the balance sheet.  Cyber criminals know this. Without exception, every industry has been under attack and suffered data breaches – healthcare, government, banking, insurance, retail, telco. Once a company has been breached, hackers focus on other companies in that same industry to exploit similar vulnerabilities. In 2015 the average cost of a data breach was US$ 3.79M, causing long term damage to the brand, loss of faith and customer churn. 
    As you think about the impacts of this and other data security breaches occurring at organizations worldwide, consider this question: how exposed is your business to a similar type of breach? To answer this question, you must first ask, “Where does the data that feeds our analytics processes originate?”

See my full paper here

-Shantan