Saturday, June 17, 2017

CSI: DB2 Historical Data Forensics On Demand for Audit Defense

Imagine that you get audited by the IRS for claiming a large business loss due to your online retail business facing some unforeseen competition. You try to recall the details of all your business expenses such as the times you used your car and home for business purposes.  You wish you had kept log records of all your business activities neatly organized and indexed on your computer for quick analysis.  instead, you attempt to cobble the details together to put forth some semblance of proof.  Every detail that you cannot prove costs you money.

Now imagine you are the Risk Officer at a $30 Billion/year Enterprise that services some of the most sensitive transactional data in the world.  This could be Social Security numbers, medical records/lab results, credit card numbers, account balances.  Changes to this data are under constant scrutiny by regulatory bodies in each industry sector.  Many organizations devote significant financial and technical resources on risk management.  For example, internal governance rules may require housing 20+ years of historical records in case of a law suit.  Audits related to government regulations (HIPAA, SEC Rule 17a-4) may not only require maintenance of historical data, but also a view of all data changes.  In order to do this, organizations may:
  • Transform all transactional ‘Update’ operations into ‘Insert’ and ‘Delete’ pairs to retain before-and-after images of records. 
  • Employ procedural code (e.g. triggers) to keep track of changes. 
  • Create copies of the historical data on external systems which may increase the liability of data breach and lead to additional costs related to data copying, transformation, storage, and maintenance.   

Performing these tasks may cost Millions in yearly costs associated with additional transactions, additional procedural computing, increased storage, copying data to external environments, etc. But what if there was a way to:
  • Keep an entire history of changes to the data without manually changing the transactions themselves (i.e. without requiring code to transform updates into insert/delete pairs)
  • Automatically maintain beginning and end timestamps for each row of data where the timestamps indicate the “life” of the data (i.e. without requiring procedural code) 
  • Access and analyze this data via the transactional systems (without impacting resources on these transactional systems)
  • Create a snapshot of the data as it existed at any point in time or range(s) of time with massive parallelism (without creating separate data connections and credentials  
All of these "data forensic" enabling features are made possible on System z through two technologies.  The first is a capability within DB2 for z/OS called Temporal Tables.  The second is through a technology called the IBM DB2 Analytics Accelerator (The Accelerator).  Please see the following paper (published soon) for details on using Temporal tables and the Accelerator for 'Historical Data Forensic' capabilities On Demand!

Thursday, June 15, 2017

New opportunities to drive analytics value into business operations: IBM DB2 Analytics Accelerator

Today, many System z clients are using the IBM DB2 Analytics Accelerator (the Accelerator) to help their organizations gain even greater insight and value from their data. Organizations can offload data-intensive and complex DB2 for z/OS queries to the Accelerator in order to support data warehousing, business intelligence and analytic workloads. The Accelerator executes these queries quickly, without requiring CPU utilization by DB2 for z/OS. The Accelerator is a logical extension of DB2 for z/OS, so DB2 manages and regulates all access to the Accelerator. DB2 for z/OS directly processes relevant workloads, such as OLTP queries and operational analytics. Queries that run more efficiently in a massively parallel processing (MPP) environment are seamlessly rerouted by DB2 for z/OS to the Accelerator. There is one set of credentials that is governed by RACF security, and all access flows through DB2 for z/OS. Users often first see the business value of the Accelerator in handling long-running queries, but many are also finding that the Accelerator can drive cost savings in areas such as administration, storage and consolidation as well as delivering real-time analytics.

This white paper discusses how organizations can improve analytic insight with the IBM DB2 Analytics Accelerator. It offers guidance to help organizations more quickly uncover new opportunity areas where the Accelerator can have the greatest impact. The paper covers topic areas including:

    •    Accessing enterprise data in place
    •    Gaining advocates from IT, application teams and Lines of Business
    •    Uncovering and expanding opportunities for the DB2 Analytics Accelerator
    •    Measuring the business value of the DB2 Analytics Accelerator
    •    Case studies
    •    The potential for the DB2 Analytics Accelerator to provide even greater ROI