Five (5) Architectural Components make up each BIAnalytix Module.
Architecture Components
The Extractor
The first step of the data flow is the installation of a source system extractor purpose built for each source transactional system. This is the initial step in liberating the data from the transactional systems for the data warehouse and resulting analytics. Data is extracted from the transactional system at very detailed granular level and is moved to a standard, pre-defined staging schema. Each BIAnalytix Module's staging schema supports a class of source data (sales, traffic, web, audience research, accounting etc.). This staging store is normally executed on a separate server resident in the corporation's IT infrastructure that, for performance reasons, is normally close to the source transactional system.
Media corporations use a variety of transactional systems from many different vendors across their businesses. Data can be extracted from some of these systems in real time or near real time (in the digital world) and others only daily (typically linear media). This process extracts and maps the data from the source system and loads the data batch into the staging schema. For those data sources that can be uploaded on a real time or near real time basis the data can be processed quickly through the next stages of the process for immediate access of users during the day.
Because the staging schema is an open standard and essentially a flat file, the extractors are simple to implement and can be developed by the vendor of the source system, your internal IT department, or Decentrix. Decentrix has many extractor programs for a specific source system that have been pre-built. There are also a number of developed and tested extractors in the form of APIs available from operational system vendors particularly in the web and new media space. However, with any source system the process to map the extractions into the staging schemas is straightforward.
The Media Extract Transform and Load - METL™
In step 2 of the data flow, the Media Extract Transform and Load process, is the METL™ component of each BIAnalytix Module and is used to validate, normalize, cleanse, and transform the staging store data into the enterprise data warehouse's predefined media star schema (note that the media star schema must be first installed for the ETL process to be executed and tuned). This process may resides on dedicated servers that may be in-house or co-located or may reside in the cloud.
The Media Star Schema - MStar™
In Step 3 of the data flow, the enterprise media data warehouse star schema called MStar™ then accumulates the media dimension and fact records ingested through the ETL process from each source system. The MStar™ data warehouse is designed to be modular to allow each new source system to have its own repository within a single enterprise BI model. The complex Media specific data relationships are constructed here and are purpose built for the analytic needs of Media decision makers. This process may resides on dedicated servers that may be in-house or co-located or may reside in the cloud.
The Media Online Analytical Processing Cube - MCube™
Step 4 of the data flow, is the Online Analytical Processing Cube (or Cubes) called MCube™ which houses the many millions of calculated totals for rapid access to the information by the end users. This process may resides on dedicated servers that may be in-house or co-located or may reside in the cloud.
The Media Portal - MPort™
The portal, MPort™, is the resulting presentation layer of the analysis to the end users for their evidence-based decision making. The portal is responsible for providing the authentication layer for users so that they can run enterprise wide reports but only "see" the data slice they are approved to have access to. Each BIAnalytix Module includes a family of pre-built analysis samples from which an array of custom reporting can be build on. Self-service BI is a critical aspect of this presentation analysis layer putting power in the hands of decision makers so that every analysis need does not have to go through IT and the potential bottlenecks of competing priorities. The presentation of dashboards and analysis reports can be implemented in an existing portal architecture within the corporation.