I am trying to get a better understanding of how the Analytics Database itself is populated. There is a staging and target table for each dimension and fact (excluding aggregates).
Source to Stage
From what I have been able to analyze regarding the ETL processing from source to stage, it appears that the Java application has an AnalyticsListener configured.
Question: Is the listener tracking all changes as they occur real time in the application and writing them to the staging tables?
Question: Is the real time data capture process above the only way the staing tables are processed or does the ETL process ever work off of timestamps in the application database to retrieve changes?
Question: How often are the staging tables truncated?
Stage to Data Warehouse
From some of the documentation I've read it appears that the data push from the staging to target tables occurs by reading from the staging tables and performing an upsert (updating any row that exists with the new information and updating the modification_ts, if the record doesnt exist in target then the entire stage record is inserted as new).
Question: Is this generally correct? And how are deletes handled?