CDW110 Exam Questions with Verified
Answers
(General Reporting Tips) If a query refers to more than one table, all columns
should be prefixed by a descriptor (table name or alias) - -Using descriptors
ensures you have unambiguous column references, preventing issues that
can occur when two tables contain columns with the same name.
-Chapter 1. (Study Checklist) Caboodle Console - -The Caboodle Console is a
web application housed on the Caboodle server. It includes the following:
Dictionary
Dictionary Editor
Executions
Work Queue
Configuration
-Chapter 1. (Study Checklist) Data Warehouse - -In a data warehouse,
multiple sources may load data pertaining to a single entity. This means that
more than one package may populate a given row in a Caboodle table. As a
result, there may be multiple business key values associated with a single
entity in a Caboodle table.
-Chapter 1. (Study Checklist) SSIS Package - -The architecture of Caboodle
includes a staging database and a reporting database. Data is extracted
from source systems (like Clarity), transformed in the staging database, and
presented for users in the reporting database. This movement of data is
realized via a set of SQL Server Integration Services (SSIS) packages.
-Chapter 1. (Study Checklist) Data Lineage - -Generally, data lineage refers
to the process of identifying the source of a specific piece of information. In
Caboodle, data lineage is defined at the package level.
-Chapter 1. (Study Checklist) Star Schema - -The standard schema for a
dimensional data model. The name refers to the image of a fact table
surrounded by many linked dimension tables, which loosely resembles a
star.
The Caboodle data model structure is based on a "star schema" ‐ where one
central fact table will join to many associated lookup or dimension tables.
This structure provides the foundation of the Caboodle data model.
-Chapter 1. (Study Checklist) DMC - -DATA MODEL COMPONENT
,No table in Caboodle "stands alone." Each is considered part of a Data Model
Component, which refers to the collection of metadata tables that support
the ETL process and reporting views stored in the FullAccess schema.
Each DMC gets a type. Strict table naming conventions are followed in
Caboodle, so that a table's suffix provides information about its structure and
purpose.
These suffixes are:
· Dim for dimensions (e.g. PatientDim)
· Fact for facts (e.g. EncounterFact)
· Bridge for bridges (e.g. DiagnosisBridge)
· DataMart for data marts (e.g. HospitalReadmissionDataMart)
· AttributeValueDim for EAV tables (e.g. PatientAttributeValueDim)
· X for custom tables (e.g. CustomFactX)
-Chapter 1. (Study Checklist) Staging Database - -The Caboodle database
into which records are loaded by SSIS packages and stored procedures.
-Chapter 1. (Study Checklist) Reporting Database - -The architecture of
Caboodle includes a staging database and a reporting database. Data is
extracted from source systems (like Clarity), transformed in the staging
database, and presented for users in the reporting database. This movement
of data is realized via a set of SQL Server Integration Services (SSIS)
packages.
-Chapter 1. (Study Checklist) Dbo Schema - -STAGING DATABASE
Import tables and Mapping tables live here. This is
primarily used by administrators for moving data into Caboodle.
REPORTING DATABASE
The dbo schema stores reporting data and acts as the
data source for SlicerDicer. The Caboodle Dictionary reflects the contents of
the dbo schema.
-Chapter 1. (Study Checklist) FullAccess Schema - -STAGING DATABASE
The FullAccess schema does not exist on the Staging database.
REPORTING DATABASE
The FullAccess schema houses views that simplify reporting. FullAccess
should be your default schema when reporting.
-(ETL Terms) Execution - -An execution is the process that extracts data
from a source system using packages, transforms the data in the staging
database, and loads it to Caboodle for reporting. You create and run
executions in the Caboodle Console.
, -(ETL Terms) Extract - -Extracts to Caboodle from Clarity can be either
backfill or incremental. Backfill extracts load or reload every row in a table
from Clarity, whereas incremental extracts load only changed rows. Existing
data is available while extracts are in progress.
-(ETL Terms)package - -A package is a definition of an extract of data from
one specific source to a specific import table. For example, a fact might have
packages for Epic inpatient data, Epic outpatient data, and several non-Epic
data sources. Packages are defined in SSIS .dtsx files.
-Chapter 1. (Study Checklist) Identify key characteristics of the dimensional
data model. - -MADE for report writers.
· Simpler and more intuitive.
· Easily extensible.
· More performant..
-Chapter 1. (Study Checklist) Identify documentation resources for reporting
out of Caboodle - -Caboodle Dictionary
Reporting with Caboodle document
Caboodle ER diagram
-Chapter 1. (Study Checklist) Identify reporting needs that best fit Caboodle
- -Custom data packages can be written by Caboodle developers to
accommodate your organization's reporting needs.
-(General Reporting Tips) Add a filter to most queries to exclude Caboodle's
special rows for unspecified, not applicable, and deleted records, which have
surrogate keys of -1, -2, and -3 - -Include only rows where the key is greater
than 0.
-(General Reporting Tips) Caboodle has a numbers table, NumbersDim, that
you can use as needed in your reports - -NumbersDim contains the integers
from 1 to 1,000,000, which you can reference to help manipulate strings and
complete other processes. If you need more than 1,000,000 rows to
accomplish a task, you can refer to NumbersDim multiple times in your
query.
-Chapter 1. (Study Checklist) How does Epic data flow into Caboodle - -Epic
data moves between several databases before it gets to Caboodle.
CHRONICLES flows into CLARITY via ETL. After transformation, the data is
stored in a relational database on a separate server. Even though the
structure of the Chronicles and Clarity databases differ significantly, the ETL
process preserves the relationships mapped in Chronicles.
CLARITY flows into Caboodle data is extracted
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller Victorious23. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $8.49. You're not tied to anything after your purchase.