The ETL and Data Warehousing tutorial is organized into lessons representing various business intelligence scenarios, each of which describes a typical data warehousing challenge.
This guide might be considered as an ETL process and Data Warehousing knowledge base with a series of examples illustrating how to manage and implement the ETL process in a data warehouse environment.
The purpose of this tutorial is to outline and analyze the most widely encountered real life datawarehousing problems and challenges that need to be taken during the design and architecture phases of a successful data warehouse project deployment.
The DW tutorial shows how to feed data warehouses in organizations operating in a wide range of industries.
Each provided topic is thoroughly analyzed, discussed and a recommended approach is presented to help understand how to implement the ETL process.
Going through the sample implementations of the business scenarios is also a good way to compare BI and ETL tools and get to know the different approaches to designing the data integration process. This also gives an idea and helps identify strong and weak points of various ETL and data warehousing applications.
This tutorial shows how to use the following ETL and datawarehousing tools: Datastage, SAS, Pentaho, Cognos and Teradata.
Data Warehousing & ETL Tutorial lessons
- Sample design in Pentaho Data Integration" href="http://etl-tools.info/en/examples/generate-surrogate-key.htm">Surrogate key generation example which includes information on business keys and surrogate keys and shows how to design an ETL process to manage surrogate keys in a data warehouse environment. Sample design in Pentaho Data Integration
- Solution examples in Datastage, SAS and Pentaho Data Integration" href="http://etl-tools.info/en/examples/header-trailer-processing.htm">Header and trailer processing - considerations on processing files arranged in blocks consisting of a header record, body items and a trailer. This type of files usually come from mainframes, also it applies to EDI and EPIC files. Solution examples in Datastage, SAS and Pentaho Data Integration
- Sample loading in Teradata MultiLoad" href="http://etl-tools.info/en/examples/ftp-load-extract.htm">Loading customers - a data extract is placed on an FTP server. It is copied to an ETL server and loaded into the data warehouse. Sample loading in Teradata MultiLoad
- Sample Cognos implementation" href="http://etl-tools.info/en/examples/data-allocation.htm">Data allocation ETL process case study for allocating data. Sample Cognos implementation
- Sample Kettle implementation" href="http://etl-tools.info/en/examples/data-masking.htm">Data masking and scambling algorithms and ETL deployments. Sample Kettle implementation
- Sample design in Pentaho Kettle" href="http://etl-tools.info/en/examples/site-traffic-dw.htm">Site traffic analysis - a guide to creating a data warehouse with data marts for website traffic analysis and reporting. Sample design in Pentaho Kettle
- Sample outline in PDI" href="http://etl-tools.info/en/examples/data-quality.htm">Data Quality Tests - ETL process design aimed to test and cleanse data in a Data Warehouse. Sample outline in PDI
- XML ETL processing
As an expert in Informatica and dwh, after a long research I prepared a quality material to the IT bees.
ReplyDeleteYou can visit and refer this Data Warehousing | Informatica | Oracle | Unix Tutorial
its really very nice
ReplyDeletesreekanth( www.train4job.com)