Overview:
This course will integrate a combination of instructor-led discussions and interactive workshops to illustrate the abilities of BusinesObjects XI Data Integrator tool. This seminar will focus on data warehousing concepts, role of the Data Integrator, defining source and target metadata, creating batch jobs, use of built-in transformations, creation of data filters, use of built-in functions, data flow optimization, perform data cleansing, role of variables and parameters, handling errors and exceptions, migration of projects, use of the Web administrator and managing our metadata.
Audience:
This course is designed for individual responsible for implementing, administering and managing projects that involve Data Integrator.
Prerequisites:
Each student should have a basic of data warehousing concepts, experience with an RDBMS and knowledge of SQL language and should have completed Introduction to Data Warehousing or understand the basic role of a Data Warehouse structures.
Objectives
- Understand Data Warehousing Concepts
- Illustrate the role of Data Integrator
- Defining Source and Target Metadata
- Creating a Batch Job
- Depict the validating, executing and debugging
of jobs
- Demonstrate the use of built-in Transforms
- Illustrate the usage of built-in Functions
- Define the optimization of a Data Flow
- Illustrate how to easily manage change and
metadata from disparate systems
- Depict how to audit data throughout the extraction,
transformation, and load (ETL) process
- Depict the ability to cleanse our corporate
data
- Demonstrate the use of variables, parameters
and Scripting
- Understand the capturing of changes in data
- Improve data integrity through demonstrated
data lineage
- Handling Errors and Exceptions
- Supporting a multi-user environment
- Migrating Projects between Design, Test and
Production Phases
- Using the Web Administrator
- Understand how to manage metadata
Class Format
Lecture and Lab
Course duration:
3 days
Course outline:
Data
Integrator Overview
- Illustrate the role of data integration
- Depict the usage of the Data Integrator
tool
- Discuss the BO Data Integrator toolset
- Examine the layout of the IDE tool
- Discuss the role of the local and central
repositories
Data
Integrator IDE
- Initiating Data Integrator
- Connecting to a local repository
- Role of the local repository
- Using Data Integrator Designer
- Illustrate role of
- Designer
- Job Server
- Access Server
- Source/Target tables
- System configuration
- Components of Designer panel:
- Projects
- Workspace
- Local object library
- Tool palette
- Single and reusable objects
- Object Hierarchy
- Jobs
- Workflows
- Data Flows
- Transforms
- Data stores
Defining
Datastores
- Illustrate object library Datastores
- Create database connection
- Defining source and targets
- Importing schema metadata
- Supported database types
Defining
File Formats
- Illustrate role of Files in object
library
- Use of File format editor
- Supported file formats
- Flat files
- Comma delimited
- XML
- Excel
- File metadata
- Formatting
- Defining data types
Integrator
Job Components
- Role of Projects
- Creating Batch Job
- Defining a Workflow
- Illustrate Job components
- Scripts
- Data Flows
- Other Workflows
- Conditionals
- Error Handling
- Looping
- Annotations
- Illustrate job output
- Trace variables
- Generated output
Using
Data Flows
- Role of Data Flows
- Creating Batch Job
- Defining a Workflow
- Illustrate Job components
Defining
a Data Flow
- Flow components
- Source
- Targets
- Transforms
- Reusable object
- Define transform types
- Query
- Many to one
- One to many
- Transform mapping
- Input and output
- Column mapping
- Validation
Using
Interactive Debugger
- Panel layout
- Data Flow
- Input source
- Output target
- Traces
- Debug variables
- Defining Breakpoints
Managing
XML Data
- Defining XML files
- Use XML Pipeline transform
- Illustrate XML DTD validation
- Perform XML transformation
MetaData
Reporting
- Using Data Management Console
- Defining Repositories
- Impact and Lineage Analysis
- Select repository
- Choosing Job Server
- Displaying table lineage
Using
Global Variables
- Role of variables
- Defining global variables
- Naming convention
- Properties
- Creating Scripts
- Passing variables to Work flows
- Utilization in script code
- Usage for Delta load jobs
Transform
Validation
- Role in Data Flow
- Use of Validate Transform
- Creating validation rules
- Success action
- Failure action
- Data flow auditing
- View audit output in Dashboard
Recoverable
Jobs
- Job Execution properties
- Define recovery Data Flow
- Create conditional
- Use local object
- Define if statement
Central
Repository Management
- Define Central Repository
- Adding repository objects
- Work Flows
- Data stores
- Data Flows
- Object Check out
- With dependents
- Without dependents
- Undoing Checkout
- Checkout without replacement
- Object versioning
- Filtering
- Deleting objects
|