Service

DATA ANALYTICS

;ADSVNDSNL;SDAVHJDIOFMNSDH;REFDNF

212 518 6065

Tek Block's Informatica training will master you in Informatica Power Center. This course is designed by industry experts and includes Informatica Power Centers Core Component from fundamental to advance level. With the basic training SQL programming, it will provide in-depth knowledge of data integration, advanced transformation, Informatica Architecture, data migration, performance tuning.   All of these industry topics will be taught through real-time projects and practical examples.



- What is Business Intelligence?

- Online Transaction Processing System (OLTP)

- OLTP Vs OLAP

- The architecture of Data warehouse



- Top-Down (Immon) Vs Bottom Up (Kimball)

- ROLAP, MOLAP, and HOLAP


- What is the Data Model?

- E-R Modeling

- Dimensional Modeling

- Dimension table

- Fact Table

- SURROGATE KEY

- SCD Type 1

- SCD Type 2

- SCD Type 3

- Types Measures or Facts or Metrics

- What is Schema?

- STAR Schema

- SNOWFLAKE SCHEMA

- GALAXY SCHEMA

§ Confirmed Dimension

- Junk Dimension

- Degenerated Dimension

- Types of Fact Tables

- Fact less fact



- Select statement

- Arithmetic Operators

- NULL Values

- Concatenation Operator

- Restricting and Sorting data

- Comparison Conditions

- Logical Conditions

- Order by clause

- Single Row Functions

- Multiple Row Functions

- Date Functions

- Data type conversion

- NVL Functions

- Join, UNION

- Aggregator

- Sub-queries



- What is ETL?

- ETL Tools

- Different Informatica Products



- PowerCenter Repository database

- PowerCenter Repository Service

- PowerCenter Integration Service

- Domain, Node

- PowerCenter Client Tools

- PowerCenter Repository Manager

- PowerCenter Designer

- PowerCenter Workflow Manager

- PowerCenter Workflow Monitor

- Informatica Administrator Home Page

- Informatica Analyst

- Developer Client

- Informatica Developer

- ETL Project Life Cycle

- Understanding Real-Time Case Study Architecture

- OLTP Source Tables

- Staging Tables

- Dimension and Fact Tables



- How to Create Straight Load mapping (One To One Mapping)

- Concatenating FIRST_NAME and LAST_NAME using the Expression Transformation

- How to filter the data from Flat File using Filter Transformation and Expression

- Fixed width Flat Files

- Delimited Flat Files

- Different Row-level Functions

- Populating the Source File Name into the Target Table



- How to divide the data into multiple targets tables using the Router Transformation

- The need for Router Transformation

- Filter Vs Router

- How to generate the Target File Name with Timestamp using Expression Transformation

- How to write the data to Target file

- How to generate the Header in Flat file with UNIX Command

- How to generate the Footer in Flat file with UNIX Command

- Joiner, Sorter and Aggregator Transformation using Flat File and Relation table

- Heterogeneous Joins

- Homogeneous Joins

- Incremental Aggregation

- Populating Unique Records into one Target and Duplicates into another Target

- Eliminating Duplicate Records from Flat File



- Usage of Lookup Transformation

- Different Types caches

- Static Vs Dynamic Vs Persistent

- Usage of Unconnected Lookup Transformation

- Difference between Connected and Unconnected Lookup

- Update Strategy Transformation

- Insert, Update, Delete and Reject at mapping Level

- Update Else Insert at Session Level

- Updating the Target Table which does not have Primary Key



- Converting Columns into Rows by using Normalizer Transformation

- Processing Multiple Flat Files into Target using Indirect Method

- Direct Vs Indirect Method

- Creating the List File

- Converting Rows into Columns using Expression and Aggregator Transformation

- Finding TOP and BOTTOM ranked products by revenue using UNION and RANK Transformation

- Combining Multiple Pipelines using Union Transformation

- Combining Heterogeneous Sources


- Implementing Slowly Changing Dimension -Type 1

- Implementing Slowly Changing Dimension -Type 2

- Implementing Slowly Changing Dimension -Type 3

- Loading data into STAR schema Dimensions

- How to create Shortcuts for Reusable Mappings

- What is Versioning?

- Create Reusable Sessions

- Sequential Workflow

- Creating Link Task Conditions

- Parallel Workflow

- Error Logging at Session level

- Error Logging Types

- Tracing level



- Incremental load using Parameter File---FACT Load

- Historical Load

- Defining Mapping Parameters

- Different ways of Creating the Parameter File

- Incremental load Using Mapping variables-FACT Load

- Incremental load Using JOB_CONTROL table-FACT Load

- Event Wait and Event Raise task

- Predefined (File watcher)

- User-Defined

- Email task

- Session failure

- Session Success

- Workflow Failure

- Workflow Success

- Attaching a file to an email

- Notifying support team with Success Rows, Rejected Rows, Failed Rows

- Assignment task, Decision task, Control Task, Timer Task

- Defining Workflow/Worklet Variables

- Worklet

- Command task



- Eliminating the Duplicate Records by Lookup Transformation

- Usage of Dynamic Lookup Cache

- Defining Update Lookup Cache Condition

- Generating Sequence Numbers without using sequence generator Transformation

- Reusable Transformations

- Transformation Developer

- How to Use the Variable Ports in Expression

- Populating the Source First Record into First Target, Second Record into Second Target and

                third into the Third target

- With Expression (MOD Function) and Router Transformation

- How to use the Variable Port

- With Sequence Generator Transformation

- Finding a Cumulative Sum of Values using Expression Transformation



- Splitting the target file dynamically based on the Content

- Transaction Control Transformation

- Source and Target Based Commit Interval

- User-Defined Commit Interval

- Invoking the Stored Procedure Transformation from Informatica PowerCenter

- Connected Vs Unconnected Stored Procedure

- Drop indexes using Stored Procedure

- Create Indexes using Stored Procedure

- Analyze Fact Tables Using a Stored Procedure

- Loading First Half Records into One Target and Another half into another Target

- Generating the Row Based on the Column Value

- Reversing the Contents of File

- How to Skip the Header and Footer Record in Source Files



- Scheduling Workflows

- Informatica Scheduler

- Third-party Scheduling Tools

- PMCMD

- PMREP

- INFACMD

- How to Create the List file using the UNIX Shell script

- How archive the Files using UNIX shell script

- Reading the data from XML File

- Preparing Unit Test cases, Unit Test Scripts (SQL Queries)

- Performance Tuning Informatica

- Partitioning in Informatica

- Range Partition

- Pass-Through Partition

- Hash Auto Key Partition

- Hash User Key Partition

- Dynamic Partition

- Pushdown Optimization

- Migration of the Informatica Code Using Repository Manager

- Database Migration

- Export/Import of Informatica Objects

- Deployment Groups



-  Project Overview

- Project Implementation

- Overview of other ETL & BI Tools

- ETL & BI Testing Overview

- Use of Quality Center /Test Director



A: As an Informatica/ETL Developer a candidate should have basic knowledge of Database, SQL, a different type of operating system like Windows, UNIX,  basic knowledge of SDLC, STLC, Test Management Tool and in-depth knowledge of Informatica Power Center component.



A: Yes, we provide a 100% guaranteed job placement assistance for our graduate.



- Industry Experienced Trainers

- Assessments and Assignments

- Hands-on training with real-life use cases

- On-campus and online training in small batch


A: Since our Informatica training course starts with the basics of all concepts anyone can join, however, our regular attendees are College graduate (local or abroad), QA Analyst, Project Manager, Software Developers, etc.


A: There is no pre-requisite for Informatica Power Center training since we start from scratch, however, basic knowledge of SQL programming, database, UNIX, software development process will be helpful.



A: The major objective of our Informatica training program is to provide in-depth knowledge of Informatica Power Centers Core Component from fundamental to advance level, data integration, advanced transformation, Informatica Architecture, data migration, performance tuning.  



A: Along with additional topics, tools that we always covered, the following will be the major area we will be focusing on: SQL, UNIX basic, Informatica Power Center.



A: With our extensive training we prepare our candidate is such a way that they will be ready for the job market. We help them with resume preparation, a number of mock interview sessions, provide them up to date frequently asked interview questionnaires and back them up until they land a job.