Informatica Interview Questions

Showing Questions 1 - 20 of 51 Questions
First | Prev | Next | Last Page
Sort by: 
 | 
Jump to Page:
  •  

    What are Dimensions and various types of Dimensions?

    set of level properties that describe a specific aspect of a business, used for analyzing the factual measures of one or more cubes, which use that dimension. Egs. Geography, time, customer and product.

    Star Read Best Answer

    Editorial / Best Answer

    ManishTewatia  

    • Member Since Jul-2010 | Jul 22nd, 2010


    What Md.Rehman is trying to say is catagorizing the SCD which are the Slowly Changing Dimensions which are used to maintain historical data.

    Dimension
    :A dimension is an organized hierarchy of categories, known as levels, that describes data in data warehouse fact tables

    The various types of dimensions are :

    1) Shared and Private Dimensions: Describes the basic differences between shared and private dimensions and their uses

    2) Regular Dimensions: Provides information about regular dimensions and their variations

    3) Parent-Child Dimensions: Describes the creation of parent-child dimensions and identifies their advantages and restrictions

    4) Data Mining Dimensions: Describes the creation of data mining dimensions and identifies advantages and restrictions to their use

    5) Virtual Dimensions:Describes the creation of virtual dimensions and their advantages and restrictions

    6) Dependent Dimensions: Describes the creation of dependent dimensions and identifies their advantages and restrictions

    7) Write-Enabled Dimensions: Describes the creation of write-enabled dimensions and identifies their advantages and restrictions

    VIKRAM KUMAR

    • Mar 1st, 2018

    Dimensions are of several types but we use few commonly which are, 1. Confirmed dim. dim. shared by all fact tables. ex- time dim. 2. Slow changing dim. it is of 3 types : SCD1. SCD2, SCD3., 3. Role ...

  •  

    How to join the data of two tables which does not have common columns

    how to join the data of two tables which do not have common columni mean how to perform non equi join in informaticaEg:just like getting data from emp and salgrade table of oracle where sal is between losal and hisalthanksvenu

    milind

    • Aug 6th, 2015

    Yes dummy column will work
    other way is to use mapping variable :)

    ashish

    • Sep 27th, 2014

    You can create a dummy port in each source and assign 1 to both port and drag the both sources in joiner tr and give join condition on dummy port, After that you can use filter condition on your location

  •  

    Could any one please help in finding the solution for this workflow related question.

    Suppose I have one source which is linked into 3 targets.When the workflow runs for the first time only the first target should be populated and the rest two(second and last) should not be populated.When the workflow runs for the second time only the second target should be populated and the rest two(first and last) should not be populated.When the workflow runs for the third time only the third target...

    sum

    • Oct 8th, 2015

    Define one variable $$CNT in mapping and assign its initial value as 1 in parameter file Take an expression and do below coding count =$$CNT INC_COUNT=$$CNT+1 O_INC_CNT=SETVARIABLE($$CNT,INC_COUNT) Ad...

    Rakesh

    • Feb 9th, 2015

    your above logic is perfect..but instead resting the variable at workflow level you can add these condition in first router group => MOD(o_Flag,3) = 1 2nd router group =>MOD(o_Flag,3) = 2 3rd router g...

  •  

    $ & $$ in Mapping or Parameter File

    What is the difference between $ & $$ in mapping or parameter file? In which cases they are generally used?

    Star Read Best Answer

    Editorial / Best Answer

    anshu.gangwar  

    • Member Since Sep-2008 | Sep 30th, 2008


    $ prefixes are used to denote  session Parameter and variables and $$ prefixes are used to denote mapping parameters and variables

    SEHAJ

    • Aug 18th, 2011

    $-This is d symbol for server o inbuilt variable.

    $$-this is the symbol for the variables o parameters which v create

  •  

    What is a Source Qualifier?

    It represents all data queried from the source.

    Star Read Best Answer

    Editorial / Best Answer

    sprajarajan  

    • Member Since Mar-2008 | Aug 8th, 2008


    Source Qualifier Is the default Transformation.
    Through The source Qualifier Transformation Informatica Reads The Data.
    We can Filter The Data.
    We can sort the Data.
    Its also Used to Join Homogeneous Source systems.
    We can Join Any number of Sources in Singlae Source Qualifier.
    We Can't Join the Flatfiles In sourcequalifier Because Flatfiles Are Heterogeneous When we open the Flatfiles At sourcequalifier At the time All The options are Disabled.

    joyfun23

    • Jan 26th, 2010

    1. Source Qualifier is the most important transformation which convert the source data type in to compatible NATIVE datatype of a mapping.2. Without a SQ a mapping can not be created,?after extractio...

  •  

    Can we revert back reusable transformation to normal transformation?

    venkat2ram

    • Jul 14th, 2014

    Yes, We can!

    Open the folder where the reusable transformation is existing.

    Then open mapping designer then drag the reusable transformation to mapping designer area. Now hold the Ctrl key and drop it.

    Now you have non reusable version of your reusable transformation.

    Thanks

    Bibhu

    • Jul 4th, 2014

    Yes, we can. It can be done in the transformation developer.

  •  

    In which condtions we can not use joiner transformation(Limitaions of joiner transformation)?

    Both pipelines begin with the same original data source. Both input pipelines originate from the same Source Qualifier transformation. Both input pipelines originate from the same Normalizer transformation. Both input pipelines originate from the same Joiner transformation. Either input pipelines contains an Update Strategy transformation. Either input pipelines contains a connected or unconnected...

    Tejas

    • Jun 7th, 2019

    We cannot use Update strategy in a pipeline when using joiner condition, because update strategy has property i.e treat row as which has insert,update,delete, which are processed at runtime, this limit joiner transformation to know exact properties values till runtime, and then proceed further.

  •  

    Transformation to Load 5 Flat files

    What is the method of loading 5 flat files of having same structure to a single target and which transformations will you use?

    Star Read Best Answer

    Editorial / Best Answer

    sarun5  

    • Member Since Feb-2008 | Mar 13th, 2008


    Guys I have got the answer for which I asked..here you go

    This can be handled by using the file list in informatica. If we have 5 files in different locations on the server and we need to load in to single target table. In session properties we need to change the file type as Indirect.
    (Direct if the source file contains the source data. Choose Indirect if the source file contains a list of files.
    When you select Indirect, the PowerCenter Server finds the file list then reads each listed file when it executes the session.)
    am taking a notepad and giving following paths and file
    names in this notepad and saving this notepad as
    emp_source.txt in the directory /ftp_data/webrep/

    /ftp_data/webrep/SrcFiles/abc.txt
    /ftp_data/webrep/bcd.txt
    /ftp_data/webrep/srcfilesforsessions/xyz.txt
    /ftp_data/webrep/SrcFiles/uvw.txt
    /ftp_data/webrep/pqr.txt

    In session properties i give /ftp_data/webrep/ in the
    directory path and file name as emp_source.txt and file type as Indirect.

    Anurag

    • Nov 27th, 2017

    You should use a indirect file (which contain path and name of all the 5 files) as source. Transformation depends on the business logic.

    Ashok Gulagond

    • Dec 16th, 2016

    When we have the same structured flat files, why don't simply use UNION make them a single file and then load to target instead of such complication? or UNION doesn't work on flat files?

  •  

    Joiner Transformation Master Detail

    Suppose you have 2000 records in one table and 12000 in another which one you will consider as master and detail?

    Star Read Best Answer

    Editorial / Best Answer

    ghola  

    • Member Since Jul-2008 | Aug 15th, 2008


    The joiner transformation compares each row of the master source against the detail source. Hence, fewer number of rows in master means fewer iterations of join comparison.

    Secondly It is easier to cache the table with fewer number of rows.

    Hence, using the table having the fewer number of rows as a master improves the performance.

    Raju R

    • Oct 27th, 2015

    @ponnana: The performance of the joiner depends on the rows and not on the columns. Also you take up the condition which consists n number of columns. Based on the condition the rows are obtained. So We need not worry about the columns.

    Rashmi

    • Oct 18th, 2015

    The join is happening based on some condition. Therefore match process depends on the condition columns only and not how many columns are present in total. The table you drag into joiner first is considered as Master.

  •  

    Scenario based

    If I have a source as below:Employeed, FamilyId, Contribution1,A,100002,A,200003,A,100004,B,205,B,20________________________________And my desired target is as below:EmployeeId,Contribution(In Percent)1,25%2,50%3,25%4,50%5,50% ____________________________________________________________________________Explanation: The contribution field in target is the individual Employee's share in a family's contribution.Say...

    Mohan Krishna Bellamkonda

    • Sep 7th, 2017

    Code
    1.  

    Mohan Krishna Bellamkonda

    • Sep 7th, 2017

    Code
    1.  

  •  

    We can insert or update the rows without using the update strategy. Then what is the necessity of the update strategy?

    Mohan

    • Aug 12th, 2018

    1. You cannot perform DD_Delete unless you have update strategy.
    2. While updating a record, ideally created date should not be changed. for that we definitely have to create two targets (insert and update)

  •  

    How to only last 100 records from a flatfile?

    Thru informatica how we can load only last 100 records.

    Prasad

    • Mar 8th, 2016

    SQ->exp->filter->target

    In expression we need to calculate sequence number then filter records in filter transformation as column>=50 and column

    Naveen

    • Dec 11th, 2015

    Use SQ --> EXP --> RANK --> TGT

    In EXP initialize v_count=v_count+1
    In Rank Txn, rank on v_count and select BOTTOM 100.

    I think, this will work.

  •  

    Mapplet Transformations

    What are the transformations not used in mapplet and why?

    Star Read Best Answer

    Editorial / Best Answer

    gazulas  

    • Member Since Jan-2006 | Jul 2nd, 2008


    The following should not include in a mapplet.

    • Normalizer transformations
    • Cobol sources
    • XML Source Qualifier transformations
    • XML sources
    • Target definitions
    • Pre- and post- session stored procedures
    • Other mapplets

    The exact reason i why these should not used i dont know.

    sahan

    • Jul 16th, 2009

    Normalizer transformationXML source qualifierXML filesOther targets.If
    you need to use sequence generator transformation use the reusable sequence gen t/rIf
    you need to use stored procedure transformation, make the stored procedure type as normal.

  •  

    What are Target Types on the Server?

    Target Types are File, Relational and ERP.

    Star Read Best Answer

    Editorial / Best Answer

    manojkumar_dwh  

    • Member Since Apr-2007 | Apr 14th, 2007


    PowerCenter can load data into the following targets:

    • Relational. Oracle, Sybase, Sybase IQ, Informix, IBM DB2, Microsoft SQL Server, and Teradata.
    • File. Fixed and delimited flat file and XML.
    • Application. You can purchase additional PowerCenter Connect products to load data into SAP BW. You can also load data into IBM MQSeries message queues and TIBCO.
    • Other. Microsoft Access.

    You can load data into targets using ODBC or native drivers, FTP, or external loaders.

  •  

    How do you identify existing rows of data in the target table using lookup transformation

    Can identify existing rows of data using unconnected lookup transformation.

    Star Read Best Answer

    Editorial / Best Answer

    Answered by: SK

    • Aug 30th, 2007


    There are two ways to lookup the target table to verify a row exists or not :
    1. Use connect dynamic cache lookup and then check the values of NewLookuprow Output port to decide whether the incoming record already exists in the table / cache or not.

    2. Use Unconnected lookup and call it from an expression trasformation and check the Lookup condition port value (Null/ Not Null) to decide whether the incoming record already exists in the table or not.

    doppalpaudi

    • Jul 9th, 2010

    Lookup Transformation is used to cheek weather the data is present in target or not.This transformation is of 2 types 1. Connected lookup transformation2. Unconnected lookup transformation.By using above both we can cheek the target for data .

  •  

    How to delete duplicate rows in flat files source is any option in informatica

    Chinna

    • May 20th, 2013

    Using Sorter Transformation it will delete duplicate records and using Expression Transformation... "c S->SQ->EXP->Router->TR.... Exp Logic=V_Prod_id=Prod_id V_Dup_...

    Madhuri

    • Jun 22nd, 2012

    Use sorter transformation after SQ and check the distinct check box of sorter.finally connect to the tgt.

  •  

    What is Session and Batches?

    Session - A Session Is A set of instructions that tells the Informatica Server How And When To Move Data From Sources To Targets. After creating the session, we can use either the server manager or the command line program pmcmd to start or stop the session.Batches - It Provides A Way to Group Sessions For Either Serial Or Parallel Execution By The Informatica Server. There Are Two Types Of Batches...

    Star Read Best Answer

    Editorial / Best Answer

    sanghala  

    • Member Since Apr-2006 | May 14th, 2007


    Session:  A session is a set of commands that describes the server to move data to the target.

    Batch :  A Batch is set of tasks that may include one or more numbar of tasks (sessions, ewent wait, email, command, etc..,)

    There are two types of batches in Informatica:

    1. Sequential: When Data moves one after another from source to target it is sequential

    2. Concurrent: When whole data moves simultaneously from source to target it is Concurrent

    ravindra

    • Jul 15th, 2011

    Session is an object of repository,which instructs the informatica server(integration service)to execute the mapping with given database connection's...

  •  

    Why we use lookup transformations?

    Lookup Transformations can access data from relational tables that are not sources in mapping. With Lookup transformation, we can accomplish the following tasks:Get a related value-Get the Employee Name from Employee table based on the Employee IDPerform Calculation.Update slowly changing dimension tables - We can use unconnected lookup transformation to determine whether the records already exist...

    Star Read Best Answer

    Editorial / Best Answer

    Answered by: prodyot Sarkar

    • Jul 31st, 2007


    The following reasons for using lookups.....

    1)We use Lookup transformations that query the largest amounts of data to improve overall performance. By doing that we can reduce the number of lookups on the same table.

    2)If a mapping contains Lookup transformations, we will enable lookup caching if this option is not enabled .
    We will use a persistent cache to improve performance of the lookup whenever possible.
    We will explore the possibility of using concurrent caches to improve session performance.
    We will use the Lookup SQL Override option to add a WHERE clause to the default SQL statement if it is not defined
    We will add ORDER BY clause in lookup SQL statement if there is no order by defined.
    We will use SQL override to suppress the default ORDER BY statement and enter an override ORDER BY with fewer columns. Indexing the Lookup Table
    We can improve performance for the following types of lookups:
    For cached lookups, we will index the lookup table using the columns in the lookup ORDER BY statement.
    For Un-cached lookups, we will Index the lookup table using the columns in the lookup where condition.

    3)In some cases we use lookup instead of Joiner as lookup is faster than joiner in some cases when lookup contains the master data only.

    4)This lookup helps in terms of performance tuning of the mappings also.

    Satya

    • Aug 9th, 2014

    Use lookup transformation in a mapping to
    --get related values
    --perform complex calculation
    --Handle slowly changing dimension.

    gazulas

    • Apr 22nd, 2009

    Thats a good question, suppose say you have 40 transformations in your mapping which invole complex agg, functions,? so in middle of the mapping there is a requirement to get the data from some? x t...

  •  

    What is the way to add the total number of records that have been read from SRC in the TGT file as last line?

    What is the way to add the total number of records that have been read from SRC in the TGT file as last line? Let me clear you if my TGT is a flat file and want to add total number of records that are written in TGT as a last line with sysdate.

    Surbhit

    • Apr 1st, 2014

    Use an aggregator and do no group by any key

    Ankit Kansal

    • Feb 15th, 2014

    Easy way of achieving the functionality is with Unix commands however if you want to achieve it using Informatica only then Then just before dumping your data to your target use aggregator count() f...

  •  

    What are Aggregate transformation?

    Aggregator transformation allows you to perform aggregate calculations, such as averages and sums.

    Star Read Best Answer

    Editorial / Best Answer

    Answered by: Praveen vasudev

    • Sep 12th, 2005


    Aggregator transform is m uch like the Group by clause in traditional SQL.

    this particular transform is a connected/active transform which can take the incoming data form the mapping pipeline and group them based on the group by ports specified and can caculated aggregate funtions like ( avg, sum, count, stddev....e.tc) for each of those groups.

    From a performanace perspective if your mapping has an AGGREGATOR transform use filters and sorters very early in the pipeline if there is any need for them.

    veepee

    sivakp

    • Mar 13th, 2011

    1. Aggrigator transformation allows to perform aggrigate calculation, such as SUM, MAX, MIN, FIRST, LAST2. Aggrigator transformation allows to perform aggrigate calculation of group.

    shr_4

    • Oct 27th, 2010

    To perform Group by calculations  we use Aggregator Transformation.It perform calculations similar to Expression Transformation.But difference  between both is that Aggregator Transform...

Showing Questions 1 - 20 of 51 Questions
First | Prev | Next | Last Page
Sort by: 
 | 
Jump to Page: