Showing posts with label Testing. Show all posts
Showing posts with label Testing. Show all posts

Friday, May 30, 2014

ETL Testing Interview Questions & Answers - Part 4

Advanced ETL Testing Interview Questions & Answers 




Q1. How can you create BPT test scripts?

A1. Please follow the below steps to create BPT:
First you have to go Business Components section of ALM,
  1. create a new component there
  2. create component steps with parameters
  3. put parameter variables in bracket like <<< input1 >>>
  4. check it’s reflecting in parameters
  5. add details for business component
  6. Make status ready once all the steps are written.
Now go to the Test Plan in ALM,
  1. Create a new test script; type will be Business Process for the same.
  2. Put all the required details for script
  3. In test script tab, you have to pull component now which you have created (you can pull more than one component as required and integrate them in particular test script)
  4. After that you can see the parameters which you defined earlier for components here (suppose you have defined 4 parameters for the component then you can see 4 i/o parameters , iteration will be default 1 )
  5. If you want to check the same parameters for different iterations then you can increase the iteration here
  6. Now, BPT is ready (you can see green box)

Q2. How can you run BPT test scripts?

A2. Please follow the below steps to run BPT test scripts:

First go to Test Plan in ALM,

  1. Go to test script tab for BPT test scripts, click on iterations
  2. Default 1 iterations is mentioned there, after clicking this, you get the parameters list
  3. Define the variable value which you want to pass for all the parameters
  4. If you want to add one more iteration then, click on add iteration above
  5. Now you can define parameter values for added iterations as well.
  6. After defining all the values, we can pull this script in test lab
Now go to Test Lab in ALM,
  1. Go to the particular BPT which you want to run, click on run with manual runner
  2. Now all the defined parameter values will be substituted
  3. You need to follow the steps now
  4. check all the iterations are checked for mentioned steps and pass it
  5. BPT will be passed.


Q3. How can we link the existing defects with particular test case?

A3. Go to particular test case in Test Lab in ALM, double click on it, windows will appear. Click on Linked Defects tab, then click on Link Existing Defects then we can put defect id and click on Link. That defect will be linked with that particular test case.


Q4. Let’s suppose for making table loading faster, index will be disabled as pre-load activity, and then duplicates got loaded. Again after loading, in pos load activity index will be enabled that time it will fetch error. How to resolve this issue?

A4. You can follow the below steps to mitigate this issue –
  1. First we have to disable primary constraint
  2. drop the index after that
  3. Remove the duplicates using rowid
  4. Now create the index on the table
  5. At last we can enable the primary constraint


Q5.How can we pass the variable value through command box in teradata?

A5. We can use the below query for this –
SELECT COUNT(*)
FROM <TABLE_NAME>
WHERE <date_field> = CAST('?yyyymmdd' AS DATE FORMAT 'yyyymmdd')


Q6.How can we search particular job in control M?

A6.you can follow the below steps to find out the job in control m –
  1. Login to control m
  2. Click customizable filter in view point tab of control m
  3. Filter based on folder name and server name ( we can give wild characters here *)
  4. After filtering we can see all the instances for jobs coming under that folder
  5. In particular folder, all the jobs coming under that will be shown

Q7. How can we check the roles assigned to particular tables/view in teradata?

A7. We can use dbc.allrolerights table to find out the roles assigned to table/view
Select * from dbc.allrolerights where tablename= <table_name>
We can see the fields for role names, access rights, and grantor names here.


Q8. In control m, how can we run the job instantly which is in waiting due to dependencies or schedule?

A8. Do right click on that particular job, free that job and click run now.



Q9. In control m, how can we order a new job which is not scheduled?

A9.Click order button on home tab, select control m server, folder, particular job (if all jobs selected then all the jobs for particular folder will be submitted to run). Also check boxes ignore scheduling criteria and order as independent flow.


Q10. How can we modify command for particular job in control m?

A10. Do right click on particular job, go to properties, click on modify job, now we can change the commands and other things like schedule for that job.



Q11. In Tivoli, some job is stuck due to dependencies and resource constraints then how can we run that job?


A11. Right click on job stream, increase the priority to 101, release all dependencies, and remove time restrictions, then job should run. If still it’s stuck then do the same thing at job level; it will definitely run.



Q12. In Tivoli, how can we submit the jobstream which are not scheduled?

A12. Go to default database jobstream , search that particular jobstream , you can submit that jobstream from there. It will be reflected in scheduled job stream.

Note : If jobstream is in draft mode, then we can't submit that job. Jobstream should be active then only we can submit to plan. 








Thursday, March 20, 2014

Basic ETL Testing Interview Questions and Answers - Part 2

Hi Guys, this post is based on some basic ETL testing questions which every ETL tester will know if he has worked on it. I believe that this post will be helpful to experience candidate in interview.These interview questions are somewhat related with real time tasks and issues faced by ETL QA during the day to day work. I have also written down answers for them, so it will be very informative for you. Please post your comments and suggestions ,which will make learning better for all. Please let me know if you like the post, so you can see more post on ETL!
Keep Sharing,Keep Learning guys :)

Data Warehouse QA
ETL TESTING INTERVIEW
21 Basic Data Warehouse Testing Interview Questions and Answers:
Q1. Which basic tasks primarily done by ETL tester?
A1. A ETL Tester primarily test source data extraction, business transformation logic and target table loading . There are so many tasks involved for doing the same , which are given below -
1. Stage table / SFS or MFS file created from source upstream system - below checks come under this :
a) Record count Check
b) Reconcile records with source data
c) No Junk data loaded
d) Key or Mandatory Field not missing
e) duplicate data not there
f) Data type and size check
2) Business transformation logic applied - below checks come under this :
a) Business data check like telephone no cant be more than 10 digit or character data
b) Record count check for active and passing transformation logic applied
c) Derived Field from the source data is proper
d) Check Data flow from stage to intermediate table
e) Surrogate key generation check if any
3. Target table loading from stage file or table after applying transformation - below check come under this
a) Record count check from intermediate table or file to target table
b) Mandatory or key field data not missing or Null
c) Aggregate or derived value loaded in Fact table
d) Check view created based on target table
e) Truncate and load table check
f) CDC applied on incremental load table
g) dimension table check & history table check
h) Business rule validation on loaded table
i) Check reports based on loaded fact and dimension table
========================================================================
Q2. Generally how enevironemt variables and paths are set in Unix?
A2. dot(.) profile , normally while logging this will be executed or we can execute as dot(.) dot(.)profile
========================================================================
3. If a column is added into a table, tell me the test cases you will write for this?
A3. Following test cases you can write for this -
1. Check that particular column data type and size is as per the data model.
2. Check data is getting loaded into that column as per the DEM (data element mapping)
3. Check the valid values , null check and boundary value check for that column
========================================================================
Q4.Let's suppose you are working on a project where requirement keeps changing. How would you tackle this?
A4. If the requirement is getting changed frequently then we need to lot of regression for the same functionality which has been tested. Then you need to be ready with all your input test data and expected result, so after checking changed part , you can run all the test cases and check the results in no time.
========================================================================
Q5. How do you modify your test data while doing the testing?
A5. If input test data is ASCII file, then you can easliy prepare it in notepad+ based on the interface and then ftp it to unix server and if it's table then you can insert the rows into table as per the data model. If file other than ascii format then we can use abinitio graph to covert excel sheet into required format or use other tools are available for doing the same.
========================================================================
Q6. A table has partitions by range on data_dt, suppose it has already defined monthly partitions PTN_01 (values less than (TO_DATE ('01-Feb-2014' , 'dd-mon-yyyy' ))) for january 2014 data only and we are trying to load data for feb 2014 then what will happen? If you find any error then how to solve the same.
A6. It will fetch error - “Inserted partition key does not map to any partition” (ORA -14400) . It means parition is not there for feb data which we are trying to load, so add a new partition in the table for feb month data as below :
Alter table table_name add partition partition_name values less than (TO_DATE ('01-MAR-2014' , 'dd-mon-yyyy' ))
Note : Remember we can only create new partition for higher value than the previous created partition (it means here we can't add partition for dec 2013 as we have higher value is feb 2014 here.
========================================================================
Q7. How will you connect oracle database from unix server?
A7. sqlplus username/password@dbserver
========================================================================
Q8. If one of the Oracle procedure fetches error - “No data found” then what is the issue here?
A8. In that procedure definitely we are retrieving the data from table and passing it to a variable. If that select statement is not fetching any row then we will get this error.
========================================================================
Q9. If one of your wrapper unix script is fetching the error - “not enough memory” then what you will do?
A9. First we can check the disk usage by command df -h , then we can clean up accordingly and run the script again.
========================================================================
Q10. let's suppose we have to two tables, item (Primary Key : item id)and order (Primary Key order id, Foreign Key : item id) . If we try to delete items from order table then will we able to delete? If not then how can we do that?
A10. If we make an attempt to delete or truncate a table with unique or primary keys referenced by foreign keys enabled in another table then we get error : “ORA-02266 unique/primary keys in table referenced by enabled foreign keys”
So, before deleting or truncating the table, disable the foreign key constraints in other tables or delete the data from foreign table item first then from the primary table order here.
========================================================================
Q11. Why do we create index on a table? Please explain
A11. In nutshell I can say - for faster retrieval of data we use indexes, let's suppose I created a table order which will contain billions of data and I know - most of the time I will be querying this table using order id then I should make index on Order table for faster result.
========================================================================
Q12. What will be the default permission of a file created in UNIX? How can we provide all access to all?
A12. When a file is created, the permission flags are set according to the file mode creation mask, which can be set using the "umask" command. If umask value is set as 002 then it means file permission is 664. (-rw-rw-r--). we can chnage the permission of file as below :
chmod 777 filename (4: read , 2: write , 1: execute)
========================================================================
Q13. How we can link a defect with a test script in QC?
A13. First We should fail the test case step in test lab , then we can click on new defect (red color symbol) then enter the defect details there and raise it. Then that defect is linked with that particular step of test case. One more thing as we mention issues in actual result that will come in defect description automatically (no need to put issue details again)
========================================================================
Q14. What are the different methods to load table from files in Oracle? Also tell me methods for teradata.
A14. SQL Loader, External table loading, loading through driver JDBC . In teradata we use multiload or fastload usually.
========================================================================
Q15. What are the things you will check before you start testing? What will be your deliverables?
A15. Before starting the testing, requirement document, functional spec . Technical spec , interface, dem and unit test result should be availble atleast. My deliverables will be – test plan, test spec, test script, defect summary with root causal analysis, test execution or result report and automation script (if created).
========================================================================


Q16. What do you understand by active and passive transformation in informatica?
A16. Active transformation - 
No or records in input != No of records in output (like - filter, router, source qualifier)
Passive transformation-
No or records in input = No of records in output (like - Expression, look-up, stored procedure)



========================================================================

Q17. Let's suppose we are having order table which are having duplicate order_id. How we can delete the duplicate rows from the table? Tell atleast two methods you know.
A17. First we can do the below :
create table order_new as select distinct * from order ;
drop table order ;
rename order_new to order ;
Note : This method is faster , but we need to recreate index, partitions, constraints....
Second method
delete from order a where rowid > (select min(rowid) from order b where a.order_id = b.order_id);
Note : here we deleting the duplicate rows based on rowid which is uniquely assigned to each row by Oracle.
========================================================================
Q18. How you will find the second highest salary from the employee table? Tell me two methods atleast.
A18. First method – we can use sub query to find this as below :
select max(sal)
from emp
where sal not in (select max(sal) from emp ) ;
Note : first we fond the hoghest salary here then next highest will be the second salary only
Second method – we can use row_number for the same as below :
SELECT empno, sal
FROM
(
select empno, sal, ROW_NUMBER() OVER (order by sal desc) RN
from emp
order by sal desc
)
WHERE RN = 2;
========================================================================
Q19. How we can find out the last two modified file for a particular file mask abc* in unix?
A19. We can do this using very simple command : ls -lrt abc* | tail -2
Note: To check the last command was successful or not – we can use echo $?
========================================================================
20. How you will delete the last line of a file? Tell atleast two methods
A20 first , we can do using Sed as : Sed -i '$d' file_name
second method -
cp file_main file_bkp
sed '$d' file_bkp > file_main
rm -f file_bkp
Note : In direction > operator is used to move all the contents of file into another, >> used to append the one file data into another file.
========================================================================
Q21. Let's suppose we are migrating the data from legacy file system to oracle database tables. How would you validate that the data is migrated propely? Tell me the imp. test scenario and test cases you will write to test this requirement and how will you test that scenario?
A21 . Following scenario can be written for the same:
1) Check that all table DDL as per data model (desc table name )
2) Check that all records are loaded from source to target (match record count)
3) Check null value or junk data is not loaded into table ( check record count by putting not null conditions)
4) Check valid values for columns ( based on group by find all the counts for particular values for a field)
5) Check same data is loaded ( put a join between source table (if there) stage table and taget table and check the result)
6) Check key fields ( add number fields for target and source and match them)
7) Check business logics (create a sheet with input and output)
8) After initial loading, check incremental loading ( insert/ delete /update check all the CDC cases)
9) check the output file layout (if any)
                                                                                COPYRIGHT ©  2014 TECH BRAINS

Wednesday, February 19, 2014

Basic ETL Testing Interview Questions - Part 1

Hi Guys, Hope you all are doing well. I have come up with my new post on ETL / Data Warehouse testing questions which I came across during my interview with various companies. I believe that this blog will be helpful to experience candidate in interview. These interview questions are mostly straight forward. I will come up with more tricky questions later. Please post your comments and suggestions ,which will make learning better for all. I am not posting the answer here, if required I will post it later. Please let me know if you like the post, so you can see more post on ETL.. 

Keep Sharing,Keep Learning guys :)

21 Basic ETL Testing Interview Questions and Answers: 1. What's the different between OLTP and OLAP system? Please provide one example for both. ================================================================================ 2. What's the advantage with Snowflake compared to Star Schema? ================================================================================ 3. What is dimensional modelling? ================================================================================ 4.Let's suppose you are working on a tech refresh project i.e. (from VB code to .net code). Which type of testing you will be doing in this project ?? ================================================================================ 5.Why OLAP system are not normalized while OLTP are highly normalized system? ================================================================================ 6. Which SCD type is used to store all historical data along with current data? ================================================================================ 7.What's the difference between active and passive transformation? ================================================================================ 8. How ETL testing is different from conventional testing? ================================================================================ 9. What do you understand by ETL ? Why we need ETL ? ================================================================================ 10. What is the difference between data mart and data warehouse? ================================================================================ 11. What are the different architectures of data warehouse systems? ================================================================================ 12. Let's suppose in a test scenario one column is added in a table. What kind of test cases you will write? ================================================================================ 13. If a requirement keeps changing, then how will you test? 14. What do you understand by business process testing ? ================================================================================ 15. Which are the important steps carried out in ETL after extracting the data from source system and before transforming the data?? ================================================================================ 16. What do you understand by fact and dimension table? ================================================================================ 17. What do you understand by slowly changing dimensions? ================================================================================ 18. How would you validate that junk data is not loaded into the target table ? ================================================================================ 19. How do you check the logs if any workflow (Informatica) failed while executing? ================================================================================ 20. Have you worked on Multi file systems? How ab-initio graph process those files?? ================================================================================ 21. Let's suppose we are migrating the data from legacy file system to oracle database tables. How would you validate that the data is migrated properly? Tell me the test scenario and test cases you will write to test this requirement. ??
                                COPYRIGHT ©  2013 TECH BRAINS

Tricky Python Interview Questions and Answers Code - (Level 1)

1. Compress the String as below using Python code  :  Sample Input - 1222311 Sample Output - (1, 1) (3, 2) (1, 3) (2, 1) Explanation - First...