What is the role of an ETL tester in scheduling Autosys job ? Why ETL tester need to know about this scheduling tool? How ETL tester can check various status of jobs in Autosys?
scheduling tools are used for in which sequence data will load in to tables , so as a tester , we should know in which sequence jobs are running ,which package's or procedure they are calling and what kind of failures we can get, ideally tester have all the rights of all scheduling tools in QA environment where he is free to introduce some error scenarios for his negative testing.
QA will have the privileges for scheduling tools. They can schedule the jobs and check the behavior of the system and the sequence of jobs running as per the schedule and monitor the timestamp of jobs and errors.
As a QA u should know all end to end of an ETL process.. In which running a job plays a big role to transfer the data from staging to ingestion , to load a table etc. Which you can handle with a toll like Autosys, TWS, Control M etc.
http://www.sqaforums.com/showflat.php?Number=574666
https://www.udemy.com/blog/etl-testing-interview-questions/
google : etl testing tutorial for beginners
http://www.youtube.com/watch?v=QhWT-H0kqbM&list=PLnPjVeHkcF0-VhvIGD5qU5ZrRsSGsLpMB
ETL tester responsibilities
writing sql queries for various scenarios like count test,
primary key test,
duplicate test,
attribute test,
default check,
technical data quality,
business data quality.
maintaining test case results with test performed on, test case id,performed by etc.....maintaining defect log.
A ETL Tester primarily test source data extraction, business transformation logic and target table loading . There are so many tasks involved for doing the same , which are given below -
1. Stage table / SFS or MFS file created from source upstream system - below checks come under this :
a) Record count Check
b) Reconcile records with source data
c) No Junk data loaded
d) Key or Mandatory Field not missing
e) duplicate data not
f) Data type and size check
2) Business transformation logic applied - below checks come under this :
a) Business data check like telephone no cant be more than 10 digit or character data
b) Record count check for active and passing transformation logic applied
c) Derived Field from the source data is proper
d) Check Data flow from stage to intermediate table
e) Surrogate key generation check if any
3. Target table loading from stage file or table after applying transformation - below check come under this
a) Record count check from intermediate table or file to target table
b) Mandatory or key field data not missing or Null
c) Aggregate or derived value loaded in Fact table
d) Check view created based on target table
e) Truncate and load table check
f) CDC applied on incremental load table
g) dimension table check & history table check
h) Business rule validation on loaded table
i) Check reports based on loaded fact and dimension table
http://www.sqaforums.com/showflat.php?Number=574666
https://www.udemy.com/blog/etl-testing-interview-questions/
google : etl testing tutorial for beginners
http://www.youtube.com/watch?v=QhWT-H0kqbM&list=PLnPjVeHkcF0-VhvIGD5qU5ZrRsSGsLpMB
ETL tester responsibilities
writing sql queries for various scenarios like count test,
primary key test,
duplicate test,
attribute test,
default check,
technical data quality,
business data quality.
maintaining test case results with test performed on, test case id,performed by etc.....maintaining defect log.
A ETL Tester primarily test source data extraction, business transformation logic and target table loading . There are so many tasks involved for doing the same , which are given below -
1. Stage table / SFS or MFS file created from source upstream system - below checks come under this :
a) Record count Check
b) Reconcile records with source data
c) No Junk data loaded
d) Key or Mandatory Field not missing
e) duplicate data not
f) Data type and size check
2) Business transformation logic applied - below checks come under this :
a) Business data check like telephone no cant be more than 10 digit or character data
b) Record count check for active and passing transformation logic applied
c) Derived Field from the source data is proper
d) Check Data flow from stage to intermediate table
e) Surrogate key generation check if any
3. Target table loading from stage file or table after applying transformation - below check come under this
a) Record count check from intermediate table or file to target table
b) Mandatory or key field data not missing or Null
c) Aggregate or derived value loaded in Fact table
d) Check view created based on target table
e) Truncate and load table check
f) CDC applied on incremental load table
g) dimension table check & history table check
h) Business rule validation on loaded table
i) Check reports based on loaded fact and dimension table