Author: aswinir
Posted: Mon Dec 12, 2016 4:03 pm (GMT 5.5)
I am sorry about it. But the existing program logic was driven thru 10M records file. Reading 70K and 90M file based on keys in random mode.
Since I joined 10M with 70K (also removing duplicates) I gets only less than 3k records (out of 70K). I changed the logic of the program to read 3k file and based on the keys, read the 10M file and 90M file sequentially. hence did not load into cobol table.
Thanks
Posted: Mon Dec 12, 2016 4:03 pm (GMT 5.5)
I am sorry about it. But the existing program logic was driven thru 10M records file. Reading 70K and 90M file based on keys in random mode.
Since I joined 10M with 70K (also removing duplicates) I gets only less than 3k records (out of 70K). I changed the logic of the program to read 3k file and based on the keys, read the 10M file and 90M file sequentially. hence did not load into cobol table.
Thanks