View previous topic :: View next topic |
Author |
Message |
Shankarganesh_gopal Beginner
Joined: 24 May 2005 Posts: 36 Topics: 20 Location: chennai
|
Posted: Tue Nov 07, 2006 1:17 am Post subject: Issue in FTP job |
|
|
Hi all,
I have GDG file that was FTPed from Client. Using that current GDG version, our jobs should run. Currently it is designed to take the latest version.
Problem is, when they didn't send the file, then our job is taking the latest version ( created yesterday and already processed) and starts executing the job.
What we need is, before executing our job, we want to check, is new version is created for today or not. Is there any way to achive this.
Thanks a ton in advance.
Shankar |
|
Back to top |
|
 |
rizi Beginner
Joined: 02 Nov 2006 Posts: 1 Topics: 0
|
Posted: Tue Nov 07, 2006 2:21 am Post subject: |
|
|
A lot depends on how the FTP job is called from. If a single JCL calls the proc to FTP in one step and your job which uses the FTP, then you can use suitable return code checks to run the step. It seems not to be the case with your job and in that case it get more difficult. I am not sure if there is a way to check creation dates in JCL - there are ways in COBOL to do that, which you can use.
The best option I would suggest is to write down the fullname of the GDG version you used for the job run in a PDS, then execute a step that calls a program to verify the file names and returns a error code if they are same. Looks complex - but that's my best suggestion.
I would be more than happy if someone proves how stupid I am by giving a simpler solution.
rizi |
|
Back to top |
|
 |
taltyman JCL Forum Moderator

Joined: 02 Dec 2002 Posts: 310 Topics: 8 Location: Texas
|
Posted: Tue Nov 07, 2006 8:44 am Post subject: |
|
|
You could run a step to do a listcat on the dataset and use rexx to parse the output for creation date. You may have to use the g00v00 format instead of the relative gdg dsname. You could still have problems with this technique if the dataset is created near midnight. |
|
Back to top |
|
 |
superk Advanced

Joined: 19 Dec 2002 Posts: 684 Topics: 5
|
Posted: Tue Nov 07, 2006 11:43 am Post subject: |
|
|
Why aren't you deleting/renaming/moving the GDG immediately after it's been processed? |
|
Back to top |
|
 |
sriramla Beginner
Joined: 22 Feb 2003 Posts: 74 Topics: 1
|
Posted: Tue Nov 07, 2006 3:43 pm Post subject: |
|
|
I second superk's suggesion. We had the same scenario you have explained (in our case FTP is done by an external system so mainframe job cannot check the status of FTP process).
What we did is this:
1. Create a 'working' GDG for FTP to add new generation(s).
2. Run the job using the 'working' GDG.
3. If job completed successfully, copy the dataset to 'history' GDG for archival purposes and delete the 'working' GDG versions. (Not the base, though).
Note that when we do not get a file thru FTP, the GDG will be empty and job will fail with 'dataset not found' error. |
|
Back to top |
|
 |
|
|