View previous topic :: View next topic |
Author |
Message |
davinski.bby Beginner
Joined: 30 Jul 2007 Posts: 31 Topics: 10
|
Posted: Thu Jan 24, 2008 2:48 pm Post subject: Contention Issues: FTP vs File Creation (GDG) |
|
|
I have a CICS program that calls a PROC which has the following steps:
Code: |
//STEP030 EXEC PGM=CREATGDG,COND=(0,NE),PARM='&PARMDAT'
//SYSOUT DD SYSOUT=&UTLMSG
//SYSPRINT DD SYSOUT=&UTLMSG
//SYSUDUMP DD SYSOUT=&DUMP
//FILEGDG DD DSN=DATASET.GDG.FILE(+1),
// DISP=(NEW,CATLG,DELETE),
// UNIT=&SYST.DA,
// SPACE=(TRK,(15,15),RLSE),
// DCB=(MODELGDG,RECFM=FB,LRECL=501,BLKSIZE=0)
//***************
//STEP040 EXEC PGM=FTPPRG1,COND=(0,NE)
//SYSOUT DD SYSOUT=*
//SYSPRINT DD SYSOUT=*
//SYSUDUMP DD SYSOUT=D
// (The FTP Script is here... The step would FTP the file that it created on the previous step)
|
The problem is that the program can be called simultaneously by different users. And when two users try to execute the program at the same time, this leads to contention issues. The other user is trying to create a new GDG version while the other is trying to FTP the latest version that his job created. Is there a way around this issue.
Thanks. |
|
Back to top |
|
 |
Terry_Heinze Supermod
Joined: 31 May 2004 Posts: 391 Topics: 4 Location: Richfield, MN, USA
|
Posted: Thu Jan 24, 2008 3:19 pm Post subject: |
|
|
If CREATGDG does a lot of processing, one way of reducing contention would be to create a cataloged data set (instead of directly creating the +1 generation of the GDG), then copy that data set to the GDG base which would still be enqued for the duration of the copy, but might enque it for a shorter period of time than your current method. _________________ ....Terry |
|
Back to top |
|
 |
davinski.bby Beginner
Joined: 30 Jul 2007 Posts: 31 Topics: 10
|
Posted: Thu Jan 24, 2008 3:46 pm Post subject: |
|
|
thanks for the reply. If this would the case, would it also lead to contention if two users would access the program at the same time then two procs would create the cataloged dataset at same time? |
|
Back to top |
|
 |
kolusu Site Admin

Joined: 26 Nov 2002 Posts: 12378 Topics: 75 Location: San Jose
|
Posted: Thu Jan 24, 2008 6:22 pm Post subject: |
|
|
davinski.bby,
Change your GDG creation to a later step. Let the PGM CREATGDG create a temp sequential data set and FTP it and then copy the temp file to a GDG Version. By doing so you are avoiding the pitfall of FTP'ing the wrong file.
Hope this helps...
Cheers _________________ Kolusu
www.linkedin.com/in/kolusu |
|
Back to top |
|
 |
Terry_Heinze Supermod
Joined: 31 May 2004 Posts: 391 Topics: 4 Location: Richfield, MN, USA
|
Posted: Fri Jan 25, 2008 3:46 pm Post subject: |
|
|
Either create a temporary data set as Kolusu suggests, or, unless you have a large number of users submitting the FTP job, each user could create a unique data set name because they would have unique jobs. I prefer cataloged over temp data sets because of recoverability. _________________ ....Terry |
|
Back to top |
|
 |
CraigG Intermediate
Joined: 02 May 2007 Posts: 202 Topics: 0 Location: Viginia, USA
|
Posted: Fri Jan 25, 2008 3:50 pm Post subject: |
|
|
Create the cataloged datasets with userid as one of the nodes of the dsn. |
|
Back to top |
|
 |
davinski.bby Beginner
Joined: 30 Jul 2007 Posts: 31 Topics: 10
|
Posted: Fri Jan 25, 2008 5:32 pm Post subject: |
|
|
Thanks guys for the help!
Kolusu, I did what you told me and it seems to be working now. I tried to run two simultaneous jobs and there were no contentions.
Can you explain why contentions happen when I am using GDG's and why it does not happen on flat files?
Thanks again. |
|
Back to top |
|
 |
Terry_Heinze Supermod
Joined: 31 May 2004 Posts: 391 Topics: 4 Location: Richfield, MN, USA
|
Posted: Fri Jan 25, 2008 10:38 pm Post subject: |
|
|
I don't have access to a mainframe, but my guess is that you still might have contention if the 2 jobs that created temp data sets are trying to add a +1 gen at the same time. It's just that the likelihood of contention is less. I think you just got lucky with "no contention". Try your 2 jobs with very large data sets and run the 2 jobs concurrently. I think you'll find that the 2nd job will get a "waiting for data sets" message and will wait for the 1st one to release the enque on the GDG base. _________________ ....Terry |
|
Back to top |
|
 |
davinski.bby Beginner
Joined: 30 Jul 2007 Posts: 31 Topics: 10
|
Posted: Mon Jan 28, 2008 5:50 pm Post subject: |
|
|
hi terry, you are also correct. i tried to execute simultaneous jobs and got the following messages in my logs:
Code: |
IEF861I FOLLOWING RESERVED DATA SET NAMES UNAVAILABLE TO JOB00002
IEF863I DSN = FILE.TEMP01.FLATFILA JOB00002 RC = 04
IEF863I DSN = FILE.TEMP01.FLATFILB JOB00002 RC = 04
ENQMPF01 : DATASET CONTENTION DETECTED
ENQMPF02 : DSNAME=FILE.TEMP01.FLATFILB ..
ENQMPF03: JOB JOB00001 ON SYSTEM SYS1 HAS IT EXC
ENQMPF04: JOB JOB00002 ON SYSTEM SYS1 WANTS IT EXC
ENQMPF01 : DATASET CONTENTION DETECTED
ENQMPF02 : DSNAME=FILE.TEMP01.FLATFILA..
ENQMPF03: JOB JOB00001 ON SYSTEM SYS1 HAS IT EXC
ENQMPF04: JOB JOB00002 ON SYSTEM SYS1 WANTS IT EXC
*IEF099I JOB JOB00002 WAITING FOR DATA SETS
|
my questions now are:
how long will the other jobs wait until they abend?
is there a specific time range for this?
what if the job would be submitted 20 times simultaneously, will they line up in a queue?
thanks. |
|
Back to top |
|
 |
davinski.bby Beginner
Joined: 30 Jul 2007 Posts: 31 Topics: 10
|
Posted: Mon Jan 28, 2008 5:54 pm Post subject: |
|
|
the jobs JOB00001 and JOB00002 both completed successfully. JOB00002 waited for JOB00001 to release the files. then it run to success. is this still possible if at least 50 jobs will run?
Thanks. |
|
Back to top |
|
 |
kolusu Site Admin

Joined: 26 Nov 2002 Posts: 12378 Topics: 75 Location: San Jose
|
Posted: Mon Jan 28, 2008 6:22 pm Post subject: |
|
|
davinski.bby,
I only see contention on the Flat files but not on GDG. I thought you used a Temp dataset to create the file from step 1 and then copy to output file.
The advantage of temp files is the OS decides the names and they are unique to each job.
Show us the JCL which you ran _________________ Kolusu
www.linkedin.com/in/kolusu |
|
Back to top |
|
 |
davinski.bby Beginner
Joined: 30 Jul 2007 Posts: 31 Topics: 10
|
Posted: Mon Jan 28, 2008 10:29 pm Post subject: |
|
|
hi kolusu, the issue i am having with temp datasets is that the users who will be submitting the jobs have no access rights to create temp datasets. they are only defined to have access on certain datasets. i am receiving "Create access not granted" when creating temp datasets when using a TEST User ID. |
|
Back to top |
|
 |
davinski.bby Beginner
Joined: 30 Jul 2007 Posts: 31 Topics: 10
|
Posted: Mon Jan 28, 2008 10:36 pm Post subject: |
|
|
hi kolusu, i think i got what you mean. please disregard my last post. i will try the &&TEMP file and let you know if it works. thank you very much. |
|
Back to top |
|
 |
davinski.bby Beginner
Joined: 30 Jul 2007 Posts: 31 Topics: 10
|
Posted: Tue Jan 29, 2008 12:59 am Post subject: |
|
|
hi kolusu, i am now having a different concern. if i use the temp dataset and the OS decides the name, i am unable to know what dataset to ftp since we use a standard ftp script:
Code: |
host=HOSTNAME1
*
* LOGIN ID
domain\user1
* PASSWORD
password1
* GO TO THE DIRECTORY
cd /path/ftp
* FTP COMMANDS
put '<dataset name>' ftpfile.dat
quit
|
can i use generic/parameters for <dataset name>?
are there other ways to go around this?
thanks. |
|
Back to top |
|
 |
taltyman JCL Forum Moderator

Joined: 02 Dec 2002 Posts: 310 Topics: 8 Location: Texas
|
Posted: Tue Jan 29, 2008 8:41 am Post subject: |
|
|
Use DD support in your job. See this manual http://publibz.boulder.ibm.com/cgi-bin/bookmgr_OS390/BOOKS/F1A1B950/4.2.4?SHELF=F1A1BK61&DT=20050708142126
And sample code from this manual.
Code: | Following is a sample job that shows usage of the //DD: token. In the sample job there are two data sets that use the local file specification with the //DD: token. One is a data set that is created as a new GDG data set in STEP01 (see the OUTSET DD statement). Note that STEP02 (the FTP step) uses a backward reference with the DD02 DD statement to locate the data set. Since the referenced DD statement contains explicit DCB attributes, FTP can access the attributes prior to opening the data set. The second data set is an old data set that existed before the job was executed.
//USER33J JOB MSGLEVEL=1,MSGCLASS=H,USER=USER33,PASSWORD=**pw**
//STEP01 EXEC PGM=IEBDG
//SYSPRINT DD SYSOUT=A
//OUTSET DD DSNAME=USER33.MYGDG(+1),DISP=(NEW,CATLG,CATLG),
// VOLUME=SER=CPDLB1,SPACE=(TRK,(5,5)),UNIT=SYSDA,
// DCB=(RECFM=FB,LRECL=80,BLKSIZE=800)
//SYSIN DD *
< create statements >
//STEP02 EXEC PGM=FTP,REGION=2048K,PARM='(TCP TCPCS TRACE'
//STEPLIB DD DSN=USER33.LINKLIB,DISP=SHR
//SYSPRINT DD SYSOUT=*
//DD01 DD DSNAME=USER33.TEST.S.A,DISP=OLD
//DD02 DD DSNAME=*.STEP01.OUTSET,DISP=SHR
//OUTPUT DD SYSOUT=*
//INPUT DD *
9.67.113.57 6321
USER33 **pw**
put4 //DD:DD02 data
get5 data //DD:DD01
quit
/*
Following are short descriptions of the numbered items in the example.
1
DD statement that allocates a new generation of a GDG data set
2
DD statement for an existing data set
3
Backward reference for the new data set in STEP01
4
Put subcommand using the //DD: token for the new data set created in STEP01
5
Get subcommand using the //DD: token for the existing data set |
|
|
Back to top |
|
 |
|
|