MVSFORUMS.com Forum Index MVSFORUMS.com
A Community of and for MVS Professionals
 
 FAQFAQ   SearchSearch   Quick Manuals   RegisterRegister 
 ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

Read file in multiple times

 
Post new topic   Reply to topic   printer-friendly view    MVSFORUMS.com Forum Index -> Application Programming
View previous topic :: View next topic  
Author Message
js01
Beginner


Joined: 13 Oct 2005
Posts: 84
Topics: 32
Location: INDIA

PostPosted: Wed Aug 06, 2008 4:03 pm    Post subject: Read file in multiple times Reply with quote

Hello Friends,

i have a requirement which needs to read a file more than one time from first record to EOF , is there any way that i can achicve it.

i know we can do this using close and open a file to reset pointer to first record, but performance may poor as i read arround 10000 times ( closing and opening)

please advise.

thank you
Back to top
View user's profile Send private message
semigeezer
Supermod


Joined: 03 Jan 2003
Posts: 1014
Topics: 13
Location: Atlantis

PostPosted: Wed Aug 06, 2008 4:25 pm    Post subject: Reply with quote

I'd STRONGLY recommend redesigning the application to read the file once. You can sort it and sort the data you are looking for so that you only have to read up to the record you are looking for or the one just past it if that record does not exist, or you can cache the records in storage if it is fairly small <100,000 records or so, or you can move the data to a database, or keep an index of record locations so you can skip around in the file or any number of other strategies. But having to close and reread a file every time you want to look for something is a waste of resources and will be unnecessarily slow.

If you absolutely MUST read the file multiple times, see if you can find a way to read mostly from memory. That could mean caching frequently used records (depending on the incoming data, that could just be a simple most recently used lines), or you might cache it with system programs (VLF? DLF, I forget) or you could (with your systems people's approval), move the whole file to VIO and read that instead. If you are programming in assembler, you could even move it to a dataspace and with the right data structures, have almost instant access to any part of the file.
_________________
New members are encouraged to read the How To Ask Questions The Smart Way FAQ at http://www.catb.org/~esr/faqs/smart-questions.html.
Back to top
View user's profile Send private message Visit poster's website
Terry_Heinze
Supermod


Joined: 31 May 2004
Posts: 391
Topics: 4
Location: Richfield, MN, USA

PostPosted: Thu Aug 07, 2008 11:03 pm    Post subject: Reply with quote

If you give us more details about your requirement, we might be able to suggest several alternate approaches as the semi has suggested. If you need only a small portion of each record, sorting the file and loading it into a COBOL table (array) might be the way to go. You certainly do NOT want to close and reopen this file 10,000 times.
_________________
....Terry
Back to top
View user's profile Send private message Send e-mail
jsharon1248
Intermediate


Joined: 08 Aug 2007
Posts: 291
Topics: 2
Location: Chicago

PostPosted: Fri Aug 08, 2008 1:08 pm    Post subject: Reply with quote

Also consider using a VSAM file. You could reposition the file without the overhead of an open/close. If the number of records is relatively small, and you allocate enough buffers, the increased run cost could be minimal when compared to a COBOL WS table.
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic   Reply to topic   printer-friendly view    MVSFORUMS.com Forum Index -> Application Programming All times are GMT - 5 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


MVSFORUMS
Powered by phpBB © 2001, 2005 phpBB Group