FTP directory listing


#1

I’ve got an FTP server that I access with the following code:


   nFTP_FileCount = SendCommunicationHandleCommand( chFTP_Server, "dir" );
   
   if (nFTP_FileCount > 0) then 
 
   SetEndOfMessageTerminator( chFTP_Server, 0x00 );
   nStatus_FTP = ReceiveStrTable(-1, 0, stFTP_List, chFTP_Server );

The nFTP_FileCount correctly finds 77 files in the directory. However, the string table fills only 73 elements. The four files that are left out of the list are the last four alphabetically, but there is no other property of the file or file name that is different. The returned status is always 0.

Each element is around 70 characters, which seems to violate the 40 character limit suggested in the manual, but since it is not rolling the entries into the next element, I think it is OK.

Any thoughts on how to get the full directory listing with this command?


#2

Hi sensij,

I’m seeing the same problem, looks like a bug to me. I’ll submit a ticket. Works okay for 73 files or less (weird!).

In the meantime, I’m wondering where you saw this “40 character limit suggested in the manual”?

Thanks,
-OptoMary


#3

Page 95 of the PAC Control Command Reference (#1701)

The string table width is limited to 40 characters. If a string is longer than 40 characters (up to
80) the extra characters are placed in the next index.

I’ve played with this some more… if I make the file names 1 character, I can get them all in. There might be a transfer limit in there, something around 5 kB if I’m counting right.

I wonder if there is a way to return less data in the FTP server DIR command. I don’t need all the attributes, file size, and date. It would free up a lot more characters for file names.


#4

I’m thinking that I might be trying to ask too much of FTP (to manage recipes), and it is time to bite the bullet and figure out how to use OptoDataLink. A recipe database instead of discrete files probably makes more sense.


#5

You’re right, there is a FTP dir transfer limit you’re hitting. Also, not sure where that “40 characters” came from, we’ll strike it from the manual.

But now I’m confused about why you mention [I]OptoDataLink [/I](typically used to move data to/from a database from a strategy) for managing recipes, vs. PAC Display. You might want to check out this webinar re: [U][B]Recipes in PAC Display[/B][/U].

What are you doing with these files, anyway? If we knew more about what you’re up to, we might have more/better suggestions for how you could/should do it!

-OptoMary


#6

PAC Display is the operator interface of a piece of manufacturing equipment that uses an R1 controller. There are about a dozen machine settings that are product specific, which I have been using the recipe feature in PAC Display to store and recall. The recipes are stored in the file system of the computer running Pac Display, not locally on the controller, and FTP is used to manipulate them.

The recipe file names contain logical components so that they can be matched to the product being run. The recipe chart would look at the DIR response, and offer the operator any of the recipes that might apply to the product being run. A fully defined product would return one possible recipe (the most recent save of that product), but a partial product ID might return more than one recipe the operator could choose from.

This has been working well for close to a year, but I noticed when I was making some tweaks to the code that the DIR list was no longer offering all of the recipes that had been saved; leading to the creation of this thread (and in parallel, a request to optosupport).

It is likely that with another year of operation, there could be a few hundred recipe files created. While the PAC Display recipe functionality could probably handle this if I abandon FTP and store the recipes on the controller locally (or on an SD Card), I’m not excited about that path. Stepping back, I’m thinking that I will be better off storing all of the settings information in a database instead of individual recipe files. It will make it easier for me to perform maintenance on the settings, especially if a global recipe template change becomes necessary as machine components are added or changed.

I’ll need to think about database structure a bit more before heading down this path… since Datalink simply transmits table information back and forth, I should try to build the tables in a way that can return something close to right set of information without relying on a query to filter the results. Any filtering, if I understand correctly, would need to be done within the logic of the chart.

Edit: While figuring out the database piece of it, maybe I should consider a short term fix with the recipes stored locally and then backed up to the computer via FTP.


#7

The recipe filtering is both the interesting and tricky bit.
I like the concept. Its really interesting having the ability to have the operator pick a recipe based on a product number.
We only ever chose the recipe based on time. (Backing up the recipe twice a day, ~2am and ~2pm).

I have to say I am ‘uncomfortable’ with using the controller to store, filter and select so many files down the track.
Think you are on the right track using a database. It sounds a little more involved to set up, but it just sounds a better approach.
The controller simply has to send the product ‘number’ to the DB which then does the filtering and presents the solution to the operator.
The computer is the right place to do that sort of storage and filtering.
BTW, the red flag to me about using the controller to do this was your comment ‘few hundred recipe files’.
That’s a lot for it to sift through each time.
A. Lot.

Sounds like a classic situation of the old frog in the water, at first it all works great, but as you add more and more recipe files over time, it just, un-noticeably, starts to slow down. Then one out ‘out of the blue’, you break down when you notice you are waiting ‘too long’ for a filtered result.
Better to nip it in the bud now.

Just my 2 cents worth.


#8

Thanks, I appreciate the comments, it sounds like you are seeing it the same way I am. As I understand it, the interaction PAC Control can have with a database is transferring full tables. If I want to push the list filtering into the database, it would mean something like:

  1. Controller writes the part number (or partial part number) to a table in the database
  2. Database has some kind internal macro that executes based on the content of that table, executing a query and producing a filtered table output
  3. Filtered table is transferred back to the controller.

Alternatively, maybe I can make this easier by using SoftPAC, which still lets the computer processor do the filtering work but reduces the database programming required.

  1. Controller tells SoftPAC what part number (or partial part number) to process
  2. SoftPAC transfers the entire recipe table from the database, and does the filtering
  3. SoftPAC makes the filtered tables available to the controller.

SoftPAC’s filtering capabilities are sure to be less efficient that what SQL Server can do with a well designed table, but for a table of a few hundred items, I’m not sure the additional processor work really amounts to much. The learning curve for me to get a database query macro working is sure to be steeper than some optoscript to filter a set of tables.


#9

Your plan looks good. From the brief experience I had with OptoDataLink, it looked like no WHERE clauses, so your plan is the same thing I was thinking of. Take a look at triggers in your database.

It shouldn’t really be a big deal for your database to filter your data on the scale of hundreds of rows - I use PostgreSQL for my data storage and it can return a row (using a timestamp index) out of a 15 million + row table within 10 milliseconds even without the row in cache! It might be a little bit slower if you have to index on the recipe name, but I suspect not by much, as long as your recipe names aren’t thousands of characters long.

I think you can set it up with a two flag system to initiate the filter and transfer the recipe:

  1. Controller sends a recipe name & timestamp row to your database.
  2. Trigger acts on insert and executes query with recipe name in WHERE clause, first clearing, then inserting resulting records into transfer table.
  3. Trigger on transfer table acts on insert and sets flag (in another table just for this purpose maybe?) that is used as the condition for data transfer in OptoDataLink. Probably set the “one data exchange” and “reset database value” boxes true for your data link condition.
  4. Recipe data gets pulled in by controller and you do your thing with it.