Import Text File

Greetings! I’ve been using Opto for 10+ years now in the oil and gas industry and this is my first forum post to write (I’ve been helped by reading many!)

I am trying to write certain information from a .txt file into Float Variables within Pac Control Pro. I’ve been stuck on this problem and looking for any kind of assistance. Here is an example of of the .txt file:

More information that will help:
The data is always in this order and format.
I’m only trying to pull the “Component” (gas name) and the “Raw Mole %”

I’m hoping for some tips to pull this data into PAC Control.
Thanks!

1 Like

Hi Nathan! 'Bout time you joined us! (Big grin). Welcome.

What hardware are you running this on?
Looks like the perfect job for Node-RED.
Even if there is a PC in the mix that you could run Node-RED and SoftPAC on and then pass the information down to the PAC Controller for example… Just looking for some ideas of what hardware options are in the mix.

We have a large network of PC’s running PAC Pro 10.3 hooked up to various S-1 Controller’s. Each S-1 then daisy chain’s to 3 EB-1’s.

If Node-RED would be the best path, I have a PC that can run Node-RED. We use Node-RED for some ModBus communication, but my programming in Node-RED compared to Control is very lacking to say the least.

Would this be simpler to convert to a comma delimited Excel file first?

Yes, having the file as a CSV for starters would be amazing, but… it depends on if that can be automated or not. If not, then we can probably deal with it as it is (but for sure CSV will be a LOT better as you will see).

So, here is how I would attack it… Node-RED has a file in node. It can read that file directly from the PC hard drive. It would need to be fired by some sort of inject node, either a timer, or a cron schedule etc.
There is another node, a file watch node that you can add to the project to check when that file is created or updated, so that takes care of that part of it.

image

In this flow, I am using both. I watch for the marked file, then I read the file when its been updated.

Next, we need to parse the file.
For the current text file, I would probably use the split node.

As it is, I would try splitting on a " " (a space, or perhaps get each line by splitting on \n\r), so remove the \n and put a space in there.
Tempted to try also splitting on " since a lot of your characters are delimited by that.
While not every field in your original text file has that, its not a huge problem because Node-RED is great a getting sub-string, so sure the " words would have extra, but we can pull the fluff out as needed.

Now, if you can do a CSV, then I would use the CSV to JSON node…

With this node you get to name the JSON keys in the top line so you can address it exactly as key:value pairs latter in the flow.

Once you have extracted your gas name and raw mole, you can then send it to the controller using the PAC Nodes… You want the PAC write node:

I would probably just deal with them as vars, but you can see that if you want to build a table and send them that way as well.
Note, you need to be using 9.x firmware to use these nodes.

Hope that gives some ideas.
I’m more than sure there are other ways to do the same thing, but yeah, text file reading is very hard with a PAC and while you could parse the raw string, it would need to be sliced up a lot due to PACs limit of 1024 characters…

Ok great, I think I follow you on this. This definitely gives me a place to start! I’ll start working on this soon and I’ll have some more follow up questions I’m sure.
Thanks so much for your time

Hi Nathan,
I patched together some pieces from some subroutines I have that should help get you going if you want to use PAC Control to parse the csv file.
This code will extract the 3rd field from each row from the csv file and stuff it into the string table called ParsedFieldResultsTable. This code could possibly be pretty close to working for you, but I would suggest you modify it so it handles the line preceding the Components lines gracefully and loads all the columns you want in one read, rather than just one column at a time.

You may want to strip the carriage-return / line-feed from the end of the string containing the line (row) from the file. Typically I have a subroutine that strips all that junk, but I tossed in the TrimString function in this code example.
This code does not, but you may also want to strip the quotes off the parsed strings, so you have Nitrogen instead of “Nitrogen” .

Position = 3;   // 3rd field in line
Delimiter = 44; // comma
PassedFilename = "c:\temp\test.csv";




// reset the error result and string
ReturnStatus=0;
ReturnMsg="";


// create the read only filespec
sFilespec = "file:r," + PassedFilename;
SetCommunicationHandleValue(sFilespec, ch_File);
iResults = OpenOutgoingCommunication( ch_File ); // open the file handle 

if (iResults <> 0) then
  NumberToString(iResults, tempstring);
  ReturnMsg = "ERROR OPENING FILE: Error Code :" + tempstring + " " + PassedFilename ;
  ReturnStatus=1;   // file open error : file not found
else 

  // Initialize some values
  nLine=0;
  nCharsWaiting = GetNumCharsWaiting( ch_File );
  // Set the EOM to LF since each line ends with CRLF
  SetEndOfMessageTerminator( ch_File, 0xA );

  // Loop through all the entire file one line at a time
  while ( (nCharsWaiting > 0)  and (iResults == 0) )
    sLineInFile = "";
    iResults = ReceiveString(sLineInFile, ch_File);
    nLine = nLine + 1;                    // bump line number and save as string
    NumberToString(nLine, sLineNumber);
    // trim the ends of the string of non printable characters
    TrimString(sLineInFile, 3);  // remove spaces off both ends
    // skip line if there is nothing printable on the line
    if (sLineInFile=="") then
      // skip this line
    else
      if (iResults<0) then
        ReturnMsg = "ERROR READING FILE. Line # " + sLineNumber + " " + PassedFilename ;
        ReturnStatus=2;   // general line error
      else
        // ** PARSE THE LINE


        if(Position>0) then

          nCharPosition=-1;
          nDelimiterCount=0;
          nLastDelimiterPosition=-1;

          repeat
            IncrementVariable(nCharPosition);
            nCharPosition = FindCharacterInString(Delimiter, nCharPosition, sLineInFile);
            if(nCharPosition>=0) then
              IncrementVariable(nDelimiterCount);
              if(nDelimiterCount<Position) then
                nLastDelimiterPosition=nCharPosition;
              endif
            endif
          until(nCharPosition<0 or nDelimiterCount==Position);

          if(nCharPosition<0) then
            nCharPosition = GetStringLength(sLineInFile);
          endif

          if(nDelimiterCount+1<Position) then
            ParsedItem="";
          else
            GetSubstring(sLineInFile, nLastDelimiterPosition+1, nCharPosition-nLastDelimiterPosition-1, ParsedItem);
            TrimString(ParsedItem, 3);  // remove spaces off both ends
          endif
        else
          ParsedItem="";
        endif
        ParsedFieldResultsTable[nLine] = ParsedItem;
      endif
    endif
  wend
endif
CloseCommunication(ch_File);
1 Like

@dp_engsberg Farout!!! This is very nice work.
Just to be clear, this is intended to be run on SoftPAC so it can access the file on the PC on the local hard drive?
A quick review of the code does seem to handle the limit of 1024 characters per string?

Correct - I used SoftPac for my test.

On an Epic, with the filename coming from a string table it might look more like : prFilename = “~/unsecured/” + Recipe_Filename_Table[id];

On a SNAP PAC using the microSD card, maybe something like this: prFilename = “/sdcard0/” + Recipe_Filename_Table[id];

The original code was a subroutine intended to pull a particular column out of a csv file. What we are looking for here is almost that. I think what is useful about it is once you have a piece of code that does the main things; like the file opening and error handling, and the parsing of a line - then you can modify it for your particular specific needs.

1 Like