I setup a FTP server which the controller is communicating with to send and receive run data. When the data is loaded to be viewed (moved from FTP server to local storage for viewing) it gets duplicated, a table with 50 cells saved into the file becomes 100 cells (each cell got a copy).
You are correct, the reason I am clearing only 50 because i setup the table length at 50
The 100 cells only show up on the local storage, I have attached an example showing how the cell get duplicated, its like the table was transmitted twice when running the get: command
How are you creating the Postcopy file? The logic you have posted is only reading the data in and since you only have a 50 length table, there will never be more than 50 values, yet you have 100 in your Postcopy file.
This exactly the same thing that is I am stuck with.
The post copy file is created by just using the ftp to get a copy of that file and it to storage and for some unknown reason by running the code above it makes the postcopy file have a 100 cells.
I have tried that, with every file command now I have a close communication for it and it still duplicate the data.
It seems just some buggy somewhere. I will try to work around it. Thank you for the input.
I have figured out partly the cause for data duplication, since I am doing an end of message terminator as “,” the last value always have a , with an empty space afterward which causes the controller to believe it is an additional cell and need to look for the next , resulting in it recycling through the file. However, I am sure why it is doing it for only twice and no more than that.