Copy csv file to Pac R1 String Table and then back

Ok, I am stuck…been a while since I used NR.

I am trying to do a simple read csv file, then write it to an R1 string table, then after that I’ll try to write the string table back out as a csv file.

As you can see I have the copy file to stable flow set up. It runs without error if I use write: value,
And the debug window looks like all the data is being read and written.

Here are the other sub windows of the read file and pac write nodes:



What am I doing wrong? I have the API key installed in Pac R1, and it is not writing to the table, at all.

Two tips for working with Node-RED.

  1. Don’t eat the elephant. Break the flow into chunks, even if its just two nodes at a time.
  2. Debug nodes. Debug nodes on everything. Make sure to select ‘complete msg object’ on each debug node so you can really see whats going on.

In that vein. Do you really want the CSV to be line by line into the R1 string table?
If so, Ok, what you have looks somewhat Ok for that part.

Don’t proceed until you are sure that all of the table is being read correctly and written to the R1.
You may need to use a split node that splits on \n or you might just trying to input a very very long string (the whole file) into the PAC Write node and that’s why its not writing it… You’re overflowing the write node buffer.
The read file ‘a msg per line’ may not be splitting it at the \n for each line. It depends on how the file is written (Windows new line vs Node-RED new line vs PAC R1 new line).

Put a debug node at the output of the read file node and see what’s actually coming out.
Click on the red string in the debug tab and see if its getting split or if it really is just one long massive CSV string.

1 Like

Beno, thanks for the help.

Yes, unless there is a better way to do this. I just need to transmit the csv file to the string table. It always errors if I use message payload. Value seems to be the only way.

I am not getting errors, so my impression is that the R1 is not accepting the transmission. Will I see any error codes in nodered related to the connection? I don’t see any errors on the R1 side, so how do I troubleshoot the connection itself?

Also, does the Pac Write node need to see the incoming message in JSON, or does it expect only raw data?

Ok, just tried to change the ‘A’ to ‘a’ in Admin (name) and I see that it errors in node red.

So that tells me that nodered side thinks everything is ok and that the R1 is accepting the write, but still not getting anything in the string table.

I think the issue is that you are not telling the PAC Node what values to use.

If you want to really dive into the subject, this is the post to read here:

Thanks so much for the help, Ben.

So it’s not sinking in very fast so bear with me, also, I’ve avoided JSON all these years because it looks so cryptic…although I’m sure you are having good laugh, but that’s ok…

After days of stumbling around in the dark even with the flashlight you gave me, I finally managed to get a write, albeit by typing it into the function block. It did write the following strings into the table.

msg.payload = {
tagName : “P_Parameters_sTable”,
tablestartindex : 0,
values : [ “string 1”,
“string 2”,
“string 3”,
“string 4”]
};
return msg

So, I get what you’re trying to tell me, but now I am trying to figure out how I tell it to put all the csv messages into each index in place of the “string 1”, etc.

You cannot have the write node set to “Value”; what that does is insert a constant value from that field in the node config — which is currently an empty string, and that explains why you’re seeing nothing written.

Try using a change node before the PAC write node to move the string into a subproperty of payload, something like msg.payload.data:

Then, in the PAC write node, change the “Value” field to take that msg.payload.data:

I can expand more on this, but can you just check if that gets you over the initial hurdle of getting something into the table?

Last note is to reiterate @Beno comment on having complete msg object debugs at every stage so you can see what data is going IN to each node as well as what comes out. This really helps both you and us understand what’s happening to the data throughout the flow:


EDIT:
Just re-read your comment about JSON appearing cryptic, have you seen this video? Might be worth a watch to help demystify what’s going on with the JSON structure

1 Like

I did as you said, but apparently, the read file node is push the strings just the way they need to look in the table, and the change node is putting each message into a data object, but I guess it looks like there is no indexing per object.

The write node has a problem with the message strings because they are not in json, so it is trying to parse the string and errors on the second column in string.

So looks like I need more rules in the move node.

These debug nodes are not set to display the complete msg object; you can see from debug 6 that there’s just “msg.payload” with an empty value in it, so you’re not starting off with what we expected. Can you double check the config of the debug nodes and the file node?

1 Like

Just so we are clear, I am only trying to transmit each string record, or line, AS IS, not try to format it, which is why the CSV node doesn’t work because it is trying to format the data.

The CSV node is most frequently used with a join node, but I was trying to skip that part for you.
I understand what you are trying to do. I have done it a few times myself.
Just trying to ease your process.
Pop the debug nodes into complete message object and run it again with the screenshots, and Terry and I should be able to jump-start you.

EDIT, Don’t forget to click on the red string in the debug tab to be sure to see where its splitting the CSV.

Ok. I see that the object in debug 1 is null. Rather the payload is null.

The payload is not null, it’s an object with a property that’s an empty string. “null” will either not appear, or appear as undefined.

Can you confirm the first debug in the chain, called “debug 6” is set to display complete msg object? It still seems the file output isn’t giving the expected output.

Thanks for that correction, sorry about the debug, here it is now with file read node set to single utf string.



the

Right now I have two identical string in the file, so now the data contain the entire file, so I gather now it is time to separate each line.

Terry is going to build an example flow here so you can just import it (or copy it).
One last question to help build that flow…
How is each row delimited in the CSV file?
\n or a lack of a , or…?

CR and LF at end which I assume is noted as /r /n?

So the delimit is end of line for each string.

Ok, I confirmed, it is 0D 0A in the file.

1 Like

It might be easier to just output the entire file, split it into parts, and use the split node’s “parts” property to get the exactly index of each line Using the dynamic write settings described in the node’s help tab (looks like you were using these properties earlier today?):

Here’s an example of what I have in mind:
image
With this config in the change node:

Import code for you to use as a starting point:
[{"id":"d559b214ef591178","type":"debug","z":"24fe2b4cae981e74","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":530,"y":580,"wires":[]},{"id":"1198fe96ba53c4ef","type":"split","z":"24fe2b4cae981e74","name":"","splt":"\\n","spltType":"str","arraySplt":1,"arraySpltType":"len","stream":false,"addname":"","x":530,"y":620,"wires":[["0aa43bacf59c325b","eb39a5fa5f6be293"]]},{"id":"0aa43bacf59c325b","type":"change","z":"24fe2b4cae981e74","name":"","rules":[{"t":"set","p":"payload.values","pt":"msg","to":"payload","tot":"msg"},{"t":"set","p":"payload.tableStartIndex","pt":"msg","to":"parts.index","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":700,"y":620,"wires":[["bc9d360274ab03db"]]},{"id":"eb39a5fa5f6be293","type":"debug","z":"24fe2b4cae981e74","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":690,"y":580,"wires":[]}]

1 Like

Here are results.

Split node is processing what we are sending, so what are these errors?

Cannot set property of non-object type: payload.values and

Cannot set property of non-object type: payload.tableStartIndex

Ah… must be that it doesn’t like that msg.payload is being implicitly redefined from a string to an object. That’s my bad. Should be able to fix that by shifting payload to a different property, define payload as an object, THEN set the properties…

Can you replace the change node with this and test again?

[{"id":"0aa43bacf59c325b","type":"change","z":"24fe2b4cae981e74","name":"","rules":[{"t":"set","p":"data","pt":"msg","to":"payload","tot":"msg"},{"t":"set","p":"payload","pt":"msg","to":"{}","tot":"json"},{"t":"set","p":"payload.values","pt":"msg","to":"data","tot":"msg"},{"t":"set","p":"payload.tableStartIndex","pt":"msg","to":"parts.index","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":700,"y":620,"wires":[["bc9d360274ab03db"]]}]

So I got to here with:

[{“id”:“0eef93d3778166a3”,“type”:“pac-write”,“z”:“fa325f40e2e7e519”,“device”:“cd198602403d769b”,“dataType”:“string-table”,“tagName”:“P_Parameters_sTable”,“tableStartIndex”:“”,“value”:“”,“valueType”:“value”,“name”:“Write to sTable”,“x”:580,“y”:380,“wires”:[[“2ebf8c6a8b7fdba2”]]},{“id”:“bee10f1ff147ae74”,“type”:“inject”,“z”:“fa325f40e2e7e519”,“name”:“”,“props”:[{“p”:“payload”},{“p”:“topic”,“vt”:“str”}],“repeat”:“”,“crontab”:“”,“once”:false,“onceDelay”:0.1,“topic”:“”,“payload”:“”,“payloadType”:“date”,“x”:200,“y”:220,“wires”:[[“09e8e5e3b4cd082a”]]},{“id”:“28144f3a443d0343”,“type”:“debug”,“z”:“fa325f40e2e7e519”,“name”:“debug 1”,“active”:true,“tosidebar”:true,“console”:false,“tostatus”:false,“complete”:“true”,“targetType”:“full”,“statusVal”:“”,“statusType”:“auto”,“x”:340,“y”:120,“wires”:},{“id”:“2ebf8c6a8b7fdba2”,“type”:“debug”,“z”:“fa325f40e2e7e519”,“name”:“debug 4”,“active”:true,“tosidebar”:true,“console”:false,“tostatus”:false,“complete”:“true”,“targetType”:“full”,“statusVal”:“”,“statusType”:“auto”,“x”:780,“y”:380,“wires”:},{“id”:“8f082f7bc82d7153”,“type”:“debug”,“z”:“fa325f40e2e7e519”,“name”:“debug 3”,“active”:true,“tosidebar”:true,“console”:false,“tostatus”:false,“complete”:“true”,“targetType”:“full”,“statusVal”:“”,“statusType”:“auto”,“x”:860,“y”:120,“wires”:},{“id”:“82063e23bc14830d”,“type”:“split”,“z”:“fa325f40e2e7e519”,“name”:“”,“splt”:“\n”,“spltType”:“str”,“arraySplt”:1,“arraySpltType”:“len”,“stream”:false,“addname”:“”,“property”:“payload”,“x”:630,“y”:220,“wires”:[[“573245e292ce71ae”,“e260c8b67b8e9335”]]},{“id”:“09e8e5e3b4cd082a”,“type”:“file in”,“z”:“fa325f40e2e7e519”,“name”:“Read Parameters File”,“filename”:“C:\Sensia Data\Parameters.csv”,“filenameType”:“str”,“format”:“utf8”,“chunk”:false,“sendError”:false,“encoding”:“utf8”,“allProps”:false,“x”:400,“y”:220,“wires”:[[“28144f3a443d0343”,“82063e23bc14830d”]]},{“id”:“573245e292ce71ae”,“type”:“debug”,“z”:“fa325f40e2e7e519”,“name”:“Debug 2”,“active”:true,“tosidebar”:true,“console”:false,“tostatus”:false,“complete”:“true”,“targetType”:“full”,“statusVal”:“”,“statusType”:“auto”,“x”:640,“y”:120,“wires”:},{“id”:“e260c8b67b8e9335”,“type”:“change”,“z”:“fa325f40e2e7e519”,“name”:“”,“rules”:[{“t”:“set”,“p”:“data”,“pt”:“msg”,“to”:“payload”,“tot”:“msg”},{“t”:“set”,“p”:“payload”,“pt”:“msg”,“to”:“{}”,“tot”:“json”},{“t”:“set”,“p”:“payload.values”,“pt”:“msg”,“to”:“data”,“tot”:“msg”},{“t”:“set”,“p”:“payload.tableStartIndex”,“pt”:“msg”,“to”:“parts.index”,“tot”:“str”}],“action”:“”,“property”:“”,“from”:“”,“to”:“”,“reg”:false,“x”:860,“y”:220,“wires”:[[“8f082f7bc82d7153”,“0eef93d3778166a3”]]},{“id”:“cd198602403d769b”,“type”:“pac-device”,“address”:“10.1.1.15:80”,“protocol”:“http”,“msgQueueFullBehavior”:“REJECT_NEW”}]