Cancel
Showing results for 
Search instead for 
Did you mean: 

[MindConnect LIB]: Why MCL has data souce uploading size limitation?

Experimenter
Experimenter

Hi, 

I'm doing a data acquisition project for a customer, with MCL to MindSphere. The factory field device has about 500 variables. Now I encounter two data source uploading issues:

1. In my case, I have about 500 data point. But in MCL, for the data source configuration in store resource, HTTP handler can ONLY send about 100 data points one time. The pity is if the store context sends successfully, the communication handler will release the store context automatically, of course, including the the created data source configuration resource, which means I can't use the same data source configuration to send the rest 400 data points. I'm not sure if it means one MCL client only supports 100 data points?

2. I also try to create another data source configuration resouce after last successful sending and copy the same data source configuration ID to the new one. Besides I devide every 90 data point into 1 data source with different data souce name and send it. The strange thing is the Asset Manager/Configuration can only show one data souce, new coming data source will override the existed one, even they have different data source name.

 


Could the developing team help me and give me some advice?

Thanks!

1 REPLY 1

Re: [MindConnect LIB]: Why MCL has data souce uploading size limitation?

Experimenter
Experimenter
I've same problem. I have updated about 300 data points and also MCL returned MCL_OK. But from the MindSphere cloud platform, the 300 data points can not be seen. Then I reduced the data points to 200 data points, it worked and I can see the 200 data points in the Mindsphere cloud platform.
The document say that once "mcl_communication_exchange " is called, the last data points will be moved.
"•Call the mcl_communication_exchange function to upload the store containing the data source configuration.
Make sure the data source configuration is uploaded before the first time series upload. The data source configuration is uploaded only once per agent as long as it does not change.
If the exchange is successful (i.e. the function returns MCL_OK), the data source configuration is automatically deleted from the store. You are expected to destroy the store using the mcl_store_destroy function if you don't intend to use it for further exchange operations.
After the data source configuration upload, you need to do data point mapping using MindSphere Launchpad before time series data can be uploaded."
the document linkage: https://developer.mindsphere.io/resources/mindconnect-lib/resources-mclib-samples.html

I have more 1000 data points to upload. Absolutely the MCL LIBARAY can not support uploading 1000 data points in one time. It seems that the http request is limited to 1M.

Anyone can help us, thanks.