Profile Picture

Optimizing Serialization for Large Data Sets

Posted By Pedram . 11 Years Ago
Author
Message
Nevron Support
Posted 11 Years Ago
View Quick Profile
Supreme Being

Supreme Being (4,435 reputation)Supreme Being (4,435 reputation)Supreme Being (4,435 reputation)Supreme Being (4,435 reputation)Supreme Being (4,435 reputation)Supreme Being (4,435 reputation)Supreme Being (4,435 reputation)Supreme Being (4,435 reputation)Supreme Being (4,435 reputation)

Group: Forum Members
Last Active: Last Week
Posts: 3,054, Visits: 4,009

Hi Pedram,

We could not replicate this issue - can you send a ziped state file serialized to PersistencyFormat.CustomXML to support@nevron.com for review? BTW does XML serialization exibit this problem as well?



Best Regards,
Nevron Support Team



Pedram .
Posted 11 Years Ago
View Quick Profile
Forum Newbie

Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)

Group: Forum Members
Last Active: 11 Years Ago
Posts: 7, Visits: 1
We are trying to serialize to disk large ChartControl files, by *clearing* all data points first and then saving the data
using the following code:

**********************
foreach (NSeries Series in Chart.Series)
{
Series.ClearDataPoints();
}
ChartControl.Document.Calculate();
ChartControl.Refresh();

ChartControl.Serializer.SaveControlStateToFile(FileName, PersistencyFormat.Binary, new NDataSerializationFilter());
**********************

However when we do this the file size changes depending on number of data points. These files can be quite large 10MB-30MB and loading them again can take 30sec to 2min even though there should be no datapoints in the file. We want to keep all the series in the ChartControl but remove all its points before saving to reduce the file size. Any thoughts on how we can do this?




Similar Topics


Reading This Topic