Calculation for Log data base
As per customer requirement in one project, to store our data like daily production, power consumption (1-2 daily sample values)... in the form of "data Log", regarding the same we have following queries
1. Logs are reliable to store our data for years (5-10 years)?
2. What is the calculation to calculate storage size?
3. Which one is better Logs or SQL?
We are using computers with Raid configuration.
Voted best answer
A Lab Data log in Basic History may hold a user defined number of samples; each sample will drain about 24 bytes worth of disk space. The max size is not defined, but could probably be compared with regular logs, where e.g. a 1 second min time log will burn 86400 samples of storage per day (~2MB disk space).
Basic History before 5.1 RevD RU5 is limited to 32-bit addressing and total storage should not exceed 10.000 files or 50 GB (these are a rule of thumb, there are likely deviations when less is recommended, e.g. if many secondary logs are deployed). After this said rollup, Basic History can be trusted with slightly more data.
You can also deploy a Basic History service group only for your lab needs and put this service group on the aspect servers. The History Source aspect control what Service Group that gets to cater for the Log Configurations below it. This way you can create a dedicated Basic History service only for lab data purpose.
An asyncronous Oracle based log in IM History can hold up to 50.000 samples, each sample will allocate roughly 100 bytes from the Oracle tablespace (datafile).
(Page 249 in 3BUF001092-600 B lists the sizes allocated by IM logs).
With as few as 1-2 daily sample values, a Basic History labdata log will take you a very long way.
I have made examples here on AKS in the past how to use Calculation service to "inject" data into a lab log. I suggest trying the Calculation/Basic History solution for a starter.