ClearSCADA

ClearSCADA is not designed to handle individual historic granules that are larger than about 1.3MB for Historic Data and 5MB for Alarm and Event Data (see the online help article "Operational Limitations - Server").
Exceeding this limitation may cause extended read locking when the historic granules are loaded.

Where possible, such granules should be identified and the cause of the abnormally high historic storage rate rectified. It may be advisable to remove the affected granules from the system to prevent performance degradation or the alter the stream size.

This article covers a few approaches to identifying large historic granules.

SQL Query

These queries will list the granule sizes and approximate file locations ordered by size including an approximate "% Rated Maximum Size".
The queries select data from the historic index tables CDBHISTORICFILE and CDBEVENTFILE without having to load the granules themselves.

Find large historic granules in the Raw Historic stream.

SELECT TOP( 50 )
    RECORDCOUNT AS "_RECORDCOUNT"
, FORMATVALUE( RECORDCOUNT / 40320.0 * 100.0 USING '0.00' ) || '%' AS PERCENTRATEDMAXSIZE
, 'Hd' || 'I' || FORMATVALUE( ( OBJECTID / 256 ) USING '00000' ) || '\' || FORMATVALUE( OBJECTID USING '000000' ) || '\WK' || FORMATVALUE( GRANULE USING '0000000' ) || '.HRD' AS RELATIVEHISTORICPATH
, OBJECTID AS OBJECTID
, STARTTIME AS PERIODSTARTING
, FORMATVALUE( ( ( RECORDCOUNT * 32 ) / 1024.0 ) USING '0' ) || ' KB' AS FILESIZE
, CDBPOINT.FULLNAME
, CDBPOINT.ID
FROM
    CDBHISTORICFILE LEFT JOIN CDBPOINT ON CDBHISTORICFILE.OBJECTID = CDBPOINT.ID
ORDER BY
    "_RECORDCOUNT" DESC

Find large historic granules in the Event stream (assuming 256 objects per stream).

SELECT TOP( 50 )
    RECORDCOUNT AS "_RECORDCOUNT"
, FORMATVALUE( RECORDCOUNT / 6825.0 * 100.0 USING '0.00' ) || '%' AS PERCENTRATEDMAXSIZE
, 'Jn' || 'I' || FORMATVALUE( STREAM USING '00000' ) || '\H' || FORMATVALUE( GRANULE USING '0000000' ) || '.HRD' AS RELATIVEHISTORICPATH
, CAST( ( STREAM * 256 ) AS STRING ) || ' - ' || CAST( ( ( STREAM + 1 ) * 256 - 1 ) AS STRING ) AS OBJECTIDRANGE
, STARTTIME AS PERIODSTARTING
, FORMATVALUE( ( ( RECORDCOUNT * 768 ) / 1024.0 ) USING '0' ) || ' KB' AS FILESIZE
FROM
    CDBEVENTFILE
ORDER BY
    "_RECORDCOUNT" DESC

To analyse the distribution of event files per stream:

SELECT
  MIN( STREAM ) AS "STREAM No."
, COUNT( RECORDCOUNT ) AS "TotalFiles"
, SUM( RECORDCOUNT ) AS "TotalRecords"
, SUM( CASE  WHEN RECORDCOUNT > 7500 THEN 1 ELSE 0 END ) AS "7500 plus"
, SUM( CASE  WHEN RECORDCOUNT > 15000 THEN 1 ELSE 0 END ) AS "15000 plus"
, SUM( CASE  WHEN RECORDCOUNT > 30000 THEN 1 ELSE 0 END ) AS "30000 plus"
, SUM( CASE  WHEN RECORDCOUNT > 60000 THEN 1 ELSE 0 END ) AS "60000 plus"
FROM CDBEVENTFILE
GROUP BY STREAM
ORDER BY "STREAM No." ASC

To analyse the distribution of event files over time (day base):

SELECT
  MIN( FORMATVALUE( STARTTIME USING 'YYYY-MM-dd' ) ) AS "Date"
, COUNT( RECORDCOUNT ) AS "TotalFiles"
, SUM( RECORDCOUNT ) AS "TotalRecords"
, SUM( CASE  WHEN RECORDCOUNT > 7500 THEN 1 ELSE 0 END ) AS ">7500"
, SUM( CASE  WHEN RECORDCOUNT > 15000 THEN 1 ELSE 0 END ) AS ">15000"
, SUM( CASE  WHEN RECORDCOUNT > 30000 THEN 1 ELSE 0 END ) AS ">30000"
, SUM( CASE  WHEN RECORDCOUNT > 60000 THEN 1 ELSE 0 END ) AS ">60000"
FROM CDBEVENTFILE
GROUP BY FORMATVALUE( STARTTIME USING 'YYYY-MM-dd' )
ORDER BY "Date" ASC

Batch Program

The following batch program can be used to find large historic granules.
It will search through a specified directory and all subdirectories for files that are larger than a specific size.

:bof
@echo off & setlocal enableextensions

REM
REM Recursively finds all files larger than a specified number of bytes within a specified path.
REM
REM Usage:
REM   findlarge.bat "<path>" <size in bytes>
REM
REM Example 1:
REM   findlarge.bat "c:\Database\Historic" 2097152
REM Example 2:
REM   findlarge.bat "c:\Database\Historic" 2097152 > output.txt
REM
REM

:init

  for /f "tokens=*" %%f in ('dir "%~1" /b /s /o-d') do (
    if %%~zf GEQ %~2 echo %%~zf - %%~ff
    )
  endlocal & goto :end

:end

This is an example of the batch file being run.

C:\utilities\findlarge>findlarge.bat "c:\Users\All Users\Control Microsystems\ClearSCADA\Database\History" 2097152
5047040 - c:\Users\All Users\ControlMicrosystems\ClearSCADA\Database\History\Historic\HdI00001
7325184- c:\Users\All Users\ControlMicrosystems\ClearSCADA\Database\History\Historic\HdI00000

"DIR /s"

"DIR /s" can be used to achieve similar results. Use the "DIR /s" command from the root of the historic folders and pipe the output to files. You end up with something like

Directory of C:\ProgramData\Schneider Electric\ClearSCADA\Database\History\Historic\HdI00000\000040

03/05/2015  01:44 PM    <DIR>          .
03/05/2015  01:44 PM    <DIR>          ..
10/08/2014  10:27 AM               160 WK021589.HRD
03/05/2015  01:43 PM                32 WK021610.HRD
               2 File(s)            192 bytes

Directory of C:\ProgramData\Schneider Electric\ClearSCADA\Database\History\Historic\HdI00000\000049

10/08/2014  10:18 AM    <DIR>          .
10/08/2014  10:18 AM    <DIR>          ..
10/08/2014  10:17 AM                96 WK021589.HRD
               1 File(s)             96 bytes

This could be searched to identify large files.

"FORFILES"

"Forfiles" can be used to get a listing from files modified before / after a certain date. Use the "Forfiles" command from the root of the historic folders and pipe the output to files. 

C:\>  forfiles /P "ProgramData\Schneider Electric\ClearSCADA\Database\Journal" /S /D +01/03/2015 /C "cmd /c echo @path,@fsize" > JournalFilesSince0103215.csv

This could be used in case you can't use a SQL query.