Problem
Sometimes you might get the following error while trying to read a large json string using JSON Source and also selected output as RAW document (DT_TEXT, DT_NTEXT, DT_IMAGE) in the JSON Source or any other component which outputs BLOB datatypes (e.g. DT_TEXT, DT_NTEXT, DT_IMAGE).
Error: System.Exception: BufferException: Error: There is not enough space on the disk.
When this error happens, you will notice during execution your Disk space is slowly reduced until the above error happens.
Possible Cause
In most cases, it happens due to buffer memory overflows. When blob datatypes are used SSIS Engine might write data to temp blob storage.
Possible Solutions
There are many performances tuning you can do as described in this article but let's see some obvious solutions you can try (see below).
Solution-1
Use below work around to reduce pressure on Disk and flush more often.
Go to Data Flow Designer > Right Click and choose Properties > Change the DefaultBufferMaxRows to 100 (or less than 10000) from 10000 (thats default) and run the Package again. You can keep reducing this limit until you no more see Disk space error.
Solution-2
Clear some temp files to get more Disk space and run data flow again see error goes away.
Solution-3
Change BLOBTempStaoregPath and BufferTempStoragePath to other than default (i.e. system TEMP - mostly on C Drive). This gives SSIS Engine more space for BLOB Temp storage.
Solution-4
Avoid Blob Datatypes in the output if possible (i.e. DT_NETXT, DT_TEXT, DT_IMAGE)
Solution-5
Try the following workaround explained on this link:
The DefaultBufferMaxRows and DefaultBufferSize Properties in SSIS
Contact Us
If you have more question(s) feel free to contact us via Live chat or email at support@zappysys.com
Comments
0 comments
Please sign in to leave a comment.