Sometimes you might get the following error while trying to read a large json string using JSON Source and also selected output as RAW document (DT_TEXT, DT_NTEXT, DT_IMAGE) in the JSON Source or any other component which outputs BLOB datatypes (e.g. DT_TEXT, DT_NTEXT, DT_IMAGE).
Error: System.Exception: BufferException: Error: There is not enough space on the disk.
When this error happens, you will notice during execution your Disk space is slowly reduced until the above error happens.
In most cases, it happens due to buffer memory overflows. When blob datatypes are used SSIS Engine might write data to temp blob storage.
There are many performances tuning you can do as described in this article but let's see some obvious solutions you can try (see below).
Use below work around to reduce pressure on Disk and flush more often.
Go to Data Flow Designer > Right Click and choose Properties > Change the DefaultBufferMaxRows to 100 (or less than 10000) from 10000 (thats default) and run the Package again. You can keep reducing this limit until you no more see Disk space error.
Clear some temp files to get more Disk space and run data flow again see error goes away.
Change BLOBTempStaoregPath and BufferTempStoragePath to other than default (i.e. system TEMP - mostly on C Drive). This gives SSIS Engine more space for BLOB Temp storage.
Avoid Blob Datatypes in the output if possible (i.e. DT_NETXT, DT_TEXT, DT_IMAGE)
Try the following workaround explained on this link:
The DefaultBufferMaxRows and DefaultBufferSize Properties in SSIS
If you have more question(s) feel free to contact us via Live chat or email at email@example.com
Please sign in to leave a comment.