Sunday, 18 August 2013

Datomic: robust settings to avoid timeouts and out-of-memory errors

Datomic: robust settings to avoid timeouts and out-of-memory errors

I'm running the same datomic system on a variety of architectures with
varying amounts of memory (1GB - 16GB). When I do bulk imports of data I
frequently run into timeouts or out-of-memory errors.
After looking at the documentation I happened upon this helpful document
(and this one) which seem to outline best practices to obtain good
performance under heavy imports.
I'm not as interested in performance as I am in making imports "just
work." This leads to my main question:
What is the minimum complexity configuration to ensure that an arbitrarily
large import process terminates on a given machine?
I understand that this configuration may be a function of my available
memory, that's fine. I also understand that it may not be maximally
performant; that's also fine. But I do need know that it will terminate.

No comments:

Post a Comment