|
From: | Tim Churches |
Subject: | Re: [Gnumed-devel] Gnumed on Windows: the pain of pyPgSQL |
Date: | Mon, 12 Sep 2005 07:38:58 +1000 |
User-agent: | Mozilla Thunderbird 1.0.6-1.1.fc3 (X11/20050720) |
Karsten Hilbert wrote:
Is DB-API adaptor performance an issue with GNUmed, since you are only loading and updating the records for one patient at a time. I doubt that teh speed of the database adaptor has any measurable impact on that. If the database is slow, it may be because of the schema design or inadequate indexing. The database adaptor speed *does* matter if you are doing large aggregate queries of 100,000 of records at a time - in which case I agree, pyPgSQL is a bit slow. But even in our public health surveillance application we are getting update speed in transactions of several thousand records per second, and we honly have several thousand records at a time to update, so changing to a faster adaptor might save us perhaps a second at each update (every hour). In GNUmed's case, you might save a few milliseconds per transaction. But the best thing to do is to measure the speed difference in osm real-life transations.For me the one reason to switch would be to gain performance. Psycopg is said to be the one that's fastest. So, *if at all*, I'd consider that one.
Tim C
[Prev in Thread] | Current Thread | [Next in Thread] |