ruby on rails - Is it better to do direct table loads in a high performance application? -
ruby on rails - Is it better to do direct table loads in a high performance application? -
i'm using postgresql in rails 3.2 application receives updates 3rd party day long. 3rd party throw on 2,000 requests min @ application, each update consisting of big xml file.
right storing basic info each xml file table. then, background process picks big chunks of info in table , copies info table using postgresql's copy
feature.
am doing right thing or wrong thing here? table load target major crud target of ui. copy
feature lock entire table when load happens, , should doing bunch of inserts instead? thought inserts expensive, if direct load locks whole table that's going problem.
copy
lowest level way mass-insert records postgresql. solution post-process records in background job.
alternatively, if need have performance , maintain rails/ruby functionality, consider activerecord-import gem. gem perform mass-insertions , allow activerecord callbacks , validations used needed. if utilize post-processing of mass copyed records, may gain important performance increase.
here article using activerecord-import: http://ruby-journal.com/how-to-import-millions-records-via-activerecord-within-minutes-not-hours/
this postgres team recommends optimal import performance: http://www.postgresql.org/docs/current/interactive/populate.html
ruby-on-rails ruby-on-rails-3 rails-postgresql postgresql-9.3
Comments
Post a Comment