Ruby on Rails performance on lots of requests and DB updates per second -
Ruby on Rails performance on lots of requests and DB updates per second -
i'm developing polling application deal average of 1000-2000 votes per sec coming different users. in other words, it'll receive 1k 2k requests per sec each request making db insert table stores voting data.
i'm using ror 4 mysql , planning force heroku or aws.
what performance issues related database , application should aware of?
how can address amount of inserts per sec database?
edit
i thinking in not inserting db each request, instead writing memory stream insert data. have scheduled job running every sec read memory stream , generate mass insert, avoiding each insert made atomically. cannot think in nice way implement this.
while can need in aws, high level of i/o cost you. rds can back upwards 30,000 iops; can utilize multiple ebs volumes in different configurations back upwards high io if want run database yourself.
depending on planned usage patterns, @ pushing in-memory info store, memcached or redis, , processing requests there. @ dynamodb, might work depending on how info structured.
are going have level of sustained throughput consistently, or in bursts? absolutely have preserve every single vote, or need summary data? how much need scale - i.e. ever 20,000 votes per second? 200,000?
these type of questions help determine proper architecture.
ruby-on-rails performance heroku amazon-web-services
Comments
Post a Comment