Rails/Ruby... 50mb string performance
|Added at||2016-12-19 04:12|
I'm trying to send 1mm rows worth of data to the browser.
Unfortunately this is a situation where I can't use pagination.
The actual raw PG query takes only 3 seconds. I'm interacting with
I originally was calling
The problem is that now the
Is there a better way I can send this much data in a response with a quick conversion to a string?
The other option I'm going to look into is using a job to push this data to S3, and then stream the raw data file to the browser from S3, but it still doesn't change the fact I'm going to have to convert it to a string before sending to S3, and I assume that will just be extremely slow.
I'm a bit confused honestly because its a lot of records, but its still only 50mb. I don't get why this requires like 400mb of memory to do (my Heroku dynos are marked as "vastly exceeding bandwidth limit"). How come the rails/ruby/dyno isn't intelligent about it's memory when trying to stream a response this big? It literally just crashes itself...