Unicorn + Ruby on Rails

0
down vote
favorite
I have a rails 4.2.1 app running with Unicorn as app server. I need to provide the user with the ability to download csv data. I’m trying to stream the data, but when the file take too long time than Unicorn timeout and Unicorn will kill this process

Is there any way to solve this problem My stream code :

private
def render_csv(data)
   set_file_headers()
  set_streaming_headers()

  response.status = 200
  self.response_body = csv_lines(data)
  Rails.logger.debug("end")
end

def set_file_headers
  file_name = "transactions.csv"
  headers["Content-Type"] = "text/csv"
  headers["Content-disposition"] = "attachment; filename=\"#{file_name}\""
end

def set_streaming_headers
  #nginx doc: Setting this to "no" will allow unbuffered responses suitable for Comet and HTTP streaming applications
  headers['X-Accel-Buffering'] = 'no'

  headers["Cache-Control"] ||= "no-cache"
  headers.delete("Content-Length")
end

def csv_lines(data)
  Enumerator.new do |y|
    #ideally you'd validate the params, skipping here for brevity
    data.find_each(batch_size: 2000) do |row|
      y << "jhjj"+ "\n"
    end
  end
end

I think You choose the wrong app server for this kind of case.
Never use rack as a professional server, especially if you play with big files.

Memory Growth

When a worker is using too much memory, god or monit can send it a QUIT signal. This tells the worker to die after finishing the current request. As soon as the worker dies, the master forks a new one which is instantly able to serve requests. In this way we don’t have to kill your connection mid-request or take a startup penalty.

You should use passenger.

I cleaned up your code.
You need to use triple backticks to post code to the forum.
See this post for details.

1 Like