Quantcast
Channel: Active questions tagged redis+ruby-on-rails - Stack Overflow
Viewing all articles
Browse latest Browse all 873

Ruby on Rails - Bulk insert of data with logging

$
0
0

My application is an analytics application and gets real-time data each second from several sites and all these are added inside the insights table.

Currently, each data request is sent to log_insight action and added to a log file as such:

class Insight < ApplicationRecord  def self.write_log(log_file, log_line)    File.open(log_file, "a") { |file| file.puts (log_line.gsub("\n", " ")) }  endendclass InsightsController < ApplicationController  def log_insight    Insight.write_log('log/insights.log', data.to_json)  endend

And every 30 minutes, these logs are added to the insights table (I'm using Activerecord-Import gem) and a new insights.log is being created, and the process continues.

PS: Throughout the day there are around a million records being processed and added to the insights table.

I feel this is not the efficient or right way to go with.

One suggestion that I've been thinking of was to use redis and write the logs in the memory and every 30 minutes take out everything and do a flush.

Rails 6.1.4Ruby 2.5.8Redis 4.4

Any suggestion is much appreciated!


Viewing all articles
Browse latest Browse all 873

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>