Quantcast
Channel: Active questions tagged redis+ruby-on-rails - Stack Overflow
Viewing all articles
Browse latest Browse all 873

Quickly adding multiple items (1000/sec) to a sidekiq queue?

$
0
0

I realize there is a push_bulk option for sidekiq but I'm currently being limited by latency to redis, so passing multiple items via push_bulk still isn't going quickly enough (only about 50/s).

I've tried to increase the number of redis connections like so:

redis_conn = proc {  Redis.new({ :url => Rails.configuration.redis.url })}Sidekiq.configure_client do |config|  Sidekiq.configure_client do |config|    config.redis = ConnectionPool.new(size: 50, &redis_conn)  end  config.client_middleware do |chain|    chain.add Sidekiq::Status::ClientMiddleware  endend

And then fire off separate threads (Thread.new) to actually perform_async on the various objects. What is interesting is any thread that isn't the first thread NEVER gets thrown into the sidekiq queue, it's like they're ignored entirely.

Does anyone know of a better way to do this?

Edit: Here is the push_bulk method I was trying which is actually slower:

  user_ids = User.need_scraping.pluck(:id)  bar = ProgressBar.new(user_ids.count)  user_ids.in_groups_of(10000, false).each do |user_id_group|    Sidekiq::Client.push_bulk('args'  => user_id_group.map{ |user_id| [user_id] },'class' => ScrapeUser,'queue' => 'scrape_user','retry' => true    )  end

Thanks!


Viewing all articles
Browse latest Browse all 873

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>