Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rails 7 - slow store time Carrierwave S3 AWA fog upload ONLY in localhost #2605

Open
robertoboldi opened this issue Apr 29, 2022 · 21 comments
Open

Comments

@robertoboldi
Copy link

robertoboldi commented Apr 29, 2022

I have switched to Rails 7 and I found some issues when I upload image in my localhost.
It is mandatory to manage, import, rework and upload images on a local environment, and Images (and attachments) needs to be uploaded from localhost.
The system idles for 10/30 sec even for 5kb images, picture will be correctly loaded.

Furthermore the server is also very slow when deleting images.

I underline that this happens only in localhost, not production.

My system is a last generation M1 MacBook pro.

Here is my code:
carrierwave.rb

CarrierWave.configure do |config|

config.fog_provider = "fog/aws"
config.fog_credentials = {
    :provider               => 'AWS',                        # required
    :aws_access_key_id      => ENV['S3_ACCESS_KEY'],                        
    :aws_secret_access_key  => ENV['S3_SECRET_ACCESS_KEY'],    
    :region                 => ENV['S3_REGION']                          # optional, defaults to 'us-east-1' defaults to 'us-east-1'
}
config.fog_directory  = ENV['S3_BUCKET']                          # required
# config.fog_public     = true                                   # optional, defaults to true
config.fog_attributes = {'Cache-Control'=>'max-age=315576000'}  # optional, defaults to {}
config.storage = :fog

image_uploader.rb

class ImageUploader < CarrierWave::Uploader::Base
include CarrierWave::MiniMagick
storage :fog
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
version :thumb do
  process resize_to_fit: [300, 300]
end
version :default do 
  process resize_to_fit: [800,800]
end
version :big do 
  process resize_to_limit: [1200, nil]
end
end
end

the issue is shown during uploads in my active admin page, with standard form uploader, in console and also in seeds.rb file.

The gems I'm using for upload are:

  • carrierwave (2.2.2)
  • fog-aws (3.13.0)

What I have tried:

  • removing versions generation (nothing changed)
  • using RMagick instead of MiniMagick (nothing changed)
  • using storage :file (the upload was really fast)
  • installed Imagemagick version 6 (the latest is 7.1)
    It seems the issue is related to the effective upload time, maybe the fog?

EDIT:
after some analysis on logs I am pretty sure that the issue is due to the store time that is very long.
Processing time is really fast, while cache time could be improved, but we may survive with it.

@courtsimas
Copy link
Contributor

Is this a copy of #2615? do you need both open?

@robertoboldi
Copy link
Author

sorry for the duplication, I got some issues of formatting of the question and I posted two of the same. Have you some ideas of the causes?

@robertoboldi
Copy link
Author

Anyone has some ideas of the cause of this?
I also tried with no images attachments, and this issue continue to be for all kind of files.

@YSavir
Copy link

YSavir commented Oct 18, 2022

I've been experiencing this as well recently. Additionally any automated tests that interact with file uploads will fail because of timeouts, so this is more than an inconvenience but actually something impacting the development process.

By logging the various steps I could see that it happens during the store action. I haven't experimented further yet. I'll see what more I can discover, but if anyone else has any insights, they are very welcome.

@robertoboldi
Copy link
Author

Thank you @YSavir

Yes, the same for me. Even if I'm looking at the console, I see the database quickly updates and the server idle for a long time.
If I try to reload page during this idle period, it reloads correctly(but without image) and I get 404 only for the image path in browser console. It means that database have been correctly updated with all the image info, but the image is not completely loaded on cloud.

After your confirmation it seems that it's clear that it is due to store action.

I am working on a Macbook pro M1, maybe this should be an useful information. Unfortunately I can't do any testing on other machines.

@YSavir
Copy link

YSavir commented Oct 18, 2022

I solved the problem: all I had to do was post about it here. Just like that months worths of timeouts and sluggish uploads went away. No code changes, no dependency updates, nothing changed at all... but it's working smoothly now, at least in what I've tested so far.

No idea what happened there, but 🤷🏼. It's been a problem for the past few months and now it's fine. Wish I had further input. Only thing I can think of is that I added logging before and after each uploader hook and putsed out the time.

@robertoboldi
Copy link
Author

@YSavir , wow this may really change development times.
Can you post your code so I try to test in my development env? Where exactly you put logging?
I really be thankful for it

@YSavir
Copy link

YSavir commented Oct 18, 2022

The various callbacks are found here in the documentation.

For each callback hook I had a log_starting_<hook> and log_ending_<hook> that basically just printed the time. I saw the difference after adding methods for the remove hook. No idea if any of this is relevant, the snag could be from something entirely other and it was just coincidental timing. And maybe it'll start again 30 minutes from now. 🤷🏼

@robertoboldi
Copy link
Author

ok I have tested all the callbacks as you suggested, @YSavir.
My store time is still really looooong, also the cache time is not really fast, but we can still survive with it.

I added the logs at the remove hook and everything is still the same :-(

@robertoboldi robertoboldi changed the title Rails 7 - superslow Carrierwave S3 AWA fog upload ONLY in localhost Rails 7 - slow store time Carrierwave S3 AWA fog upload ONLY in localhost Oct 18, 2022
@robertoboldi
Copy link
Author

Still getting this issue, has someone some news?

@patrickemuller
Copy link

I have been following the thread, but since I can't update our project to Rails 7 yet I can't contribute.
Considering it's only local, is there any chance that this is related to I/O in M1 Macs only?
Does anyone know if Rails 7 has changed (in any way) how it handles file saving?
@robertoboldi which version of Ruby are you using?

@emilebosch
Copy link

Yeah have a similar issue here. Its unbearable. I tried digging into the root cause with fog-aws but as soon as directory.files is called it just stalls forever. It was absolutely unworkable. Also the fog gem is hard to debug.I'm on an M1 on Rails 7.0.5.

Also i was a big shocked about the huge amount of files and layers of indirecton needed to upload a file to s3 in aws/fog but that is maybe something for a different story.

@batalla3692
Copy link

I'm also on an M2 MacBook, and along with the slow store time, I was also getting Excon::Error::Timeout errors when uploading some really small files to S3 from my local.

After reading this issue on fog-aws gem, which is kind of related, I figured out a way to bypass the timeouts by adding the following settings on the initializer of CarrierWave on my Rails project:

  CarrierWave.configure do |config|
    ...
    config.fog_credentials = {
      ...,
      connection_options: {
        write_timeout: 5
      }
    }
  end

Of course, this does not resolve the issue of the slow uploads, but at least I don't receive timeouts anymore, so I leave my solution here in case someone else has the same issue.

@robertoboldi
Copy link
Author

Has anybody got any solution?
I tried all Rails version until now (7.1.3)

this happens also when removing images, I don't know if this is clear.

@emilebosch
Copy link

@robertoboldi I just rolled my own upload and skippped all of this

@robertoboldi
Copy link
Author

yes it seems a good idea.
I may try do to this. Have you some good references to suggest?
I never done this, before

@emilebosch
Copy link

Something dumb like this would get u started

Some lib thing

class Uploader
  def get_url(variant)
    "https://xxx.ams3.digitaloceanspaces.com/#{get_key(variant)}"
  end

  def initialize(model)
    @model = model
  end

  def magick(upload, &block)
    return unless upload
    @image = MiniMagick::Image.read(upload.tempfile)
    yield
    @image = nil
  end

  def variant(name, size)
    Tempfile.open { |xx|
      @image.resize(size)
      @image.write (xx)

      s3 = Aws::S3::Client.new(endpoint: "https://ams3.digitaloceanspaces.com")
      File.open(xx, "rb") do |file|
        s3.put_object(
          acl: "public-read",
          body: file,
          bucket: "xx",
          key: get_key(name),
        )
      end
    }
  end
end

class SpotUploader < Uploader
  def get_key(variant)
    "#{Rails.env}/spots/#{@model.id}/#{variant}.jpg"
  end

  def upload(upload)
    magick(upload) do
      variant(:thumb, "612x612")
      variant(:tiny, "50x50")
    end
  end
end

In the controller

class PageController < Controller
  def create
    @spot = Spot.new(spot_params)
    @spot.user = User.first
    @spot.category = :building
    @spot.difficulty = "5"
    @spot.save

    loader = SpotUploader.new(@spot)
    loader.upload(spot_params[:file])
    @spot.file = loader.get_url(:thumb)
    @spot.save

    render json: merge_cover_photo(@spot)
  end
end

Gems

gem "mini_magick"
gem "aws-sdk-s3"

@robertoboldi
Copy link
Author

@emilebosch Wow, the full code!
thank you so much and have a nice day

@emilebosch
Copy link

No worry remember this is crap and needs love and attention and mime types and security checks and all that jazz

@Xeej
Copy link

Xeej commented Apr 4, 2024

In a production environment, after updating carrierwave to the latest version 3.0.5, saving documents began to take a long time to process. I studied the gem for a long time, and decided to set logging to save fog and noticed that the move_to or copy_to click method and caching occur in the fog environment. try adding https://github.com/fog/fog-core/blob/5560d230d90fcca70ec8f82d4f3d4950af7ed31b/lib/fog/core/connection.rb#L78 logging here and you will see that the request appears with. X-Copy-From
The only solution now was to roll back to an older version. copying inside a remote storage is a very expensive process (perhaps in newer versions of openstack it will be faster, but not for me).
in 1.3.4 version cached files in local storage
image_censored_1 3 4
in 3.0.5 cached in remote storage openstack
censored_3 0 5 (1)
sorry for censor))

@Xeej
Copy link

Xeej commented Apr 4, 2024

im trying change X-Copy-From to Destination with monkey patch, its work fasterer, but somtetimes to very long. on the graph we can see how much longer it took to load documents
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants