Zip Up All Paperclip Attachments Stored on S3

Zip up all Paperclip attachments stored on S3

You almost certainly want to use e.abstract.to_file.path instead of e.abstract.url(...).

See:

  • Paperclip::Storage::S3::to_file (should return a TempFile)
  • TempFile::path

UPDATE

From the changelog:

New in 3.0.1:

  • API CHANGE: #to_file has been removed. Use the #copy_to_local_file method instead.

Paperclip + S3 massive zipping

Note: Some questions at stackoverflow are outdated, some paperclip methods are gone.

Lets say we got a User and it :has_many => user_attachments

GC.disable
@user = User.find(params[:user_id])
zip_filename = "User attachments - #{@user.id}.zip" # the file name
tmp_filename = "#{Rails.root}/tmp/#{zip_filename}" # the path
Zip::ZipFile.open(tmp_filename, Zip::ZipFile::CREATE) do |zip|
@user.user_attachments.each { |e|
attachment = Paperclip.io_adapters.for(e.attachment) #has_attached_file :attachment (,...)
zip.add("#{e.attachment.original_filename}", attachment.path)
}
end
send_data(File.open(tmp_filename, "rb+").read, :type => 'application/zip', :disposition => 'attachment', :filename => zip_filename)
File.delete tmp_filename
GC.enable
GC.start

The trick is to disable the GC in order to avoid Errno::ENOENT exception. The GC will delete the downloaded attachment from S3 before it gets zipped.

Sources:

to_file broken in master?

io_adapters.for(object.attachment).path failing randomly

Paperclip multiple storage

Maybe you'd benefit from this:

http://airbladesoftware.com/notes/asynchronous-s3/

What you'll have to do is firstly upload to your local storage, and then "asynchronously" upload to S3

This is typically done through the likes of Resque or DelayedJob (as the tutorial demonstrates), and will require you to run some sort of third-party processing engine on your server (typically Redis or similar)

From the tutorial:

### Models ###

class Person < ActiveRecord::Base
has_attached_file :local_image,
path: ":rails_root/public/system/:attachment/:id/:style/:basename.:extension",
url: "/system/:attachment/:id/:style/:basename.:extension"

has_attached_file :image,
styles: {large: '500x500#', medium: '200x200#', small: '70x70#'},
convert_options: {all: '-strip'},
storage: :s3,
s3_credentials: "#{Rails.root}/config/s3.yml",
s3_permissions: :private,
s3_host_name: 's3-eu-west-1.amazonaws.com',
s3_headers: {'Expires' => 1.year.from_now.httpdate,
'Content-Disposition' => 'attachment'},
path: "images/:id/:style/:filename"

after_save :queue_upload_to_s3

def queue_upload_to_s3
Delayed::Job.enqueue ImageJob.new(id) if local_image? && local_image_updated_at_changed?
end

def upload_to_s3
self.image = local_image.to_file
save!
end
end

class ImageJob < Struct.new(:image_id)
def perform
image = Image.find image_id
image.upload_to_s3
image.local_image.destroy
end
end

### Views ###

# app/views/people/edit.html.haml
# ...
= f.file_field :local_image

# app/views/people/show.html.haml
- if @person.image?
= image_tag @person.image.expiring_url(20, :small)
- else
= image_tag @person.local_image.url, size: '70x70'

403 AccessDenied for Paperclip attachments uploaded to Amazon S3 using s3cmd tool

It looks like your S3 bucket policy is not allowing Public Read access from Public users properly. Try something like:

{
"Version": "2008-10-17",
"Statement": [
{
"Sid": "AllowPublicRead",
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": [
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::my-brand-new-bucket/*"
]
}
]
}

The fact you are able to access these files as a public user when you manually apply the public read permissions confirms your bucket policy is not granting read access correctly.

When you use a public S3 URL to access the files, there is no authenticated user.

Rails - Paperclip not working with AWS

It's working now

Here is the code I used:

#production.rb
config.paperclip_defaults = {
storage: :s3,
s3_region: ENV["AWS_REGION"],
s3_host_name: "s3-us-west-2.amazonaws.com",
s3_credentials: {
# s3_host_name: ENV["AWS_HOST_NAME"],
bucket: ENV["S3_BUCKET_NAME"],
access_key_id: ENV["AWS_ACCESS_KEY_ID"],
secret_access_key: ENV["AWS_SECRET_ACCESS_KEY"]
}
}

@meagar pointed out that I needed to combine the 2 config statements. I also uploaded an image to the S3 bucket to double check bucket name/region. And I recreated IAM credentials.

CKEditor gem with Paperclip and Amazon S3

One way is to see what the ckeditor install generator is doing.
For example, if using ActiveRecord as ORM, take a look at the templates being used for the models that use Paperclip here.

The generator actually copies this templates into your app/models/ckeditor folder. You could edit them and configure as needed for Paperclip to use S3.

For ActiveRecord, the models are:

/app/models/ckeditor/attachment_file.rb
/app/models/ckeditor/picture.rb

Keep in mind that this approach could give you extra work in the future if the ckeditor gem is updated and the update process needs to overwrite this models.

Else, you can use Paperclip default options. In you Paperclip initializer (/config/initializers/paperclip.rb) use:

Paperclip::Attachment.default_options.merge!(
YOUR OPTIONS FOR S3 HERE
)


Related Topics



Leave a reply



Submit