Module: Paperclip::Storage::S3
- Defined in:
- lib/paperclip/storage.rb
Overview
Amazon’s S3 file hosting service is a scalable, easy place to store files for distribution. You can find out more about it at aws.amazon.com/s3 There are a few S3-specific options for has_attached_file:
-
s3_credentials
: Takes a path, a File, or a Hash. The path (or File) must point to a YAML file containing theaccess_key_id
andsecret_access_key
that Amazon gives you. You can ‘environment-space’ this just like you do to your database.yml file, so different environments can use different accounts:development: access_key_id: 123... secret_access_key: 123... test: access_key_id: abc... secret_access_key: abc... production: access_key_id: 456... secret_access_key: 456...
This is not required, however, and the file may simply look like this:
access_key_id: 456... secret_access_key: 456...
In which case, those access keys will be used in all environments. You can also put your bucket name in this file, instead of adding it to the code directly. This is useful when you want the same account but a different bucket for development versus production.
-
s3_permissions
: This is a String that should be one of the “canned” access policies that S3 provides (more information can be found here: docs.amazonwebservices.com/AmazonS3/2006-03-01/RESTAccessPolicy.html#RESTCannedAccessPolicies) The default for Paperclip is :public_read. -
s3_protocol
: The protocol for the URLs generated to your S3 assets. Can be either ‘http’ or ‘https’. Defaults to ‘http’ when your :s3_permissions are :public_read (the default), and ‘https’ when your :s3_permissions are anything else. -
s3_headers
: A hash of headers such as => 1.year.from_now.httpdate -
bucket
: This is the name of the S3 bucket that will store your files. Remember that the bucket must be unique across all of Amazon S3. If the bucket does not exist Paperclip will attempt to create it. The bucket name will not be interpolated. You can define the bucket as a Proc if you want to determine it’s name at runtime. Paperclip will call that Proc with attachment as the only argument. -
s3_host_alias
: The fully-qualified domain name (FQDN) that is the alias to the S3 domain of your bucket. Used with the :s3_alias_url url interpolation. See the link in theurl
entry for more information about S3 domains and buckets. -
url
: There are three options for the S3 url. You can choose to have the bucket’s name placed domain-style (bucket.s3.amazonaws.com) or path-style (s3.amazonaws.com/bucket). Lastly, you can specify a CNAME (which requires the CNAME to be specified as :s3_alias_url. You can read more about CNAMEs and S3 at docs.amazonwebservices.com/AmazonS3/latest/index.html?VirtualHosting.html Normally, this won’t matter in the slightest and you can leave the default (which is path-style, or :s3_path_url). But in some cases paths don’t work and you need to use the domain-style (:s3_domain_url). Anything else here will be treated like path-style. NOTE: If you use a CNAME for use with CloudFront, you can NOT specify https as your :s3_protocol; This is *not supported* by S3/CloudFront. Finally, when using the host alias, the :bucket parameter is ignored, as the hostname is used as the bucket name by S3. -
path
: This is the key under the bucket in which the file will be stored. The URL will be constructed from the bucket and the path. This is what you will want to interpolate. Keys should be unique, like filenames, and despite the fact that S3 (strictly speaking) does not support directories, you can still use a / to separate parts of your file name.
Class Method Summary collapse
Instance Method Summary collapse
- #bucket_name ⇒ Object
- #exists?(style = default_style) ⇒ Boolean
- #expiring_url(time = 3600) ⇒ Object
-
#flush_deletes ⇒ Object
:nodoc:.
-
#flush_writes ⇒ Object
:nodoc:.
- #parse_credentials(creds) ⇒ Object
- #s3_host_alias ⇒ Object
- #s3_protocol ⇒ Object
-
#to_file(style = default_style) ⇒ Object
Returns representation of the data of the file assigned to the given style, in the format most representative of the current storage.
Class Method Details
.extended(base) ⇒ Object
133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 |
# File 'lib/paperclip/storage.rb', line 133 def self.extended base begin require 'aws/s3' rescue LoadError => e e. << " (You may need to install the aws-s3 gem)" raise e end base.instance_eval do @s3_credentials = parse_credentials(@options[:s3_credentials]) @bucket = @options[:bucket] || @s3_credentials[:bucket] @bucket = @bucket.call(self) if @bucket.is_a?(Proc) @s3_options = @options[:s3_options] || {} @s3_permissions = @options[:s3_permissions] || :public_read @s3_protocol = @options[:s3_protocol] || (@s3_permissions == :public_read ? 'http' : 'https') @s3_headers = @options[:s3_headers] || {} @s3_host_alias = @options[:s3_host_alias] @url = ":s3_path_url" unless @url.to_s.match(/^:s3.*url$/) AWS::S3::Base.establish_connection!( @s3_options.merge( :access_key_id => @s3_credentials[:access_key_id], :secret_access_key => @s3_credentials[:secret_access_key] )) end Paperclip.interpolates(:s3_alias_url) do |, style| "#{.s3_protocol}://#{.s3_host_alias}/#{.path(style).gsub(%r{^/}, "")}" end Paperclip.interpolates(:s3_path_url) do |, style| "#{.s3_protocol}://s3.amazonaws.com/#{.bucket_name}/#{.path(style).gsub(%r{^/}, "")}" end Paperclip.interpolates(:s3_domain_url) do |, style| "#{.s3_protocol}://#{.bucket_name}.s3.amazonaws.com/#{.path(style).gsub(%r{^/}, "")}" end Paperclip.interpolates(:s3_simple_url) do |, style| "/#{.path(style).gsub(%r{^/}, "")}" end end |
Instance Method Details
#bucket_name ⇒ Object
174 175 176 |
# File 'lib/paperclip/storage.rb', line 174 def bucket_name @bucket end |
#exists?(style = default_style) ⇒ Boolean
187 188 189 190 191 192 193 |
# File 'lib/paperclip/storage.rb', line 187 def exists?(style = default_style) if original_filename AWS::S3::S3Object.exists?(path(style), bucket_name) else false end end |
#expiring_url(time = 3600) ⇒ Object
170 171 172 |
# File 'lib/paperclip/storage.rb', line 170 def expiring_url(time = 3600) AWS::S3::S3Object.url_for(path, bucket_name, :expires_in => time ) end |
#flush_deletes ⇒ Object
:nodoc:
226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 |
# File 'lib/paperclip/storage.rb', line 226 def flush_deletes #:nodoc: unless delete_files? @queued_for_delete = [] return end @queued_for_delete.each do |path| begin log("deleting #{path}") AWS::S3::S3Object.delete(path, bucket_name) rescue AWS::S3::ResponseError # Ignore this. end end @queued_for_delete = [] end |
#flush_writes ⇒ Object
:nodoc:
209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 |
# File 'lib/paperclip/storage.rb', line 209 def flush_writes #:nodoc: @queued_for_write.each do |style, file| begin log("saving #{path(style)}") AWS::S3::S3Object.store(path(style), file, bucket_name, {:content_type => instance_read(:content_type), :access => @s3_permissions, }.merge(@s3_headers)) rescue AWS::S3::ResponseError => e raise end end @queued_for_write = {} end |
#parse_credentials(creds) ⇒ Object
182 183 184 185 |
# File 'lib/paperclip/storage.rb', line 182 def parse_credentials creds creds = find_credentials(creds).stringify_keys (creds[Rails.env] || creds).symbolize_keys end |
#s3_host_alias ⇒ Object
178 179 180 |
# File 'lib/paperclip/storage.rb', line 178 def s3_host_alias @s3_host_alias end |
#s3_protocol ⇒ Object
195 196 197 |
# File 'lib/paperclip/storage.rb', line 195 def s3_protocol @s3_protocol end |
#to_file(style = default_style) ⇒ Object
Returns representation of the data of the file assigned to the given style, in the format most representative of the current storage.
201 202 203 204 205 206 207 |
# File 'lib/paperclip/storage.rb', line 201 def to_file style = default_style return @queued_for_write[style] if @queued_for_write[style] file = Tempfile.new(path(style)) file.write(AWS::S3::S3Object.value(path(style), bucket_name)) file.rewind return file end |