Module: Paperclip::Storage::S3
- Defined in:
- lib/paperclip/storage/s3.rb
Overview
Amazon’s S3 file hosting service is a scalable, easy place to store files for distribution. You can find out more about it at aws.amazon.com/s3
To use Paperclip with S3, include the aws-sdk
gem in your Gemfile:
gem 'aws-sdk'
There are a few S3-specific options for has_attached_file:
-
s3_credentials
: Takes a path, a File, or a Hash. The path (or File) must point to a YAML file containing theaccess_key_id
andsecret_access_key
that Amazon gives you. You can ‘environment-space’ this just like you do to your database.yml file, so different environments can use different accounts:development: access_key_id: 123... secret_access_key: 123... test: access_key_id: abc... secret_access_key: abc... production: access_key_id: 456... secret_access_key: 456...
This is not required, however, and the file may simply look like this:
access_key_id: 456... secret_access_key: 456...
In which case, those access keys will be used in all environments. You can also put your bucket name in this file, instead of adding it to the code directly. This is useful when you want the same account but a different bucket for development versus production.
-
s3_permissions
: This is a String that should be one of the “canned” access policies that S3 provides (more information can be found here: docs.amazonwebservices.com/AmazonS3/latest/dev/index.html?RESTAccessPolicy.html) The default for Paperclip is :public_read.You can set permission on a per style bases by doing the following:
:s3_permissions => { :original => :private }
Or globaly:
:s3_permissions => :private
-
s3_protocol
: The protocol for the URLs generated to your S3 assets. Can be either ‘http’, ‘https’, or an empty string to generate scheme-less URLs. Defaults to ‘http’ when your :s3_permissions are :public_read (the default), and ‘https’ when your :s3_permissions are anything else. -
s3_headers
: A hash of headers or a Proc. You may specify a hash such as => 1.year.from_now.httpdate. If you use a Proc, headers are determined at runtime. Paperclip will call that Proc with attachment as the only argument. -
bucket
: This is the name of the S3 bucket that will store your files. Remember that the bucket must be unique across all of Amazon S3. If the bucket does not exist Paperclip will attempt to create it. The bucket name will not be interpolated. You can define the bucket as a Proc if you want to determine it’s name at runtime. Paperclip will call that Proc with attachment as the only argument. -
s3_host_alias
: The fully-qualified domain name (FQDN) that is the alias to the S3 domain of your bucket. Used with the :s3_alias_url url interpolation. See the link in theurl
entry for more information about S3 domains and buckets. -
url
: There are four options for the S3 url. You can choose to have the bucket’s name placed domain-style (bucket.s3.amazonaws.com) or path-style (s3.amazonaws.com/bucket). You can also specify a CNAME (which requires the CNAME to be specified as :s3_alias_url. You can read more about CNAMEs and S3 at docs.amazonwebservices.com/AmazonS3/latest/index.html?VirtualHosting.html Normally, this won’t matter in the slightest and you can leave the default (which is path-style, or :s3_path_url). But in some cases paths don’t work and you need to use the domain-style (:s3_domain_url). Anything else here will be treated like path-style.Notes:
-
The value of this option is a string, not a symbol. right:
":s3_domain_url"
wrong::s3_domain_url
-
If you use a CNAME for use with CloudFront, you can NOT specify https as your :s3_protocol; This is *not supported* by S3/CloudFront. Finally, when using the host alias, the :bucket parameter is ignored, as the hostname is used as the bucket name by S3. The fourth option for the S3 url is :asset_host, which uses Rails’ built-in asset_host settings.
-
To get the full url from a paperclip’d object, use the image_path helper; this is what image_tag uses to generate the url for an img tag.
-
-
path
: This is the key under the bucket in which the file will be stored. The URL will be constructed from the bucket and the path. This is what you will want to interpolate. Keys should be unique, like filenames, and despite the fact that S3 (strictly speaking) does not support directories, you can still use a / to separate parts of your file name. -
s3_host_name
: If you are using your bucket in Tokyo region etc, write host_name. -
s3_metadata
: These key/value pairs will be stored with the object. This option works by prefixing each key with “x-amz-meta-” before sending it as a header on the object upload request. -
s3_storage_class
: If this option is set to:reduced_redundancy
, the object will be stored using Reduced Redundancy Storage. RRS enables customers to reduce their costs by storing non-critical, reproducible data at lower levels of redundancy than Amazon S3’s standard storage.
Class Method Summary collapse
Instance Method Summary collapse
- #bucket_name ⇒ Object
- #copy_to_local_file(style, local_dest_path) ⇒ Object
- #create_bucket ⇒ Object
- #exists?(style = default_style) ⇒ Boolean
- #expiring_url(time = 3600, style_name = default_style) ⇒ Object
-
#flush_deletes ⇒ Object
:nodoc:.
-
#flush_writes ⇒ Object
:nodoc:.
- #http_proxy_host ⇒ Object
- #http_proxy_password ⇒ Object
- #http_proxy_port ⇒ Object
- #http_proxy_user ⇒ Object
- #parse_credentials(creds) ⇒ Object
- #s3_bucket ⇒ Object
- #s3_credentials ⇒ Object
- #s3_host_alias ⇒ Object
- #s3_host_name ⇒ Object
- #s3_interface ⇒ Object
- #s3_object(style_name = default_style) ⇒ Object
- #s3_permissions(style = default_style) ⇒ Object
- #s3_protocol(style = default_style) ⇒ Object
- #s3_url_options ⇒ Object
- #set_permissions(permissions) ⇒ Object
- #using_http_proxy? ⇒ Boolean
Class Method Details
.extended(base) ⇒ Object
93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 |
# File 'lib/paperclip/storage/s3.rb', line 93 def self.extended base begin require 'aws-sdk' rescue LoadError => e e. << " (You may need to install the aws-sdk gem)" raise e end unless defined?(AWS::Core) # Overriding AWS::Core::LogFormatter to make sure it return a UTF-8 string if AWS::VERSION >= "1.3.9" AWS::Core::LogFormatter.class_eval do def summarize_hash(hash) hash.map { |key, value| ":#{key}=>#{summarize_value(value)}".force_encoding('UTF-8') }.sort.join(',') end end else AWS::Core::ClientLogging.class_eval do def sanitize_hash(hash) hash.map { |key, value| "#{sanitize_value(key)}=>#{sanitize_value(value)}".force_encoding('UTF-8') }.sort.join(',') end end end base.instance_eval do @s3_options = @options[:s3_options] || {} @s3_permissions = (@options[:s3_permissions]) @s3_protocol = @options[:s3_protocol] || Proc.new do |style, | = (@s3_permissions[style.to_s.to_sym] || @s3_permissions[:default]) = .call(, style) if .is_a?(Proc) ( == :public_read) ? 'http' : 'https' end @s3_metadata = @options[:s3_metadata] || {} @s3_headers = @options[:s3_headers] || {} @s3_headers = @s3_headers.call(instance) if @s3_headers.is_a?(Proc) @s3_headers = (@s3_headers).inject({}) do |headers,(name,value)| case name.to_s when /^x-amz-meta-(.*)/i @s3_metadata[$1.downcase] = value else name = name.to_s.downcase.sub(/^x-amz-/,'').tr("-","_").to_sym headers[name] = value end headers end @s3_headers[:storage_class] = @options[:s3_storage_class] if @options[:s3_storage_class] @s3_server_side_encryption = @options[:s3_server_side_encryption] unless @options[:url].to_s.match(/^:s3.*url$/) || @options[:url] == ":asset_host" @options[:path] = @options[:path].gsub(/:url/, @options[:url]).gsub(/^:rails_root\/public\/system/, '') @options[:url] = ":s3_path_url" end @options[:url] = @options[:url].inspect if @options[:url].is_a?(Symbol) @http_proxy = @options[:http_proxy] || nil end Paperclip.interpolates(:s3_alias_url) do |, style| "#{.s3_protocol(style)}//#{.s3_host_alias}/#{.path(style).gsub(%r{^/}, "")}" end unless Paperclip::Interpolations.respond_to? :s3_alias_url Paperclip.interpolates(:s3_path_url) do |, style| "#{.s3_protocol(style)}//#{.s3_host_name}/#{.bucket_name}/#{.path(style).gsub(%r{^/}, "")}" end unless Paperclip::Interpolations.respond_to? :s3_path_url Paperclip.interpolates(:s3_domain_url) do |, style| "#{.s3_protocol(style)}//#{.bucket_name}.#{.s3_host_name}/#{.path(style).gsub(%r{^/}, "")}" end unless Paperclip::Interpolations.respond_to? :s3_domain_url Paperclip.interpolates(:asset_host) do |, style| "#{.path(style).gsub(%r{^/}, "")}" end unless Paperclip::Interpolations.respond_to? :asset_host end |
Instance Method Details
#bucket_name ⇒ Object
193 194 195 196 197 |
# File 'lib/paperclip/storage/s3.rb', line 193 def bucket_name @bucket = @options[:bucket] || s3_credentials[:bucket] @bucket = @bucket.call(self) if @bucket.is_a?(Proc) @bucket or raise ArgumentError, "missing required :bucket option" end |
#copy_to_local_file(style, local_dest_path) ⇒ Object
337 338 339 340 341 342 343 344 345 346 |
# File 'lib/paperclip/storage/s3.rb', line 337 def copy_to_local_file(style, local_dest_path) log("copying #{path(style)} to local file #{local_dest_path}") local_file = ::File.open(local_dest_path, 'wb') file = s3_object(style) local_file.write(file.read) local_file.close rescue AWS::Errors::Base => e warn("#{e} - cannot copy #{path(style)} to local file #{local_dest_path}") false end |
#create_bucket ⇒ Object
294 295 296 |
# File 'lib/paperclip/storage/s3.rb', line 294 def create_bucket s3_interface.buckets.create(bucket_name) end |
#exists?(style = default_style) ⇒ Boolean
267 268 269 270 271 272 273 274 275 |
# File 'lib/paperclip/storage/s3.rb', line 267 def exists?(style = default_style) if original_filename s3_object(style).exists? else false end rescue AWS::Errors::Base => e false end |
#expiring_url(time = 3600, style_name = default_style) ⇒ Object
166 167 168 169 170 171 |
# File 'lib/paperclip/storage/s3.rb', line 166 def expiring_url(time = 3600, style_name = default_style) if path = { :expires => time, :secure => use_secure_protocol?(style_name) } s3_object(style_name).url_for(:read, .merge()).to_s end end |
#flush_deletes ⇒ Object
:nodoc:
325 326 327 328 329 330 331 332 333 334 335 |
# File 'lib/paperclip/storage/s3.rb', line 325 def flush_deletes #:nodoc: @queued_for_delete.each do |path| begin log("deleting #{path}") s3_bucket.objects[path.sub(%r{^/},'')].delete rescue AWS::Errors::Base => e # Ignore this. end end @queued_for_delete = [] end |
#flush_writes ⇒ Object
:nodoc:
298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 |
# File 'lib/paperclip/storage/s3.rb', line 298 def flush_writes #:nodoc: @queued_for_write.each do |style, file| begin log("saving #{path(style)}") acl = @s3_permissions[style] || @s3_permissions[:default] acl = acl.call(self, style) if acl.respond_to?(:call) = { :content_type => file.content_type, :acl => acl } [:metadata] = @s3_metadata unless @s3_metadata.empty? unless @s3_server_side_encryption.blank? [:server_side_encryption] = @s3_server_side_encryption end .merge!(@s3_headers) s3_object(style).write(file, ) rescue AWS::S3::Errors::NoSuchBucket => e create_bucket retry end end after_flush_writes # allows attachment to clean up temp files @queued_for_write = {} end |
#http_proxy_host ⇒ Object
235 236 237 |
# File 'lib/paperclip/storage/s3.rb', line 235 def http_proxy_host using_http_proxy? ? @http_proxy[:host] : nil end |
#http_proxy_password ⇒ Object
247 248 249 |
# File 'lib/paperclip/storage/s3.rb', line 247 def http_proxy_password using_http_proxy? ? @http_proxy[:password] : nil end |
#http_proxy_port ⇒ Object
239 240 241 |
# File 'lib/paperclip/storage/s3.rb', line 239 def http_proxy_port using_http_proxy? ? @http_proxy[:port] : nil end |
#http_proxy_user ⇒ Object
243 244 245 |
# File 'lib/paperclip/storage/s3.rb', line 243 def http_proxy_user using_http_proxy? ? @http_proxy[:user] : nil end |
#parse_credentials(creds) ⇒ Object
260 261 262 263 264 265 |
# File 'lib/paperclip/storage/s3.rb', line 260 def parse_credentials creds creds = creds.respond_to?('call') ? creds.call(self) : creds creds = find_credentials(creds).stringify_keys env = Object.const_defined?(:Rails) ? Rails.env : nil (creds[env] || creds).symbolize_keys end |
#s3_bucket ⇒ Object
223 224 225 |
# File 'lib/paperclip/storage/s3.rb', line 223 def s3_bucket @s3_bucket ||= s3_interface.buckets[bucket_name] end |
#s3_credentials ⇒ Object
173 174 175 |
# File 'lib/paperclip/storage/s3.rb', line 173 def s3_credentials @s3_credentials ||= parse_credentials(@options[:s3_credentials]) end |
#s3_host_alias ⇒ Object
181 182 183 184 185 |
# File 'lib/paperclip/storage/s3.rb', line 181 def s3_host_alias @s3_host_alias = @options[:s3_host_alias] @s3_host_alias = @s3_host_alias.call(self) if @s3_host_alias.is_a?(Proc) @s3_host_alias end |
#s3_host_name ⇒ Object
177 178 179 |
# File 'lib/paperclip/storage/s3.rb', line 177 def s3_host_name @options[:s3_host_name] || s3_credentials[:s3_host_name] || "s3.amazonaws.com" end |
#s3_interface ⇒ Object
199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 |
# File 'lib/paperclip/storage/s3.rb', line 199 def s3_interface @s3_interface ||= begin config = { :s3_endpoint => s3_host_name } if using_http_proxy? proxy_opts = { :host => http_proxy_host } proxy_opts[:port] = http_proxy_port if http_proxy_port if http_proxy_user userinfo = http_proxy_user.to_s userinfo += ":#{http_proxy_password}" if http_proxy_password proxy_opts[:userinfo] = userinfo end config[:proxy_uri] = URI::HTTP.build(proxy_opts) end [:access_key_id, :secret_access_key].each do |opt| config[opt] = s3_credentials[opt] if s3_credentials[opt] end AWS::S3.new(config.merge(@s3_options)) end end |
#s3_object(style_name = default_style) ⇒ Object
227 228 229 |
# File 'lib/paperclip/storage/s3.rb', line 227 def s3_object style_name = default_style s3_bucket.objects[path(style_name).sub(%r{^/},'')] end |
#s3_permissions(style = default_style) ⇒ Object
277 278 279 280 281 |
# File 'lib/paperclip/storage/s3.rb', line 277 def (style = default_style) = @s3_permissions[style] || @s3_permissions[:default] = .call(self, style) if .is_a?(Proc) end |
#s3_protocol(style = default_style) ⇒ Object
283 284 285 286 287 288 289 290 291 292 |
# File 'lib/paperclip/storage/s3.rb', line 283 def s3_protocol(style = default_style) protocol = if @s3_protocol.is_a?(Proc) @s3_protocol.call(style, self) else @s3_protocol end protocol = protocol.split(":").first + ":" unless protocol.empty? protocol end |
#s3_url_options ⇒ Object
187 188 189 190 191 |
# File 'lib/paperclip/storage/s3.rb', line 187 def = @options[:s3_url_options] || {} = .call(instance) if .is_a?(Proc) end |
#set_permissions(permissions) ⇒ Object
251 252 253 254 255 256 257 258 |
# File 'lib/paperclip/storage/s3.rb', line 251 def if .is_a?(Hash) [:default] = [:default] || :public_read else = { :default => || :public_read } end end |
#using_http_proxy? ⇒ Boolean
231 232 233 |
# File 'lib/paperclip/storage/s3.rb', line 231 def using_http_proxy? !!@http_proxy end |