Class: Rds::S3::Backup::MyS3
- Inherits:
-
Object
- Object
- Rds::S3::Backup::MyS3
- Defined in:
- lib/rds-s3-backup/mys3.rb
Instance Method Summary collapse
- #backup_bucket ⇒ Object
-
#destroy ⇒ Object
nothing really to do here…
-
#get_bucket(bucket) ⇒ Object
Retrieve a pointer to an AWS S3 Bucket.
-
#get_storage(o = {}) ⇒ Object
Make a connection to AWS S3 using Fog::Storage.
-
#initialize(options) ⇒ MyS3
constructor
A new instance of MyS3.
-
#prune_files(o = {}) ⇒ Object
Remove older files from S3.
-
#s3 ⇒ Object
Lazy loaders.
- #s3_bucket ⇒ Object
-
#save(bucket, file_path, o = {}) ⇒ Object
Perform the actual save from a local file_path to AWS S3.
-
#save_clean(file_path) ⇒ Object
Save the Cleaned (Obfuscated) Version of the Database.
-
#save_production(file_path) ⇒ Object
Save the Real version of the Production Database.
Constructor Details
#initialize(options) ⇒ MyS3
Returns a new instance of MyS3.
19 20 21 |
# File 'lib/rds-s3-backup/mys3.rb', line 19 def initialize() @options = end |
Instance Method Details
#backup_bucket ⇒ Object
34 35 36 |
# File 'lib/rds-s3-backup/mys3.rb', line 34 def backup_bucket @backup_bucket ||= get_bucket(@options['backup_bucket']) end |
#destroy ⇒ Object
nothing really to do here…
158 159 |
# File 'lib/rds-s3-backup/mys3.rb', line 158 def destroy end |
#get_bucket(bucket) ⇒ Object
Retrieve a pointer to an AWS S3 Bucket
116 117 118 119 120 121 122 123 124 125 126 127 |
# File 'lib/rds-s3-backup/mys3.rb', line 116 def get_bucket(bucket) begin bucket = self.s3.directories.get(bucket) rescue Exception => e raise MyS3Exception.new "Error getting bucket #{bucket} in S3: #{e.class}: #{e}" end raise MyS3Exception.new "In #{self.class}#get_bucket: bucket is nil!" if bucket.nil? bucket end |
#get_storage(o = {}) ⇒ Object
Make a connection to AWS S3 using Fog::Storage
94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 |
# File 'lib/rds-s3-backup/mys3.rb', line 94 def get_storage(o={}) = { :aws_access_key_id => nil, :aws_secret_access_key => nil, :region => nil, :provider => 'AWS', :scheme => 'https'}.merge(o) begin storage = Fog::Storage.new() rescue Exception => e raise MyS3Exception.new "Error establishing storage connection: #{e.class}: #{e}" end $logger.debug "What is storage? #{storage.class}:#{storage.inspect}" raise MyS3Exception.new "In #{self.class}#get_storage: storage is nil!" if storage.nil? storage end |
#prune_files(o = {}) ⇒ Object
Remove older files from S3
Input
o – an options hash, expecting the following keys:
- :prefix
-
the prefix to use with the s3 bucket
- :keep
-
the number of files to keep. Must keep at least one.
70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 |
# File 'lib/rds-s3-backup/mys3.rb', line 70 def prune_files(o={}) = { :prefix => '', :keep => 1 }.merge(o) raise MyS3Exception "Must keep at least one file. options[:keep] = #{[:keep]}" if [:keep] < 1 # must keep at least one, the last one! my_files = s3_bucket.files.all('prefix' => [:prefix]) return if my_files.nil? if my_files.count > [:keep] my_files. sort {|x,y| x.last_modified <=> y.last_modified}. take(files_by_date.count - [:keep]). each do |f| $logger.info "Deleting #{f.name}" f.destroy end end end |
#s3 ⇒ Object
Lazy loaders
24 25 26 27 28 |
# File 'lib/rds-s3-backup/mys3.rb', line 24 def s3 @s3 ||= get_storage(:aws_access_key_id => @options['aws_access_key_id'], :aws_secret_access_key => @options['aws_secret_access_key'], :region => @options['aws_s3_region'] ||= @options['aws_region']) end |
#s3_bucket ⇒ Object
30 31 32 |
# File 'lib/rds-s3-backup/mys3.rb', line 30 def s3_bucket @s3_bucket ||= get_bucket(@options['s3_bucket']) end |
#save(bucket, file_path, o = {}) ⇒ Object
Perform the actual save from a local file_path to AWS S3
130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 |
# File 'lib/rds-s3-backup/mys3.rb', line 130 def save(bucket, file_path, o={}) raise MyS3Exception.new "bucket is nil!!" if bucket.nil? = { :key => File.join(@options['s3_prefix'], File.basename(file_path)), :body => File.open(file_path), :acl => 'authenticated-read', :encryption => 'AES256', :content_type => 'application/x-gzip'}.merge(o) tries = 0 begin bucket.files.new().save rescue Exception => e if tries < 3 $logger.info "Retrying S3 upload after #{tries} tries" tries += 1 sleep tries * 60 # progressive back off retry else raise MyS3Exception.new "Could not save #{File.basename(file_path)} to S3 after 3 tries: #{e.class}: #{e}" end end end |
#save_clean(file_path) ⇒ Object
Save the Cleaned (Obfuscated) Version of the Database
Input
- :file_path
-
the path where the file resides to save up to S3
57 58 59 |
# File 'lib/rds-s3-backup/mys3.rb', line 57 def save_clean(file_path) save(backup_bucket, file_path, :acl => 'authenticated-read') end |
#save_production(file_path) ⇒ Object
Save the Real version of the Production Database
Input
- :file_path
-
the path where the production database resides to save up to S3
46 47 48 |
# File 'lib/rds-s3-backup/mys3.rb', line 46 def save_production(file_path) save(s3_bucket, file_path, :acl => 'private') end |