fog
The Ruby cloud services library.
fog - The Ruby cloud services library
I get the following warning while querying Amazon S3 via the Fog gem:
[WARNING] fog: followed redirect to my-bucket.s3-external-3.amazonaws.com, connecting to the matching region will be more performant
How exactly do I "connect to the matching region"?
Source: (StackOverflow)
Every action in a rails console (rails server, rails console, db:migrate, etc.) raises a warning since my last bundle update:
[fog][WARNING] Unable to load the 'unf' gem. Your AWS strings may not be properly encoded.
I'm sure I didn't change anything in the AWS strings which are in my application.rb file:
# Amazon S3 credentials
ENV["AWS_ACCESS_KEY_ID"] = "AWS_ACCESS_KEY_ID"
ENV["AWS_SECRET_ACCESS_KEY"] = "AWS_SECRET_ACCESS_KEY"
ENV["AWS_S3_BUCKET"] = "my-bucket"
I don't have this "unf" gem in my gemfile. Should I add it?
Source: (StackOverflow)
I am creating a rails app that lets an administrator upload photos that are optionally publicly displayed. For the upload / storage process I am using the Carrierwave gem along with the Fog gem and S3. The issue is that in order to get this all working, I have to make every file uploaded to the s3 bucket public. Is there a way to make files public / private on a file-by-file basis? Also, if this file-by-file granularity is possible, can it extend down to versions of images (created by automatic Carrierwave resizing)?
Currently, I have the following line in my carrierwave initializer:
config.fog_public = true
Source: (StackOverflow)
I'm trying to find documentations on how to setup Paperclip to use fog.io and fog.io to use Rackspace Cloud File, but I wasn't able to find any good reference (and I consider myself a good Google language speaker :D). The ideal scenario would be a setup where I could use local storage for develop environment and Rackspace for prod.
Could any one point to a good doc or use this space to document this approach?
Tks!
== Update ==:
Paperclip to fog.io
https://github.com/thoughtbot/paperclip/blob/master/lib/paperclip/storage/fog.rb
fog.io to Rackspace Cloud File
http://fog.io/1.1.1/storage/
... still trying to figure out how to put these together.
Source: (StackOverflow)
I am trying to use CarrierWave with Amazon S3. When I try to upload a file, through a rake task, I get this error:
rake aborted!
Expected(200) <=> Actual(403 Forbidden)
My CarrierWave initializer looks like this:
CarrierWave.configure do |config|
config.fog_credentials = {
provider: 'AWS',
aws_access_key_id: MY_AWS_ACCESS_KEY_ID,
aws_secret_access_key: MY_AWS_SECRET_ACCESS_KEY
}
config.fog_directory = MY_BUCKET
config.fog_public = true
end
I do have real, hard-coded key/secret/bucket values set while I'm debugging this.
The rake task looks like this, and is successful when I have the CarrierWave uploader set to upload locally with storage: file
:
Photo.create({
image: File.new('lib/dummy_files/image.jpg')
})
Any help is much appreciated. Thanks!
Source: (StackOverflow)
I am a little lost with Heroku and Carrierwave Gem. I have read the WIKI, Read me and searched the net and i admit, i need help. Everything well on local but Heroku crushes the application.
///ERROR MESSAGE FROM HEROKU LOGS
2012-01-03T17:33:26+00:00 app[web.1]: /app/vendor/bundle/ruby/1.9.1/gems/carrierwave-0.5.8/lib/carrierwave/uploader/configuration.rb:91:in `eval': uninitialized constant CarrierWave::Storage::Fog (NameError
///GEM FILE
gem "fog"
gem 'carrierwave'
/app/uploaders/avatar_uploader.rb
storage :fog
/config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'XXXX',
:aws_secret_access_key => 'XXXX',
:region => 'eu-west-1' # optional, defaults to 'us-east-1'
}
config.fog_directory = 'site_images' # required
config.fog_public = true # optional, defaults to true
config.fog_attributes = {'Cache-Control'=>'max-age=315576000'} # optional, defaults to {}
end
When i change the storage to file not fog, then i do not get errors. Are there any other fog settings i am skipping or missing. Any help greatly appreciated. Do i need to create a separate document with fog settings?
Source: (StackOverflow)
I'm currently getting the following error: Excon::Errors::SocketError - Broken pipe (Errno::EPIPE)
when uploading images bigger than about 150kb. Images under 150kb work correctly. Research indicates that others have also experienced this problem but I'm yet to find a solution.
Error message
Excon::Errors::SocketError at /photos
Message Broken pipe (Errno::EPIPE)
File /Users/thmsmxwll/.rvm/rubies/ruby-1.9.3-p194/lib/ruby/1.9.1/openssl/buffering.rb
Line 375
image_uploader.rb
class ImageUploader < CarrierWave::Uploader::Base
include CarrierWave::RMagick
storage :fog
include CarrierWave::MimeTypes
process :set_content_type
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
version :large do
process :resize_to_limit => [800, 600]
end
end
carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
aws_access_key_id: ENV['AWS_ACCESS_KEY_ID'],
aws_secret_access_key: ENV['AWS_SECRET_ACCESS_KEY'],
:region => 'us-east-1'
}
config.fog_directory = 'abcd'
config.fog_public = true
config.fog_attributes = {'Cache-Control'=>'max-age=315576000'}
end
Source: (StackOverflow)
I am trying to set up my rails app to upload its assets to Amazon's AWS S3 using the asset_sync gem, following these instructions. I know I've got my S3 stuff configured right because my app is otherwise able to upload images to S3. I'm pretty sure I've got all the settings correct:
FOG_DIRECTORY => mybucketname
FOG_PROVIDER => AWS
FOG_REGION => s3-us-west-2
Yet I keep getting an error:
-bash> heroku run rake assets:precompile --remote staging
Running rake assets:precompile attached to terminal... up, run.1
AssetSync: using default configuration from built-in initializer
mkdir -p /app/public/assets
...
mkdir -p /app/public/assets
AssetSync: Syncing.
rake aborted!
getaddrinfo: Name or service not known # <-- error
Compiling locally produces a slightly different error:
-bash> bundle exec rake assets:precompile
AssetSync: using default configuration from built-in initializer
mkdir -p /Users/bart/Dev/MyApp/myapp/public/assets
...
mkdir -p /Users/bart/Dev/MyApp/myapp/public/assets
AssetSync: Syncing.
rake aborted!
getaddrinfo: nodename nor servname provided, or not known # <-- error
Source: (StackOverflow)
I'm using the excellent Fog gem to access just the Rackspace Cloud Files service. My challenge is that I'm trying to keep the service that is accessing Cloud Files lightweight, and it seems that Fog through its flexibility has a lot of dependencies and code I'll never need.
Has anybody tried to build a slimmed down copy of Fog, just to include a subset of providers, and therefore limit the dependencies? For example, for the Rackspace Cloud Files API exclusively, I'd expect to be able to handle everything without net-ssh, net-scp, nokogiri gems, and all the unused code for Amazon, Rackspace and 20 other providers that are not being used. I'm hoping to avoid upgrading the gem every time one of those unused providers notices a bug, while keeping my memory footprint down.
I'd appreciate any experience anybody may have in doing this, or advice from anybody familiar with building Fog in what I can and can't rip out.
If I'm just using the wrong gem, then that's equally fine. I'll move to something more focused.
Source: (StackOverflow)
I have a bunch of files on s3. I have fog set up with a .fog config file so I can fire up fog
and get a prompt. Now how do I access and edit a file on s3, if I know its path?
Source: (StackOverflow)
been trying to search the reason for this error for a long time and can't seem to find any...
So I have a rails app, and I utilize carrierwave for pictures uploading. I also want to utilize Amazon S3 for file upload storage in my app.
Initially as I am developing the app I allowed file uploads to be on the on :file, i.e.
image_uploader.rb
# Choose what kind of storage to use for this uploader:
storage :file
# storage :fog
Now upon finishing up development and placing it live (I use heroku), I decided to change the carrierwave storage to S3 to test it locally.
image_uploader.rb
# Choose what kind of storage to use for this uploader:
# storage :file
storage :fog
However, now when I try to upload a picture (be it for user avatar, etc) I get this error:
Excon::Errors::Forbidden in UsersController#update
Expected(200) <=> Actual(403 Forbidden)
request => {:connect_timeout=>60, :headers=>{"Content-Length"=>74577, "x-amz- acl"=>"private", "Content-Type"=>"image/png", "Date"=>"Sun, 26 Feb 2012 10:00:43 +0000", "Authorization"=>"AWS AKIAJOCDPFOU7UTT4HOQ:8ZnOy7X71nQAM87yraSI24Y5bSw=", "Host"=>"s3.amazonaws.com:443"}, :instrumentor_name=>"excon", :mock=>false, :read_timeout=>60, :retry_limit=>4, :ssl_verify_peer=>true, :write_timeout=>60, :host=>"s3.amazonaws.com", :path=>"/uploads//uploads%2Fuser%2Favatar%2F1%2Fjeffportraitmedium.png", :port=>"443", :query=>nil, :scheme=>"https", :body=>"\x89PNG\r\n\x1A\n\x00\x00\x00\rIHDR\x00\x00\x00\xC2\x00\x00\x00\xC3\b\x06\x00\x00\x00\xD0\xBD\xCE\x94\x00\x00\nCiCCPICC Profile\x00\x00x\x01\x9D\x96wTSY\x13\xC0\xEF{/\xBD\xD0\x12B\x91\x12z\rMJ\x00\x91\x12z\x91^E%$\
...
# The code you see above to the far right repeats itself a LOT
...
1@\x85\xB5\t\xFC_y~\xA6=:\xB2\xD0^\xBB~i\xBB\x82\x8F\x9B\xAF\xE7\x04m\xB2i\xFF\x17O\x94S\xF7l\x87\xA8&\x00\x00\x00\x00IEND\xAEB`\x82", :expects=>200, :idempotent=>true, :method=>"PUT"}
response => #<Excon::Response:0x007fc88ca9f3d8 @body="<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>8EFA56C0DDDC8878</RequestId><HostId>1OxWXppSSUq1MFjQwvnFptuCM3gKOuKdlQQyVSEgvzzv4Aj+r2hSFM2UUw2NYyrR</HostId></Error>", @headers={"x-amz-request-id"=>"8EFA56C0DDDC8878", "x-amz-id-2"=>"1OxWXppSSUq1MFjQwvnFptuCM3gKOuKdlQQyVSEgvzzv4Aj+r2hSFM2UUw2NYyrR", "Content-Type"=>"application/xml", "Transfer-Encoding"=>"chunked", "Date"=>"Sun, 26 Feb 2012 10:00:47 GMT", "Connection"=>"close", "Server"=>"AmazonS3"}, @status=403>
And then it says this as well for my application trace:
app/controllers/users_controller.rb:39:in `update'
And my REQUEST parameters:
{"utf8"=>"✓",
"_method"=>"put",
"authenticity_token"=>"DvADD1vYpCLcghq+EIOwVSjsfmAWCHhtA3VI5VGD/q8=",
"user"=>{"avatar"=>#<ActionDispatch::Http::UploadedFile:0x007fc88cde76f8
@original_filename="JeffPortraitMedium.png",
@content_type="image/png",
@headers="Content-Disposition: form-data; name=\"user[avatar]\";
filename=\"JeffPortraitMedium.png\"\r\nContent-Type: image/png\r\n",
@tempfile=#<File:/var/folders/vg/98nv58ss4v7gcbf8px_8dyqc0000gq/T/RackMultipart20120226- 19096-1ppu2sr>>,
"remote_avatar_url"=>"",
"name"=>"Jeff Lam ",
"email"=>"email@gmail.com",
"user_bio"=>"Tester Hello",
"shop"=>"1"},
"commit"=>"Update Changes",
"id"=>"1"}
Here's my users_controller.rb partial code:
def update
@user = User.find(params[:id])
if @user.update_attributes(params[:user])
redirect_back_or root_path
flash[:success] = "Your have updated your settings successfully."
else
flash.now[:error] = "Sorry! We are unable to update your settings. Please check your fields and try again."
render 'edit'
end
end
My image_uploader.rb code
# encoding: utf-8
class ImageUploader < CarrierWave::Uploader::Base
# Include RMagick or MiniMagick support:
# include CarrierWave::RMagick
include CarrierWave::MiniMagick
# Choose what kind of storage to use for this uploader:
# storage :file
storage :fog
# Override the directory where uploaded files will be stored.
# This is a sensible default for uploaders that are meant to be mounted:
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
# Provide a default URL as a default if there hasn't been a file uploaded:
# def default_url
# "/images/fallback/" + [version_name, "default.png"].compact.join('_')
# end
# Process files as they are uploaded:
# process :scale => [200, 300]
#
# def scale(width, height)
# # do something
# end
# Create different versions of your uploaded files:
version :thumb do
process resize_to_fill: [360, 250]
end
version :cover_photo_thumb do
process resize_to_fill: [1170, 400]
end
version :event do
process resize_to_fill: [550, 382]
end
version :product do
process resize_to_fit: [226, 316]
end
# Add a white list of extensions which are allowed to be uploaded.
# For images you might use something like this:
def extension_white_list
%w(jpg jpeg gif png)
end
# Override the filename of the uploaded files:
# Avoid using model.id or version_name here, see uploader/store.rb for details.
# def filename
# "something.jpg" if original_filename
# end
# fix for Heroku, unfortunately, it disables caching,
# see: https://github.com/jnicklas/carrierwave/wiki/How-to%3A-Make-Carrierwave-work-on-Heroku
def cache_dir
"#{Rails.root}/tmp/uploads"
end
end
Finally, my fog.rb file in the config/initializers
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => 'ACCESS_KEY', # required
:aws_secret_access_key => 'SECRET_ACCESS_KEY/ZN5SkOUtOEHd61/Cglq9', # required
:region => 'Singapore' # optional, defaults to 'us-east-1'
}
config.fog_directory = 'ruuva/' # required
config.fog_public = false # optional, defaults to true
end
I'm actually quite confused on some of the things in my fog.rb. Firstly, should I change my region to Singapore if I created a bucket called "ruuva", with region "Singapore" on my amazon s3 account?
Thank you to anyone that can help in advance!
Source: (StackOverflow)
I'm uploading my images with Carrierwave and Fog to S3. On the upload I also create a thumbnail version of the image:
version :thumb do
process :resize_to_limit => [90, 80], if: :is_resizable?
end
Now I need a method to check if thumbnail version exists.
The Documentation lists the exists?
method. This actually works, if I want to check the existence of the original version:
asset.file.exists? # => true
But when I use the "thumb" version like this:
asset.url(:thumb).file.exists?
it get:
undefined method 'exists?' for #<String:0x007fcd9f9d9620>
:
Source: (StackOverflow)
I'm trying to figure out how to setup CarrierWave to work with Fog and Amazon S3. On S3, I have a bucket, "bucket1" with folder "images". Uploads work fine. For example, an image might get uploaded to something of the form https://s3.amazonaws.com/bucket1/images/picture/pic1.jpg. However, in the show view, when I call the image_url helper, I get https://s3.amazonaws.com/images/picture/pic1.jpg. What am I missing here?
#config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'aws_key',
:aws_secret_access_key => 'aws_secret'
}
config.fog_directory = 'bucket1'
config.fog_host = 'https://s3.amazonaws.com'
config.fog_public = true
config.fog_attributes = {'Cache-Control'=>'max-age=315576000'}
end
#app/uploader/image_uploader.rb
def store_dir
"images/#{model.class.to_s.underscore}"
end
#app/views/pictures/show.html.erb
<%= image_tag @picture.image_url if @picture.image? %>
Source: (StackOverflow)
Here is the Fog walkthrough of creating a file (an S3 object) in a directory (an S3 bucket):
connection = Fog::Storage.new({
:provider => 'AWS',
:aws_access_key_id => YOUR_AWS_ACCESS_KEY_ID,
:aws_secret_access_key => YOUR_AWS_SECRET_ACCESS_KEY
})
directory = connection.directories.create(
:key => "fog-demo-#{Time.now.to_i}", # globally unique name
:public => true
)
file = directory.files.create(
:key => 'resume.html',
:body => File.open("/path/to/my/resume.html"),
:public => true
)
But it looks to me as though this requires 2 API calls:
connection.directories.create
directory.files.create
If I already have the directory (an S3 bucket) created, how do I create an file (an S3 object) with only one Fog call?
Source: (StackOverflow)
I'm having this problem trying to use S3 services with fog and the Jquery File Upload (https://github.com/blueimp/jQuery-File-Upload)
The error
Excon::Errors::SocketError (getaddrinfo: nodename nor servname provided, or not known (SocketError)):
This occur when i try to call "save" method in the controller. I'm setting carrierwave as follow:
config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'access_key_here',
:aws_secret_access_key => 'secret_key_here',
:region => 'eu-east-1'
}
config.fog_directory = 'folder_name_here'
config.fog_public = false
config.fog_attributes = {'Cache-Control'=>'max-age=315576000'}
config.storage = :fog
end
and my uploader just the "config.storage :fog" and the "store_dir"
Now, I have created my bucket already.
Am i missing some configuration?
It doesn't even work on my dev environment.
Please any help. Thanks in advance.
Source: (StackOverflow)