fakeweb
Ruby test helper for injecting fake responses to web requests
I never liked writing mocks and a while ago someone here recommended to use FakeWeb. I immediately fell completely in love with FakeWeb. However, I have to wonder if there is a downside to using FakeWeb. It seems like mocks are still much more common, so I wonder what I am missing that's wrong with using FakeWeb instead. Is there a certain kind of error you can't cover with Fakeweb or is it something about the TDD or BDD process?
Source: (StackOverflow)
I am writing a gem and I want to check that it is performing an http request with the parameters , headers and content that its supposed to pass. How do I write a unit test.
I am using httparty to do the request, I am also using fakeweb to test actions after the response.
Source: (StackOverflow)
I am trying to test my app and remote control. I use Rails 3.2 and lastest version of vcr and fakeweb. Actually I have watched RailsCast and now i want to do anti testing but I cannot.
My request test file;
require "spec_helper"
describe "ZipCodeLookup" do
it "show Beverly Hills given 90210", :vcr do
visit root_path
fill_in "zip_code", with: "90210"
click_on "Lookup"
page.should have_content("Beverly Hills")
end
it "any errors" do
visit root_path
fill_in "zip_code", with: "9"
click_on "Lookup"
page.should have_content("Nomethods")
end
end
do not get stuck in page.should line. I just tried how I can make an anti test.
and my model;
class ZipCode
attr_reader :state, :city, :area_code, :time_zone
def initialize(zip)
client = Savon::Client.new("http://www.webservicex.net/uszip.asmx?WSDL")
response = client.request :web, :get_info_by_zip, body: { "USZip" => zip }
data = response.to_hash[:get_info_by_zip_response][:get_info_by_zip_result][:new_data_set][:table]
@state = data[:state]
@city = data[:city]
@area_code = data[:area_code]
@time_zone = data[:time_zone]
end
end
And the error when I run bundle exec rspec spec/requests/zip_code_lookup_spec.rb
Ender-iMac:zip_coder-after ender$ bundle exec rspec spec/requests/zip_code_lookup_spec.rb
HTTPI executes HTTP GET using the net_http adapter
SOAP request: http://www.webservicex.net/uszip.asmx
SOAPAction: "http://www.webserviceX.NET/GetInfoByZIP", Content-Type: text/xml;charset=UTF-8, Content-Length: 389
<?xml version="1.0" encoding="UTF-8"?><env:Envelope xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:web="http://www.webserviceX.NET" xmlns:env="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ins0="http://www.webserviceX.NET"><env:Body><ins0:GetInfoByZIP><ins0:USZip>90210</ins0:USZip></ins0:GetInfoByZIP></env:Body></env:Envelope>
HTTPI executes HTTP POST using the net_http adapter
SOAP response (status 200):
<?xml version="1.0" encoding="utf-8"?><soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><soap:Body><GetInfoByZIPResponse xmlns="http://www.webserviceX.NET"><GetInfoByZIPResult><NewDataSet xmlns=""><Table><CITY>Beverly Hills</CITY><STATE>CA</STATE><ZIP>90210</ZIP><AREA_CODE>310</AREA_CODE><TIME_ZONE>P</TIME_ZONE></Table></NewDataSet></GetInfoByZIPResult></GetInfoByZIPResponse></soap:Body></soap:Envelope>
.HTTPI executes HTTP GET using the net_http adapter
F
Failures:
1) ZipCodeLookup any errors
Failure/Error: click_on "Lookup"
FakeWeb::NetConnectNotAllowedError:
Real HTTP connections are disabled. Unregistered request: GET http://www.webservicex.net/uszip.asmx?WSDL. You can use VCR to automatically record this request and replay it later. For more details, visit the VCR documentation at: http://relishapp.com/myronmarston/vcr/v/1-11-3
# ./app/models/zip_code.rb:6:in `initialize'
# ./app/controllers/zip_code_lookup_controller.rb:3:in `new'
# ./app/controllers/zip_code_lookup_controller.rb:3:in `index'
# (eval):2:in `click_on'
# ./spec/requests/zip_code_lookup_spec.rb:15:in `block (2 levels) in <top (required)>'
Finished in 0.17138 seconds
2 examples, 1 failure
Failed examples:
rspec ./spec/requests/zip_code_lookup_spec.rb:12 # ZipCodeLookup any errors
I do not understand why VCR cannot run same HTTP request in different states.
How can I test in anti test way?
SOLUTION
Sorry it was my mistake :( I forgot to write :vcr in "it" description line. it should be like this
it "any errors", :vcr do
visit root_path
fill_in "zip_code", with: "9"
click_on "Lookup"
page.should have_content("Nomethods")
end
Source: (StackOverflow)
What is the easiest way to test if HTTP-request is not made?
Say, I want my application NOT to send an API call to a remote server, if the user doesn't supply a username.
What RSpec test should I write in my Rails app to test that HTTP-request is never made?
Can fakeweb help me somehow?
Source: (StackOverflow)
I really like the way fakeweb in Ruby can be used to fake http requests when testing. Is there a similar library or an alternative for Python?
Source: (StackOverflow)
When I stub request with nock
it returns String
result instead of Object
even with 'Content-Type': 'application/json'
:
var response = {
success: true,
statusCode: 200,
body: {
"status": "OK",
"id": "05056b27b82",
}
};
Test.BuildRequest();
Test.SendRequest(done);
nock('https://someapi.com')
// also tried
// .defaultReplyHeaders({
// 'Content-Type': 'application/json',
// 'Accept': 'application/json'
// })
.post('/order')
.reply(200, response.body,
'Content-Type': 'application/json',
'Accept': 'application/json');
checking:
console.log(put.response.body);
console.log(put.response.body.id);
output:
{"status":"OK","id":"05056b27b82"}
undefined
In code I use request
module that returns Object
with the same data. I tried also sinon
(doesn't work for me) and fakeweb
but got the same issue.
My code, which i'm trying to test:
var request = require('request');
// ...
request(section.request, function (err, response, body) {
if (err || _.isEmpty(response))
return result(err, curSyndication);
//if (_.isString(body))
// body = JSON.parse(body);
section.response.body = body;
console.log(body.id); // => undefined (if uncomment previous code - 05056b27b82)
_this.handleResponse(section, response, body, result);
});
And it returns an object in real requests.
PS. I could add next code in my response handler:
if (_.isString(body))
body = JSON.parse(body);
But some of queries returns xml string, and i'm not responsible for such changes.
Fakeweb:
fakeweb.registerUri({
uri: 'https://someapi.com/order',
body: JSON.stringify({
status: "OK",
id: "05056b27b82",
}),
statusCode: 200,
headers: {
'User-Agent': 'My requestor',
'Content-Type': 'application/json',
'Accept': 'application/json'
}
});
Test.SendRequest(done);
Same results.
Updated:
I read a couple of articles, that uses JSON Object, without parsing it (with nock), so it should returns JSON object, same way how request library do that.
Source: (StackOverflow)
I'm currently using python_flickr_api to upload photos for my app: it uses httplib
to perform a multipart POST request.
Problem: I want to verify that the upload really is issued in an integration test by intercepting the POST request and creating a pre-canned success response so that my tests can run completely offline and not depend on Flickr (I don't want to upload the same test image 100 times, either!)
To this end, I've tried using two incredible libraries: VCRPy and HTTPretty. Neither of them solves my problem because neither of them support httplib
(HTTPretty comes closest, with support for httplib2
only), and I get an error that looks something like this:
Failure/Error: [Errno 32] Broken pipe
Traceback:
...
File "/usr/local/lib/python2.7/site-packages/flickr_api/upload.py", line 92, in upload
r = post(UPLOAD_URL,auth.AUTH_HANDLER,args,photo_file)
File "/usr/local/lib/python2.7/site-packages/flickr_api/upload.py", line 52, in post
r = multipart.posturl(url,fields,files)
File "/usr/local/lib/python2.7/site-packages/flickr_api/multipart.py", line 19, in posturl
return post_multipart(urlparts[1], urlparts[2], fields,files)
File "/usr/local/lib/python2.7/site-packages/flickr_api/multipart.py", line 33, in post_multipart
h.send(body)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py", line 805, in send
self.sock.sendall(data)
File "/usr/local/lib/python2.7/site-packages/httpretty/core.py", line 243, in sendall
return self._true_sendall(data)
File "/usr/local/lib/python2.7/site-packages/httpretty/core.py", line 216, in _true_sendall
self.truesock.sendall(data, *args, **kw)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/socket.py", line 224, in meth
return getattr(self._sock,name)(*args)
So clearly httpretty
is intercepting but is causing a broken pipe.
How can I fix this?
Source: (StackOverflow)
I have this error when I tried to implement "Fakeweb" and I do not understand the problem.
Scenario:
"After user had filled in all the information, the system will use one of the attributes "fbid" for validation and if success then only a new company will be created, if not the process will fail."
Failures:
1) Companies new company create with valid information correct facebook id validates facebook id
Failure/Error: it "should create a company" do
NoMethodError:
undefined method `it' for #<RSpec::Core::ExampleGroup::Nested_2::Nested_3::Nested_1::Nested_1:0x00000102fb84e0>
# ./spec/requests/companies_spec.rb:40:in `block (5 levels) in <top (required)>'
companies_spec.rb
describe "correct facebook id" do
#validate fbid
it "validates facebook id" do
FakeWeb.register_uri(:head, "http://graph.facebook.com/examplecompany", :username => 'examplecompany')
Company.new(:url => "http://graph.facebook.com/examplecompany").fb_id_string.should eq('examplecompany')
it "should create a company" do
expect { click_button submit }.to change(Company, :count).by(1)
end
model/company.rb
def fb_id_string
uri = URI.parse(url)
response = Net::HTTP.start(uri.host, uri.port) { |http| http.request_head(uri.path) }
response["username"].to_str
end
end
Source: (StackOverflow)
Is there a way to get a list of registered URIs in FakeWeb? When I register one like:
FakeWeb.register_uri(:get, url, body: expected_response)
It seems like it should be available somewhere since it keeps track of it internally, but I can't track it down externally. Something like FakeWeb.registered_uris
, but obviously that doesn't work.
Source: (StackOverflow)
to test my program against webpages, which change quite often, I need to mock the answers. I've found FakeWeb for Ruby, which would be a good starting point. Unfortunately, there seems to be no library, which provides similar functionality for Java.
So, my question is: How can I "record" requests and response pairs and "replay" them later, so that my application always receives the same webpages. To make things even more difficult, it should work for PUT and GET methods too.
Thanks in advance for any answer.
Regards, Daniel
Source: (StackOverflow)
I'm using the fakeweb module which overwrites Node's http.request
with the following function:
var old_request = http.request;
http.request = function(options, callback){
var rule = match_rule(options);
if(rule){
var res = new events.EventEmitter();
res.headers = rule.headers || {'Content-Type': 'text/html'};
return {end: function(){
callback(res);
res.emit('data', rule.body || '');
res.emit('end');
}
};
} else {
return old_request.call(http, options, callback);
}
};
My issue is that I get the bug: TypeError: Object #<Object> has no method 'on'
from the following code in another file:
var req = proto.request(options, function (res) { ... });
req.on('error', function (err) {
err.success = false;
settings.complete(err);
});
I think my issue occurs because it's no longer an EventEmitter
, though I may be wrong. How can I successfully overwrite http.request
without getting this issue?
Background: I'm using Node version 0.8.2. The request NPM is version 2.12.0
Update (Feb 11th, 2013): I want to provide some more info on where the http.request is being called so I can be more specific about what's needed and what causes bugs. Here is where it's called:
var proto = (options.protocol === 'https:') ? https : http;
var req = proto.request(options, function (res) {
var output = new StringStream();
switch (res.headers['content-encoding']) {
case 'gzip':
res.pipe(zlib.createGunzip()).pipe(output);
break;
case 'deflate':
res.pipe(zlib.createInflate()).pipe(output);
break;
default:
// Treat as uncompressed string
res.pipe(output);
break;
}
output.on('end', function() {
if (settings.cookieJar && res.headers['set-cookie']) {
var cookies = res.headers['set-cookie'];
for (var i = 0; i < cookies.length; i++) {
settings.cookieJar.set(cookies[i]);
}
}
if (res.statusCode >= 300 && res.statusCode < 400) {
if (settings.maxRedirects > settings.nRedirects++) {
// Follow redirect
var baseUrl = options.protocol + '//' + ((proxy) ? options.headers['host'] : options.host),
location = url.resolve(baseUrl, res.headers['location']);
request(location, settings);
} else {
var err = new Error('Max redirects reached.');
err.success = false;
settings.complete(err);
}
} else if (res.statusCode >= 400) {
var err = new Error(output.caught);
err.success = false;
err.code = res.statusCode;
settings.complete(err);
} else {
settings.complete(null, output.caught, res);
}
});
});
req.on('error', function (err) {
err.success = false;
settings.complete(err);
});
if (data) { req.write(data); }
req.end();
Source: (StackOverflow)
I'm using Mechanize to spider some websites. While spidering I save pages to files that I use later with Fakeweb to do tests.
My Mechanize agent is created this way:
Mechanize.new do |a|
a.read_timeout = 20 # doesn't work with Fakeweb?
a.max_history = 1
end
When I run my app enabling Fakeweb to fetch files instead of actual Internet access, my log throws these messages for every uri I try
W, [2011-08-20T18:49:45.764749 #14526] WARN -- : undefined method `read_timeout=' for #<FakeWeb::StubSocket:0xb72c150c>
If I comment the second line in the above code (# a.read_timeout = 20 ...
), it works perfectly. No problem at all. Any idea on how to enable read_timout and make Fakeweb work?
TIA
Source: (StackOverflow)
I try to mock out geocoding request by using FakeWeb (in cucumber/rails).
When I block all http requests I get the message:
Real HTTP connections are disabled. Unregistered request: GET
http://maps.google.com/maps/api/geocode/json?..... (FakeWeb::NetConnectNotAllowedError)
So I registered the url by trying:
FakeWeb.register_uri(:any, %r|http://maps\.google\.com/maps/|, :json
=> {
"status": "OK",
....}
I get the error "A JSON text must at least contain two octets!" (MultiJson::DecodeError)
I'm not sure what information to return. And how FakeWeb can return json data..
Does someone have a solution for stubbing out server requests to the google maps api?
Source: (StackOverflow)
I want to use 'VCR' gem for the purposes of blocking content of external webpages in popup windows. My problem is that after configuring VCR and allowing localhost connections I get internal server error when trying to reach localhost. When running specs, this line gets called:
@app.call(env)
It was trying to check if the socket is closed:
@socket.closed?
And resulted in:
undefined method `closed?' for nil:NilClass
When configuring, I have followed this description.
These are the relevant parts of my spec_helper.rb file:
VCR.configure do |config|
config.cassette_library_dir = 'fixtures/vcr_cassettes'
config.hook_into :fakeweb
config.ignore_localhost = true
end
config.before(:each) do
FakeWeb.allow_net_connect = true
end
config.after(:each) do
FakeWeb.allow_net_connect = false
end
Could you please give me a hint what should be done to fix this issue?
Source: (StackOverflow)