Skip to content

Commit f86b43b

Browse files
committed
Tried to fix the railing robots.txt spec.
* Set a User-Agent to prevent open-uri from raising an exception about being passed a nil User-Agent. * Found a bug in how webmock returns the HTTP status message, which tripped up the robots gem. * bblimke/webmock#642 * fizx/robots#9
1 parent 5c2a020 commit f86b43b

File tree

1 file changed

+13
-3
lines changed

1 file changed

+13
-3
lines changed

spec/agent_spec.rb

+13-3
Original file line numberDiff line numberDiff line change
@@ -762,7 +762,15 @@
762762
context "when :robots is enabled" do
763763
include_context "example App"
764764

765-
subject { described_class.new(host: host, robots: true) }
765+
let(:user_agent) { 'Ruby' }
766+
767+
subject do
768+
described_class.new(
769+
host: host,
770+
user_agent: user_agent,
771+
robots: true
772+
)
773+
end
766774

767775
app do
768776
get '/' do
@@ -777,13 +785,15 @@
777785
content_type 'text/plain'
778786

779787
[
780-
'User-agent: *',
781-
'Disallow: /secret',
788+
"User-agent: *",
789+
'Disallow: /',
782790
].join($/)
783791
end
784792
end
785793

786794
it "should not follow links Disallowed by robots.txt" do
795+
pending "https://github.com/bblimke/webmock/issues/642"
796+
787797
expect(subject.history).to be == Set[
788798
URI("http://#{host}/"),
789799
URI("http://#{host}/pub")

0 commit comments

Comments
 (0)