Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update dependencies and fix some bugs #24

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

update dependencies and fix some bugs #24

wants to merge 1 commit into from

Conversation

bjtulynn
Copy link

to run the project normally with recent stable dependencies, we need modify some points as below:

  1. update spider package (BaseSpider->Spider)
  2. update pipeline package (MediaPipeline -> ImagesPipeline)
  3. update mongodb package (pymongo.connection -> pymongo)
  4. fix bug for SpiderInfo instance (spiderinfo[spider] -> spiderinfo(spider))
  5. fix bug for FilePipeline init (add one missing parameter)

…modify some points as below:

1. update spider package (BaseSpider->Spider)
2. update pipeline package (MediaPipeline -> ImagesPipeline)
3. update mongodb package (pymongo.connection -> pymongo)
4. fix bug for SpiderInfo instance (spiderinfo[spider] -> spiderinfo(spider))
5. fix bug for FilePipeline init (add one missing parameter)
@Bajie-cigarette
Copy link

我按照你的方法对代码进行了修改
出现了如下的错误:
Unhandled error in Deferred:
Unhandled Error
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 57, in run
self.crawler_process.crawl(spname, **opts.spargs)
File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 163, in crawl
return self._crawl(crawler, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 167, in _crawl
d = crawler.crawl(*args, **kwargs)
File "/usr/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1237, in unwindGenerator
return _inlineCallbacks(None, gen, Deferred())
--- ---
File "/usr/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1099, in _inlineCallbacks
result = g.send(result)
File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 90, in crawl
six.reraise(*exc_info)
File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 72, in crawl
self.engine = self._create_engine()
File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 97, in _create_engine
return ExecutionEngine(self, lambda _: self.stop())
File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 70, in init
self.scraper = Scraper(crawler)
File "/usr/local/lib/python2.7/dist-packages/scrapy/core/scraper.py", line 71, in init
self.itemproc = itemproc_cls.from_crawler(crawler)
File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 58, in from_crawler
return cls.from_settings(crawler.settings, crawler)
File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 36, in from_settings
mw = mwcls.from_crawler(crawler)
File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/media.py", line 51, in from_crawler
pipe = cls.from_settings(crawler.settings)
File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/images.py", line 95, in from_settings
return cls(store_uri, settings=settings)
exceptions.TypeError: init() got an unexpected keyword argument 'settings'

我感觉是错误是出现在file.py中的
super(FilePipeline, self).init(store_uri,download_func=download_func)

这是怎么回事?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants