webb

An all-in-one Web Crawler, Web Parser and Web Scrapping library!


Keywords
scraper, crawler, spider, webb, crawl-pages, python-library
License
Apache-2.0
Install
pip install webb==0.9.2.5

Documentation

Webb - A Complete Web Scrapper and Crawler Library

An all-in-one Python library to scrap, parse and crawl web pages

Gist

This is a light-weight, dynamic and highly-flexible Python library. It can be used to crawl, download, index, parse, scrap and analyze web pages in a systematic manner or any of the individual functionality. It is also used to clean web pages, normalize web pages, store web data, extract server-side information and import/export relevant components from the web. Some of the other features also include downloading images from a web page, downloading google images and spidering wikipedia articles.

Usage and Instructions

For usage and instructions please visit the Official Documentation

For issues and discussion visit the Issue Tracker

For sample codes and examples, please visit Examples Codes

Compatability

This library is compatible with both Python 2 (2.x) as well as Python 3 (3.x) versions. It is a download-import-and-run program with no or little changes as required by users.

Dependencies

There are no dependencies to this project. Hurray! It functions entirely of the standard 'built-in' library support. It does not need any external support or installations. Just download and run!!!

Status

This is a stand-alone python script which is ready-to-run, but still under development. Many more features will be added to it shortly.

Disclaimer

The crawler function lets you download and crawl tons of web pages. Please do not download and crawl any pages of a domain without reading the 'robot.txt' file of that specific domain.

It is inappropriate to violate the robot.txt file and is strictly not recommended. This may even lead to the domain completely blocking your crawler and thus blacklisting it. It is also not appropriate to crawl pages at high rate as it may put a lot of pressure on the requesting server.