Skip to content

GanadiniAkshay/web_crawler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

WEB CRAWLER BUILT IN PYTHON

This is a web crawler built using python. As it is expected that this will run on a single system, it is expected that users provide a depth upto which crawling should happen. This is done by setting the second argument in crawl to True (True by default) and the third argument is the depth. It returns a list of all the links.

However if you are feeling more adventurous there is another way of calling the crawl function so as to crawl the entire web. For this supply the second argument as False instead of true. In this case instead of returning all the links it prints out individual links as and when they are found

About

A Web Crawler Built in Python

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages