Crawler is a pilot project designed to crawl BD sites only. First, Initiate crawler module provides current status of URL queue to the Schedule policy module and receives policy map and crawling threshold number. Until a threshold reached flag is received from the Fetch URL module, Crawler receives a link from Fetch URL module. It then retrieves a new page by providing page link to the Fetch site page module. This page is then sent to the Extract URL module and it returns a set of newly extracted links to the Crawler. In order to do that, Extract URL module first generates raw links using Parse raw links module and sends those to the Filter Valid links module. Filter valid links module returns only the URLs that Crawler should crawl in the future. After receiving the filtered links, Extract URL module formats the links using a library module called Format link and finally sends these filtered-formatted links to the Crawler. Based on the policy map, Crawler can either add the links to the Update URL queue module which is an off-page connector or dispatch to the Reschedule policy module which is an on-page connector.

Computer Networking: A Top-Down Approach (7th Edition)
7th Edition
ISBN:9780133594140
Author:James Kurose, Keith Ross
Publisher:James Kurose, Keith Ross
Chapter1: Computer Networks And The Internet
Section: Chapter Questions
Problem R1RQ: What is the difference between a host and an end system? List several different types of end...
icon
Related questions
Question

Crawler is a pilot project designed to crawl BD sites only. First, Initiate crawler module provides current status of URL queue to the Schedule policy module and receives policy map and crawling threshold number. Until a threshold reached flag is received from the Fetch URL module, Crawler receives a link from Fetch URL module. It then retrieves a new page by providing page link to the Fetch site page module. This page is then sent to the Extract URL module and it returns a set of newly extracted links to the Crawler. In order to do that, Extract URL module first generates raw links using Parse raw links module and sends those to the Filter Valid links module. Filter valid links module returns only the URLs that Crawler should crawl in the future. After receiving the filtered links, Extract URL module formats the links using a library module called Format link and finally sends these filtered-formatted links to the Crawler. Based on the policy map, Crawler can either add the links to the Update URL queue module which is an off-page connector or dispatch to the Reschedule policy module which is an on-page connector. 

Design a structure chart based on the above information.

Expert Solution
steps

Step by step

Solved in 2 steps with 1 images

Blurred answer
Similar questions
  • SEE MORE QUESTIONS
Recommended textbooks for you
Computer Networking: A Top-Down Approach (7th Edi…
Computer Networking: A Top-Down Approach (7th Edi…
Computer Engineering
ISBN:
9780133594140
Author:
James Kurose, Keith Ross
Publisher:
PEARSON
Computer Organization and Design MIPS Edition, Fi…
Computer Organization and Design MIPS Edition, Fi…
Computer Engineering
ISBN:
9780124077263
Author:
David A. Patterson, John L. Hennessy
Publisher:
Elsevier Science
Network+ Guide to Networks (MindTap Course List)
Network+ Guide to Networks (MindTap Course List)
Computer Engineering
ISBN:
9781337569330
Author:
Jill West, Tamara Dean, Jean Andrews
Publisher:
Cengage Learning
Concepts of Database Management
Concepts of Database Management
Computer Engineering
ISBN:
9781337093422
Author:
Joy L. Starks, Philip J. Pratt, Mary Z. Last
Publisher:
Cengage Learning
Prelude to Programming
Prelude to Programming
Computer Engineering
ISBN:
9780133750423
Author:
VENIT, Stewart
Publisher:
Pearson Education
Sc Business Data Communications and Networking, T…
Sc Business Data Communications and Networking, T…
Computer Engineering
ISBN:
9781119368830
Author:
FITZGERALD
Publisher:
WILEY