logo Hurry, Grab up to 30% discount on the entire course
Order Now logo
515 Times Downloaded

Ask This Question To Be Solved By Our ExpertsGet A+ Grade Solution Guaranteed

expert
Yusuf PathanStatistics
(5/5)

749 Answers

Hire Me
expert
Lance BlaserComputer science
(5/5)

500 Answers

Hire Me
expert
Henry BehrensAccounting
(5/5)

830 Answers

Hire Me
expert
Doll JuttHistory
(5/5)

865 Answers

Hire Me
Others
(5/5)

Welcome to the fourth lab! In this lab we will create our own code for crawling websites.

INSTRUCTIONS TO CANDIDATES
ANSWER ALL QUESTIONS

1 Introduction

Welcome to the fourth lab! In this lab we will create our own code for crawling websites.  The lab instructions are set as minimum expectations. You are always encouraged to go beyond the expectations. Challenge yourself, have fun, and learn more!

Remember that when you have finished your lab activities, you need to show your final results to your TA. They will provide feedback if necessary and record your mark.

 

2 Setting Up

Make sure you have a properly working version of the queue module in the folder you will be working in for this lab. Then create a module called crawling.py. All of your functions for this lab will be written in this module.

 

3 Tasks

Familiarize yourself with your breadth-first search and Beautiful Soup code from the previous two labs. With this information on hand, you are tasked with creating your own website crawler. This crawler will create a directed graph showing the connections between all of the webpages you have crawled.  Your crawler should stop once it has crawled 100,000 webpages or once it has run out of webpages to crawl to.

There are a few things to note when crawling webpages:

      The urls of webpages contain the string ‘http’

      Webpages do not contain image or pdf suffixes

      An edge should be added between every linked webpage, whether it has been seen previ- ously or not.

It is recommended that you add a section to your code to save your graph every 500 crawled webpages (HINT: Look into write gexf in NetworkX). This will give you graph data to work with in the event that your crawler crashes.

It is also recommended that you print every url you crawl so you can see your code working during the process.

Deliverable: Submit all your code to the Canvas. Your code should have internal documenta- tion (or comments) explaining each line of code and instructions.

 

 

(5/5)
Attachments:

Expert's Answer

515 Times Downloaded

Related Questions

. The fundamental operations of create, read, update, and delete (CRUD) in either Python or Java

CS 340 Milestone One Guidelines and Rubric  Overview: For this assignment, you will implement the fundamental operations of create, read, update,

. Develop a program to emulate a purchase transaction at a retail store. This  program will have two classes, a LineItem class and a Transaction class

Retail Transaction Programming Project  Project Requirements:  Develop a program to emulate a purchase transaction at a retail store. This

. The following program contains five errors. Identify the errors and fix them

7COM1028   Secure Systems Programming   Referral Coursework: Secure

. Accepts the following from a user: Item Name Item Quantity Item Price Allows the user to create a file to store the sales receipt contents

Create a GUI program that:Accepts the following from a user:Item NameItem QuantityItem PriceAllows the user to create a file to store the sales receip

. The final project will encompass developing a web service using a software stack and implementing an industry-standard interface. Regardless of whether you choose to pursue application development goals as a pure developer or as a software engineer

CS 340 Final Project Guidelines and Rubric  Overview The final project will encompass developing a web service using a software stack and impleme