logo Hurry, Grab up to 30% discount on the entire course
Order Now logo

Ask This Question To Be Solved By Our ExpertsGet A+ Grade Solution Guaranteed

expert
Michele PerezPhilosophy
(4/5)

555 Answers

Hire Me
expert
Peter EllisAccounting
(5/5)

634 Answers

Hire Me
expert
Martha BagemihllSocial sciences
(5/5)

853 Answers

Hire Me
expert
Xavier FosterResume writing
(5/5)

956 Answers

Hire Me
Applied Statistics
(5/5)

Give an optimal tree for the second iteration of ADABoost and the H2 classifier composed of these first two trees.

INSTRUCTIONS TO CANDIDATES
ANSWER ALL QUESTIONS

Exercise 1 For the 2D data below, find and draw the decision boundaries for the best separation with

- a line,

- the nearest neighbor method and

- a decision tree of your choice. 

Exercise 2 Using the data table below, we want to construct a decision tree that classifies Y as True or False from the binary variables A, B, C. Compute the decision tree decision tree with entropy gain for this data and draw it. 

Exercise 3 ADABoost is a sequential aggregation method consisting in choosing classifiers in a pre-defined family to form a linear combination. After T steps, we obtain the classifier where the ht's have been chosen successively from the predefined family.

In this question, we will use as a predefined family decision trees of a certain depth, which classify a point into the category {1, -1} according to a sequence of thresholds on its coordinates, denoted x and y.

1. In this question we consider trees of depth 1.

(a) Propose an optimal depth-1 tree for the data in Figure 2. 

(b) Give an optimal tree for the second iteration of ADABoost and the H2 classifier composed of these first two trees.

(c) What is the error rate of H2?

(d) Draw the decision frontier for the H2 classifier.

2. Repeat the previous questions for trees of depth 2.

Exercise 4 In this exercise we will analyze the behavior of ADABoost for the K-nearest neighbors. We consider the dataset of Figure 3 below. We restrict ourselves to classifiers of type K-nearest neighbors with K ≤ 5, i.e. our family of classifiers contains only 5 elements. 

1. Give the first three steps of ADABoost and show the decision frontier of the resulting classifier obtained.

2. Give the frontier of the classifier with the highest weight among those obtained during the first 4 iterations.

Exercise 5 Propose a way to extend ADABoost to the case of regression and show how it works on an example.

(5/5)
Attachments:

Related Questions

. The fundamental operations of create, read, update, and delete (CRUD) in either Python or Java

CS 340 Milestone One Guidelines and Rubric  Overview: For this assignment, you will implement the fundamental operations of create, read, update,

. Develop a program to emulate a purchase transaction at a retail store. This  program will have two classes, a LineItem class and a Transaction class

Retail Transaction Programming Project  Project Requirements:  Develop a program to emulate a purchase transaction at a retail store. This

. The following program contains five errors. Identify the errors and fix them

7COM1028   Secure Systems Programming   Referral Coursework: Secure

. Accepts the following from a user: Item Name Item Quantity Item Price Allows the user to create a file to store the sales receipt contents

Create a GUI program that:Accepts the following from a user:Item NameItem QuantityItem PriceAllows the user to create a file to store the sales receip

. The final project will encompass developing a web service using a software stack and implementing an industry-standard interface. Regardless of whether you choose to pursue application development goals as a pure developer or as a software engineer

CS 340 Final Project Guidelines and Rubric  Overview The final project will encompass developing a web service using a software stack and impleme