Monthly Archives: September 2018

Exam #1 Review

A reminder that we will have Exam #1 this Wednesday (Oct 3). The exam will cover Sections 1.1-1.6.

See below for a list of HW exercises from those sections to review (as well as some additional  exercises that you can work through for additional review).  You should also review Quizzes #1 & #2 (solutions have been posted to Files).

  • Sec 1.1 (Propositional Logic):
    • #10, 14, 32
    • in addition review “exclusive-or”: pp5-6, #32(c)
  • Sec 1.3 (Propositional Equivalences):
    • #9, 10 (tautologies using truth tables)
    • in addition: review “satisfiability”: pp30-31, #61
  • Sec 1.4 (Predicates & Quantifiers):
    • #10, 18
    • Quiz #2
  • Sec 1.5 (Nested Quantifiers):
    • from HW#3: #4(b)(d), 10(a)-(c), 30(a)-(d)
  • Sec 1.6 (Rules of Inference):
    • read pp72-73
    • from HW#3: #5, 6 (similar to Examples 3-6)

 

Math Club (Thurs Sept 20): Decision Trees for Machine Learning

The CityTech Math Club has a speaker give a presentation most Thursdays. The first talk of this semester will be this Thursday (Sept 20) at 12:50pm in N720.  This week’s talk will be given by one of our math faculty, on a topic that involves discrete math, and has applications to computer science (specifically the branch of artificial intelligence called machine learning).  In fact, the talk will include some python programming (which is the programming language we will use later in the semester!)

Since this topic is so relevant to our course, you can earn a “participation point” by attending this talk.  I will be at the talk to verify attendance. 

(Recall that 5% of your final grade will come from attendance & participation; you can earn 1% by attending this talk. I will announce numerous other ways to earn participation points in class; by earning at least 5 such points you will get the full 5%.)

Here is the talk abstract:

 

Title: “Decision Trees”
Speaker: Dr. Johann Thiel
Date/Room: Thursday September 20, 2018, 12:50-2:00pm, Namm 720

 

Abstract: Suppose you have 5 pieces of fruit whose size, color, and label (the type of fruit it is) are known. Given the size and color of an unlabeled piece of fruit, is it possible to classify it given our previous observations? How sure can we be that we are right? This is a kind of classification problem.

Decision trees (used in machine learning) can be used to help with these types of problems. In this talk we will discuss how to construct a decision tree by hand and how to use the scikit-learn python module to create decision trees for larger data sets.