Lecture 4: Feasibility of Learning 1. Learning is Impossible? -- absolutely no free lunch outside Learning from (to infer something outside ) is doomed if any 'unknown' can happen. 2. Probability to the Rescue -- probably approximately correct outside If is large, we can probably infer unknown by known . 3. Connection to Learning -- verification possible if small for fixed So, if and small ==> small ==> with respect to . Now, we can use 'historical records' (data) to verify 'one candidate formula' . 4. Connection to Real Learning -- learning possible if finite and small But in real learning, we have to deal with some BAD sample: and far away --can get worse when involving 'choice'. if = finite, large enough, for whatever picked by , . if finds one with 0, PAC guarantee for 0 ==> learning possible.