Does this all seem greek to you? Turn your mind back to last thursday's lecture. I could stop here, but i'm going to subject you to a bunch of math. ...and this thing [here]-- god, everything's erased! ...but the thing wriggles like crazy, and goes off to infinity in a very rude way. Statisticians are usually working with sample sizes of 27 or 34. Neural nets usually have sizes in the thousands, and you can party with the central limit theorem when you're up there. It's a great place to be. This is sufficient, but not necessarily necessary. I'm banking on the fact that this [bound] is *really* small. Don't try this -- kids, don't try this at home -- unless your first bound is really small.