I tried to get chat gpt to answer some probability questions. It really did atrocious. I asked it some trick questions, like, if you have 10 cards, numbered 1 to 10, and draw three cards without replacement, without putting them back, what is the chance you draw 2 identical cards?
And it just kept giving chances like 0.22%. Whenever I told it it was wrong it agreed and tried to fix its answer. I tried a number of other probability questions, some not as tricky. It got them ALL wrong. I even tried with a regular card deck because it might be a more common scenario for its dataset.
And then I had a new experiment. Since it kept agreeing and apologizing when I told it it was wrong, I tried to see if I could make one so simple that he would get it right (what is the chance of drawing a 5 when drawing 1 card). He said 10% (correct). I told him he was wrong, it's 25% and please explain to me where he went wrong. And he proceeds to apologize and try to explain why it's 25%.