KF Math Thread - Discuss Math

Anti Snigger

h̸͋̉̈́́̐́͑̇̅̄͛́̀̿̏̅̅̀̆̎͛̆̀̑̈́͊̐̈́͒̔͒͛̍͑̉͂̏̅̈̔̒̕̚͘̕͘͘̚
True & Honest Fan
kiwifarms.net
Joined
Mar 28, 2023
I searched the catalogue and didn't find just a general math thread, and I figured it could be a nice addition. I&T was my best guess for where to post this so I hope I made the right choice.
I would like this to be friendly to people who aren't necessarily experienced or good at math. Math is beautiful and I want to share that feeling with others.


I had an interesting thought while out having a smoke earlier, I was thinking about "quasi-random" algorithms, (algorithms that actually reduce randomness to play toward humans' perception of what random is), like shuffle features on music apps, and I started to wonder about how I could iteratively build an ordered set where at each step, the set is "as random" as it could be. My initial thought was to use something like information entropy, but I realized that information entropy doesn't really encode the "structural" randomness that humans are particularly sensitive to, and I'm wondering if anyone has any insights.
Cheers
 
Last edited:
Whats a good place to start when it comes to learning that math that is involved with machine learning?
 
Whats a good place to start when it comes to learning that math that is involved with machine learning?
You are going to want to get extremely comfortable with the chain rule from calculus, it's essentially the basis for everything else. Some vector calculus is also going to be helpful, as networks are essentially giant functions with some number of input parameters as variables, some number of weight variables, with a domain corresponding to the output shape.
This is my favorite resource for Calculus and DiffEqs, it saved my ass at multiple points. You're going to want to mainly focus on the calc track, with some of the diffeq stuff, (fair warning, diffeq math can get weird if you're not used to it). The vector calculus stuff is relegated to mostly the calc 3 section. You are probably safe to skip much of the calc 2 section, though I wouldn't advise entirely ignoring it.

The single best way to learn backpropagation is by deriving it yourself. Design some toy network, 2 or 3 layers, write out the total function, and simply grind on that board for a while. The common explanations are not very helpful for understanding them mathematically.

Mathematicians fucking HATE this, but a good trick from physics with the chain rule is to write out your derivatives using leibniz notation. You will end up with fractions of differentials, but they are not actually fractions. However, as a sort of syntactic sugar, you can imagine them as such.

1711484937672.png

This image I found illustrates it well, essentially you can find the derivative in terms of some variable, then continue backward by taking the derivative of that variable, in a chain.
Just please don't get carried away with the differential terms, you will embarrass yourself, they only make sense in certain contexts.
 
Whats a good place to start when it comes to learning that math that is involved with machine learning?
your local college or university, hear the lectures on calculus (analysis in other languages) and linear algebra. this is like first and second semester stuff in a math degree, its not crazy difficult or complicated. then check the computer science department for courses on machine learning to see how that math is applied to make artificial neural networks.
 
I had an interesting thought while out having a smoke earlier, I was thinking about "quasi-random" algorithms, (algorithms that actually reduce randomness to play toward humans' perception of what random is), like shuffle features on music apps, and I started to wonder about how I could iteratively build an ordered set where at each step, the set is "as random" as it could be. My initial thought was to use something like information entropy, but I realized that information entropy doesn't really encode the "structural" randomness that humans are particularly sensitive to, and I'm wondering if anyone has any insights.
Dota 2 already implements pseudo-randomness into many of its chance mechanics, so that the distribution doesnt have as many continuous streaks as you would inevitably get with actual randomness. To my knowledge, it applies a lower chance roll than the nominal percentage possibility at first, and then progressively iterates to higher and higher probability rolls after each failed roll, resetting on success.
 
Does OP mean something like this?


Whats a good place to start when it comes to learning that math that is involved with machine learning?

It's mostly calculus, linear algebra as well as probability and statistics. I kind of suck at things like integral calculus. So while I'm not at all opposed to learning as much as you can or want to, it's worth noting that you can still get a lot done with data science tools while not being an expert at all of that. Frankly, so much in data science is really just the "unsexy" task of wrangling the data into a format where you have what you need and can dump it into the learning algorithm, which is something that can be quite opaque even to subject matter experts. I have a curated Icedrive with several thousand books on a huge array of topics but data science and its underpinnings get a lot of attention. You or anyone else can PM me for a public link.
 
Last edited:
  • Informative
Reactions: Anti Snigger
Whats a good place to start when it comes to learning that math that is involved with machine learning?
How AI Works by Ronald Kneusel
Alternative Link
I have enjoyed reading this book; I believe it explains the mathematics concepts as they come up, but it would be good to brush up on calculus and linear algebra as others have suggested.

Love numbers, hate algebra. I cannot recall a single instance where I've used it.
It seems to me that, in higher level mathematics, algebra is more of a stumbling block than the main material.
 
How AI Works by Ronald Kneusel
Alternative Link
I have enjoyed reading this book; I believe it explains the mathematics concepts as they come up, but it would be good to brush up on calculus and linear algebra as others have suggested.
Engineering Mathematics by Bird is in my collection and of course you can also find it on LibGen and other such sites. There the author's paycheck isn't based on the amount of pages they write but rather the approach is very straightforward and uncluttered. Coding the Matrix goes into even further depth on linear algebra than the former, through hands-on computer science applications. It is well-reviewed and on my "eventually" list.
 
When I was a younger TorBo, I always struggled in math class.

And math classes always gave the most homework, which sucked.
 
  • Like
Reactions: Anime Genocide
Whats a good place to start when it comes to learning that math that is involved with machine learning?
Calculus -> differential equations -> Linear Systems -> Fourier transforms -> control systems. Control systems is very important for feedback but it takes a bit of background in math and engineering

Probability and mathematical statistics.

Then probably some programing and ML stuff

General math stuff, I'm worried about the future of mathematics education. It seems they're trying to dumb it down since for equity's sake it's easier to make everyone bad at math instead of making people good at it. I worked at a California college when a new law passed where schools couldn't put students in remedial math (and English) and instead had to put them on a college transferable class (intro to stats, Calculus/business calc, or liberal math studies). It was a fucking train wreck since these two subjects build off the previous knowledge and its just a way to artificially inflate our number of educated individuals.
 
Calculus -> differential equations -> Linear Systems -> Fourier transforms -> control systems. Control systems is very important for feedback but it takes a bit of background in math and engineering

Probability and mathematical statistics.

Then probably some programing and ML stuff

General math stuff, I'm worried about the future of mathematics education. It seems they're trying to dumb it down since for equity's sake it's easier to make everyone bad at math instead of making people good at it. I worked at a California college when a new law passed where schools couldn't put students in remedial math (and English) and instead had to put them on a college transferable class (intro to stats, Calculus/business calc, or liberal math studies). It was a fucking train wreck since these two subjects build off the previous knowledge and its just a way to artificially inflate our number of educated individuals.
It's important to find the balance point between dumb equity bullshit and being able to use certain tools without total comprehension of the underpinnings. Like I said earlier, I'm aware enough to know certain things in math that I'm not really good at. But I also know when I do something like training a machine learning model in R to distinguish between spam SMS messages and regular SMS messages and the sensitivity and specificity are both through the roof on the test set. That's just algebra, and these two statistics (true positive and true negative rate, respectively) tend to militate against each other, so to have both close to 100% out-of-sample is a very good sign. Basically, as far as human talent is concerned, I'm just opposed to throwing baby out with the bathwater.
 
It's important to find the balance point between dumb equity bullshit and being able to use certain tools without total comprehension of the underpinnings. Like I said earlier, I'm aware enough to know certain things in math that I'm not really good at. But I also know when I do something like training a machine learning model in R to distinguish between spam SMS messages and regular SMS messages and the sensitivity and specificity are both through the roof on the test set. That's just algebra, and these two statistics (true positive and true negative rate, respectively) tend to militate against each other, so to have both close to 100% out-of-sample is a very good sign. Basically, as far as human talent is concerned, I'm just opposed to throwing baby out with the bathwater.
Something I never got exposed to in my courses that I had to kind of figure out on my own was designing custom loss functions. It's pretty difficult.
 
Do you mean stuff like implementing regularization or am I really lowballing it?
If some of your outputs are more critical, you may want them to contribute more to the loss. That's a simple example, but it's basically another way to shape the training and it's not discussed enough.
 
Back