- Joined
- Jun 30, 2023
I was thinking that if this thread keeps growing, it becomes harder to pick out good bits of discussion around particular problems like this. Some really good nuggets of wisdom could get buried. What if we had a sub-forum for it, perhaps similar in function to Q&A, and called it KiwiOverflow or some shit? Though, this would probably heavily diminish the number of conversation topics apropos to this thread specifically. Just thinking aloud.
I agree. I guess what I meant by "counterpart" is the contrast in how they improve/evolve at a fundamental level, where one approach may be closer to random spontaneous mutations vs a more directed approach based on recognizing and adapting to patterns. Regardless, adaptive and self-improving systems need to incorporate these sorts of general concepts in one form or another. I did a lot of AI math in my work in the time leading up to me quitting out of a sense of bring about our doom lol.They're not mutually exclusive. My understanding is that early recurrent neural networks used an evolutionary computing approach because traditional backpropagation no worky and also that NEAT (Neuroevolution of Augmenting Topologies), where the entire topology of the network is permitted to evolve, is a popular approach to evolutionary computing in neural networks today.
Never tried it, but it looks interesting as a concept. In a practical learning sense, I would just use a different language with that familiar ALGOL-inspired syntax that has more widespread, mainstream use. Depending on your level of comfort, consider picking something that will handle essential memory management for you.How does Dart stack up as a first language?
Last edited: