Programming thread

Rust annoyed me so much with soyboys and trannies constantly posting LOOK I REWROTE LS! IN RUST!!1! that I never took a serious look at the language whatsoever and I cannot believe I am the only one.
My favourite part is the AWESOME SELF HOSTED COMPILER that requires a giant monolith of C++ to do all the real work.
I though C++ was evil and obsolete???

Even google can self host their shitty meme language.
 
While true, that wouldn't stop the tech industry (the peak wokest of the woke) from adopting it. There has to be some technical reason that just makes it a bad choice for professionals. I've considered switching to it before just to try, but after racking my brain, I could not come up with a compelling reason as to why I wouldn't just be wasting my time.

Well Rust is something i am poking with a stick right now. If it can bring me the speed of C with some of the convince of C++. Not that i believe it will do a lot for the speed of my programs, but i want to test it out.
 
Besides Rust being peak troonclownery, the packet system is a travesty. This is a general complaint on my part, not Rust-specific.

Have they (meaning: whoever might be responsible for this) finally devised a blessed crate set for POSIX support? Or is it still "which one of these three crates will give me select() and still be supported in a year down the road"?
 
  • DRINK!
Reactions: Yotsubaaa
My favourite part is the AWESOME SELF HOSTED COMPILER that requires a giant monolith of C++ to do all the real work.
I though C++ was evil and obsolete???

Even google can self host their shitty meme language.

obsilete. The company I work for (a 20billion dollar company) do everything in C++ or C and have no intention of changing.
 
Random thought I had about Rust (NO MY NAME IS NOT A REFERENCE TO THAT THANK YOU!)
I think one of the big reasons it's failed to take off is because the whole core design of the language is about memory safety right? But it's a language meant for systems programming.
Rust annoyed me so much with soyboys and trannies constantly posting LOOK I REWROTE LS! IN RUST!!1! that I never took a serious look at the language whatsoever and I cannot believe I am the only one.
Every time I get the inkling to look into Rust, I see the Reddit/Hacker News circlejerking everywhere and it puts me right off. This cult that they've built around the language is just the most bizarre and insufferable thing ever.
(Is this how us Lisp weens come across to the normal people
 
Every time I get the inkling to look into Rust, I see the Reddit/Hacker News circlejerking everywhere and it puts me right off. This cult that they've built around the language is just the most bizarre and insufferable thing ever.
(Is this how us Lisp weens come across to the normal people
Pretty much

I can bracket the sideline cultists. My main pain was the Rust Book. What the hell is it with the massive overuse of exclamation marks? It's really jarring, and a crap way to feign enthusiasm. I don't mind one or two for the real surprises, but the fact is that Rust doesn't have anywhere near as many innovations as some of its proponents make out.

It still ticks all the boxes I want of a modern language "close to the metal": Simula style OOP was a mistake that no new language should adopt. Algebraic data types and pattern matching are a must.
 
My main pain was the Rust Book.
You're not the first person with credible complaints about the book. Mozilla hired one of the worst SJWs in tech for their Rust documentation, Steve Klabnik, a classic entryist who only knew Ruby on Rails before that job. The sort of guy who fervently wishes there was a violent "tech antifa" movement.
 
You're not the first person with credible complaints about the book. Mozilla hired one of the worst SJWs in tech for their Rust documentation, Steve Klabnik, a classic entryist who only knew Ruby on Rails before that job. The sort of guy who fervently wishes there was a violent "tech antifa" movement.
I think the image speaks for itself
1615168167520.png
 
obsilete. The company I work for (a 20billion dollar company) do everything in C++ or C and have no intention of changing.
Duh. Only a nignog like pic related would ever believe that horseshit. C/C++ Obsolete? What sort of fucking retard would EVER think that?
spez.png
This is what rust trannies actually believe and their entire ecosystem is fully dependant on it with no plans to change.
 
Python:
def create_list():
        list = [0]
        attempt_list = []
        y = 0

        while len(list) != 1001:
            x = random.randint(0,1000)
            if x != (list[-1] + 1):
                y += 1
                continue
            else:
                attempt_list.append(y)
                y = 0
                list.append(x)

I tried to come up with the most pointless piece of code I could, and this is the best I can do at my current level. Though I actually think that making an array of attempt lists could yield some interesting results.


***
I‘m currently trying to understand how to work with classes in Python. I figured out how to make and instantiate them, and there was a helpful post on Reddit (shockingly) that finally made me understand WHEN to use them...
but I’m still having trouble actually using them in code. After too many layers of “self” calls, I start to lose track of what is actually being done,

I think it may be still not always knowing when my code is (or should be) dealing with the class itself, vs when it is acting on an instance of that class.

I have a physical card game I designed that I’m thinking I’ll make into a PC adaptation. I feel like that will require me to make and use several classes, which I can then refer back to when I get confused.
 
I think it may be still not always knowing when my code is (or should be) dealing with the class itself
Answer: Usually never.

vs when it is acting on an instance of that class.
Answer: 99.99% of the time.

One of the primary uses of a class is encapsulation: you separate code off into its own little thing, and only the class itself ever needs to care about the internals. So when you say that you're losing track of all the "self" calls and cannot follow what's actually happening, that's a red flag to me that something iffy is going on. Objects are literally designed to solve exactly this problem of code complexity.

e.g. Let's use your card game idea as an example (which I'll imagine works like Yugioh or something, just to make this post extra gay). So you'll probably have a Card class or something:
Python:
class Card:
    def __init__(self, attack):
        self.attack = attack

    def battle(self, other_card):
        # Insert complicated battle logic here
        return (self.attack - other_card.attack)
In the constructor __init__ you take in the attack points as a parameter and assign them to the attribute self.attack so that they're stored in the class. Then you have a method battle that lets you battle another card with this card.

Now watch this:
Python:
card1 = Card(3000)
card2 = Card(2400)

card1.battle(card2)
That's what the 'main' code that uses our class looks like: we make two instances of the Card class and make them battle. Notice some things:
1. No self appears anywhere in this main code: self only ever appears in the class.
2. We only ever reference the Card class itself when we make the instances: after that point, we're strictly dealing with the instance objects card1 and card2. (Now you can reference the class itself for other things, and as you get more familiar with OOP you'll come across class methods/attributes where you do exactly that. But for now, we only reference the class itself when using it to instantiate the instance objects.)
3. All the main code needs to know is that these Card objects have a battle method that can be used to make them fight.

Point 3 is the crux. All of our code for implementing the battle logic can now sit comfortably inside our Card class: the main program doesn't need to know about any of it or how it works. You can work on the battle code completely separately from the main logic of your game, because it's all inside the Card class and your main program only ever accesses that Card class through its interface, i.e. its constructor and the battle method.

This means that you're free to change the battle logic all you want without breaking anything: all the main code knows is that Card objects have a battle method that it calls by passing in another Card object and (in this case) that method returns a number. Your main code doesn't know or care about any of the internals, so you're free to modify them however much you want at any point.

This also means you only need to worry about the battle logic inside that Card class. When you're working on the main code, you (or, for example, another programmer on your team) only need to worry about the interface. "Yeah if you want these two cards to battle each other just do first.card.battle(second_card) and I'll take care of it all". That's the promise that your class is making to your main code (or less abstractly, the promise you'd be making to the programmer writing the main code if you were the programmer in charge of writing the Card class and implementing the battle logic).

This idea is awesome and very powerful, and its why Object-Oriented Programming dominates the software development landscape. It empowers you to have programmers working on different parts of a program without having to know every little bit of it themselves: they just need to know the interfaces. Or even if it's just you, it allows you to separate concerns: if ever you wanted to change the battle logic in your card game, you know that all you need to do is modify the Card class: none of the other code would ever need to change.

Hopefully that was useful. :heart-full:
 
So when you say that you're losing track of all the "self" calls and cannot follow what's actually happening, that's a red flag to me that something iffy is going on. Objects are literally designed to solve exactly this problem of code complexity.
Sounds like I need to find some different tutorials, then.

Thanks. That whole explanation actually makes a lot of sense, and it explains the thing that was most annoying me, which is why I can’t set, say, self.x directly, but had to declare self.x = x and then make a method declaring that self.x was a random # between 1 and 3.

but, if I’m understanding you correctly, that’s because - to carry the card analogy 1 step further, the __init__ function acts like a “deck” and initializing your function is basically “dealing a card”. So if I said self.x = Ace through K, then every card would literally have the face value “Ace through K”. Rather, when you initialize the function and create an instance, that’s the point at which you actually see which “card” it is and it’s assigned it’s face value.
 
Also, the idea of functional programming has merit but I think in it's 'pure form' it's entirely impractical. I tried to use that during college and some incredibly simple things just became preposterously complicated.
For certain sets of problems it can produce extremely dense and elegant solutions. When combined with decent modern libraries to make simple things more readable (i.e. no scheme's (+ 1 x) ) it can be extremely powerful. Erlang is one of my favorite functional languages that has a great use-case and purpose for its programming philosophy.

Functional + OOP combined seems to be my favorite. Scala is a decent mix IMO. You can pick and chose which one you want and which libraries to pull to suit your needs. And it keeps your work practical, and steers away all the Haskell fags who think its gods gift to algorithms.
 
For certain sets of problems it can produce extremely dense and elegant solutions. When combined with decent modern libraries to make simple things more readable (i.e. no scheme's (+ 1 x) ) it can be extremely powerful. Erlang is one of my favorite functional languages that has a great use-case and purpose for its programming philosophy.

Functional + OOP combined seems to be my favorite. Scala is a decent mix IMO. You can pick and chose which one you want and which libraries to pull to suit your needs. And it keeps your work practical, and steers away all the Haskell fags who think its gods gift to algorithms.
Scala attracts Haskell people like a plague. They wrote libraries like scalaz and cats, and then the book titled Functional Programming in Scala turns out to be entirely Haskell infected.

The sad thing is that neither community benefits at all from this interaction. Scala is such a horrible platform for Haskell style functional programming that a lot of these scalaz types decided to start from scratch and try to write their own Haskell for the JVM, while those in Scala who just wanted a better Java have to live in an ecosystem where people are trying to turn their language into a pretzel.
 
Sounds like I need to find some different tutorials, then.

Thanks. That whole explanation actually makes a lot of sense, and it explains the thing that was most annoying me, which is why I can’t set, say, self.x directly, but had to declare self.x = x and then make a method declaring that self.x was a random # between 1 and 3.

but, if I’m understanding you correctly, that’s because - to carry the card analogy 1 step further, the __init__ function acts like a “deck” and initializing your function is basically “dealing a card”. So if I said self.x = Ace through K, then every card would literally have the face value “Ace through K”. Rather, when you initialize the function and create an instance, that’s the point at which you actually see which “card” it is and it’s assigned it’s face value.
Yep. But you don't need the separate method. e.g. you can do it like this:

Python:
import random

class Card:
    def __init__(self):
        self.attack = random.randint(0, 10000)
So in the __init__ constructor we pick a random integer between 0 and 10000 and assign it immediately as the attack value for the Card.

And then in your main code every Card you make will be assigned a different random attack, because that __init__ constructor gets freshly called every time you make a new card.
code1.png


So e.g. you could generate a hand of five cards just by looping and creating a new card (so you store the cards unnamed in a list, instead of giving them a name like card1 or card2). Here I do just that and then use the list(map(lambda(... stuff just to print out the attack value of each card so we can see them:
code2.png


And you can 'draw' a new card to your hand by calling the constructor the same way and appending the new card to the list:
code3.png


Etc etc.
 
Don't mistake those that spend 24/7 playing with libs or masturbating on message boards about monads, to those who work and actually have deadlines to meet.
You're ignoring the fraction of who are retired after a life of work when we indeed had deadlines to meet, who once benefited from those who were more experienced that us and have been paying that forward for a long time.
 
Don't mistake those that spend 24/7 playing with libs or masturbating on message boards about monads, to those who work and actually have deadlines to meet.
Yeah this is what college programming students and academics almost can't understand. Programming in industry is wildly different than programming in school or on your own time. That 4 months long project you did in your senior year? Yeah I'm gonna need that by 3 pm.
 
Yeah this is what college programming students and academics almost can't understand. Programming in industry is wildly different than programming in school or on your own time. That 4 months long project you did in your senior year? Yeah I'm gonna need that by 3 pm.
Not only that, but those that beat off to algorithmic elegance have never had to develop a real-world, multidevice, multiplatform solution.

The code in those applications can be horrifying, because the real world is messy. Irreducible complexity is a fact in the digital world, just as in nature.

My favorite example of this is the original Netscape code. It was full of dragons, almost incomprehensible, and had numerous hacks after hacks to fix other hacks. But guess what...what junior developers don't understand is those 'hacks' are business value. Each one fixes a bug or some strange set of circumstances to cause instability.

Then some jerk-off tech lead comes on the project and convinces management of a overhaul or major rewrite, only to sink their product because the competition wasn't rewriting code that already worked (warts and all).
 
Back