Programming thread

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
Pike then decided to add 5 different ways to allocate an object in Go.
Ooh, don't even fucking get me started lol.

For those who don't know Go, here's an example:
C-like:
type SomeType struct {
    // struct fields here
}

C-like:
func NewSomeType() *SomeType {
    x := SomeType{}
    return &x
}
C-like:
func NewSomeType() *SomeType {
    x := &SomeType{}
    return x
}
C-like:
func NewSomeType() *SomeType {
    x := new(SomeType) // returns pointer, like &SomeType.
    return x
}
Binding to an unnecessary variable instead of a direct return to help highlight the differences in (de)referencing.

An important thing to note here is Go initializes type/struct fields to 0 automatically upon alloc. Effective Go (a very old spec-related article) says that new(SomeType) and &SomeType{} are equivalent until you use the latter form to initialize non-zero values in the declaration.

There's no clear distinction between stack and heap allocations when returning like you would normally have C. So whether you should use the 1st or 2nd/3rd form to return a pointer to a type is a matter of intense debate to this day. As much as I love Go, I have no fucking clue why they don't have a clear distinction between the first two forms, at the very least.
 
You forgot the best 300 IQ decision: whether parameters are passed by value or by reference is almost entirely up to the compiler. Because missed optimization opportunities for copy elision are a Very Big Deal That Must Be Addressed By A Modern C Replacement
There's no clear distinction between stack and heap allocations when returning like you would normally have C. So whether you should use the 1st or 2nd/3rd form to return a pointer to a type is a matter of intense debate to this day.
In order for these nu-langs (Zig & Go here) to improve upon the classical languages, they have to make things easier for the end users. This often means letting the compiler choose what is happening on the lower level instead of forcing the programmer. In most cases it's fine, except in the few cases where it's not. Just using C here is the best and correct option.

Go does not deprecate C like how C does not deprecate assembly. The Zig and Go designers know this as they (try) to make compatibility with C libraries easy.

My dislike for the system's programming nu-langs (Rust, Zig, Go) is based on how they're floating between low and high level. The designers want their language to dominate so it has to be able to do everything, essentially become both low and high level. These languages are so incredibly complicated, not only from language features, but also internally on the compiler side that only one compiler implementation can ever exist.

The combination of C and a true high level language (Scheme, JavaScript, Ruby, etc.) and embracing Ousterhout's dichotomy feels enlightening with a sense of freedom. Choose the right tool for the job, not some abomination.
 
This often means letting the compiler choose what is happening on the lower level instead of forcing the programmer. In most cases it's fine, except in the few cases where it's not.
That's the nature of most language-related abstractions. I see no reason why they couldn't achieve this in a much less confusing and ambiguous way.

Something I really like about Go (compared to other modern langs) is that it lets you take advantage of the power of c-style pointers without introducing a shitload of memory allocation complexity. Its fancy scheduler allows for some relatively efficient async GC, which is kinda the core idea of Go—retard-friendly async programming for big performance gains.
Something I really don't like is a lack of justification for seemingly retarded design choices like the first two allocation examples I showed above.

Just using C here is the best and correct option.
:thinking:

The combination of C and a true high level language (Scheme, JavaScript, Ruby, etc.)
true high level language (Scheme, JavaScript, Ruby, etc.)
Nigga...
 
The designers want their language to dominate so it has to be able to do everything, essentially become both low and high level.
the more of them come out, the harder they make each others lives and the less likely each one is to succeed

>some guy makes rust, hopes it will compete with C
>another guy makes zig, hopes it will compete with C
>yet another guy makes odin, hopes it will compete with C

but before actually competing with C they first have to compete with each other for that "the new lang trying to compete with C" spot. so far rust has gotten the furthest (probably because it's the oldest of these nulangs) but who knows what the future will look like?
 
These languages are so incredibly complicated, not only from language features, but also internally on the compiler side that only one compiler implementation can ever exist.
Rust and Go both have GCC front-ends as well as their self-hosted compilers. Go, as well, has like 15 other implimentations, from an LLVM-hosted compiler to the embedded-programming focused TinyGo. Zig is the only one you mentioned that only has one implimentation, and that can be chalked up to it just not being as popular as the other two, nor around for as long.
 
the more of them come out, the harder they make each others lives and the less likely each one is to succeed

>some guy makes rust, hopes it will compete with C
>another guy makes zig, hopes it will compete with C
>yet another guy makes odin, hopes it will compete with C

but before actually competing with C they first have to compete with each other for that "the new lang trying to compete with C" spot. so far rust has gotten the furthest (probably because it's the oldest of these nulangs) but who knows what the future will look like?
You forgot Fil-C, which is C but inserts some extra checks on the compiled code to halt the program if memory safety violations are detected. While not particularly performant, it has a number of advantages over the others:
  • Everyone can just keep doing their own C/C++ projects as usual, they never even need to take notice of Fil-C.
  • People don't spam their shitty "strcmp-fil-c" projects everywhere because anyone can literally just compile an existing C/C++ project as Fil-C and the memory safety magic is just added on top of the compiled code.
  • You don't have to deal with "coreutils-but-its-in-rust-and-developed-by-retards-and-fucks-everything-up"
 
These languages are so incredibly complicated, not only from language features, but also internally on the compiler side that only one compiler implementation can ever exist.
This is so incredibly laughable in the face of the C++ standard and the myriad C++ implementations.
 
Fuck it, any good sources to learn python that isn't a meme? I can't take another array/list and print() tutorial. I want to learn how to do things more "general" without ending up in spaghetti code that I have to troubleshoot a whole day just to not fuck up a project or assignment. My strategy of doing "for i in range(...)" for everything is too messy, there has to be a smarter way to do this.

I'm not talking about numerical shit like "c = a + b, print(c)" or how to use import a "def function(...)" from another file in the same folder. I want to learn how it actually works and how to optimize the computer stuff. I know it's meme that Python is slow, but it's what is used in simulations and neural networks (with numpy it's fast enough).
 
I want to learn how it actually works and how to optimize the computer stuff. I know it's meme that Python is slow, but it's what is used in simulations and neural networks (with numpy it's fast enough).
Those are all C/C++ libraries wrapped with a Python API.
Regarding the rest, as far as I know everyone just fucks around until it clicks + a few words of generic wisdom you pick up from here and there.
 
Those are all C/C++ libraries wrapped with a Python API.
I'm aware of that, hence why I stick with it because it just works. I don't want to touch any Fortran libraries.

Regarding the rest, as far as I know everyone just fucks around until it clicks + a few words of generic wisdom you pick up from here and there.
So vibe coding until it sticks? God have mercy on zoomer coders!
 
So vibe coding until it sticks? God have mercy on zoomer coders!
Be thankful you have tools that can analyze and provide you with insight into your code. Back in My Day(tm) I had a single source: the QBASIC documentation. And then there was my period doing pro Sharepoint work where half the features I used on a daily basis were basically undocumented, even though everything was online work.
 
Fuck it, any good sources to learn python that isn't a meme? I can't take another array/list and print() tutorial. I want to learn how to do things more "general" without ending up in spaghetti code that I have to troubleshoot a whole day just to not fuck up a project or assignment. My strategy of doing "for i in range(...)" for everything is too messy, there has to be a smarter way to do this.

I'm not talking about numerical shit like "c = a + b, print(c)" or how to use import a "def function(...)" from another file in the same folder. I want to learn how it actually works and how to optimize the computer stuff. I know it's meme that Python is slow, but it's what is used in simulations and neural networks (with numpy it's fast enough).
Agreed. I'm going to start reading Automate the Boring Stuff with Python so I can review a truly up-to-date guide for learning Python (as opposed to what I started with years ago) and comment on it for anyone who wants to learn Python here.
have fun
 
The fact that there are multiple web rendering engines in existance, with a few more on the way, should put to bed the notion that any piece of software is too complex to be replicated.
or just look at vulkan lol
someone really saw the absolute clusterfuck that is openGL and thought "we should make something similar, but make everything twice as complicated to get better performance!"
 
Time for some hot takes:
https://youtube.com/watch?v=7fGB-hjc2Gc


(((JewNU)))
As someone who programs in C++, I 100% agree with this video.
Fuck it, any good sources to learn python that isn't a meme? I can't take another array/list and print() tutorial. I want to learn how to do things more "general" without ending up in spaghetti code that I have to troubleshoot a whole day just to not fuck up a project or assignment. My strategy of doing "for i in range(...)" for everything is too messy, there has to be a smarter way to do this.

I'm not talking about numerical shit like "c = a + b, print(c)" or how to use import a "def function(...)" from another file in the same folder. I want to learn how it actually works and how to optimize the computer stuff. I know it's meme that Python is slow, but it's what is used in simulations and neural networks (with numpy it's fast enough).
This is going to sound harsh, but here we go.

Then fucking do it, and learn. Define a project and work on it, or fail and quit. It is the only way you will learn. The 4 hours you spend debugging your messy code teach you how to do it better the next time. Now, how messy your code looks is likely because you lack experience, which will come in time. Also, there is a good rule of thumb: if I need to do something more than once, make it a function; if you need the data again and again with some modifications, use a Class.

Now, for the issue of your for loop, the simple dirty secret is that almost all coding is a set of loops and conditions. Yes, we use hashmaps (dicts in Python), search trees, A*, and all kinds of algorithms, but they are the filler in the loops and conditions. When it comes to optimisation, it is more about the innovative application of algorithms to reduce the need for nested loops and other constructs.

The hard part of coding is NOT writing the code or making it fast enough, but defining the scope of your code in a way that makes it easy to improve, fix, and use with other programs.
 
Time for some hot takes
i agree with most of this video
i didnt really like the hate for header files but i didnt know c++ was this retarded about them (ive never written templated code)
coming from C i just treat header files as more of a table of contents or summary of a given implementation file, very useful for libraries where i dont care what each specific function does but i only care what functions there are
so yeah if templated functions are required to be implemented in the header that makes them cringe and gay
 
i agree with most of this video
i didnt really like the hate for header files but i didnt know c++ was this retarded about them (ive never written templated code)
coming from C i just treat header files as more of a table of contents or summary of a given implementation file, very useful for libraries where i dont care what each specific function does but i only care what functions there are
My main beef with C++ classes is they're a "solution" to an already solved problem. C has structs for objects and the headers largely solve the problem of namespaces and scoping.

Edit: improved clarity.

so yeah if templated functions are required to be implemented in the header that makes them cringe and gay
This fact bugs me more than it probably should lol. Sure, it's all concatenated C++ code after the preprocessor, but this is just gross.
 
Last edited:
I want to learn how it actually works and how to optimize the computer stuff. I know it's meme that Python is slow, but it's what is used in simulations and neural networks
There's no optimizing anything in Python. Python isn't used in simulations and neural networks. Fortran, C and C++ are used in simulations and neural networks, and they make a tard interface for Python programmers that is pretty hard to fuck up using.
 
I'm not talking about numerical shit like "c = a + b, print(c)" or how to use import a "def function(...)" from another file in the same folder. I want to learn how it actually works and how to optimize the computer stuff. I know it's meme that Python is slow, but it's what is used in simulations and neural networks (with numpy it's fast enough).
What you really want to learn is math/physics behind it, not programing language. You can then use numpy to do calculations. Or C++ or whatever. The thing about programming is syntax is easy, translating ideas to working solutions is hard.
So the best thing you can do is just program what you want. Since you want neural networks you can look up how to do them conceptually using math, and do your own implementation.

i honestly wonder what would be a solution to removing header files while still allowing maintaining the summary thing
Documentation?
Header files are terribly antiquated tbh. When I was working mainly with C++ I also was downplaying how bad they are, but after playing with newer languages I see that they just clutter everything, require you to change code in 2 places, might introduce subtle differences etc.

My main beef with C++ classes is they're a "solution" to an already solved problem. C has structs for objects and the headers largely solve the problem of namespaces and scoping.
Classes also add polymorphism and inheritance. Albeit inheritance is disliked nowadays, I think it has it's uses.

Also how headers solve the problem of namespaces? It's the opposite, there is no namespacing whatsoever. Whatever int foo() is declared in header1.h, if it is in header2.h it has to be the same function. Otherwise you will get linker errors.
 
Back
Top Bottom