Programming thread

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Is there a supervisor or anyone to rein in the pajeet incarnate? This is not even specifically python fuckery, it's not language-specific to know to not raise "errors" that are not errors. A 404 isn't even an error class in commonly used python http packages, it's an attribute value on a successful response. Errors are things like Internet Lumberjack activity, or AIDS in the pool. (I'd make something to catch the "expected" 404, too, to better tell them apart from when whatever it is that you're scraping fucks with their urls.)
Unfortunately, this is sorta his wheelhouse for the time being.

We started off this project with a django project that he was responsible for and the django project used celery to launch jobs that I wrote and I was responsible for.

Now gradually my jobs are being folded into his project and I'm having to work with all the nonsense that he wrote already.

At some point I'll propose some big architectural PRs to clean up his existing code, but I'm just really not in the mood for it this afternoon, I guess.

Like our storage module is really sloppy and there's a lot of automagical assumptions in it. I'm sure it's convenient if you're both the author of the module and its sole client but that's changing now.
My "junior" coworker sometimes errs on the opposite end of the scale and adds retard insurance against crashes where it's not needed. For example, I would not be adding a runtime check for or forcing a datetime to be timezone-aware. The one thing I do check for is whether a reusable function that's meant to run in a database transaction (but will "successfully" run and corrupt data outside of one) does in fact run in a database transaction.
Yeah, I have a similar attitude. I've written a lot of code that is superficially sloppy and looks like it's completely devoid of checks, but I've already done all the checks up front in the main entrypoint.

It's like that joke:
“The patient says, "Doctor, it hurts when I do this."
The doctor says, "Then don't do that!”

I make it productive by insisting (even if just to myself) that there's some really obvious visual identifier that indicates something's special about the function. Extra underscores in the name or some special keyword in the name, things like that.

Since I've been doing more python at work for the past few months, I prefer to import the module and use it as the prefix, and I avoid doing from foobar import blah_blah_blah, unless blah_blah_blah has very clear name, and it's still understandable thousands of lines away from the top of the file.
 
  • Informative
Reactions: Safir
Apparently it still pays pretty well.
So does COBOL xd

Ok thats not a fair comparison. It's mostly that I have a visceral disgust reaction when needing to look into any Java, even something heavily templated with a lot of boilerplate already bundled into an annotation or whatever (I struggle with even a basic Spring-based codebase). I'm not exactly at a point where I'm allowed to have an opinion on OOP design patterns and jump on the hipster hate wagon for it, but I just cannot wrap my head around Java and how you are forced to encapsulate and nest everything into classes and dependencies.

What kind of stuff was particularly valuable for your papers? Beyond general university level math skills like specific tools or language libraries or whatever, or was your work focused on the math and theoretical implementations of ideas.
 
  • Feels
Reactions: Kiwi & Cow
So does COBOL xd

Ok thats not a fair comparison. It's mostly that I have a visceral disgust reaction when needing to look into any Java, even something heavily templated with a lot of boilerplate already bundled into an annotation or whatever (I struggle with even a basic Spring-based codebase). I'm not exactly at a point where I'm allowed to have an opinion on OOP design patterns and jump on the hipster hate wagon for it, but I just cannot wrap my head around Java and how you are forced to encapsulate and nest everything into classes and dependencies.

What kind of stuff was particularly valuable for your papers? Beyond general university level math skills like specific tools or language libraries or whatever, or was your work focused on the math and theoretical implementations of ideas.
Data structures, Algorithms and discrete mathematics.
 
Also, apparently Python's type annotations are 100% meaningless? Like they're essentially comments? Fucking wonderful.
Python's type system is a crime against humanity, and type hints are a clear example of why.

What kind of stuff was particularly valuable for your papers? Beyond general university level math skills like specific tools or language libraries or whatever, or was your work focused on the math and theoretical implementations of ideas.
Without doxing myself, a lot of my work was intricate graph theory trickery to optimize data transmission in mesh networks, like in many parts of telecommunications. Calculus helps a lot here.

My interest in cryptography also got me into consensus and methods for Byzantine fault tolerance.
 
Since I've been doing more python at work for the past few months, I prefer to import the module and use it as the prefix, and I avoid doing from foobar import blah_blah_blah, unless blah_blah_blah has very clear name, and it's still understandable thousands of lines away from the top of the file.
This is how every competently managed Python project I’ve been on has done it. The problem, of course, is “competently managed.”

The “type annotations” Python added are pretty much indefensibly dumb, IMO. To get any use out of them, you need to actually run one of the type checking libraries like mypy, and the only reliable way to remember to do that, IME, is to include linting commands that run mypy, black, etc. in your Makefile, and run those in your CI or whatever. This works fine, but if you don’t set up your repo for that early on, good luck ever doing it. I haven’t seen anyone seriously try to use them with Python’s data science/ML libraries either, because knowing a function returns an ndarray or DataFrame or whatever isn’t very useful—what’s the shape of that array? What columns and data types does the DataFrame have? Also the whole thing where the typing module exists, but you can do list[str] or typing.List[str] is really fucking stupid. What happened to “there should be one obvious way to do it?”

I like Python a lot, but I went from a team that did their best to define and adhere to best practices, to one that didn’t give a fuck, and that made me way more understanding of the criticisms of Python. At least the new company pays more.
 
if you add type annotations to your dynamic language at least have the common fucking decency to have some fucking brains and cause a runtime error when the types mismatch because then you can have more well-behaved code and the jit can optimize with the extra static information, literal win/win

oh wait python is fucked and it can't do anything fast to save its life because it chose the slowest possible design decision 9 times out of 10

static typing is the white man's typing fuck dynamic niggers they can enjoy javascript type system prison for eternity
gradual typing is ok but only if the language itself knows about it and it's not some gay linter with extra steps (fuck you)
 
Python's type annotations are 100% meaningless?
Holy shit I actually get to give you a bit of advice about something I know about.

Use Numpy for everything you can. It forces the elements in arrays to be strongly typed. Numpy isn't the best for processing strings but it is not any slower than Python.

Numpy is basically all written in C, so it is usually faster too.

I wouldn't use Numpy to replace F-strings either but it is pretty good for just about everything else.
Although I don't think I need to use lists here, but that's how I started doing it as for experimenting.
Same advice if you are trying to learn programming, or you just want to do anything. Numpy functions are actually a really great way to think about procedural operations.

Numpy is also usually faster too.

Feel free to ask questions. The programming thread here is just perfect, not too big not too small. Just active enough to get replies but not active enough to the point people here say things they don't believe for attention.
Should I go back to college for linear algebra/mathematics or is college a scam even for math-related degrees? I'm 25
I don't think college is a scam though don't go to somewhere expensive. I don't know what the job market is like for SWE who don't get degrees but if you want to learn math on your own I suggest finding something that requires math to understand that your find interesting and then figuring out how it works and do that for every math concept you need to understand.

If you need some pointers(pun intended) for stuff like differential equations or basic linear algebra or even basic calculus I can point you in the right direction.

Know though that you need to be good at algebra and the way to be good at algebra is just to do a whole bunch of it.
I have been learning how to code for less than a year and I'm honestly not really that fucking good at it yet but I've been getting somewhat competent and I already have a client who is paying me to make a website for their business.
Awesome. Real world experience is great. Building something real is great.
I have really bad fucking anxiety after seeing a shooting in the work place
Anxiety is something shy women get when they occasionally need to talk to people. You got Shell-Shocked from a downright insane and crazy event. Don't feel bad about it.
What is the quickest and simplest way to set up react in vs code? Create react app is fucking gay and pozzed. Hell, I don't even like react, I'm just learning it to help my job prospects I learned the basics for react on freecodecamp and as helpful as that website has been for learning the basics for webdev languages, none of the lessons actually teach you how to set anything up in vscode
I'll say learning React is kind of eh. I'd advise agaisnt learning React if you are doing it to try and make your Resume look better.

This goes for Resume building and general advice. If you have free time, try and build something you love and if you can't try and build something you think is kind of neat or cool.

Learning stuff is a pain in the ass and is real work, and if there isn't a light at the end of the tunnel it can lead to burnout.
 
An actual "linear" distance between two generic vectors is computationally expensive because it involves the square root.
However, what we had were unit vectors, each with a length of 1. A much cheaper proxy for distance in that case is the scalar product (multiply dimensions pairwise then sum up).
Incredibly pedantically splitting hairs, but can you really call it a proxy when it is practically the distance itself lol ( |a-b|^2 = |a|^2 + |b|^2 - 2*a.b = 2 - 2*a.b for |a| = |b| = 1)
 
That's such a FAANG mindset. Just because Big Tech only cares about scaling doesn't mean that raw speed isn't important for the rest of us.

I was responding directly to a statement about cloud scalability.
I don't know exactly what you refer to as "raw speed", but I think what I say still applies. In real world situations, the speed of your business logic is very rarely the bottleneck.
From a company's point of view, theres really only 2 main phases in development. When you are starting up, development speed is the most important variable. It doesn't matter how slow your code is, or how much tech debt you accrue in the process of building your MVP. You need to push out a product with the amount of funding you have. At some point, if you are successful, you need to scale your product to meet the demands of your new customer base. At no point in this process does the speed between JS vs C++ factor in. Its not a faang mindset. Its the reality of growing your business from startup to success.
 
  • Agree
Reactions: Marvin
Incredibly pedantically splitting hairs, but can you really call it a proxy when it is practically the distance itself lol ( |a-b|^2 = |a|^2 + |b|^2 - 2*a.b = 2 - 2*a.b for |a| = |b| = 1)
I know. I call it a proxy because there's an expensive square root involved, and the whole massdebate was about (not) calculating the square root.
 
Use Numpy for everything you can. It forces the elements in arrays to be strongly typed. Numpy isn't the best for processing strings but it is not any slower than Python.

Numpy is basically all written in C, so it is usually faster too.
I like that the only good way to use numpy is as a wrapper for C. It's a culmination of dumbing down coding and standards once enough processing power was reached and it was tine to fire good engineers and replace them with the shittiest of Pajeets.
 
Two hours later, I have confirmed that the fix is working just fine. The 404 is actually because my coworker is caching stuff on a file storage REST API (sorta like s3), and when we read from the cache, if it's a cache miss, we ignore it. And that's fine, that's the point of a cache. Except he logs the error as an error, without distinguishing between "cache miss" and "something is broken, investigate this".
Wait, so someone asking for something he doesn't have is an error?

Boy it sure would suck (for him) if the internet started DDoSing his log file by requesting random crap he doesn't have.

404 is the definition of a "this is a you problem, not a me problem".
 
Any windows driver fags here? Looks like Microsoft are pushing for Rust in the kernel. Link

I have absolutely no idea how things like kernel synchronization primitives will translate into Rust though.

Honestly, I really don't see an awful amount of value in Rust drivers. A good driver should be thoroughly tested through all routes with driver verifier and there's plenty of established tools to detect leaks, tagging etc.

I also find that with my drivers that c++ is more than adequate to handle resource lifetimes for me in ctor/dtor.


So what value really am I getting creating new drivers in a language that is hacked around C anyway that is also developed by pajeets and troons?

Edit: lol at supporting shit everyone uses like this Link
 
Last edited:
Dave really makes me reconsider Total Boomer Death. What is it with pretty much every tech Gray Beard being lovable, humble, and generally appreciative of the opportunities they were given through the dotcom boom or the starting point of monolithic companies like Microsoft, completely the polar opposite of the greedy meritless boomer stereotypes? Is it just because you couldn't lie and BS your way to the top of that field?
 
Dave really makes me reconsider Total Boomer Death. What is it with pretty much every tech Gray Beard being lovable, humble, and generally appreciative of the opportunities they were given through the dotcom boom or the starting point of monolithic companies like Microsoft, completely the polar opposite of the greedy meritless boomer stereotypes? Is it just because you couldn't lie and BS your way to the top of that field?
I love his Twitter account, he's spending his retirement just sperging out over old hardware. Guy's got a PDP-11 in his garage.
 
  • Feels
Reactions: Marvin and y a t s
Sounds like a recipe for massive driver binary sizes and bloat :|

I hope I'm wrong.
The reason for rust binary size is because it statically compiles dependencies into the binary, which is great for portability (in userland) and not great if you are concerned about binary size. The reason C binaries in userland are small is because they are dynamically linked to most of their dependencies. In a #[no_std] environment such as a kernel driver, you don't have the luxury of a standard library and so it will not be compiled into the driver, All these dependencies will be, so you will get larger kernel drivers all packaging different versions of the same serde code, introducing more risk to the supply chain of privileged execution modes within windows.
I can't say this is surprising given that microsoft took developer studio from being imperceptibly fast on 20 year old hardware to taking whole seconds to do exactly the same thing in the latest visual studio IDE version on a modern workstation, or shipping the new file manager with high input delay in windows 11, or delivering niggercattle ideology updates via the little graphic in the search box on the task bar, or shipping spyware AI features that screenshot your desktop constantly. MS has a track record for being niggerlicious with windows so this is expected when implementing rust support for kernel drivers as well, which should essentially just be API and struct FFI wrappers.
 
Back