Programming thread

Fuck Electron and everything ever built with it.

There aren't really any easy answers to multi-platform development for such a wide range of platforms (unless you count game frameworks like Unity, but games aren't really expected to behave like standard applications so much). But that's kind of the point. Windows applications behave quite differently than iPhone applications; they have different input methods, different UI/UX expectations, and so on. If you try to make an application which works the same on both, you end up with something that's both a shitty Windows application and a shitty iPhone application. You know, like every single Electron application.

The "correct" approach is to abstract away the core functionality of the application into a separate "module," and then write custom "wrappers" around it for each target OS which interacts with the core module and presents its input and output in an optimized way for the OS. But developers are lazy and too many of them don't care about their users, so now we have shit like Visual Studio Code, where you can run basically a whole other web browser instance for your code editor with a shitty UI… what the fuck?
My favourite was when my job chastised me for using notepad++ instead of vscode (both installed) for development despite them both being identical in functionality except one takes up 600mb of memory with no tabs open.

As for ui, I'm trying to write my own opengl ui system so that it works seamlessly accross all platforms. Sucks if you don't have a graphics card though...
 
Trying to learn R is like pulling my own fucking teeth out with pliers. Are all programming languages this bullshit? In the space of four weeks we've gone from making R add 2 and 2 to something called a GLM; and an ANOVA. Anyone got any tips on learning R? Every time I try and google something on it I start out again with basic stuff like 'this is how R is a calculator' and then it zips by almost instantly to 'and this is R plotting the moon rocket trajectory'.

Like a literal 'I am retarded' flipbook style tutorial with a step by step breakdown.
 
Trying to learn R is like pulling my own fucking teeth out with pliers. Are all programming languages this bullshit? In the space of four weeks we've gone from making R add 2 and 2 to something called a GLM; and an ANOVA. Anyone got any tips on learning R? Every time I try and google something on it I start out again with basic stuff like 'this is how R is a calculator' and then it zips by almost instantly to 'and this is R plotting the moon rocket trajectory'.

Like a literal 'I am retarded' flipbook style tutorial with a step by step breakdown.
It sounds like you may have only limited experience with statistics prior to this? If so, 'learning R' or 'learning SAS' or whatever is kind of a waste of time- you need to have statistical knowledge that at least matches what you're trying to achieve in the programming environment to do anything useful, even if this is a class that you never intend to make use of in the real world.

I believe an troon who teaches at an Australian university published their course notes for a course that actually kind of does this, as 'Learning Statistics with R'. It's freely available and I've heard it's at least somewhat useful. Would be worth having a look at that.
 
It sounds like you may have only limited experience with statistics prior to this? If so, 'learning R' or 'learning SAS' or whatever is kind of a waste of time- you need to have statistical knowledge that at least matches what you're trying to achieve in the programming environment to do anything useful, even if this is a class that you never intend to make use of in the real world.

I believe an troon who teaches at an Australian university published their course notes for a course that actually kind of does this, as 'Learning Statistics with R'. It's freely available and I've heard it's at least somewhat useful. Would be worth having a look at that.
Thanks for the recommendation on the course notes.

I'm doing a degree that requires statistics (genetics); and what was supposed to be covered in 3rd term just wasn't because of the wu flu. So I've been scrabbling to try and crash learn the needed information as we've just had a bunch of worksheets about some advanced (to me) statistical analyses dumped on us and we're expected to just do them each week by week in R. Which is fun.
 
Trying to learn R is like pulling my own fucking teeth out with pliers. Are all programming languages this bullshit? In the space of four weeks we've gone from making R add 2 and 2 to something called a GLM; and an ANOVA. Anyone got any tips on learning R? Every time I try and google something on it I start out again with basic stuff like 'this is how R is a calculator' and then it zips by almost instantly to 'and this is R plotting the moon rocket trajectory'.

Like a literal 'I am retarded' flipbook style tutorial with a step by step breakdown.
There has to be some international agreement to teach R like shit. We had to learn it too because we had a mandatory class of statistics and all we got was "here's an article with code samples, translate it, run the code samples, and take screenshots." That was our project for the semester. In a university we had to pay for.

As for all programming languages: There's a wide scale from zero bullshit to absolute bullshit because there as many languages as stars in the sky, I don't know where I would put R since all I did was to copy paste code, but having it's own studio which generates all the output on the fly is handy.
 
I had to teach R to an older relative last year lmao! It's shit language from a language point of view, but it roughly accomplishes it's job as a stats tool. Honestly R makes me wish people would just use python instead, which is entirely out of character for me
The worst part is the "magic" shit done in the name of convenience like how an operation on a vector and a scalar automatically converts the scalar to a vector. Obviously you could try explain that as function overloads, but R has a pretty loose type system so that doesn't really suffice. Another issue is that most apis don't seem to do any input validation, so if something is subtly wrong with your data or just the type of data you'll get an incomprehensible error from somewhere deep in the algorithm.

@Johan Schmidt If you feel like you're just piecing things together without really understanding why, don't feel too badly, from what I can tell that's how most R end users operate anyway. In order to feel a little more comfortable with the language, the most important things to understand are basic concepts like values, functions, and expressions. Honestly you don't need to understand how the specific analysis techniques work in order to use them, but if you can understand how data flows between functions on a conceptual level then you'll at least be able to reason about what your program does.
And no, not all programming is this shitty lol
 
Last edited:
@Johan Schmidt If you feel like you're just piecing things together without really understanding why, don't feel too badly, from what I can tell that's how most R end users operate anyway.
This is 100% how most statisticians operate in practice. Even the ones that are decent at programming still seem to turn their code into monolithic messes of dplyr pipes.

Anyone know a good resource to learn Matlab?
The book I used as a 'guide' when I learned Matlab was "Matlab: A practical introduction to programming and problem solving". But honestly I think I learned more by just coding things up during my uni classes that used it. Which is a crapshoot, since Matlab rivals R in terms of how badly it's typically taught, and in terms of how many professional mathematicians/statisticians who "know Matlab/R" seem to treat it as just a StackOverflow code parser.
 
What's the opinion on Electron? What are the other options for cross-platform app development that allows deployment on windows, linux, mac, iphone, android, etc.
It definitely doesn't replace native development for desktop and mobile apps, but it has some good use cases for instances where a web app is the primary use case, but there is some extra (usually hardware interfacing) functionality is needed.

An example would be if a bakery wanted their website totally redone with the ability to order, set pickup times, and manage all their other e-commerce shit through the site. But they also wanted the site to generate the barcode labels to print for the orders created, and the ability for the cashier scanning the barcode when someone picked up the order to mark the order as completed. Being able to just quickly port the existing code to Electron saves a web development company a ton of time, and making the desktop app gives an infinitely easier method to connect and interface with the scanning and printing hardware integration.

Is it ultimately a shitter solution than building something natively? Absolutely. But it's usefulness comes in either when a codebase is so fucking massive that maintaining individual separate builds for each platform is impractical (like Slack, etc.), or for smaller primarily Web Development focused projects/agencies that would otherwise just probably not do it due to development time, or the impracticality of hiring/training a desktop app developer when the vast majority of their work is all web focused.

In terms of the way of the future for cross-platform app development, the next big thing (IMO) is going to be Flutter. Google has dumped a dickload of effort and development into it, and it shows. It's fast as hell to develop with, but it doesn't paint you into the corner that React Native kind of does in terms of matching native capabilities on Mobile. Also, Dart is straightforward enough that it's pretty simple to pick up regardless of what stack your coming from.

But if this is the thread where we just dunk on Web Development, then I'll say they're all shit and anyone that doesn't do all their development in assembly has room temp IQ.
 
In terms of the way of the future for cross-platform app development, the next big thing (IMO) is going to be Flutter. Google has dumped a dickload of effort and development into it, and it shows. It's fast as hell to develop with, but it doesn't paint you into the corner that React Native kind of does in terms of matching native capabilities on Mobile. Also, Dart is straightforward enough that it's pretty simple to pick up regardless of what stack your coming from.

But if this is the thread where we just dunk on Web Development, then I'll say they're all shit and anyone that doesn't do all their development in assembly has room temp IQ.
Flutter is also important because it's deeply integrated into Google's future effort to overthrow Windows: Fuchia.
I think Windows will always have a place for businesses with horrible legacy systems but people are getting fed up with it and I think Google are gonna capitalize on it so people move to their infrastructure rather than a *nix. Google is already on everyone's phones so I don't think many people would think twice about it.

I don't like it because I hate google and all that they represent (and their tendency to randomly drop shit when they get bored) but a big business I have a line into is currently developing a new version of their services in flutter.
 
So I recently figured out how to elegantly use partial orderings to order asynchronous tasks. Basic partial orderings are binary relationships in the set Unordered, Before, After and with this you can do a fair bit, but it doesn't really let you specify any asynchronous relationships. Basically I want to be able to specify things like "A runs while B runs", and to this end I added relationships Within, Without. Within specifies that one task runs while another runs, in the sense that the the first starts after the second starts, and ends before the the second ends, and Without specifies the opposite.

You can kind of imagine this model like stacking blocks, some blocks must come before or after others, and some must come atop or below others. The ones on top of others have to be either smaller or just the same size than the ones below, and can't overhang.
1603415221127.png

Now, it's important to be able to check that your ordering doesn't have any loops as that could cause the program to hang, and for the simple model with just the three relationships this is easy, you just build a graph and follow edges of one of the relationships through the graph, looking for loops. What are the implications of "A is within B" however? Well, it's solvable, but the solution is kind of absurd and inelegant. Furthermore, it'd be nice to be able to specify more specific things like "A starts after B starts" —half of the Within relationship, or "A ends after B ends" —better thought of as specifying that B must finish first.

A week ago I had an epiphany and realized that if I redefine the problem I can write a better, faster, and more flexible algorithm! What you need to do is separate the blocks into their ends, recognizing that the beginning of a block is implicitly before it's end, and vice versa.
1603416723599.png

At this point you can just remove the Within/Without relationships as they are redundant, and then you can use the naive loop detection algorithm without having to worry about them, so long as you account for the implicit relationship between a block's start and it's end. This also enables more interesting relationships like "A starts before B ends", the only unfortunate side effect of which is that you now have to specify which ends of the block you are referring to as well as before/after.

It's also pretty easy to actually iterate through the tree in partial order, all you have to do is loop through all the blocks and count the relationships; first count how many things must happen before this block starts, then how many things may start after this block starts, how many things must happen before this block ends, and how many things may start after this block ends. Keep a queue of things that may currently execute, and initialize it with the things that have zero things before them. Every time you take something out of the queue you notify everything that is waiting for it to start, and then after you're done with the item and everything that must happen before it's end, you notify everything that comes after it. Once a block has nothing that must happen before it, you put it in the queue. It's all a matter of incrementing and decrementing a couple integers if you do it right.


I'm doing this all in service of creating a game engine whose main loop is entirely asynchronous, and to that end you could just have numbered steps in the loops like "Physics Prepare", "Render Scene Synch", "Early logic", "Late Logic", etc. But in truth this is a cumbersome and inaccurate way of doing things —what if something is in early logic, but you need to execute something else first?— so it makes much more sense to me to instead order things relatively; "A starts after Logic starts, A ends before Logic ends" "B starts after Logic starts, B ends before A starts".
 
I'm doing this all in service of creating a game engine whose main loop is entirely asynchronous,
Neat, I like it. It reminds me a bit of a self-organising version of AWS Simple Workflow or Cadence.

In those systems, you have a central decider (aka scheduler) process which yields the next step to execute, and something else is responsible for being notified of that and executing it. You can do other things like start delay timers before the next step or waiting for a signal of some kind.

It works very well in ecommerce for example, in which you want to make sure you have charged the user's credit card before sending the order to the warehouse, and then when someone has picked the item that's a signal that yields another decision and so on.

I've never considered it for something like a game engine where you've got a limited capacity to do your processing inbetweeen frames but it could be an elegant solution for that as well.

You're making me want to spend my weekend making a game engine instead of doing anything productive now.
 
I recently started playing DOOM after all these years and I found out about the entire world of Source ports and yada yada, it peaked my interest and I'm thinking of making my own just so I have a project under my belt.

Now my main reason for bringing this up is because I really need and want to learn C, but I don't feel confident using tutorialpoints specifically for my cause, so is there anywhere else that isn't autistic that I can learn the basics and then branch out from there?
 
I recently started playing DOOM after all these years and I found out about the entire world of Source ports and yada yada, it peaked my interest and I'm thinking of making my own just so I have a project under my belt.

Now my main reason for bringing this up is because I really need and want to learn C, but I don't feel confident using tutorialpoints specifically for my cause, so is there anywhere else that isn't autistic that I can learn the basics and then branch out from there?
A good way to do both is find a raycasting tutorial and do both together, although the game will be more like Wolfenstein 3d then Doom. There was a good course for free on Udemy a while ago.
 
A good way to do both is find a raycasting tutorial and do both together, although the game will be more like Wolfenstein 3d then Doom. There was a good course for free on Udemy a while ago.
Thanks man, I'll look further into it and build everything up from Raycasting if I can.
 
I recently started playing DOOM after all these years and I found out about the entire world of Source ports and yada yada, it peaked my interest and I'm thinking of making my own just so I have a project under my belt.

Now my main reason for bringing this up is because I really need and want to learn C, but I don't feel confident using tutorialpoints specifically for my cause, so is there anywhere else that isn't autistic that I can learn the basics and then branch out from there?
I have not read it myself but "The C Programming Language" book is regarded very highly when it comes up in discussions and it's pretty short too, ~270 pages. Granted the 2nd edition is 30 years old but C isn't a highly evolving language.
 
Hey, got some programming employment related questions, and I figured this is probably the best place to ask:

1. Anybody have experience with job offers contingent on passing a background screen (like through HireRight)? I submitted all the background check info at the beginning of last week, but still haven't heard anything back yet. There's no massive skeletons in my closet, but it's a bit nerve racking.

2. I've never done a panel interview before, and I have one coming up next week and I feel a bit nervous. Any tips or ways I should prep? First half of the panel interview will be with senior management, second half will be with some senior SWEs.
 
Hey, got some programming employment related questions, and I figured this is probably the best place to ask:

1. Anybody have experience with job offers contingent on passing a background screen (like through HireRight)? I submitted all the background check info at the beginning of last week, but still haven't heard anything back yet. There's no massive skeletons in my closet, but it's a bit nerve racking.

2. I've never done a panel interview before, and I have one coming up next week and I feel a bit nervous. Any tips or ways I should prep? First half of the panel interview will be with senior management, second half will be with some senior SWEs.
I've interviewed with 3 people at least twice (One time, they also had two people WATCHING) so here's my insights. They're probably obvious but there's not much to say really, it's not that different to a regular interview. Possibly easier as it gives you more oppourtunities to leave a good impression.

1: Obvious but ENGRAVE THEIR NAMES IN YOUR MEMORY. Address them by name in the next part (But not too frequently. Make it natural)
2: You gotta bounce off of ALL of them. Repeatedly. (Although generally they'll tag themselves in and out) If you find someone's being left out, find a natural way to include them in the conversation. Remember: They're assessing you on whether you can communicate effectively and initiate difficult conversations.
 
Remember: They're assessing you on whether you can communicate effectively and initiate difficult conversations.
I think that's more than a bit optimistic, but your advice to include all in the conversation, and by name can make it true.

Feel free to avoid this paragraph of blackpilling: That assumes the interview will be a fair and legitimate one, you can't know ahead of time, and should accept that some fraction of your attempts to get a job never had a chance at winning. To take one example especially relevant in this plague year, maybe you'll be dealing with sincere people, but a hiring freeze will be declared before they can get you through the system. Don't take it personally as a judgement of yourself, move on to future opportunities. Never get too sold on your prospects of getting a particular job until you have an offer letter (or are all them emails now??), keeping going full bore with your job search.

For general strategy, I like the one from Ask the Headhunter. In this narrower context of an upcoming interview, the salient point is to demonstrate how you can add value to the company. Which entails research into it, and if you know who's interviewing you, into them. Check the site for more on how to choose and pursue specific opportunities; TL;DR is always research the company, and do your very best to get the ear of a hiring manager, someone who can actually make the decision. Playing games with HR, shotgunning resumes, getting them subjected to automated scoring systems using Machine Learning/AI etc. are not productive (if you do the latter, at minimum write a custom and focused cover letter; which is pre-ubiquitious Internet advice; adjust as appropriate, no doubt the link provided will have updated advice on that and other things to avoid). In general avoid scrums where you're competing with hundreds or thousands of people, most of whom can't actually program.
 
I recently started playing DOOM after all these years and I found out about the entire world of Source ports and yada yada, it peaked my interest and I'm thinking of making my own just so I have a project under my belt.

Now my main reason for bringing this up is because I really need and want to learn C, but I don't feel confident using tutorialpoints specifically for my cause, so is there anywhere else that isn't autistic that I can learn the basics and then branch out from there?
Someone on this thread recommended this to me and it's my responsability to pass it on to you c tutorial
 
With C, the most important things are thus:
Function pointers (For polymorphism)
Prototyping structs so they can reference themselves in internal declarations
The void pointer which can be cast to anything (polymorphism 2)

Also memory is very important to learn. Lean malloc, memset and the dereferencing rules and you pretty much know all the important stuff.
 
Back