Programming thread

5. Shotgun your resume anywhere you think you'll even be a semi-fit. Prereqs are just a suggestion in this field and with the labor market the way it is, companies are already going to be hiring you with the expectation that you'll need to be trained (most of the time anyway).

@Piss Chemist This is more important than people realize. Some of the best jobs I've gotten were from listings that I didn't fit the prereqs for. Cast as wide a net as possible when looking for work! Don't be discouraged if you get rejected fifty times, it's worth it for that one place you'll really click in!
 
Spent most of Thursday at work trying to solve an issue where I need to record 8 hidef cameras on 90+ FPS using ffmpeg. The main time blocker was encoding.

I managed to do it on an "old" computer with two geforce 2080 by encoding most of the cameras on the GPUs (which was bullshit by itself, since Nvidia blocks the number of total encoding calls irregardless of the number of GPUs), only to then transfer the code to a PC with two 3090 and a beefier CPU and it doesn't fucking work in the target FPS.

I tried installing a newer version of ffmpef, nothing. So either Nvidia fucked around with the GPU Encoders (not likely since they only slow down once I use any CPU encoder), ffmpeg doesn't support the new GPUs (more than likely), or there are some magic Ubuntu line I need to call to remove some arbitrary hard coded memory transfer limit, or it's something wrong with how the PC was built. And before anyone ask, both PCs should have the same Ubuntu version and every software in the same version.

The cases where something works on one PCs only to fail on another is the most frustrating shit ever.
 
Yeah by not be a waggie i meant more pay wise I dont imagine itll be somehow more glamorous just hopefully better payed misery. Thanks for the in depth answers I probably should have assumed the more languages you know the better anyway and that plan really helps. I'm assuming a couple years min before anyone would even look at my resume so not in a rush or anything just wanna try and get a good groundwork i guess, I just have no idea what field I wanna end up in so for now ill just start learning and figure out what I like.
 
Yeah by not be a waggie i meant more pay wise I dont imagine itll be somehow more glamorous just hopefully better payed misery.
The work is generally interesting and rewarding, so it doesn't feel as dead-end as your typical wagie job. But having to run at near 100% of your mental capacity almost every day gets exhausting after a year or two. That's why a lot of old hands will tell you to undersell yourself and not tryhard during your first few months at a job - set expectations low so you have the ability to check out mentally when you need to.
 
  • Agree
Reactions: Knight of the Rope
Makes sense I'm used to checking out hard as soon as i clock in so that would be an adjustment
 
One of the biggest problems ive seen as a computer forensics student is how outdated the course material is, like in my C class were expected to utilize out as fuck dated crap, like the graphics.h. From my self study there is really no reason to teach in C, when C++ is just as relevant and more useable. Java and python are just as easy, but from what ive found its mostly due to the book makers catering to indias students.
It's dumb but don't expect to be taught up-to-date stuff, the main takeaway is teaching yourself to teach yourself and being flexible. Everywhere and everyone does stuff their own way so you have to be flexible anyway.
Teach yourself the stuff you care about. At least with computers you can do that pretty easily for low or no cost.
I'm looking to not be a waggie anymore and get a job programing someday any advice on where to start? Starting from pretty much zero dont really know where to start language wise or what to aim for to get a job without a degree.
It sounds like you want a white collar job more than anything. I'm a "literally who" but have been in a few areas and can only speak from an American perspective but cyber security is on the rise, cloud stuff is increasingly important (security, engineering, management, etc), analysis with a technical lean, data base everything, and the list goes on for tech-y white collar jobs.
With that said, it's good to understand C/C++ at a basic level for nearly everything but practically speaking you'll probably never use it in most jobs.

Java is extremely common in enterprise, Python increasingly so (definitely learn it, just not as first lang imo), javascript and all the webdev shit, perl will be around forever.
Query languages (SQL, KQL, Lucene, etc) and their respective databases are worth learning and playing with. Start with SQL.
Git (the utility) basics are a must
Stack Overflow surveys are not actually that representative as far as I'm concerned but can introduce you to stuff that you may want to explore. https://survey.stackoverflow.co/2022/

There's no reason to pay for anything starting off unless you prefer physical books, though it's worth checking your library's site for good reference material. Official documentation is all online and libgen lets you pirate books if you want to.


As for first language, I may get shit but start with C and either The C Programming Language by K&R (classic) or C Programming a Modern Approach by King. Make your own data structures and algorithms from scratch and be comfy with pointers then move on to Java, C++ or something.
If you end up hating actual programming the experience is still valid for exploring other areas.

ps autistic but linux familiarity is also often good in every domain mentioned
 
Going from the other posts I was gonna look into C# JS and Python in the beggining but ill give C a look as well, never really messed around with linux it always seemed interesting though
 
As someone who does training for entry level people. Telling someone to start with C is the dumbest shit ever.

The key is observable feedback loop. A new programmer needs to see how their changes reflect on the output of the program. Thats one reason web dev is a traditional entry point. You make a change to the html or whatever and can see what your changes are immediately.

Only the most concentrated autists can handle going straight into "backend" languages. Theres this huge cliff where after learning how to make it say hello world on the console they have no idea what the real world application is.

Java is still big. Anyone who tells you python is taking over for it doesnt crank out microservices on the regular for a Fortune 30.
 
Java is still big. Anyone who tells you python is taking over for it doesnt crank out microservices on the regular for a Fortune 30.
Sure, but "not working for a Fortune 30" describes the vast majority of programmers, dude.

I hate it as much as you do (I'm not a fan of Python) but pretending that Python isn't everywhere and still proliferating is pure cope.
 
Sure, but "not working for a Fortune 30" describes the vast majority of programmers, dude.

I hate it as much as you do (I'm not a fan of Python) but pretending that Python isn't everywhere and still proliferating is pure cope.
Im not saying dont learn Python. Im saying the use cases for when you want to use Python and when to use Java dont really align and telling me that Python is on track to replace it would be a huge red flag to anyone i was interviewing. (There are other languages that are better set up to be Java replacements)
 
As someone who does training for entry level people. Telling someone to start with C is the dumbest shit ever.

The key is observable feedback loop. A new programmer needs to see how their changes reflect on the output of the program. Thats one reason web dev is a traditional entry point. You make a change to the html or whatever and can see what your changes are immediately.
Sure and webdevs are hated the world over; I'm actually not aware of it being a "traditional" entry point. Python, Java, C, and C++ are common 101 languages in universities in the US as far as I know. The expectation isn't "You need to know this specific language for a job," it's "We're going to use this to teach fundamentals."
Only the most concentrated autists can handle going straight into "backend" languages. Theres this huge cliff where after learning how to make it say hello world on the console they have no idea what the real world application is.
Respectfully disagree but still love you or whatever.
If you can get through the first 6 chapters of K&R before moving on to other languages you will be fine in any common language. It's foundational.

That said not gonna get into a slapfight over anything, anyone can look into what people say itt and make decisions based on that.

edit: @Piss Chemist You ultimately can't go wrong by just picking a language and running with it. After your first language the rest will come a lot easier.
 
Last edited:
Sure and webdevs are hated the world over; I'm actually not aware of it being a "traditional" entry point. Python, Java, C, and C++ are common 101 languages in universities in the US as far as I know. The expectation isn't "You need to know this specific language for a job," it's "We're going to use this to teach fundamentals."

Respectfully disagree but still love you or whatever.
If you can get through the first 6 chapters of K&R before moving on to other languages you will be fine in any common language. It's foundational.

That said not gonna get into a slapfight over anything, anyone can look into what people say itt and make decisions based on that.

edit: @Piss Chemist You ultimately can't go wrong by just picking a language and running with it. After your first language the rest will come a lot easier.
Enjoying the discussion.

Im legit curious what you think most peoples first job out of school is these days? Seems like its usually full stack doing crud stuff. (So web dev) Dont see many people entry level doing back end stuff. Those who do avoid it are still usually exposing apis that are consumed primarily by web devs and would do well to know their world so you can call them out when they wre being dumbasses.

Also it sounds like you dont deal with many bootcamp style people. Lucky you. Also the current state of CS grads is pretty dire. I see plenty who get confused at basic data structures. ("You should use a map here." "Whats a map? I dont know what that is." "It says you graduated with a 3.0+ in Computer Science from UGA.....")

I would say big decision point for your first language is if you want to go more functional or oo directed. It will change your style alot. (Speaking as someone who started OO and was required by work to shift to functional paradigms)
 
  • Like
Reactions: Marvin
Enjoying the discussion.

Im legit curious what you think most peoples first job out of school is these days? Seems like its usually full stack doing crud stuff. (So web dev) Dont see many people entry level doing back end stuff. Those who do avoid it are still usually exposing apis that are consumed primarily by web devs and would do well to know their world so you can call them out when they wre being dumbasses.
Front-end is many people's first jobs but they'll call it "full stack." I don't know what most people do though.
It's beside the point - I don't think beginners should start with C or C++ so they can professionally use those languages, they just force one to learn fundamentals and they have well established documentation in every medium. The goal would be to move on to more practical languages but better understand why things are the way they are.

A good and simple example is someone starting with Python who then moved to C would treat arrays as lists. Opposite - someone in C would go "what is a list??" and be more aware of its pros and cons after looking into it. Someone who starts with C or C++ would know why you'd use numpy (or something similar) purely for the data structures even if you're not doing any serious calculations. Not just "it's faster" or "it's better."
Also it sounds like you dont deal with many bootcamp style people. Lucky you. Also the current state of CA grads is pretty dire. I see plenty who get confused at basic data structures. ("You should use a map here." "Whats a map? I dont know what that is." "It says you graduated with a 3.0+ in Computer Science from UGA.....")
I don't and am glad. I think degree requirements are a better-than-nothing filter for a lot of roles. Even then you can tell who just got the grade and those who are interested.
I would say big decision point for your first language is if you want to go more functional or oo directed. It will change your style alot. (Speaking as someone who started OO and was required by work to shift to functional paradigms)
People should be able to adapt but sure. I don't think it's that big of a deal if someone can teach themselves.
 
  • Dumb
Reactions: CockPockets
What's the difference between functional and oo directed?
In heavily-object-oriented languages like Java, pretty much everything is an object, and any functions will not exist on their own, but rather as methods on objects.

In functional languages like C, functions exist on their own and there aren't objects. There may be data collections like C struts which can hold associated bundles of information, but they can't have methods be one of the elements of that collection/structure.

There's a little more to it than just that, but that's the basics of it.

Some languages like Python and PHP support coding with either approach, and so which one you use or even how you mix them together becomes a matter of convention (or lack thereof).
 
In functional languages like C
I wouldn't call C a functional programming language. I feel like the bare minimum you'd demand of a programming language for it to be considered functional is that functions are emphatically first-class (i.e. you can store them in variables/data structures, pass them as parameters to other functions, and return them as results from other functions). Sure, you can sort of get that behavior happening with function pointers in C, but it definitely doesn't feel as obvious as doing it in a classical functional language like Haskell or ML or something.
 
In heavily-object-oriented languages like Java, pretty much everything is an object, and any functions will not exist on their own, but rather as methods on objects.

In functional languages like C, functions exist on their own and there aren't objects. There may be data collections like C struts which can hold associated bundles of information, but they can't have methods be one of the elements of that collection/structure.

There's a little more to it than just that, but that's the basics of it.

Some languages like Python and PHP support coding with either approach, and so which one you use or even how you mix them together becomes a matter of convention (or lack thereof).
I never understood how Functional programming is supposed to be preferable beyond very specific cases where your code is clearly some State Machine. Like if you need to start carrying structs around then just fucking use objects.

Edit: Also learn fucking C you lazyasses. Nobody is saying that you should spend too much time on it, but it will both explain what happens behind obscuration of high level language and why a lot of modern features exist.
 
Last edited:
Back