Programming thread

what are people complaining about though
i guess there was that chromium crash bug but surely they've fixed that, right?
if so they are retarded
people: "google format le bad i hate google format"
also people: *use chromium even though it's trash*

also webm is just a subset of matroska so if you want to blame somebody blame matroska
i cannot believe how uninformed people here can be
I am not complaining about the Chromium crash, I am complaining how WebM has cursed features such as dynamic sizing as it's a mutant file format. Matroska is a fine video format, nothing bad about it.

And to suggest that I am another user of any chromium based browser when it's still a Google controlled project, is while an educated guess given their prominence, really in bad faith. I just hate big tech chicanery as an absolute.
 
  • Like
Reactions: 306h4Ge5eJUJ
I am not complaining about the Chromium crash, I am complaining how WebM has cursed features such as dynamic sizing as it's a mutant file format. Matroska is a fine video format, nothing bad about it.
i don't particularly mind the concept of a video changing size mid-stream, since image sequences (the purest form of video there is) are perfectly capable of doing such a thing
for all you know, somebody might want to art-direct the resolution and crank it up to 1080p for showing something with text and back down to 720 when they switch to the b-roll
or they might want to losslessly encode a clip montage recorded on different resolutions for some reason, and scaling would make that incredibly hard and hurt the already-weak lossless encoding
doesn't matroska allow such a thing though, being the superset of webm that webm was defined as a profile of? or is this an extension that libwebm has or something?
whats a good platform to learn GO without spending money?
Thanks in advance
you probably saw the extremely prominent links to the official docs when you were downloading the toolchain or whatever, have you tried looking at them?
the fact that you even asked this seems to indicate you probably won't go too far
if you do get far, however, i will congratulate you
 
you probably saw the extremely prominent links to the official docs when you were downloading the toolchain or whatever, have you tried looking at them?
the fact that you even asked this seems to indicate you probably won't go too far
if you do get far, however, i will congratulate you
Dude chill, other person responded with a good link.
If you indeed chill, however, i will congratulate you
I found https://gobyexample.com/ really helpful. Other than that, reading the docs helps a lot.

And of course there's always the excellent Pipelines article.
Thanks, that exactly what im looking for!
 
Dude chill, other person responded with a good link.
If you indeed chill, however, i will congratulate you
yes i will chill i just pointed out (in a rather abrasive manner) that resources for learning go are so liberally distributed everywhere that it's hard to go anywhere near the language without basically tripping on a high-quality tutorial every other step
i guess you couldn't know beforehand that go is one of those languages where you can literally search up "how to $language" and get a high-quality tutorial on the best way to use the language so i can understand
 
  • Like
Reactions: mooted
what are people complaining about though

I wouldn't call it a disaster, but I don't really see the point of it to be honest. It seems like one of those things that was invented to solve some "big tech" problem that I just don't have.

I don't know much about video encoding, but, I do have a few bird-box cameras. The video files are quite large as they come out, presumably uncompressed, so I run them through ffmpeg (literally just ffmpeg -i source.mov out.mkv) before archiving. Converting to a .webm takes 7-8x as long and the files are usually a little larger than the .mkv (I tried .webm because I kept hearing "reduced file size" as a selling point).
 
I wouldn't call it a disaster, but I don't really see the point of it to be honest. It seems like one of those things that was invented to solve some "big tech" problem that I just don't have.
i think webm was made as an alternative to .mp4 that shitty phone companies would add to their operating systems and it's royalty-free
I don't know much about video encoding, but, I do have a few bird-box cameras. The video files are quite large as they come out, presumably uncompressed, so I run them through ffmpeg (literally just ffmpeg -i source.mov out.mkv) before archiving. Converting to a .webm takes 7-8x as long and the files are usually a little larger than the .mkv (I tried .webm because I kept hearing "reduced file size" as a selling point).
webm and mkv are container formats. you did not specify anything about which codec to use so ffmpeg probably used something different in both
you didn't even specify the encoding quality, so who knows what defaults may have been used
try looking up codecs and redo your tests with a known codec on known settings, i think your results may be different

also i think .mov is usually not uncompressed so much as using an ancient proprietary shit format that anything remotely recent does better than
 
webm and mkv are container formats. you did not specify anything about which codec to use so ffmpeg probably used something different in both
you didn't even specify the encoding quality, so who knows what defaults may have been used
try looking up codecs and redo your tests with a known codec on known settings, i think your results may be different

I think I would need to spend a weekend or two on this one, tweaking settings. I don't know how many times I've watched the same clip in different formats/settings, but I should probably stop.

The webm is vp9, everything else (the original, mov, mp4, mkv) is h264 (all with similar file sizes and encoding times). Forcing ffmpeg to put vp9 into a mkv results in a similar time/size to putting vp9 into a webm, but webm does not support h264. Imagine that instead of "mkv" and "webm" I had said "h264" and "vp9"? I don't suppose the container really matters much for a simple video. Encoding vp9 seems to use less ram than h264, maybe if I were Google, and encoding an obscene amount of video, then using less ram might be worth it taking longer? It seems that the longer the clip is, the smaller the difference in file size becomes, but I haven't seen the "up to 50% smaller" yet.

also i think .mov is usually not uncompressed so much as using an ancient proprietary shit format that anything remotely recent does better than

If I'm reading ffmpeg correctly, it's just re-encoding h264 at a different bitrate? 21948 kb/s to 4597 kb/s is about the same as the reduction in filesize (21MB to 4.1MB). Uncompressed might not be the right word for it, but I suspect that the camera is just dumping the data into the file as fast as it can with minimal processing so they could cheap out on the CPU.
 
I think I would need to spend a weekend or two on this one, tweaking settings. I don't know how many times I've watched the same clip in different formats/settings, but I should probably stop.

The webm is vp9, everything else (the original, mov, mp4, mkv) is h264 (all with similar file sizes and encoding times). Forcing ffmpeg to put vp9 into a mkv results in a similar time/size to putting vp9 into a webm, but webm does not support h264. Imagine that instead of "mkv" and "webm" I had said "h264" and "vp9"? I don't suppose the container really matters much for a simple video. Encoding vp9 seems to use less ram than h264, maybe if I were Google, and encoding an obscene amount of video, then using less ram might be worth it taking longer? It seems that the longer the clip is, the smaller the difference in file size becomes, but I haven't seen the "up to 50% smaller" yet.
there are a lot of settings here and vp9 could probably encode faster than h264 if you wanted it to
the main benefit of vp9 over h264 is that it's not patent-encumbered
you should try av1, you can even put it in webms (it's slow but has a very good quality/size ratio)
If I'm reading ffmpeg correctly, it's just re-encoding h264 at a different bitrate? 21948 kb/s to 4597 kb/s is about the same as the reduction in filesize (21MB to 4.1MB). Uncompressed might not be the right word for it, but I suspect that the camera is just dumping the data into the file as fast as it can with minimal processing so they could cheap out on the CPU.
i assume so, also h264 is an ancient proprietary shit format
 
the camera is just dumping the data into the file as fast as it can with minimal processing so they could cheap out on the CPU
Yes, this is how they tend to work. They often use weird ASICs that cut as many corners as they can. FFMPEG is very clever about matching quality settings so trust it for a start.
 
Actually its pretty good tbh
No
1000048840.png
 
Well you are plotting it against stuff that pretty much all had the specific goal of improving on h264 (except av1 which was arguably taking aim at vp9) and as a result its basically the yardstick these things were all measured against to decide if they were worth adopting or not.

Its generally still quite good in terms of encoding time and compatibility, and adequate in terms of bitrate. When sharing a video I still tend to encode to h264 as it plays on more devices and that is more noticeable than shaving 6.5mb down to 4mb or whatever.
 
Well you are plotting it against stuff that pretty much all had the specific goal of improving on h264 (except av1 which was arguably taking aim at vp9) and as a result its basically the yardstick these things were all measured against to decide if they were worth adopting or not.

Its generally still quite good in terms of encoding time and compatibility, and adequate in terms of bitrate. When sharing a video I still tend to encode to h264 as it plays on more devices and that is more noticeable than shaving 6.5mb down to 4mb or whatever.
The only advantage for H264 is its compatibility, but in any other regard including encoding speed, it's worse than its succesors specially now that hardware encoders for modern codecs are easily and cheaply available.
 
I really feel like quitting IT, and going back to University.
If student loans are interest-free in your country, a second field of expertise is a decent idea since most STEM fields pair well with programming.
yeah python and javascript are incredibly :cryblood:
thankfully we also talk about normal languages (c and lisp) a lot
Since I have designed and implemented a bunch of my own programming languages, I have come to appreciate the syntactic design of python far more than I did previously. I am no longer a python hater because of it. I feel a similar way with lisp, but that is more related to my renewed appreciation of lambda calculus.
C is still incredibly cursed, blows my mind that we just let that exist unchecked for so many decades. Rust should have existed in 1990.
So is Rust astroturfed and if so, why do you think that is? I don't understand why Rust has so much buzz around it from the top but the only people I have seen be interested in using it personally are doing it to virtue signal and they're always wanting to rewrite something that already exists. I haven't seen people use it as their preferred language for writing their own programs, usually that's C++, C#, Java, or Python anecdotally. It's peculiar.
I use it for all of my personal programming and most of my professional programming, and I have for years. It's simply unparalleled in usability and expressiveness, I'm able to lean on the tooling and standard library to move faster using rust than anything else, and be confident that my programs won't exhibit unexpected behaviour that violates the memory model and shared resource access of an OS thread. I am very rarely held back by the compiler, and when I am, I'm glad I was. Previously, I used C++ full time for everything. I have found subtle memory corruption bugs in my old C++ programs when I re-wrote them in rust years ago, just like uutils has found within the GNU C coreutils as a result of their efforts in producing identical output to the GNU C coreutil counterparts. It would be much better to be GPL than MIT.
Most of the bugs and exploits are bad memory management in big projects, and you gotta ask why is it like that? I'd say it's laziness or over crunched workers, that's mostly why.
Fundamentally, a software bug is a discrepancy between the expectation of software's behaviour and software's actual behaviour. When you are programming complex software systems, even if you are a highly skilled programmer and super duper careful, it's still hard to reason about the lifetimes of allocated memory*, so eventually even the best programmers will write code under the expectation of certain memory conditions, where unexpected conditions can still occur, and that leads to memory corruption bugs.

John Carmack's related opinion (source):
John Carmack said:
And I reached the conclusion that anything that can be syntactically allowed in your language, it's gonna show up eventually in a large enough codebase, good intentions aren't going to keep it from happening. You need automated tools and guardrails for things. And those start with things like static types and even type hints in the more dynamic languages.
But the people that rebel against that basically say, "that slows me down doing that". There's something to that I get that I've written, I've cobbled things together in a notebook. I'm like, "Wow, this is great that it just happened," but yeah, that's kind of sketchy, but it's working fine. I don't care. It does come back to that value analysis where sometimes it's right to not care, but when you do care, if it's going to be something that's going to live for years and it's gonna have other people working on it and it's gonna be deployed to millions of people, then you want to use all of these tools you want to be told: "No, you've screwed up here, here and here." And that does require kind of an ego check about things where you have to be open to the fact that everything that you're doing is just littered with flaws.
++ on the ego point.

The reason memory corruption bugs are particularly bad is because the code execution machine can be manipulated into executing code outside of the behavioural intentions of the written code, completely through precise overwriting of memory addresses and code within existing data structures in the buggy program, and when a program allows itself to be re-programmed at runtime like this, that generally leads to very bad security problems, especially when said program can be re-programmed by inputs sourced from over a network.

The root cause is that memory lifetimes and shared resource accesses are just hard to reason about in complex software projects that don't employ radical memory management techniques or other fixes like GC. Even relatively simple software projects suffer from the curse of expectation vs implemented reality + changing requirements, constraints, expectations, and adjacent code.

Rust is exceptional because it eliminates the class of bugs that is the worst offender regarding software security, while allowing programmers to still operate at the level that C can operate at, and as a bonus it is possible to easily express high level concepts in it, which get compiled down to tight and optimised machine code. where C++ equivalents are clunky (iterators, lambdas, smart pointers) or are inconsistent, or optional. So understandably, the hype from the people who actually use it is natural because it is genuinely a technical improvement to. Online it's a political weapon, and the 1-dimensional troon psyop has convinced tech chuds that it's actually useless and shit and just git gud at programming C++ like a real programmer, when even the best programmers in the world agree that it's an impossible task because of the fundamental design of said "based and redpilled" programming languages.

*unless you employ radical memory management techniques such as strict arena allocation, where the lifetimes and bounds of allocations are specifically made stupidly easy to reason about through fundamental restructuring of code to what is traditionally taught and/or learned in the classic ad-hoc fashion.
it's the hot new language that's going to solve all the problems in computing caused by those stinky old languages nobody likes, ever since they stopped being the hot new language
Except rust actually is solving the problem of memory corruption bugs caused by C and C++, and I can't help but admit the programming language imperialism is kind of based for a troon op.
Maybe I'm retarded, but I'm 100% sure he's talking about the Rust lobby, not Rust. The argument for Rust's existence is pretty clear, even if the execution is naff. Faggots lobbying for it to the government doesn't make a lick of sense considering it's not even standardized yet.
Rust standard just dropped, it's official: https://rust-lang.github.io/rust-project-goals/2025h1/spec-fls-publish.html
I’m in early stage of a (real) engineering career and I’ve been learning C++ to fulfil a childhood aspiration. It seems like something I wouldn’t hate working in, so wondering in idle curiosity if it’s (financially) worth the hassle of trying to switch considering I don’t have a relevant degree or experience and the job market seems to have changed with all the Indians + AI.
If your existing career path is already paying well, going into programming generally won't be worth it. If you can find a niche combo with your existing career path and programming, there could be good earning potential since most programmers are not experts in programming AND something else. Like, if you're a physicist and you become an expert at C++ programming (doesn't require a degree), you're gonna be in front of all the fools with CS degrees if you apply to be a programmer in the aerospace industry. Same shit in biomed, engineering etc, you automatically beat the CSfags because your skills are simply worth more than theirs, all you need is proof that you have programming chops and that you can operate version control software without too much training. Good luck, null's broken penis.
wordswordswords
idk dude, sounds like you have a serious vril issue to me
Name one programing language specific package manager that is not dogshit.
Cargo
Here's the two best package managers that work with any language:
1. git clone
2. wget + unzip
Take the vendoring pill.
Cargo supports these. --git and --path flags for cargo add. Yw.
[more vril issues]
Solved problem: https://github.com/crev-dev/cargo-crev + SBOM
blah blah blah
angery and mad
Sorry zhang/vlad, you're fired because your refusal to use rust makes you and your work a national security threat. You are being replaced by a zoomer rust programmer and your accidental backdoors are being patched out in the rust rewrite.
 
...I can't help but admit the programming language imperialism is kind of based for a troon op.
It's counter productive in the grand scheme if the goal is widespread Rust adoption. I think they latched on early, as they do with many things, to try to control it. It isn't about secure code with them, it's about control.
 
It's counter productive in the grand scheme if the goal is widespread Rust adoption. I think they latched on early, as they do with many things, to try to control it. It isn't about secure code with them, it's about control.
Trannies need to stop sticking their mangled dicks in things.

I've been writing a fair bit of Odin recently, and I enjoy it, but it's almost impressive how easy it is to footgun myself, even with things like String slices. Especially with string slices.
 
Back