Open Source Software Community - it's about ethics in Code of Conducts

Which is substantially faster than manually writing boilerplate.
it might be, but if you spend so much time writing boilerplate all day that it makes a huge difference in your productivity, there is undoubtedly something incredibly deeply wrong going on somewhere
coding often isn't prestigious and its mostly writing the same patterns over and over again which is something you can train an LLM to do.
writing a regular program or an enhanced library to abstract over repeated patterns is pretty much always better, if it's possible
"the human compiler" is the greatest antipattern known to man, even if you use talmudic statistical magic to greatly optimize the process

They work really well as parsers for super huge outputs or as proofreading tools in my experience. If you can put up with double checking for hallucinations (especially if you are researching a niche subject), they can find some really neat things. Personally I use them to sniff out obscure 3 view Guix System githubs that I can dig interesting scripts & configs from.
how are you doing that fully self-hosted llm repository searching though
if it's not self hosted who the fuck are you and what did you do to @Ferryman
 
how are you doing that fully self-hosted llm repository searching though
if it's not self hosted who the fuck are you and what did you do to @Ferryman
Its self-hosted* with an asterisk. An irl friend of mine has a model running off a former crypto mining rig that we use as our search engine gimp via vpn + webgui. I'd be lying if I told you I know how it works, I'm pretty much in the dark when it comes to local LLM deployment or LLMs in general. All I know is that it runs off Rocky and a lot of Docker autism goes into making it work.
figured it's worth reposting this in this thread. I heard about this drama in prep for a stream because grok sends me a weekly chronicle of random stuff by my design.

View attachment 8046935

not using ai is gimping yourself.
"NOOOOOOO YOU CAN'T USE AI BECAUSE.... BECAUSE YOU JUST CAN'T, OKAY?! You have to focus on team-based, collabor-ACK!"

1760723305395.png


Right or wrong, telling people to "fork or fuck off" is a very refreshing thing to see nowadays. Tis what Varyx should have done when the trannies came knocking.
 
Its self-hosted* with an asterisk. An irl friend of mine
the "cloud" might be somebody else's computer, but somebody else's computer is not always """the cloud"""
it's never cringe to use somebody else's computer if you trust them

"NOOOOOOO YOU CAN'T USE AI BECAUSE.... BECAUSE YOU JUST CAN'T, OKAY?! You have to focus on team-based, collabor-ACK!"
i like that stance linus took on it: if it makes acceptable code and is submitted by somebody who knows what they're doing then there's no problem
how the fuck are they so sure he's using it in the first place, anyway? if you properly use an llm and tardwrangle it and doctor its output a bit, it should be virtually indistinguishable from something you wrote yourself, should it not?
 
if you properly use an llm and tardwrangle it and doctor its output a bit, it should be virtually indistinguishable from something you wrote yourself, should it not?
That's the theory. But I'd like to see the numbers behind time spent on someone doing it all by hand (including troubleshooting and debugging) vs using an LLM to assist them, then having to correct syntax cockups, hallucinations, tweaking to get the desired output, etc. This is totally an ass figure, but I'd wager at best the gains are negligible, and more likely it takes longer.

Which tracks with the same philosophy of outsourcing, sure you may spend less because you can get 3 programmers for the price of one competent non-jeet, but when that one has to do two or three times the work un-fucking the output of the three jeets, it becomes wildly inefficient.
 
Which is substantially faster than manually writing boilerplate.

This is just one study. And there are some uses of AI that are probably more practical than others. But I think there is definitely a perception issues going on. It feels faster when you have an AI, vs doing it all yourself. But it's code writing, vs a code review. And a lot of the time, looking over code I would wager it's very easy to let things slip.

From what I have seen pretty unanimously from people that were willing to adopt AI, at least the ones that already knew how to code well before doing it, and heavily used it. Using AI has a few effects. When you aren't actually writing the code, particularly in complex projects, your understanding of what the code is actually doing isn't what it would have been if you had written it yourself. Basically it's easy to be a bit lazier. The other is over time people pretty consistently say they loose their ability to write code themselves without it.

Those two combined. Seems like a pretty bad combo. I do think there can be some potentially decent uses of AI for coding. Maybe AI agents could be a good choice. But the normal, the ai writes the code for people, vibe coded, just doing a code review on your own work. really seems like it's not the full answer.

You know what.

I have to post it. Since I'm listening to it now. And I can't help but think some of the issues talked about in the beginning of this, are related to this topic. In the beginning of the stream
 
Last edited:
how the fuck are they so sure he's using it in the first place, anyway? if you properly use an llm and tardwrangle it and doctor its output a bit, it should be virtually indistinguishable from something you wrote yourself, should it not?
The push comment said "This is what ChatGPT told me". In his own words
The AI code is gone. I excised the function from the repo,
But don't ever think this will solve the AI problem. I know far too many developers that swear by AI tools and would not develop witthout them anymore. And rest assured, most won't be honest about it and it won't be easy identifying such code - it's not always obvious slop.
For example, if I had said that this code was from an old project of mine, nobody would have raised an eyebrow.
Unfortunately that project only had Windows and Mac versions for this check... :(
He's partly correct in saying that people wouldn't "raise an eyebrow", the Doom community in general is very gay and being anti-AI is one of the things that crowd does, but he did force push code to master that failed to compile, which could very well have resulted in the same even if he had insisted that "Oh, ja! I took that from my old project and I guess it didn't work". The real issue here is that people really don't like Graf Zahl (has been the case for 15 years), and I can guarantee that the people behind UZDoom have wanted to have a ZDoom without Graf for a long time. If people liked Graf, they would have been much more willing to work with him despite his supposed accident, and tried to convince him more diplomatically to not use AI. It should also be mentioned that Graf has not really been the lead maintainer of GZdoom for about a year, with the team behind UZDoom being the ones doing the development for the past year, which you can see from the Github contributor page and this post from Xaser.
github.pngdoomworld.png
Oh, also Graf allegedly briefly ragequit development 8 years ago because of lilith.pk3
 
DEIsoft & Friends made sure to kneecap Stallman's efforts. It didn't fail, people have to push harder against their psyops
What kneecapped it the most was the emergence of Linux Foundation once Linux became dominant and corporate interests took over.

LinuxFoundation due to it's importance now has an outsized influence over things like how to interpret the GPL and their stance for the GPL is essentially "no enforcement ever, no matter what".
They actively try to prevent enforcements to the point if you contribute to the kernel and start enforcing the license against deliberate violations, they will actively remove and replace your contributions,

EDIT: So, if you deliberately never enforce the license, in what way is it then meaningfully different from public domain and how can it be considered free software?
 
What kneecapped it the most was the emergence of Linux Foundation once Linux became dominant and corporate interests took over.

LinuxFoundation due to it's importance now has an outsized influence over things like how to interpret the GPL and their stance for the GPL is essentially "no enforcement ever, no matter what".
They actively try to prevent enforcements to the point if you contribute to the kernel and start enforcing the license against deliberate violations, they will actively remove and replace your contributions,
year of the hurd desktop 2030
i want to believe
lynch that nigger penguin
EDIT: So, if you deliberately never enforce the license, in what way is it then meaningfully different from public domain and how can it be considered free software?
you don't need to consider these details from the stance of a user of the software wishing to exercise their four freedoms
it only affects people who are trying to violate the license (in the case of linux's gpl2, by redistributing it as proprietary software)
Public domain software is free if the source code is preserved.
as long as the license terms do not infringe on the four freedoms, it is free software. this includes public domain software distributed with sources, gpl software, 3-clause bsd software, mit software, and unenforced gpl2 software
in all of these cases, you have your freedom. the only thing that varies is whether you are allowed to infringe on others' rights; the fsf doesn't want people doing this so they have copyleft and the gpl
 
what major laptop manufacturer is likely the easiest to persuade into trialling having Linux Mint installed by default instead of Ubuntu on any model?
I feel like it doesn't really matter, because the only people buying laptops that come with any non-Chrome Linux distro prinstalled are ones who already know or want to know how to slap their own preferred flavor of Linux on it.
 
I feel like it doesn't really matter, because the only people buying laptops that come with any non-Chrome Linux distro prinstalled are ones who already know or want to know how to slap their own preferred flavor of Linux on it.
what about the people who don't know how to do that?
 
You mean normgroids? The normgroids who you physically cannot convince to use a laptop or desktop that doesn't run Windows or MacOS?
if you put a computer with Linux Mint on it in front of them and they are at least willing to try Linux they will do well. But if it has Ubuntu and they get told to learn how deal with isos and reinstalling they will be overwhelmed.
 
You are assuming NPCs make decisions based on practicality or ease of use. They do not. They make their decisions based entirely around what's "normal", and using Linux is not "normal". You physically CANNOT convince NPCs to use Linux, because they WANT to be slaves because being slaves is the norm.
 
You are assuming NPCs make decisions based on practicality or ease of use. They do not. They make their decisions based entirely around what's "normal", and using Linux is not "normal". You physically CANNOT convince NPCs to use Linux, because they WANT to be slaves because being slaves is the norm.
It's a very small cross section in the cen diagram but there is a spot where people not good with computers will be willing to learn Linux Mint. It's really little different then going from Windows 7 to Windows 11
 
Public domain software is arguably more free than with GPL or even MIT.
 
Public domain software is arguably more free than with GPL or even MIT.
The question is always "free like what?" and the thing about the GPL is that "free to steal your software to use in proprietary contexts to fuck you in the ass" is not a freedom it provides.
 
Public domain software is arguably more free than with GPL or even MIT.
No, for one simple reason: GPL requires the source to be made available. Software can be in the public domain without source available. With the source anyone can continue to build against it; with closed-source you need to reverse engineer it or, even worse, build it again from scratch which comes with a number of flaws.
 
your understanding of what the code is actually doing isn't what it would have been if you had written it yourself. Basically it's easy to be a bit lazier. The other is over time people pretty consistently say they loose their ability to write code themselves without it.
Yeah. Those are the real issues. I take issue with that study as it seems they focus on large code bases a decade or more old which inherently will have issues due to token limitations. For a fast-api project, it can potentially save time if you already have the docs available or need an overview. I do feel like my skills aren't progressing as fast as they used to because of using these tools but the pressure at the workplace to rush things was overwhelming until I started using it as an assistant.

Converting an API from deprecated java + tomcat to python took less than 10 minutes when putting claude on auto-accept and that included writing tests, testing, writing the Dockerfile, and writing the Github actions to test, build, and push. There was only one bug with it regarding a format conversion in the json response with an undocumented format. Auto-accept is a bad idea for a project you care about but we were experimenting with it and it was surprising how easy and accurate it is. That API has been in production for months now with no issues.
 
Back
Top Bottom