Programming thread

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Do you have any recommendations when it comes to intermediate projects in Python? I'm currently building a music player in Python and want some other projects to keep me occupied after I finish it. Preferably don't recommend any projects that have to do with compilers and hardware design.
This is kind of hard for me because a Music player is already more complex than like 95% of the things people use Python for. Like a lot of people use Python for data science, mathematical models and Machine Learning but those things are fairly easy to do Python wise and only the concepts are really that hard.

So I'm going to throw a bunch a stuff at you that I have no idea whether you'll find it too hard or too easy.

Implement a Fourier or fast Fourier transform in a general way.

Do something with implementing rotation and reflection matrices for a 3D graphical object.

Implement a basic compression algorithm for something like images.

If you are feeling this stuff is easy do Fluid simulations.

These might be too hard I have no idea.

All of these are good because you could make something out of them that is visual that will give you something nice you attach your code to mentally.
but I see it quickly being used at a rapidly accelerating rate as financial and political interests start focusing on it. I got very blackpilled over time about the powers that be and what kind of research actually gets interest and funding.
This might start a massive argument but I have to say it. I don't think "AI" has fundamentally developed in the last 7-8 years or so.

Oh jeez people might not like this one.

But my understanding with what I have done and heard from people who have done a lot of this stuff and exist within that ecosystem is that the "AI" isn't getting any more accurate. Most of the developments have been making the scope of the things the "AI" tries to do larger but not actually improving its accuracy. Like I wasn't surprised by LLMs because I had seen stuff like this 5 or so years before they became popular. Even the name shows my point, Large Language Models.

To discuss accuracy, we have to think of this as a problem of diminishing returns. The standard easy response to improve accuracy is just MOAR data. This doesn't work because you need orders of magnitude more data to get a very small improvement in accuracy. To put numbers on it is very easy to make something that is right 70-80% of the time. It becomes more difficult to get to 90%. And it is hard but very possible to get to 95%. I have gotten to about 98.5% accuracy before I started overfitting on an AI trying to beat the stock market. Getting any further than that is basically impossible without a fundamentally new development in the technology.

We can refine the data and this is usually what AI people do to make things work better but refining the data is always going to get you somewhat small improvements. Refinement is not a massive leap forward but a slow and small process of improvement.

Where are the self-driving cars? The answer is they are not coming and that can't work with current technology. Everyone completely forgets about that little debacle because it was inconvenient but "AI" failed to deliver this.

we can literally take the heart out of one person and put it in another with fairly minimal long-term complications.
A lifetime of immuno-suppressants is not pleasant. My main point in mentioning this is that things are actually really difficult and over a long time period we have built a lot of stuff, but it is always worth remembering that limitations have not gone away we have just clever ways to make them not matter as much.
 
Here: https://docs.celeryq.dev/projects/k....transport.redis.Channel.queue_order_strategy

I think what you want is

priority (priority_cycle).


Though this is redis, I assume switching message broker shouldn't be a big deal though? or you could run a redis alongside rabbitmq, it's super lightweight anyway
Whoa, their documentation is a mess.

This is still not quite what I'm looking for (I want to pick a queue at runtime), but I might be able to look at the class and see where and how it's used and plug in my own class in its place.

But here's the weird part: this is in the Redis section because, they say, it emulates AMQP's in-queue priorities when using the Redis transport.

An AMQP queue supports message priorities up to a predetermined per-queue maximum. Higher-priority messages are pushed to the front of the queue.

Their implementation of a queue on Redis does not have this because it predates AMQP's priorities, so what they do under the hood is create several real Redis-based queues for each declared celery queue name, and draw from these queues in order of priority:

This led me to a temporary solution, which is to make more queues for more important customers; then "fair" round-robin workers subscribed to all queues will clear them out faster.
Lol. Lmao, even.
 
That's the great thing about C. It doesn't do memory management, at all. It's all on you. It will either make you a better programmer or let you write code that will be hacked within 30 seconds of your first release or both.
C very much does memory management, and it's not simple either. malloc-free is a tradeoff just like any other scheme.
 
You guys know any good, simple and engaging tutorial sets for getting the hang of Python and Lua my autistic ADHD ass won't drop in a few minutes? I kinda want to learn both, Python for writing up quick dumb shit as an alternative or addition to Batch and AutoHotkey and Lua for mainly getting the hang of why ComputerCraft was such a big deal, and maybe making some cool shit with it, like automating factories and whatnot.
 
You guys know any good, simple and engaging tutorial sets for getting the hang of Python and Lua my autistic ADHD ass won't drop in a few minutes? I kinda want to learn both, Python for writing up quick dumb shit as an alternative or addition to Batch and AutoHotkey and Lua for mainly getting the hang of why ComputerCraft was such a big deal, and maybe making some cool shit with it, like automating factories and whatnot.
Based computercraft enjoyer.
As far as python, most of the tutorials are pretty middling. I rarely see type annotations covered and the like. Python suffers from a similar problem to Java where it's incredibly easy to learn but difficult to master, and it's flexible enough to accommodate bad decisions.
I guess the best answer I can give is to just read the python library on github itself alongside other materials, it'll give you a decent idea of how to properly leverage the lang.
 
  • Like
Reactions: 419 and y a t s
Based computercraft enjoyer.
I'd like to enjoy it but since I don't know how to write Lua code I never took real leverage of it, just fucked around with premade code for it. Plus, maybe if I start making cool shit with ComputerCraft, maybe something like an energy grid deficit alert, I might learn Lua enough that I can make cool shit somewhere else that also uses Lua scripts, like STALKER.
 
  • Like
Reactions: 419 and y a t s
I'd like to enjoy it but since I don't know how to write Lua code I never took real leverage of it, just fucked around with premade code for it. Plus, maybe if I start making cool shit with ComputerCraft, maybe something like an energy grid deficit alert, I might learn Lua enough that I can make cool shit somewhere else that also uses Lua scripts, like STALKER.
I smell a gregtech enjoyer behind this post
 
Go is really enjoyable to write. It's a C-like systems language made by a group of C++ haters, and it has a lot of the flexibility of things like Python with its garbage collection. I like how the language design still largely retains the spirit of the C language, and it makes it easy to reason about what you're writing if you're a seasoned C developer. At the same time, it's not super inaccessible to people used to languages like Python instead.

Based on the way you described OOP and wanting to work on things in small chunks, I think you would like the quasi OOP it has. It sticks more with the Unix philosophy of doing one thing and doing it well, and encourages breaking up your problem into those modular bits. There are some exciting projects for ML stuff with Go. I haven't looked into them much, but I want to. I assume they make heavy use of C bindings, similar to the Python ML projects.
I've always felt this way about Go and I'm surprised that I've only first seen someone else share this take now. It's sort of like the missing link between C and Python, IMO. It's almost as easy to start with as Python or JS and avoids a lot of their pitfalls. I'm not 100% sure it's an ideal first language (though it'd probably work fine) because IMO some of the design choices only make sense if you've dealt with big codebases in other languages before.

That said, like 95% of my work is in Python. I've begun writing small executables for work in Go because it's really nice for my use case: sometimes I want to write and share a small utility with my coworkers, but I don't want to deal with Bash's eccentricities and if I use Python instead, I need to either distribute a fuckoff huge Docker image, restrict myself to the standard library, or pray my colleagues care/know about virtual environments because Python's whole environment and packaging shit is stupid. I got a little pushback on that at the start because Go isn't used much in my part of the company, until I made my coworkers actually read the source of one of those Go utilities and agree that it's pretty straightforward.

Legacy libraries needing to be updated for new minor versions of Python because they didn't really make await and async keywords until like 3.5 :story:
 
I smell a gregtech enjoyer behind this post
Yes I am playing GregTech Community Pack right now, my first experience with GregTech, and unfortunately I'm enjoying it, I'm near the end of HV era with an ME system in place. It's a painful slog but a fun kind of slog.

But anyway, the topic is not GregTech but how to learn Lua, that's a bit more ubiquitous than ComputerCraft and GregTech, I brought up CC just as a practical example.
 
Memory management techniques all have trade-offs between efficiency, security, and ease of use. Languages that use garbage collection basically handle freeing allocated memory for you when it goes out of scope, but they are inefficient compared to well written C code where everything is done manually but in a hopefully sane way. Each are better for certain tasks, depending on resource requirements and typical use case.
I wanted to expand on this to add that efficiency is relative. As yats was getting at, it entirely depends on what you're doing. If you're writing something where performance matters, then you don't want automatic garbage collection, because it will steal performance from you and there's little you can do about it. Games are the canonical example here with Minecraft as the "exception that makes the rule" (yes, I know they later rewrote it). On the other hand, if performance doesn't matter, then you do want automatic garbage collection, because debugging the manual memory management code you wrote is inefficient in that case (because you're wasting your time).

When does performance matter? You don't know. You think you know, but you don't. You have to get good at imagining how you will refactor your project once you find out what's slowing it down (if anything is). Languages like C++ and Python make refactoring for performance easier. In C++, for example, assuming you planned your project well, you can rewrite the slow parts of it with lower-level, C-like memory management, for example, to get better performance. A lot of times it's like 5% or 10% of the project that needs to be rewritten. Python, as a higher level language, has terrible performance, but because you usually only need to rewrite 5% or 10% of the project to get acceptable performance, you can (again, assuming you planned ahead) rewrite that 5% or 10% in C and then call that new C code from your Python code.

Again, you shouldn't care about performance as someone just getting into programming, because you won't be able to tell when efficiency is going to matter. Even experienced guys like me can't tell in advance a lot of the time. But, because you will care about performance at some point, you should start with languages that won't hold you back. C, C++, and Python are good, though C++ is a bear and might make you want to kill yourself. Go may be good if you can disable its garbage collection. I don't know because I've only ever written like 30 lines of Go and that was years ago. In any case, I know it has easy interoperability with C, which is all you really need at the end of the day.
 
See Also: Knuth. "Premature optimization is the root of all evil." From "Structured Programming with go to Statements"
There is no doubt that the grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.
Yet we should not pass up our opportunities in that critical 3 %. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified. It is often a mistake to make a priori judgments about what parts of a program are really critical, since the universal experience of programmers who have been using measurement tools has been that their intuitive guesses fail. After working with such tools for seven years, I've become convinced that all compilers written from now on should be designed to provide all programmers with feedback indicating what parts of their programs are costing the most; indeed, this feedback should be supplied automatically unless it has been specifically turned off.
 
You guys know any good, simple and engaging tutorial sets for getting the hang of Python and Lua my autistic ADHD ass won't drop in a few minutes? I kinda want to learn both, Python for writing up quick dumb shit as an alternative or addition to Batch and AutoHotkey and Lua for mainly getting the hang of why ComputerCraft was such a big deal, and maybe making some cool shit with it, like automating factories and whatnot.
I can't think of any good tutorials for Python. I know a bit about Lua but not that much.

If you are completely new to Programming I say to watch an hour or two of the Havard Course on Python Programming and then just build something. I never have watched it but it is probably pretty decent, and it is video so it might better than reading.
You need to know about strings, variables, lists/arrays, functions, user defined functions fstrings and finally for loops. That should be enough to do scripting stuff if you are fine with using fstrings and then eval/exec in a for loop.

If you are not new to programming then Python is incredibly simple to learn and all you have to do is just find a tutorial of something you want to build online and it will be quite easy.

I'd skip the OOP stuff like classes and I'd also skip recursion until you have a decent understanding of what is going on and are somewhat familiar.

Once you have the basics of programming like what is a variable? what is a string? what is a type? what is a function? what is a for loop? what is a list? what is an fstring? you can also do Lua.

Lua is exactly the same as Python in that it also really easy to learn once you have just a bit of programming knowledge.

Also this is the cursed advice and it is incredibly dangerous to do this but it will also work well if you use it right. Chat GPT is actually somewhat decent to ask questions too. DON'T get it to write your code exactly for you ask it for an example of something similar in order to teach you. It can also straight up hallucinate so be careful in completely trusting everything it says. You should treat it like a search engine that is usually right but not always.

That's basically how I use it.

Don't use copilot you don't want something that will write code for you since it writes really shitty code. Chat GPT will actually explain shit to you.
I still think it's useful for novice programmers to get a sense of O(n) notation. You can ignore all the math behind it and just try to grasp it intuitively.
That stuff is eventually useful yeah, but for a new guy that is way too much theory. I think a lot the advice here is bad for that reason. If someone is new don't send them to the docs or something. They need to learn the very basics. I don't think CS students learn about O(n) notation until later in their college career and they just do something like Python or Java for a reason.
Yes I am playing GregTech Community Pack right now, my first experience with GregTech, and unfortunately I'm enjoying it, I'm near the end of HV era with an ME system in place. It's a painful slog but a fun kind of slog.

But anyway, the topic is not GregTech but how to learn Lua, that's a bit more ubiquitous than ComputerCraft and GregTech, I brought up CC just as a practical example.
Noice. I used to play Gregblock and I made it until the fusion reactor. I was sad because the second silicon recipe broke and I had to cheat that item in. I could never figure out how the next versions of the Fusion reactors were able to work as I had Plasma and plasma turbines but the next tiers required certain types of Superconductors that you could only make with the higher tier stuff.

Minecraft is the only reason for Java to exist.
Legacy libraries needing to be updated for new minor versions of Python because they didn't really make await and async keywords until like 3.5 :story:
This is actually the worst thing about Python. It is also why I like Pycharm so much despite having to pay for it. It is plug and play and I don't have to go through the hassle of setting something up and it just works. If what I am working on needs to be set up to run in Production I can set up something in Linux and just install the right Python version once I know it works.

People complain about it being slow but that is fine for nearly all cases where you'd want to use it. But the Python 2/3 thing and there being like 10 different versions with a clusterfuck of trying to get different stuff to work together is actually the worst part.
 
Also a "Technical Q&A" for getting help with problems and requesting code reviews, using those tag thingies you see in certain other boards? I don't know.

The forum already has a Q&A board with answer-voting functionality like StackOverflow, but it's mostly used for shitposting. Null could theoretically make a copy named "Programming questions" or KiwiOverflow, and have it going pretty quickly. I don't know how many people would use it though, and considering who we share the site with and how the main Q&A board is used, it might get quickly trolled into uselessness.

Major upgrades to the site are probably on hold until Null gets more free time and/or gets Sneedforo running. The above idea could be a useful test run, since all the other functionality like code formatting and thread tags already exist.
 
I wanted to expand on this to add that efficiency is relative. As yats was getting at, it entirely depends on what you're doing. If you're writing something where performance matters, then you don't want automatic garbage collection, because it will steal performance from you and there's little you can do about it. Games are the canonical example here with Minecraft as the "exception that makes the rule" (yes, I know they later rewrote it). On the other hand, if performance doesn't matter, then you do want automatic garbage collection, because debugging the manual memory management code you wrote is inefficient in that case (because you're wasting your time).
I don't know about games being a canonical example of performance requirements. They require low latency, whereas most high performance software requires high throughput. GCs typically focus on the latter, and lots of high performance software doesn't perform any allocations during performance-critical parts in the first place so the distinction is moot. On a semi-related note, being garbage-collected and allowing you to write allocationless code are orthogonal properties, even if many language implementations with GC don't allow it.
 
You guys know any good, simple and engaging tutorial sets for getting the hang of Python and Lua my autistic ADHD ass won't drop in a few minutes?
I learned Python so long ago I can't really recommend any tutorial material that I've used directly but Automate the Boring Stuff with Python is supposed to be pretty good.
 
  • Agree
Reactions: Jarry Seinfelt
In C++, for example, assuming you planned your project well, you can rewrite the slow parts of it with lower-level, C-like memory management, for example, to get better performance.

Even if you didn't, it's usually pretty obvious by using a profiler where the problems are and how to fix them. Underlying data structures aren't opaque at all. Unless your problem came from using std::pair, or a million other esoteric C++ things that compilers can't optimize due to type-trait bullshit.
 
  • Horrifying
Reactions: polyester
You guys know any good, simple and engaging tutorial sets for getting the hang of Python and Lua my autistic ADHD ass won't drop in a few minutes? I kinda want to learn both, Python for writing up quick dumb shit as an alternative or addition to Batch and AutoHotkey and Lua for mainly getting the hang of why ComputerCraft was such a big deal, and maybe making some cool shit with it, like automating factories and whatnot.
If I could learn Python in a couple nights as a kid with this tutorial years ago, anyone can. Each video is 5mins.
Just make sure to install Python 3 instead of Python 2, and use "print" with brackets.
Code:
So instead of:
print "test"

Do:
print("test")
And use "input" instead of "raw_input" too.
 
Last edited:
Back