Programming thread

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
Buy an ATMega386P, best microcontroller, lots of fun to play with and program in assembly.

Or consider an FPGA/CPLD, lots of flexibility there to pretty much make your own (slow) processor. Anyone here ever work with HDLs/PLDs?
 
Or consider an FPGA/CPLD, lots of flexibility there to pretty much make your own (slow) processor. Anyone here ever work with HDLs/PLDs?
i've thought about fpgas a couple times but it seems like the software to compile verilog and program them is always a really horrible 100gb proprietary retard ide
i want to just use libsomefpga from c or a c ffi to program an fpga with some compiled verilog (see: vulkan)
 
i've thought about fpgas a couple times but it seems like the software to compile verilog and program them is always a really horrible 100gb proprietary retard ide
i want to just use libsomefpga from c or a c ffi to program an fpga with some compiled verilog (see: vulkan)
They are sadly almost always 100GB, that's right. Most of that is the place and route tools because they need timing and connectivity databases the hundreds of thousands of nodes of hundreds of parts for that one vendor.

I recommend using VHDL since it's easier to understand what you're doing.
Verilog is like Python in that anything can be anything.
 
They are sadly almost always 100GB, that's right. Most of that is the place and route tools because they need timing and connectivity databases the hundreds of thousands of nodes of hundreds of parts for that one vendor.
it would be very nice if it was all split up, ala vulkan and spir-v and all that
just install the compiler and the compiler database for the single card you have
also i think such databases could be made smaller because you know these guys aren't prioritizing small code size
also you could split the compiler and its compiled output so when somebody wants to run the fpga they only need a driver that can send the finished program to the fpga and then talk to it

it would be very nice if these devices were a more mainstream, well-supported part of computing like gpus are
i'll probably fuck around in gpu land before i go to fpgas because the gpu can do a lot of shit
 
i'll probably fuck around in gpu land before i go to fpgas because the gpu can do a lot of shit
FPGAs (generally speaking PLDs) are an old field. You already have a GPU though, whereas you'd have to buy a devkit for an FPGA.
The most popular trainer for the world of FPGAs is the Blue Digilent Basys3 board, uses the Artix 7 and offers a really convenient platform for learning.

However Xilinx, the vendor for the 7 series FPGAs and coincidentally the first popular PLD manufacturer, is one of the worst offenders for massive IDEs. However their stuff is completely free for individuals, unlike others who have ridiculous licensing systems.
Ironically, you can turn the FPGA into a GPU on the Basys3 board because it has a VGA out.

FPGAs are extremely mainstream but you never see them, almost every car made after 2014 has a few of them.
 
their stuff is completely free for individuals
sounds like it's not quite as free as i want
Ironically, you can turn the FPGA into a GPU on the Basys3 board because it has a VGA out.
i think of gpus as very large simd processors that just so happen to have rasterization hardware and display outputs for historical reasons
unlike others who have ridiculous licensing systems
FPGAs are extremely mainstream but you never see them, almost every car made after 2014 has a few of them.
reason i hate proprietary software #17634915: i can't use an fpga like almost any other device on linux
 
SIMD is the best use for FPGAs
my fear is what happens when you write a program for the fpga and then hypothetically want other people to run your code
"yeah let me run to the store and grab this really weird coprocessor and install it into my computer and download the 100gb toolkit" - the 4-time winner of the World's Most Dedicated User Awards, and absolutely nobody else
meanwhile in gpu land we got vulkan 1.x natively supported on the shittiest android phones known to man
the fpga situation is just sad (:_(
 
Quartus isn't too bad at being bloated as long as you install the necessary drivers for the Fpgas you have then it shouldn't take to much space. Now as for Xilinx, yeah vivado is pretty bloated even when you don't just install the necessary drivers for the fpgas you need. Now in regards to languages i prefer systemverilog for most cases if i'm gonna do a project with an Fpga board, but Vhdl is better when it comes to fixed point arithematic from my research.
 
I love learning about how the CPU works, so much of what we do in programming is kind of conceptual with design patterns and abstractions and the like, but at the end of the day, there's a really intensely engineered little chip that you may ignore at your own peril. Finding out how to write code that works at a high level while also functioning appropriately when the rubber hits the road is a never ending process.
If you haven't heard:
A couple weeks back I decided it's time I take the plunge and actually learn how to program.
I've written simple scripts in the past, and anything else I've needed I've been able to get chatgpt to piece something together for me, but I've gotten to the point where I don't want to have to use chatgpt.
I don't want to be a vibe coder, I want to actually know how to write professional code that will actually do the things I want it to do.

I had a couple classes in college for web dev, but other than that my experience is primarily networking and cloud stuff.

I decided to start off with learning python and I'm going through the code academy course for it now, but my concern is once I finish the course, what do I do from there?
I'm 3/4 the way through the course now and the main thing I've realized is that once I finish I will still be far from what I'd consider competent at Python.
There are so many coding principles and practices I feel like I'll still need to learn, use of libraries, and just other things that I'm not really sure how to explain as other than computer science knowledge.

I guess I'm asking, how do I continue my learning in a way that will actually help me learn? I feel like there are currently a lot of "unknown" unknowns for me, and I'd like some advice on how what to do about that.
I'm gradually working through Automate the Boring Stuff with Python, not because I really need the material, but to determine whether I can recommend it to n00bs. I restarted with the third edition, which seems to have made some improvements, but the author still makes the peculiar choice of using camelCase instead of snake_case where the vast majority of Python programmers would use the latter (though Python typically does use PascalCase for classes). Just ignore when the author does that and the poopy faggot flag on his website and it's actually quite a good book so far. (I'm now in chapter 10.) The introduction does a lot to answer why you would want to be a programmer:

"'You've just done in two hours what it takes the three of us two days to do.' My college roommate was working at a retail electronics store in the early 2000s. Occasionally, the store would receive a spreadsheet of thousands of product prices from other stores. A team of three employees would print the spreadsheet onto a thick stack of paper and split it among themselves. For each product price, they would look up their store’s price and note all the products that their competitors sold for less. It usually took a couple of days.

'You know, I could write a program to do that if you have the original file for the printouts,' my roommate told them when he saw them sitting on the floor with papers scattered and stacked all around.

After a couple of hours, he had a short program that read a competitor's price from a file, found the product in the store's database, and noted whether the competitor was cheaper. He was still new to programming, so he spent most of his time looking up documentation in a programming book. The actual program took only a few seconds to run. My roommate and his co-workers took an extra-long lunch that day.

This is the power of computer programming. A computer is like a Swiss Army knife with tools for countless tasks. Many people spend hours clicking and typing to perform repetitive tasks, unaware that the machine they’re using could do their job in seconds if they gave it the right instructions."

^ That actually goes to why I don't like using the open source GIS software tool QGIS, not because it's a horrible piece of garbage, far from it, but because using a zillion toolbars and panels and dialogue boxes like a normal person has become very tedious to me. It's like this:
calvin-and-hobbes-spaceman-spiff-computer.webp
Instead I opted to start to learn how to carry out geospatial tasks in Python and while a) for certain tasks a GUI like QGIS or ArcGIS is indispensable and b) QGIS (and ArcGIS) themselves support scripting in Python, I found just diving in with Python libraries like GeoPandas was far more efficient for my purposes.
python courses are notorious for not teaching you any higher level concept than just how to use python
You get out what you put in. I'm willing to bet there is far more information on learning data structures and algorithms out there in dedicated street shitting language Java and even Python than what is available for Scheme or other Lisps and, on that note, AI: A Modern Approach switched over from Common Lisp to Python, and though there has been some grousing about that decision I'd be surprised if anyone made a compelling case that the text has thereby been crippled
i think stuff like scheme is a bit less distracting because it has way less features
That's also how it got so balkanized. Why should a beginner have to find out how whatever Scheme they're using implements dictionaries if it does at all?
i would always like a newbie to do things themself and not copy paste because copying and pasting is not conducive to learning
Yes, I remember being told to write stuff down in a way that seemed pointless in middle school history class of all places and the teacher mentioned there is actual pedagogy behind what seemed like busy work at the time. I later read something to that effect which would explain why I never feel like I learn anything after copy/pasting code.
 
the author still makes the peculiar choice of using camelCase instead of snake_case where the vast majority of Python programmers would use the latter
the only good case is kebab-case but it's good to use the local convention in languages with braindead infix syntaxes that don't let you use actually good identifiers because somebody thinks the stylistic mistake a-b should be treated as a subtraction and not an identifier (s-expression languages sidestep the issue by not having any infix operators at all)
I'm willing to bet there is far more information on learning data structures and algorithms out there in dedicated street shitting language Java and even Python than what is available for Scheme or other Lisps
yeah far more information but does it really matter which language it is in
a certain very famous algorithms textbook is based on the author's fantasy assembler
AI: A Modern Approach switched over from Common Lisp to Python, and though there has been some grousing about that decision I'd be surprised if anyone made a compelling case that the text has thereby been crippled
i've never heard of this, might be a sicp javascript edition deal where people will read the edition from before somebody had the bright idea to get rid of all those pesky parentheses
of course it is a textbook about symbolic ai and stuff iirc and python probably has the abstraction techniques to support that kind of stuff
i think sicp js gets shit on because js is really quite schemelike and afaik they somehow managed to fuck it up
never trust a js nigger to ever get anything right though
That's also how it got so balkanized. Why should a beginner have to find out how whatever Scheme they're using implements dictionaries if it does at all?
you would think this but every scheme in serious use supports srfi-69 unless it's r6rs which has its own hash tables (and even then certain r6rs implementations have srfi-69 too)
also association lists are omnipresent, even though they have linear computational complexity, which makes them a bad idea for huge dictionaries (they still work though)

side note: if you keep disrespecting the good name of scheme i will have no choice but to bully you until you either leave the thread or learn the proper heirarchy of programming languages
 
the only good case is kebab-case
R is especially cursed and allows periods as separators
a certain very famous algorithms textbook is based on the author's fantasy assembler
I had to think about this one for a minute
i've never heard of this, might be a sicp javascript edition deal where people will read the edition from before somebody had the bright idea to get rid of all those pesky parentheses
The switch came with the 3rd edition back in 2010 when there wasn't the same type of faggotry there is today and I believe the main reason was that Python more closely resembled the pseudocode in the book
you would think this but every scheme in serious use supports srfi-69 unless it's r6rs which has its own hash tables (and even then certain r6rs implementations have srfi-69 too)
SRFI-125 does more:
Python can do virtually everything here easily AFAICT with the exception of the set operations which could be implemented with only a little more difficulty using the .keys() method and Python set objects. I definitely don't have an easy answer for hash-table-pop! but I have to wonder how greatly desired popping an "arbitrary" key-value pair (Python makes you specify) really is.
 
I'm gradually working through Automate the Boring Stuff with Python, not because I really need the material, but to determine whether I can recommend it to n00bs
I have an edition of that, but it’s been a while since I cracked it open. I’ve had decent luck with the publisher and other books though, and I don’t remember it striking me as bad
 
Quartus isn't too bad at being bloated as long as you install the necessary drivers for the Fpgas you have then it shouldn't take to much space. Now as for Xilinx, yeah vivado is pretty bloated even when you don't just install the necessary drivers for the fpgas you need. Now in regards to languages i prefer systemverilog for most cases if i'm gonna do a project with an Fpga board, but Vhdl is better when it comes to fixed point arithematic from my research.
The trouble is that anything Intel touches is a nightmare to work with when it comes to licencing, not to say they aren't a good option.

However Intel is being forced to divest Altera so some good might come out of it. Though this unfortunately resulted in them getting Indian management as opposed to Jewish via Intel.

There really isn't any truly good PLD vendors, they are not very mainstream in the hobby market, but they easily could be and that would spur on OSS tools.
I suspect, despite their bloat, AMD(Chinky slavemasters of Xilinx as of 2023) would make things easy for open source hobbyists to use their hardware, just as they have in the world of PC hardware.
 
Last edited:
I have an edition of that, but it’s been a while since I cracked it open. I’ve had decent luck with the publisher and other books though, and I don’t remember it striking me as bad
Oh yeah ... literally all of these are "read online for free" and not merely shadow library-free (though that works for me too):
There was something about "linear" complexity mentioned earlier. I don't think any of the books on that list go into much depth about such topics, but this one does:
However Intel is being forced to divest Altera so some good might come out of it. Though this unfortunately resulted in them getting Indian management as opposed to Jewish via Intel.
le-happy-merchant-intel-vs-amd-montage.webp
 
Do jeets just apply the same lifestyle of filth and chaos to their coding that they do to their physical homes?
there is no question about it, have you seen the recent improvements to microsoft windows
R is especially cursed and allows periods as separators
a valid scheme identifier: (maybe not according to the standard due to ^ being illegal or something but it works in at least 1 implementation, do not try this at home)
Code:
> (define |!@$%^&* sneed chuckman™| 4)
> |!@$%^&* sneed chuckman™|
4
it also allows periods of course
the pipe characters are there to allow the symbol to include spaces
this is the advanced hash table library which isn't needed all of the time
probably a good idea to put it into any implementation though
I definitely don't have an easy answer for hash-table-pop! but I have to wonder how greatly desired popping an "arbitrary" key-value pair (Python makes you specify) really is.
given this is scheme they probably added it for a very good reason
scheme is not very shy about leaving things open to implementer discretion
The trouble is that anything Intel touches is a nightmare to work with when it comes to licencing, not to say they aren't a good option.
idk their graphics drivers are not horrible
they probably still clutch onto anything they can hold on to and that includes fpgas so we will never have an unretarded fpga stack
 
This was my first thought, think of something, and make it, but my main concern is, for the things I don't know, do I need to dig into the documentation to truly learn them? Or am I going to be lazy and just have chatgpt walk me through it, and if I do that am I really learning? Is it even gunna tell me the right way to do it? This is all probably just cope, but still.
Python has a lot of ways to do things so ChatGPT may say one thing that might make sense for a certain use case but not another. It may also say something that doesn't even work. It can be helpful for quick documentation reference but it can also make stuff up so it's always best to refer to the documentation directly.

Do "best practice" but that's always a guideline. Use docstrings and type hinting. Keep in mind how Python manages data. For example, list comprehension is faster than traditional looping but can get ugly fast. The various math libraries people like to use have nuances you just have to learn about. Sometimes a numpy array is better, sometimes it isn't and a slice of an array is not a copy (of references) like the slice of a list.
Whatever "best practice" you decide on for a project, apply it to the entire project. Consistency matters most.

You'll see people do all sorts of stuff because people from various backgrounds use Python in different ways. Data science people rely heavily on libraries to be efficient because what they write often isn't.
The introduction does a lot to answer why you would want to be a programmer:
Reminiscent of an example Brian Kernighan used at the beginning of one his lectures on language design. He then shows an example of a C program used compared to an AWK program. It's also nice to hear one of the creators of C admit even he struggles with some of the basic C functions sometimes.

Timestamped

1755196892869.webp
1755196963303.webp
 
Last edited:
Back
Top Bottom