Programming thread

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
The way I see it, the Turing "tape" setup feels more like a description of how the mathematical/natural concepts can be applied in practice (like with machines and registers and shit). Church's lambda calculus feels closer to a direct model of whatever natural universe properties make all this shit a thing. I'm a lil sleepy rn, so please forgive my lack of eloquence here lol.
turing machines form a very simple abstraction that is relatively close to our computers, and lambda calculus is a very simple abstraction that is the native instruction set of god's computers
 
Isn't that just the BSD license with (or perhaps without) extra steps
To quote wikipedia:

This happens to present the same dilemma about forcing users to use specific licenses when redistributing.

So to answer your question, it's effectively without extra steps. There are no steps for users except to enjoy the stuff freely given to them. Or they can hate it; it's their choice. I'm not their mama. I'm not nearly fat enough.
Just wanted to quickly follow this up with some new info I came across: There are some permissive license variants like 0-clause BSD and also notably the MIT No Attribution License that are considered to be effectively public domain.

In my experience, the main distinction between permissive and public domain licenses basically everywhere (Creative Commons versions vs CC0, for example) is attribution to the original author. In fact, of the common CC license types, CC0 is the only Creative Commons license variant that doesn't include attribution—though I'm sure you could create some weird hybrid most sites won't support properly like with music.
1763652418764.png
Along with this, there are often copyleft elements that retain an arguably heavy level of control over the copyright of the work (and thus, its users). Controlling how it may be used and distributed, exercising a degree of control over users to the questionable net-benefit of all. This, in essence, is the original distinction I made between Unlicense and BSD in the quotes.

Edit: BSD-3-Clause-No-Nuclear-Warranty is a fun one—no use in nuclear facilities. But if you can somehow use sockchat or firebird to make nukes, all the power to you.

Edit 2: The Anyone But Stefan Esser (ABSE) License is another fun one with an interesting and lengthy history. See here, here, and here (archive)—iirc it first appeared a while before Todesco's use here.

1763667613112.png
Forgive the weird overflowing text here. The site's CSS is done in a wacky way and I needed to fit all content on screen without a scrollbar (a scrollbar that remains regardless of window size :mad:).

The iOS jailbreaking community and the wider computer security scene's issues with Stefan Esser (i0n1c) are > a decade old and could make for decent thread content. I don't remember enough to do it justice in a summary rn, but it's textbook Internet drama.
 
Last edited:
cross-quoting this into the programming thread so as to not shit up the open source software community thread:
This is bit of an amateur/outsider question, but when they teach Software Engineering or Computer Science or whatever in university, do they sit the students down and plop a book of Grammar or Rhetoric in front of them? Is Aristotle or any other lesson in Formal Logic a requisite part of the degree?
i don't really know, but i'm leaning on "no". but we can all be sure that the bootcamp jeets certainly do not learn anything about any sort of philosophy, be it unix philosophy or aristotle philosophy
In my line of work (nothing software) knowing a bit of OR, AND, NOT, NAND etc. is part and parcel of the job and I've always just assumed that the various software languages are all necessarily different implementations of formal logic because of this. Is that even acknowledged in school though? I never hear references to philosophy within programming besides socio-politics.
yes and no. while programming languages are all different implementations of the same formal logic at a high level of abstraction, you will find that in the low-level details they can be pretty different. programming is about as much of an art as it is a science, because if it wasn't we would all still be using assembly or fortran for everything

more on the "unix philosophy": when "unix philosophy" is brought up, it refers to a style of software design that unix and its variants have been notorious for: a million little programs that do exactly 1 thing in the most retardedly simple manner possible, with which complex things can be readily constructed by gluing them together, using the program that's really good at gluing programs together and not much else
 
more on the "unix philosophy": when "unix philosophy" is brought up, it refers to a style of software design that unix and its variants have been notorious for: a million little programs that do exactly 1 thing in the most retardedly simple manner possible, with which complex things can be readily constructed by gluing them together, using the program that's really good at gluing programs together and not much else
Yeah... I can feel the whole corpus of programming theory edging into my life nowadays. At the moment I just sort of see it as a mix of Logic, Technique, and Textiles (use Wool++ for Cold Weather applications. If too rough, implement a layer of Polyester Script where it interacts with the Skin User's Interface).

The whole 'here's how to decide which programming language & grammar to automate a sequence and execute a program' thing is becoming important as I get older and plan more complicated dinner parties.
 
plop a book of Grammar or Rhetoric in front of them?
No.
Is Aristotle
No.
any other lesson in Formal Logic a requisite part of the degree?
Yes. We had a course that covered propositional logic and first-order logic. First-order logic was also touched upon in further courses like Logic-oriented programming and Introduction to knowledge-based systems.
 
cross-quoting this into the programming thread so as to not shit up the open source software community thread:

i don't really know, but i'm leaning on "no". but we can all be sure that the bootcamp jeets certainly do not learn anything about any sort of philosophy, be it unix philosophy or aristotle philosophy

yes and no. while programming languages are all different implementations of the same formal logic at a high level of abstraction, you will find that in the low-level details they can be pretty different. programming is about as much of an art as it is a science, because if it wasn't we would all still be using assembly or fortran for everything

more on the "unix philosophy": when "unix philosophy" is brought up, it refers to a style of software design that unix and its variants have been notorious for: a million little programs that do exactly 1 thing in the most retardedly simple manner possible, with which complex things can be readily constructed by gluing them together, using the program that's really good at gluing programs together and not much else
im lucky my Basics of Computer Science lecturer was so based
the first thing he did was ask us "what is 5?" and in general talked a lot about philosophy
pretty much everything he would talk about was something i had to think about back when i was starting out
 
what is 5?
in my personal favorite system of thought i think it would be λfx.f (f (f (f (f (x)))))
in general talked a lot about philosophy
i don't much care for philosophy but i don't like niggers who go around never thinking about why they're doing some incredibly niggerlicious shit
 
in my personal favorite system of thought i think it would be λfx.f (f (f (f (f (x)))))

i don't much care for philosophy but i don't like niggers who go around never thinking about why they're doing some incredibly niggerlicious shit
yeah but he had to explain to the niggercattle (my fellow students) to disassociate the squiggle that is 5 from the concept it represents
that an application of a function is really just the same thing as a value conceptually at the end of the day
he brought up socrates saying that he thought that there exists something like THE five or THE human, while aristotle disagreed with it
another time he asked us "what does it mean to understand?", if you were to ask a regular person "do you understand gravity?", the person would be like "duh? i throw thing up the thing comes down", but if you were to ask a theoretical physicist "do you understand gravity?" the guy would be like "hell nah"

although i disagree with him on many philosophical issues i treasure that he tried to explain the programmers way of thought to the cattle even if they might've not understood it at all.
in general he explained everything extraordinarily well, and i do wish other lecturers were like him
its a real shame this week was his last lecture with us
 
We had a course that covered propositional logic and first-order logic.
he tried to explain the programmers way of thought
I can understand that if you're A Programmer going to school to learn Programming that you're being taught how to use tools - not how to design them. It's a Trade and all skilled trades basically function like that. If a tradesman is designing, it's usually something of a lower order than his own tools. (Unless he's a Master.)

But higher up the food chain is The Engineer - who's job is to understand the fundamentals well enough to design a new thing from first principles. In this case a new software language or a refinement of an existing language. I would hope/expect that guy to basically be taught the old latin Trivium because he needs to know the rigors of a good and concise argument for his software logic. He needs to pick a philosophy around Objects, Categories, Ground etcetera to build his software around, the way an Orator uses them to build an argument. Ideally all the way from Binary Code, through Logic Gates, up to Logic Arguments and so forth, to build programs.

What I'm hearing is: 'ain't nobody calling themselves a Software Engineer has learnt that shit for fifty years'?
 
I'm pretty sure we didn't have classical philosophy. But we did have computing history, logic and mathematical proofs, also specifically system design course.
I am currently taking a philosophy course and during the first semester I haven't learned anything new yet.
I think a lot of the core concepts are being taught, but not really in the context of greek philosophers but directly in the context of computing.
 
IMO it should be a major part of computer science courses. Leave dedicated code-monkeying to software engineering ones instead.
I ended up deciding against doing CS for this reason. Though I think you can specialize more in that kind of thing in postgrad, and I guess its normal for bachelors degrees to be very broad.
 
I can understand that if you're A Programmer going to school to learn Programming that you're being taught how to use tools - not how to design them. It's a Trade and all skilled trades basically function like that. If a tradesman is designing, it's usually something of a lower order than his own tools. (Unless he's a Master.)

But higher up the food chain is The Engineer - who's job is to understand the fundamentals well enough to design a new thing from first principles. In this case a new software language or a refinement of an existing language. I would hope/expect that guy to basically be taught the old latin Trivium because he needs to know the rigors of a good and concise argument for his software logic. He needs to pick a philosophy around Objects, Categories, Ground etcetera to build his software around, the way an Orator uses them to build an argument. Ideally all the way from Binary Code, through Logic Gates, up to Logic Arguments and so forth, to build programs.

What I'm hearing is: 'ain't nobody calling themselves a Software Engineer has learnt that shit for fifty years'?
How does knowing philosophy help you implement a programming language?
 
I am currently taking a philosophy course and during the first semester I haven't learned anything new yet.
Out of curiosity, which schools of thought/dudes?
I think a lot of the core concepts are being taught, but not really in the context of greek philosophers but directly in the context of computing.
Yeah it's fine if they deracinate it a bit. Though personally I'd always prefer to know the memeology of way of thinking so I can trace back to its roots and find out if the guy was a fruit-eater or not.
 
Out of curiosity, which schools of thought/dudes?
It is just the first semester, so we are mostly just discussing the basics - the different conceptions of philosophy and the basics of onthology, epistemology and ethics. For each we get a few viewpoints from notable philosophers, but the different schools of thoughts are more marginal for now. It's really more about laying the foundation for now, if from multiple points of view.
 
yeah but he had to explain to the niggercattle (my fellow students) to disassociate the squiggle that is 5 from the concept it represents
although i disagree with him on many philosophical issues i treasure that he tried to explain the programmers way of thought to the cattle even if they might've not understood it at all.
oops i forgot that there are a bunch of humanlike creatures (i wouldn't call them humans exactly) that are completely incapable of separating names from the concepts they represent
another time he asked us "what does it mean to understand?", if you were to ask a regular person "do you understand gravity?", the person would be like "duh? i throw thing up the thing comes down", but if you were to ask a theoretical physicist "do you understand gravity?" the guy would be like "hell nah"
understand can mean many things and the only way to know is through context
ask a theoretical physicist if he understands gravity in a very non-theoretical-physics context and you might get a "yeah thing goes up thing comes down, except if you're trying to fuck with a bunch of shit people have been studying since the mid 1900s"
that an application of a function is really just the same thing as a value conceptually at the end of the day
only from an abstract modeling context, of course. although "sneed" and (bruteforce-sha256 #x0899b7238080d2faf7cbcd1f6f5beac9c04a8d193e160d1b48b786f625930cc3) are conceptually identical one of them makes the box get way warmer

Though personally I'd always prefer to know the memeology of way of thinking so I can trace back to its roots and find out if the guy was a fruit-eater or not.
of course if something is right it's right, doesn't matter how retarded the person who said it was
 
View attachment 8105928
Reattached because of how wrong it is. Bottom case doesn't fail. There's no segmentation fault if the pointer goes to a char*. You just get the first {32,64,...} bits interpreted as an "int". You have to eg. inject an int as a char* to get a segfault. But then even void* doesn't resolve it. It breaks my brain to think how someone could be this wrong. There's more here too that I'm not mentioning. Not gonna type it all out. I'm just astounded at human stupidity. Flummoxed, even.
Very often char* points to a string literal such as "dGhlIGwhatever".

That string literal gets placed in the .rodata section of the process image which is mapped read-only. Try mutating it and you *do* get a segfault.
 
Back
Top Bottom