Programming thread

Don't mistake those that spend 24/7 playing with libs or masturbating on message boards about monads, to those who work and actually have deadlines to meet.
I think it's cool if a language caters to both. I want industry folk to be able to get stuff done without running into some massive roadblock that your abstractions put in the way. And I want library designers working outside of deadlines being able to explore new ideas in a language that, once battle tested, can be used by those same industry folk. Every company I've worked for has been the beneficiary of libraries that try to push the envelope on what a language can do.

As for academics, Scala was created by one. He had previously created Java generics with the help of Phil Wadler, one of the creators of Haskell. He worked to introduce higher-kinded types into Scala, and Haskell was used as a reference for this. He also worked to introduce typeclasses. The first line of the abstract of that paper is:

"Type classes were originally developed in Haskell as a disciplined alternative to ad-hoc polymorphism."

Guess what? Once you've got higher-kinded types and typeclasses, you've got the primary abstraction mechanisms of Haskell. That's why Haskell people show up in HN comment threads going "where are your higher-kinds?" It's because once they have them, they know they can use your language to write Haskell.

Scala doesn't steer away Haskell programmers. It explicitly tried to appeal to them, and they showed up.

And if you want to complain about masturbatory over-abstraction, let's just remember that Scala's standard library is absolutely full of it. Here's what the type signature for the map function on lists used to be:

Code:
map[B, That](f: A => B)(implicit bf: CanBuildFrom[List[A], B, That]): That
Whatever crimes Haskell programmers have committed, they have never done anything as masturbatory as this to their beloved map function, especially not for the spurious idea that one might want to generically obtain a more efficient data structure as a result of a map operation. The signature was considered so rightly appalling that it had to be systematically covered up in API documentation, lest newbies be terrified by what should be the simplest function in functional programming.

And this isn't close to the worst of it

Scala's problem isn't that Haskell people infected it. They were invited, and a precedent had already been set for a lot of type abstraction. Scala's problem is that it tries to shove a whole bunch of very powerful but non-orthogonal concepts into a language, and the end result is that there are usually far too many ways to solve a problem, most of them are bad, and any abstraction inevitably leaks or is horribly non-ergonomic.

Now F# has a similar story to Scala, in that the guy behind it was also an academic with a background in typed functional programming and was responsible for generics on the CLR, but F# is infinitely more opinionated than Scala, has simple abstractions, working type-inference, gives priority to functional programming, and has mostly kept the Haskell people away. For another example, Kotlin does a decent job of marrying FP and OOP, by not trying to be anywhere near as advanced as traditional typed FP languages. Its abstractions are simple, and they mostly keep the Haskell people away. If you want a simple amicable relationship between typed FP and Simula style OOP, I recommend Kotlin if you're on the JVM, and F# on the CLR.
 
Last edited:
As for academics, Scala was created by one. He had previously created Java generics with the help of Phil Wadler, one of the creators of Haskell. He worked to introduce higher-kinded types into Scala, and Haskell was used as a reference for this. He also worked to introduce typeclasses. The first line of the abstract of that paper is:
You seem to be overshooting what I said between the idealism of academics and the pragmatic realities of working in tech. I have no problem with people remaining 'pure' what whatever ideology they follow within the parameters of their own work, own libraries, such that they provide an easy to use API which is readable and rational. And I love Haskell. I love getting my brain absolutely fucked by a language written by an alien.

The problem is when that ideology breaks down, and every problem looks like a nail to a hammer. I used Scala as an example because its flexible enough and has enough libs where you can go go full scalaz, akka, etc, if you want, but you also have simple and pragmatic ways of solving a problem. I've been on both sides of the spectrum (pun) and oscillate between the two extremes, but I'm mostly speaking about the individual writing a simple CRUD app, but obsessively trying to curry every function. I've seen complete and total module rewrites because a function needed an optional parameter.

If you look at the most successful applications in history, a majority of them aren't pretty and contain hacks upon hacks. Those that obsess over such things bring the company no money, no business value, besides placating their algorithmic fascination and sink projects.

But I'm not advocating for shit code, obviously, but you need to know when to fold 'em and when to hold 'em.

A great example of this was during the "Is TDD dead" debate in 2014. When DHH was going over the difficulties of unit testing certain parts of the rails, and Fowler presented this creative, but batshit insane architecture of indirection, command patterns, chain-of-responsibilities, and composition.

I believe it was in part 3, but it's been a while since I listened to it (recommend all parts if you've never seen it).
 
Been working on a programming language with a friend of mine. We're not relying on libc for our stdlib*, so I've written a relatively naive memcpy here. In the best case it's a little bit slower than musl's memcpy, but depending on dest's and src's alignment it can take twice as long to copy. Right now there are some pain points in the language here, like how we only have while loops, and the casting stuff in that hygienic macro is uncomfortable.

Code:
## into [dest] copy [len] bytes from [src]
memcpy_bytes :: (dest: *void, len: u64, src: *void)
{
    if: (dest == null) || (src == null) { abort(); }
    if: len == 0 { return; }

    strides := len / #sizeof_t(u4096);
    len     %=       #sizeof_t(u4096);

    i: u64 = 0;
    while: i != strides
    {
        chunkcpy{u4096}(dest, src);
        ++i;
    }

    if_chunkcpy{u2048}(dest, len, src);
    if_chunkcpy{u1024}(dest, len, src);
    if_chunkcpy{ u512}(dest, len, src);
    if_chunkcpy{ u256}(dest, len, src);
    if_chunkcpy{ u128}(dest, len, src);
    if_chunkcpy{  u64}(dest, len, src);
    if_chunkcpy{  u32}(dest, len, src);
    if_chunkcpy{  u16}(dest, len, src);
    if_chunkcpy{   u8}(dest, len, src);

    chunkcpy :: #macro: (dest: ---, src: ---) T
    {
        ## these will always be name refs, so this is fine
        #cast(*T, dest) @ 0 = #cast(*T, src) @ 0;

        ## how i want to do this
        ##cast(*T, dest) @+= 1;
        ##cast(*T,  src) @+= 1;

        ## how i have to do this right now. it's cancer
        dest_t := #cast(**T, &dest);
         src_t := #cast(**T, & src);

        dest_t @ 0 @+= 1;
         src_t @ 0 @+= 1;
    };

    if_chunkcpy :: #macro: (dest: ---, len: u64, src: ---) T
    {
        ## these will always be name refs, so this is fine
        if: len >= #sizeof_t(T)
        {
            chunkcpy{T}(dest, src);
            len -= #sizeof_t(T);
        }
    };
};

One of the notable things about this language is that no operators are defined by the language itself, and you can define whatever operators you like. We're pushing as much stuff into userland as we can so the language will be flexible. Long term this means that things like if statements, loop statements, switch statements, etc. will also be defined in userland. The idea here is that you can define your own dialect of the language, or use a library that implements a dialect that you like. Additionally, the scoping rules mean that you can define, or redefine, the language without trampling on code that lives in different scope.

An example of defining my own operator can be seen in this code snippet from a SHA-3 implementation I wrote:

Code:
slice_shuffle :: #macro: (state: u64, last: u64, shift: u64) -> u64
{
    ## intentionally uses state twice to avoid passing state as a ptr
    out := state;
    state = last <<|<< shift;
    return out;
};

I've defined rotate left as `<<|<<`, and instead of using the usual rotate idiom I used LLVM's funnel shift left intrinsic:

Code:
`<<|<<` :: #macro: (a: +T, i: T) T -> T { return #llvm_fshl(a, a, i); };

Right now we're still in the prototype stage and the compiler is written in C++. It's a buggy piece of shit that barely implements a lot of the features that we want, but it's enough that we're starting to write non-trivial programs with it. We're hoping that sooner than later we can start working on the self-hosting compiler because the bootstrap compiler has a bunch of naive design decisions that are preventing us from doing everything that we want without rewriting large sections of it, and if we're going to rewrite it we'd rather put that effort into getting more experience with our language and testing our ideas.

*We're iffy on the idea of having a stdlib in the first place, but that's a long conversation. The summary is that we're leaning towards offering "common libraries" that are useful, but otherwise we won't define any libraries in the standard because that doesn't seem like a good use of our time.
 
That's kind of a vague question. Are you asking whether the implementation matches the description of the algorithm in the readme? If the code is efficient? If the style is good?

You're right, I'm sorry. I'm wondering about the first two things and also about extendability towards other uses. I'm about to relocate and am trying to make a mini-vacation of it, so I guess the question I'm trying to answer is if it would be worth the time wrapping my brain around this. I have extensive experience with VBA (it suuuuucks), some proficiency in R and have looked in C(++)'s general direction if that helps.
 
  • Horrifying
Reactions: Strange Looking Dog
I'd like to see what WFC's behavior is when applied to one dimensional data. My interest here is applying it to input text to see if it can loosely infer grammar rules, which could be useful for fuzzing programming languages. http://www.vegardno.net/2018/06/compiler-fuzzing.html talks about a similar concept where you define a looser version of your language's grammar in order to explore more of your compiler's execution space than what's formally defined by your grammar. WFC could be a robust analysis tool for language development.
 
  • Like
Reactions: Strange Looking Dog
Not only that, but those that beat off to algorithmic elegance have never had to develop a real-world, multidevice, multiplatform solution.

The code in those applications can be horrifying, because the real world is messy. Irreducible complexity is a fact in the digital world, just as in nature.

My favorite example of this is the original Netscape code. It was full of dragons, almost incomprehensible, and had numerous hacks after hacks to fix other hacks. But guess what...what junior developers don't understand is those 'hacks' are business value. Each one fixes a bug or some strange set of circumstances to cause instability.

Then some jerk-off tech lead comes on the project and convinces management of a overhaul or major rewrite, only to sink their product because the competition wasn't rewriting code that already worked (warts and all).


These days you are competing against some cheap Indian contractor who just bangs out shit code to meet deadlines.

You wanna think about nicities like design, reuse, architecture etc you wont be in a job for long because all that the Project Manager cares about is hitting that deadline.

Sure your Engineering boss will bang on about the benefits of reuse etc, but he's not really the one with any power.
 
  • Agree
Reactions: Besachf Jhakut
wow I didn't expect to see Bob Harper here lol

did I post these yet? I probably did but if I ever enter academia I will have a page like this

View attachment 1964508

View attachment 1964510


View attachment 1964506
This ↓ so much.
You have discovered an open secrete of the industry: real programmers don't like haircuts.
If you don't look like a washed up John Romero impersonator (Carmack is sometimes fine too) you're in the wrong industry.
changingTheFaceOfCoding.jpg

In which case I hope you contract a contagious terminal disease.
 
This ↓ so much.

If you don't look like a washed up John Romero impersonator (Carmack is sometimes fine too) you're in the wrong industry.
View attachment 2015353
In which case I hope you contract a contagious terminal disease.
Sorry no. I was thinking of Stallman and Ivan Godard.
Ivan Godard said:
Did you know you could overload the comma operator in C++?
 
If you've ever used Boost libs with C++, the first time you see the 'operator()()' and 'operator,' overload there's a doubletake. But it makes some statements a bit more elegant.

overwise, I can't imagine any other situations where you'd want to overload a comma though.

If anyone here is familiar with objective-c, there's even weirder things you can do, since every function is a k,v message to objc_msgSend
 
If anyone here is familiar with objective-c, there's even weirder things you can do, since every function is a k,v message to objc_msgSend
I presume much of that weirdness is inherited from Smalltalk. No, I should put it another way: Obj-C is probably less weird that Smalltalk it is significantly inspired by.
 
  • Like
Reactions: Marvin
Can't believe I missed an opportunity to have an internet fight about Go while I was away. My gripes with it are also tied to my complaints about C.
I don't know how many of you are familiar with the claim C is not a low level language. Yes, it is closer to the metal, yes, you can inline assembly, but in the end, if it weren't for lots of fancy compiler magic C today would suck. We're in a state where compilers and processors have co-evolved to provide a facade of a serial machine while a modern processor is very far from it. So this all pretty much sucks and is why we have to live with stuff like heartbleed.
Now Go is a language designed by intelligent people for people less intelligent than themselves, which is guess is fine if you have a bunch of code monkeys that need to churn a lot of code out fast. It is so dead simple that there's a very low chance of "ugly hacks" being required to create business value. It's verbose and straightforward, has a clever compiler, and is an all-and-all soul draining experience to work with. It maintains the illusion of being low-level and system-programming like while being very unfit for either, both due to the limitations related to C, and because GC is a non started for system programming. It is suitable for churning out decently performing backend and CLI code up until the idiot-proof language will have to deal with a better idiot.
The main weaknesses I've seen:
  • Escape to heap. Someone will fuck it up, at which point your performance will tank hard
  • Empty interfaces will end up being passed around at which point you're back to void*
  • goroutines are badly designed. They reintroduce goto to your code, eliminate structure.
  • goroutines are not free no matter what anyone tells you. They are allocated with a minimal stack, which will grow at runtime which will force copying the stack which will fuck performance. Since you have no manual control over initial stack size this forces you to go back to using thread pools to ensure your virtual thread maintains a big enough stack.
The type system is also at a "gentleman's agreement" level.
At that point, I'd rather just use Java tbh. the JIT can give you better performance because it does runtime analysis. Maybe that's what you get when your language is designed by folks who are researchers first and engineers second. Which bring us neatly into the issues with Scala and F#, and other academically inclined languages. Rich Hickey noted that while being a member of a language design forum he overheard some designers mention "I don't think I've ever talked to a database".
If these are the people designing our languages no wonder we're fucked. Which is a point in favor of OCaml because Jane Street are investing in it instead of just crusty academics.
Static types are nice for large projects but they are like playing bowling the the guard rails up. They are a way to handle complexity, but not the only way. Data specs and immutability solve a lot of the same problems while staying in dynamic land. Yes, objects are faster than passing maps around. Structs are even faster. But that introduces nillability or Optionality which both suck. I've never had one Maybe<Apple> or nil Apple.
---
Going even further back to the question of where to start - either Smalltalk or Scheme. They are excellent pedagogical tools. Learn how to think before you learn how to wrangle with the insanity inducing machine of searching for a missing semi-colon.
 
At that point, I'd rather just use Java tbh. the JIT can give you better performance because it does runtime analysis. Maybe that's what you get when your language is designed by folks who are researchers first and engineers second.
Java's ridiculous type system is the unmistakable footprint of smelly research people. In fairness, they didn't have as many high-profile mistakes to learn from in 1994 as we do today - most notably Java itself.
 
Java's ridiculous type system is the unmistakable footprint of smelly research people. In fairness, they didn't have as many high-profile mistakes to learn from in 1994 as we do today - most notably Java itself.
I absolutely hate java due to it's verbosity and locking you into doing things in one very specific way.

I have to 'box' integers to pass them by reference? Are you fucking kidding me? I have NO control of stack even for tiny structures? How the fuck are you supposed to optimize then? == doesn't work on strings? Are you for real!?

Some of the complaints sound petty, but it's a lot of little concentration breaking things like that that add up to a truly miserable experience.

And re: lack of stack control. The standard defense is "Java doesn't need it because it smartly optimizes your code for you!" I have never seen a single java program in my life that was not a bafflingly slow resource hog. The smartest people in the world can't seem to make it run fast outside of benchmarks. What hope to common business coders have?
 
Back