Programming thread

  • 🔧 Actively working on site again.
perhaps try separating the logging logic out from the program logic when possible
e.g. you have some code that runs other code and it is in charge of logging statistics and any thrown exceptions, and the inner code does not care about logging at all, except for maybe returning some extra statistics that could be logged
After reading this and thinking through the problem, I've realized that, at least for non-trace-level logging, I should invert the dependency, and expose the logger as a wrapper... wait a second, is this a monadic pattern!?!?

Code:
foo = function(params) ...

foo_with_logs = Logger(foo)

foo_with_logs(params)

//results in something like
//foo called with (params)
//foo result: (results)
 
  • Thunk-Provoking
Reactions: ADHD Mate
wait a second, is this a monadic pattern!?!?
while Logger(foo) looks a lot like the unit operation, i don't think foo_with_logs(params) is like the bind operation
to be a proper monad it has to act like a monad, and this seems to just be a somewhat monadish higher order function of some sort
i'm a retard who doesn't know category theory though so i might be completely wrong

i would maybe make it so foo uses a special context (maybe through foo being returned somehow from a lexical closure that was passed a logger) and that context has a function that logs whatever is passed into it
 
  • Thunk-Provoking
Reactions: ADHD Mate
i would maybe make it so foo uses a special context (maybe through foo being returned somehow from a lexical closure that was passed a logger) and that context has a function that logs whatever is passed into it
Yeah that's closer we want something that follows this signature: bind = m a -> (a -> m b) -> m b
For that we want to define m a. Since we want it to be logger it should be something like logger a = logs -> (a, logs)
So our result = foo_with_logs(params) => params -> logs -> (result, logs)
But if we don't pass around logs and results but instead rely on side effects it doesn't make any sense.
 
Yeah that's closer we want something that follows this signature: bind = m a -> (a -> m b) -> m b
For that we want to define m a. Since we want it to be logger it should be something like logger a = logs -> (a, logs)
So our result = foo_with_logs(params) => params -> logs -> (result, logs)
But if we don't pass around logs and results but instead rely on side effects it doesn't make any sense.
One pattern I quite like is using f# computation expressions to build up a series of operations to perform, which I can then pass conditionally into a function that either just executes those, or into one that executes each step and logs it in sequence. It's somewhat comparable to the interpreter pattern in OO languages. It also allows you to easily change out the "interpreter" to do different things, like only log the start and end of the operation flow, or trace each individual function call, or have it log how long each step took, etc.
 
  • Thunk-Provoking
Reactions: ADHD Mate
monads are neat but nobody really needs to use them unless your language is autistic and forbids you from doing random side effects whenever you feel like
just use a higher order function in a way that looks ok and you'll be golden
 
  • Like
Reactions: ADHD Mate
monads are neat but nobody really needs to use them unless your language is autistic and forbids you from doing random side effects whenever you feel like
just use a higher order function in a way that looks ok and you'll be golden
Monads are very cool for error handling though. Just having result monad that is defined as result a = a | error
and bind defined as
Code:
bind :: result a -> (a -> result b) -> result b
bind error _ = error
bind a fn = fn a
Let's you chain arbitrary amount of functions that can potentially fail, and I think it's beautiful.
 
years of making new languages still no reason to use anything other than c
lisp is both older and better than the portable assembler for unix (mostly useful for making new garbage collectors so it's a nice language but in an ideal world we would have infinite-memory computers that don't need these)
Let's you chain arbitrary amount of functions that can potentially fail, and I think it's beautiful.
yeah it's neat but many languages include exceptions and they end up doing a lot of the same shit that error monads do
 
Monads are very cool for error handling though. Just having result monad that is defined as result a = a | error
and bind defined as
Code:
bind :: result a -> (a -> result b) -> result b
bind error _ = error
bind a fn = fn a
Let's you chain arbitrary amount of functions that can potentially fail, and I think it's beautiful.
That one has the drawback of unwrapping and re-wrapping the result at each step though, which usually doesn't really matter, but can be a problem in a hot loop.
 
That one has the drawback of unwrapping and re-wrapping the result at each step though, which usually doesn't really matter, but can be a problem in a hot loop.
the standard cope here is that The Compiler Will Definitely Figure This One Out™
 
the standard cope here is that The Compiler Will Definitely Figure This One Out™
Which is where the interpreter pattern comes in, to short circuit the evaluation.
Code:
let evaluate currentValue remainingFunctions =
    match currentValue with
    | Ok a ->
        match remainingFunctions with
        | h::t -> evaluate (h a) t
        | [] -> currentValue
    | Error _ -> currentValue
This example takes a currentvalue of type result and a list of remaining functions to evaluate. On the first error it just returns that error, otherwise it just throws the evaluation to the next function.
 
years of making new languages still no reason to use anything other than c
If by "using" C you also count embedding compiled bytecode from another language in a C byte array and passing that to the interpreter (written in C), yeah sure.
- Python is fast enough
Really depends on what is meant by "enough". The soydevs have to raise their standards.
 
Two most common programming advices:
- Python is fast enough
- Compiler is smarter than you

I hope they play along nicely :ratface:
"the compiler is smarter than you" does not paint the full picture because it's way better than you at some things (extremely specific micro-optimizations) and completely useless at other things (not making your algorithm run in O(n!))
you have to help the compiler and it will help you
Really depends on what is meant by "enough". The soydevs have to raise their standards.
python actually is fast enough if your program is i/o bound and you don't do something extremely retarded, which happens to include an incredibly large number of programs
besides if you have to loop through a million items it's often a sign that you're not using a structure that lets you only loop through the logarithm of a million items
universal law of programming: jeets can easily write incredibly slow and broken code in any language
 
The compiler being smarter than the programmer only applies to novices.
No one understands the context, exact intentions and requirements of your code more than you do.

The compiler is a convenience item, even a standard 1000 line program could be an additional 20 hours of piecing together the assembly code for every individual part vendor.

Even if you are only moderately experienced with assembly in a given codeset you will have the ability to out perform the compiler given enough time. The compiler is not smarter, it is faster.
A small increase in size and reduction in speed is usually worth the immense amount of time to be saved.
 
  • Agree
Reactions: YOU ARE
The compiler being smarter than the programmer only applies to novices.
No one understands the context, exact intentions and requirements of your code more than you do.

The compiler is a convenience item, even a standard 1000 line program could be an additional 20 hours of piecing together the assembly code for every individual part vendor.

Even if you are only moderately experienced with assembly in a given codeset you will have the ability to out perform the compiler given enough time. The compiler is not smarter, it is faster.
A small increase in size and reduction in speed is usually worth the immense amount of time to be saved.
the compiler is a tool that can automatically apply a number of advanced optimizations that assembly programmers stay away from because it makes shit really hard to understand, and no human is capable of consistently and efficiently making all the logical jumps that smart compilers make when they are optimizing code, which is why they are frequently called "smarter than you"
you could probably get closer to the stuff compilers like doing all the time with sufficiently advanced macro assemblers but at some point those are just compilers by any other name
you might be able to make a small function 15% faster than the compiler can with a few hours of hard work but i don't think you can actually realistically write an entire application in assembly and consistently beat the compiler throughout
 
The compiler being smarter than the programmer only applies to novices.
No one understands the context, exact intentions and requirements of your code more than you do.

The compiler is a convenience item, even a standard 1000 line program could be an additional 20 hours of piecing together the assembly code for every individual part vendor.

Even if you are only moderately experienced with assembly in a given codeset you will have the ability to out perform the compiler given enough time. The compiler is not smarter, it is faster.
A small increase in size and reduction in speed is usually worth the immense amount of time to be saved.
Yes, humans are smarter than compilers, BUT not with the local optimizations, most of the time. Instead macro optimizations are what humans are good at. Like not using an O(n^2) algo when you can do an O(n * log(n)) one. (Well, that still has other caveats, but I digress).

And quite frankly, most of the time microoptimizations don't matter at all, as you are likely waiting on IO or network calls (At least in my field of work)
 
Yes, humans are smarter than compilers, BUT not with the local optimizations, most of the time. Instead macro optimizations are what humans are good at. Like not using an O(n^2) algo when you can do an O(n * log(n)) one. (Well, that still has other caveats, but I digress).
and such optimizations often matter a bit more in the grand scheme of things than how exactly you juggle your registers
And quite frankly, most of the time microoptimizations don't matter at all, as you are likely waiting on IO or network calls (At least in my field of work)
it's like that in an overwhelming number of programs; even many video games spend a lot of their wallclock time telling the gpu what to do, because most of the number crunching is happening on there
also as long as you don't do your 200 thousand item bubblesort on the ui thread, nobody will really notice
 
  • Like
Reactions: Jotch
Back