Qt Future Releases Panned To Be For "Paying Customers Only" For A Year - Qt is the framework which KDE relies on. This will kill all open source contributions to Qt.

The thing about Electron is it's, again, a necessary thing. It enables developers to make a single application for both web and all major desktop OSes. It's what Java always wanted to be if it didn't fail at that spectacularly. Software development is prohibitively complicated and time-consuming, I think even after 70 years of people learning that the hard way, people still don't seem to appreciate how software takes many talented man-hours to make, and it's only getting ever more complex as software is expected to simply do more things. No matter how big the company is, it's got a limited amount of employees, time and money. Companies, especially Microsoft have to strategize and make appropriate abstractions to both help themselves make software and help their customers make software for their platforms.
It's fair to say it's an unfortunate state of affairs that everything needs to be made on top of HTML, CSS and JS. It's all just kind of a fungus that sporadically grew and nobody really planned any of this out. But there are great strides by pretty much every massive IT corporation to optimize the entire stack and even trying to replace parts of it, like with WebAssembly. There's actually quite a few projects to make standalone desktop runtimes for WASM.
Plus, this doesn't really affect the average user. I don't know what your situation is, but the average well-behaved Electron app uses, like, 300MB RAM, maybe less. Most people have 8 or 16GB RAM, it's plenty of space for whatever various applications they use daily. I mean I've got a ton of shit open and I'm doing fine, I hit the page file every now and then but it's nothing too dramatic.
Well, whatever. I don't think there's any point arguing because we obviously have completely opposing viewpoints.
If you think 300mb (best case scenario) for something Windows 95 could easily do is acceptable then I couldn't disagree more.
I will however, heavily dispute 16GB of ram being a "general user" configuration. Most users I've seen only have 8.
I also disagree with the idea that webassembly is going to make things better. It's just the foundations for a horrifying tower of jank sitting on top of an already existing one. Why compile to bytecode on top of a massive overhead when you could just comple to the base OS instead? As much as I dislike Java, at least it's virtual machine is far less removed from the base hardware.
I understand the need to capture the web market but I don't see why everything needs to be constrained by it.
 
Well, whatever. I don't think there's any point arguing because we obviously have completely opposing viewpoints.
If you think 300mb (best case scenario) for something Windows 95 could easily do is acceptable then I couldn't disagree more.
I will however, heavily dispute 16GB of ram being a "general user" configuration. Most users I've seen only have 8.
I also disagree with the idea that webassembly is going to make things better. It's just the foundations for a horrifying tower of jank sitting on top of an already existing one. Why compile to bytecode on top of a massive overhead when you could just comple to the base OS instead? As much as I dislike Java, at least it's virtual machine is far less removed from the base hardware.
I understand the need to capture the web market but I don't see why everything needs to be constrained by it.

Hey I'm not saying any of this is a good thing, just that it works. All market and industry requirements are being fulfilled, and at the end of the day, that's all that really matters.
Software development isn't supposed to be fun or fancy or an artform, it's absolutely abysmal and soul-crushing. It's an industry constructed of decades-old technology that has several dozen engineering teams desperately trying to create or revise standards in order to patch it together and optimize through a complex system of duct tape and chewing gum. And they're not even trying to make any of it good, just trying to keep the lights on so the industry doesn't collapse in on itself. The sooner you appreciate the cynicism of IT the easier it gets. Quite frankly I'm just happy any of this shit is even working.
 
I think we can all agree that Electron is the easiest path to having a single codebase with the absolute minimum amount of platform specific code to deliver an absolutely minimally usable and minimally performant minimum viable product for webshit and desktop.

I'm sure the QT web stuff is probably a better option if you want to make something that also actually functions well on desktop, but haven't talked to anyone who's used it (as opposed to QT on desktop or mobile).

The mailing list thread overview is here (along with discussions of a Instagram account for KDE).
Some posters are claiming that Trolltech is now lying about the threats they may have made earlier..
 
Last edited:
Why compile to bytecode on top of a massive overhead when you could just comple to the base OS instead?
Because it's easier to ship and distribute bytecode.

Binaries are fatter, require multiple binaries for each platform, and in Linux, generally require another package to be maintained. The binary also has to anticipate all the possible CPU extensions it might encounter that it wants to take advantage of (such as SIMD). With bytecode, you have one program to distribute that is compiled to the specific hardware by a JIT and is then future proof against hardware extensions as the JIT is upgraded. You also get the supposed pay-off of things like Hotspot where bytecode is compiled to native code according to how you run the program on your particular hardware.

Finally, a VM can be far better sandboxed than a Windows or Linux native application, which gives the user way more peace of mind, and means that they're more likely to run your application.

There's nothing wrong with bytecode. There's plenty wrong with Electron and Javascript, overhead being a definite issue for me.
 
Last edited:
Regarding bytecode, although it deviates a bit from the subject, the technologies regarding it today are pretty amazing:
- hotspot JVM: The JIT compiler is GOOD
- GraalVM: polyglot native compiler for all JVM languages, python, ruby JS (I think)
Even Emacs is working on a native compiler for their bytecode representation.
 
Regarding bytecode, although it deviates a bit from the subject, the technologies regarding it today are pretty amazing:
- hotspot JVM: The JIT compiler is GOOD
- GraalVM: polyglot native compiler for all JVM languages, python, ruby JS (I think)
Even Emacs is working on a native compiler for their bytecode representation.
Don't forget the strides .NET has made either. It's now cross platform and open source.
 
Don't forget the strides .NET has made either. It's now cross platform and open source.
.net is a joke for cross platform development.
Almost all the libraries Microsoft pushes that you need to do anything will work only on Windows.

It's much easier to get a native Win32 application running on Linux through wine than a .net application through mono.
Hell, I can't even run a .net application that uses the latest framework on Windows XP.
 
I'm pretty sure that like 95% of the stuff that doesn't work on other platforms is either UI stuff, which they're working on and which is sort of hard to do cross platform, and stuff that doesn't have an equivalent on *nix like the registry.
 
Is a powerful abstraction that bad? Rendering UI elements with a markup language isn't much different from a game engine using a scripting language.

ACTUALLY most game engines have dropped scripting languages because its a shit idea in practice. If you want lots of nightmarish bugs then by all means shove fucking lua into your engine.
 
  • Thunk-Provoking
Reactions: uncleShitHeel
ACTUALLY most game engines have dropped scripting languages because its a shit idea in practice. If you want lots of nightmarish bugs then by all means shove fucking lua into your engine.

Examples? I don't know of any modern commercial engines that don't have a scripting language. Again, you need the ability to have graphics designers to write shitty little scripts to have NPCs sparkle. It's a business requirement that modern consumer hardware is more than happy to harbor the cost of.
 
Examples? I don't know of any modern commercial engines that don't have a scripting language. Again, you need the ability to have graphics designers to write shitty little scripts to have NPCs sparkle. It's a business requirement that modern consumer hardware is more than happy to harbor the cost of.


I know Unreal blueprints may be called a scripting language but I wouldn't really consider it to be the same as the living hell that was off the shelf scripting "languages". I think unity has followed a similar model of node-graphs.

It used to be common to tack on something like Lua using an interpreter and yippie! all your game logic can exists as script files. Now you can iterate really fast and don't need to recompile for every little change and the higher paid programmers wont be doing as much bitch work (which really means you can just hire less of them). The problems were that interpreter is going to have some bugs and be slow, the latter was really a problem on the trash 360/ps3 cpus. And those nice .lua files that make things so easy are going to be an absolute nightmare, because some level designer with no programming experience is going to fuck up often. It could be simple syntax or spelling mistakes to more mindblowing stuff like "I thought I could type gravity=off and it'd work".

What unreal is doing is so on-rails that it is nearly impossible for some fresh graduate to fuck up, and if they do fuck up you'll actually see error messages in the console - the latter usually didn't happen with lua just hooking into the engine. And its not using some snowflake markup but C++ too.

You might consider all of that to be just semantics but imho its very different to the early 2000s trend of "just use lua bro!".
 
  • Like
Reactions: Smaug's Smokey Hole
Ah. Yeah actually I do consider blueprints a scripting language. I think there's a fallacy people make where they attribute "scripting language" with the definition of "a weakly-typed interpreted language thats some horrifying amalgamation of the worst parts of Lisp and Algol." I basically just mean any sort of runtime extension system whether it's JIT compiled or even some Scratch-like system.
But yeah, Lua is shit. I don't think I outright hate it, but god I could rant about it for entire paragraphs.
 
Ah. Yeah actually I do consider blueprints a scripting language. I think there's a fallacy people make where they attribute "scripting language" with the definition of "a weakly-typed interpreted language thats some horrifying amalgamation of the worst parts of Lisp and Algol." I basically just mean any sort of runtime extension system whether it's JIT compiled or even some Scratch-like system.
But yeah, Lua is shit. I don't think I outright hate it, but god I could rant about it for entire paragraphs.
I'd be up to hear some ranting for entire paragraphs. The only Lua I've touched is for "Tales of Maj'Eyal", and I liked it, at least more than Python and Javascript.
 
I'd be up to hear some ranting for entire paragraphs. The only Lua I've touched is for "Tales of Maj'Eyal", and I liked it, at least more than Python and Javascript.

I get flack for it but I actually like JS. It's quirky but you can learn the quirks pretty quick and then it's actually quite usable. I usually prefer embedding JS than Lua for scripting.

With those languages, and Lua specifically, the weak typing is really annoying. I mean I know it's kind of a given to hate weak typing since it results in so many bugs and less readable code, but it is a big issue for me. (Another reason I like JS is because TypeScript exists to solve that one too. Not perfect but it's nice.)

Then there's the "object-oriented" nature of it. This one gets me really autistic for some reason. Like, they're trying to be all cute about their OOP system, when in reality they just half-implemented some half-assed classical OOP system. They try to tout it as being "flexible", literally all you can do with it is just classic OOP shit like in Java except it's way harder and more annoying because the language does nothing to help you, for the sake of being "flexible", it's like how people say Boogie is too much of a fence sitter. Worst OOP system I have ever used.

And then there's the syntax I generally don't like, but that one's mostly preference. It's just kind of verbose and feels less like it's trying to help me write code and more like it's trying to tutor a baby on how to write a script. Which I guess is the point. But I still don't have fun using it.

I don't hate it, it just makes me wish I was using something better. I can't stop seeing the glaring flaws and missed opportunities when I use it. It's such a misguided and poorly through-out language. Like people give JS shit for being that, but JS thought of more in the, what, two weeks it took to design and implement, than the Lua guys did in whatever timespan they had. And they didn't even have a deadline. Just really feels like someone's first attempt at making an experimental language instead of an actual effort to make something easy to use.
 
  • Informative
Reactions: cecograph
With those languages, and Lua specifically, the weak typing is really annoying. I mean I know it's kind of a given to hate weak typing since it results in so many bugs and less readable code, but it is a big issue for me.
But Lua is strongly typed. Do you mean dynamic typing?
 
But Lua is strongly typed. Do you mean dynamic typing?

Yeah.

That reminds me of something I left out, how you have to append everything with "local". Really should have gone the other way and had a "global" keyword with local being implicit. That one is just straight up stupid.
 
  • Feels
Reactions: Smaug's Smokey Hole
Then there's the "object-oriented" nature of it. This one gets me really autistic for some reason. Like, they're trying to be all cute about their OOP system, when in reality they just half-implemented some half-assed classical OOP system. They try to tout it as being "flexible", literally all you can do with it is just classic OOP shit like in Java except it's way harder and more annoying because the language does nothing to help you, for the sake of being "flexible", it's like how people say Boogie is too much of a fence sitter. Worst OOP system I have ever used.
Classic OOP to me means Smalltalk, of which Java is a parody. I'd be curious how close Lua cleaves to the original OOP tradition, which was very Lispy, very dynamic, and crazy flexible. The core of Smalltalk is also based on a simple calculus which almost makes you think of lambda calculus. Even conditionals are built out of the message dispatching system.

I never much liked Python or Javascript, but on the face of it, Lua looked to be getting closest to that simple core calculus. I've heard similar things for Ruby.

On the dynamic craziness, where bugs are inevitably rampant and only going to be found at runtime, it means your debugger better be awesome. The debuggers of Smalltalk and Common Lisp still make everything I've used today look like toys.
 
Last edited:
Classic OOP to me means Smalltalk, of which Java is a parody. I'd be curious how close Lua cleaves to the original OOP tradition, which was very Lispy, very dynamic, and crazy flexible. The core of Smalltalk is also based on a simple calculus which almost makes you think of lambda calculus. Even conditionals are built out of the message dispatching system.

I never much liked Python or Javascript, but on the face of it, Lua looked to be getting closest to that simple core calculus. I've heard similar things for Ruby.

On the dynamic craziness, where bugs are inevitably rampant and only going to be found at runtime, it means your debugger better be awesome. The debuggers of Smalltalk and Common Lisp still make everything I've used today look like toys.

To me Classic OOP is Erlang, since that was the actual definition of OOP in the beginning. But now OOP means Java. Language changes and evolves, I just adapt.
 
Back