Programming thread

The problem is that there's a metric Brazilian different ways to do this. I do this with SSH. I turn my TV on, and a little ARM computer comes on with it. I can SSH in to that box and play a movie, play music, whatever.

But depending on what you want, you may want something like VNC. Securing that becomes its own concern. You could bind it to a VPN and then the VPN handles the security. Or whatever. There's still a lot about what you intend to do that is not well defined yet.

I found team viewer is free for personal use and its encrypted so I am gonna try and use that.

Turns out windows 10 charges out the ass to get built in remote desktop.

No matter, I'm gonna switch to Linux on my desktop too. But I just dont feel like doing that at the moment cause I have alot of shit that I'll need to back up.

But I'll also look into how to work with ssh. I feel like the biggest draw back of being self taught is my knowledge isn't as well rounded as it ought to be.
 
  • Agree
Reactions: Belisarius Cawl
Cant you just fix that by ensuring the name strings are stored as literal values?

Like, Its really that simple isn't it?
Very often... way too often... it's simply slapped into a string that is then used directly to interact with a database. From there it's entirely dependent on how the database is set up to determine how it interprets a "Null" string. There are standard ways to manage this but a frightening amount of places still don't do the basics which is how you have SQL injection attacks even today with well documented libraries for every language to avoid this problem.

To answer your question, yes in probably every common database management system nowadays you can enforce not null, among other options. It's largely incompetence no matter how you paint it.
 
Serialization. These systems send data across the network. The solution actually has a bit more code stink to it since you need to write a corner case for exactly "Null" to make sure nothing goes wrong.
Yes, but it's something even a barely competent programmer should think about.
With data, you can have
  • a normal value
  • an empty value that's legal
  • the absence of a value (different from the empty value)
  • the lack of a value (incomplete data, like a bystander who gives his first name but presumably has a surname)
  • "value not provided" ("keep it as it is" in partial updates)
 
What kind of extreme pajeetery is this? Why are "Null" the string and NULL the empty value the same thing?
They're rarely ever the same thing outright, but way too many languages have "helpful" operators and functions that wildly convert between data types in case the user is too stupid to know the difference between the number 123 and a string that says "123". Because saving some complete beginner all of five seconds at the cost of introducing hard-to-find hell bugs for everyone else is totally reasonable and not at all retarded.
 
Im not sure how you would do this in other languages but like, In javascript you can use template literals to store just about anything as a string.

I can go and check, but if you did something like

JavaScript:
let lastName = `Null`

Would that value equal nothing or wouldn't it literally be the string, "Null"?

Surely most languages have something similar?

Surely, I'm talking out my ass too.

Enlighten me.
 
  • Thunk-Provoking
Reactions: Belisarius Cawl
Im not sure how you would do this in other languages but like, In javascript you can use template literals to store just about anything as a string.

I can go and check, but if you did something like

JavaScript:
let lastName = `Null`

Would that value equal nothing or wouldn't it literally be the string, "Null"?

Surely most languages have something similar?

Surely, I'm talking out my ass too.

Enlighten me.
1740186139260.png1740186311213.png
JavaScript just converts null to "null", which is not equivalent to null as in the data type.
 
Im not sure how you would do this in other languages but like, In javascript you can use template literals to store just about anything as a string.

I can go and check, but if you did something like

JavaScript:
let lastName = `Null`
I don't think the backticks change anything. `null` === 'null' evaluates to true in the console and in the Node REPL. What the others are talking about broadly speaking is called weak typing.
JavaScript is a retarded language and the fact it powers the web is horrifying.
Besides the aforementioned, let's not forget that JavaScript handles so much e-commerce and its only native number type is floating-point numbers
 
JSON is a plain text format. JSON.stringify encodes an object in the JSON format, which is by definition a string. You're basically complaining that (123).toString() returns "123" instead of 123.
There's a tradeoff between representing data accurately and interoperability with different languages and systems and security as well. Pickled Python objects have (AFAIK) total fidelity to the original objects when deserialized but other languages won't have exactly the same concepts and they allow arbitrary code execution as well. Having said that JavaScript is still really niggerlicious.
 
  • Like
Reactions: ADHD Mate
Besides the aforementioned, let's not forget that JavaScript handles so much e-commerce and its only native number type is floating-point numbers

The funny thing is that javascript can handle the math. But you gotta write out unfathomably niggerlicious stuff that you don't have to do with other languages.

JavaScript:
let result = parseFloat((x + y).toFixed(2));
 
The funny thing is that javascript can handle the math. But you gotta write out unfathomably niggerlicious stuff that you don't have to do with other languages.

JavaScript:
let result = parseFloat((x + y).toFixed(2));
View attachment 7012929
That isn't much different from directives used in C's printf() function among others inspired by it. .toFixed(2) does display JS floating-point numbers with what appears to be appropriate decimal currency figures but there's no guarantee that math like applying discounts works properly under the hood:
Posters in the thread suggest using prices and other currency figures in cents as an integer and then converting at the last minute. That should probably work even though JS doesn't really have a dedicated integer type. (My understanding is that floating-point math with integer values can be expected to have higher fidelity.) There are also specialized libraries for JavaScript like currency.js that can help:
 
My understanding is that floating-point math with integer values can be expected to have higher fidelity.
It's perfectly accurate up to 2^53. Double precision floating point numbers are for the most part just a 52-bit integer times 2^e for some limited e and a sign, so at e=0 you just get a 52-bit integer with weird overflow behavior. There's this common misconception that floating point math randomly corrupts your results, but it's actually rather predictable. Not having integers is still a retarded decision though.
 
JSON is a plain text format. JSON.stringify encodes an object in the JSON format, which is by definition a string. You're basically complaining that (123).toString() returns "123" instead of 123.
Wasn't the point of the post that I was replying to? That when you convert null to a string you get "null" and not an empty value? I'm also not complaining that you can't compare a string to null and somehow get true. That would be really dumb as evidenced by all the previous posts about issues with database software where typing in null would cause problems.
 
  • Agree
Reactions: Belisarius Cawl
It's perfectly accurate up to 2^53. Double precision floating point numbers are for the most part just a 52-bit integer times 2^e for some limited e and a sign, so at e=0 you just get a 52-bit integer with weird overflow behavior. There's this common misconception that floating point math randomly corrupts your results, but it's actually rather predictable. Not having integers is still a retarded decision though.
I agree it isn't random corruption. It is predictable, but it's also corruption (in Python this time):
Code:
In [15]: 0.1 + 0.2
Out[15]: 0.30000000000000004
 
I agree it isn't random corruption. It is predictable, but it's also corruption (in Python this time):
Code:
In [15]: 0.1 + 0.2
Out[15]: 0.30000000000000004
That's a floating point calculation and not an int. Monetary values should use a monetary datatype. Floats should always be compared with an epsilon, etc.

I can bitch and moan about language design, but numerical operations being garbage has been a problem since C. It's perhaps the original sin of programming.

Take infix '+' for example. 2 + 2.0. Should it cast to int or float? Should it fail? If it hits a representational limit, should it saturate, or overflow? Even simple addition isn't simple unless you're coding for a virtual machine with well defined behaviours - one of the few things functional languages get right. (And one of many many things JS gets wrong).

Numbers being doubles isn't really an issue though, it's just poor design that causes these issues and poor documentation that causes these myths to pop up around them.
 
Back