Also, what are our thoughts on the qubit and the remodeling of computers with quantum rules instead of the boolean bit of "1" and "0"?
I suspect that quantum computing will require a retarded amount of energy, so it'll probably be a long time before it's practical.
The biggest thing on my mind about quantum computing is that we're going to need new encryption algorithms. That'll suck. See, even if quantum computing isn't really affordable for your average person, all of our encryption algorithms will need to be written as if it is affordable. That means we're going to need to have encryption strong enough to defeat quantum brute forcing, but we're not going to get many of the benefits of quantum computing. I don't know, maybe it'll make web services cheaper?
Pointers (especially two- and three-stars out) can be rather unintuitive and even once you become experienced with them there are some cases where you need to be careful and think about what you're doing. Once you understand pointers, you begin to wonder if there's anything that can be done to make them more intuitive while preserving the same level of functionality.
Yeah, nah I don't think pointers can really be improved on and still be pointers.
Java does away with pointers, instead using references, but you sacrifice things like pointer arithmetic.
Eh, I don't know if that's a huge loss. Like, the most common use of pointer arithmetic I can think of is multi dimensional arrays. And you can do that in managed languages by just using an integer offset and an ordinary array, just like you'd do with a pointer.
Or if you want to mess with individual bytes, pointer arithmetic might be useful there. But you can get the same effect with other, perhaps more verbose techniques. Like, when I mess with individual bytes, that's like a tiny fraction of my program. If the safe version is slightly more complicated, but in exchange for type safety everywhere else, I consider it a worthy tradeoff.
I think the syntax can be improved upon. The C family has a variety of ways you can define a pointer. You can do:
"int *n": which is a way of saying "define a pointer named N, and make it of int type".
Or you can do:
"int* n": which is a way of saying "define an int pointer named N".
These mean the same thing and have the same practical effect. I prefer the second type, which works in most but not all cases (function pointers will still need to be defined with the asterisk on the identifier to my knowledge). The reason I prefer it is because it is clearer that you are defining an int pointer, not an int. However, almost all C style guides I've seen use the first notation, and I'm sure there's some reason why that someone can explain to me.
This has driven me nuts forever. I have never been convinced by explanations for the former.
With variables, I want to be able to ask myself "what type is n?". And looking at the second example, the type of n is readily apparent. It's an "int*".
I think the argument for the alternative is that they're asking "what type is *n?". But personally I don't buy that argument because there are two syntaxes to access pointer data in C (* and []), and I'll freely switch between the two, depending on the situation.
I've had a few jobs where I've been brought in to update code that clients outsourced to India or Russia and no one else could make sense of. In one case they had an entire CMS coded by some Russian where every single variable was $a1, $a2, $b1 and shit like that. They would have saved a ton of money had they gone with a competent programmer in the first place instead of saving a couple of bucks on initial development and being forced to bring in the firm I was working with to figure out what the hell was going on.
My company is learning that lesson. We hired Indian programmers to do our mobile app and they're braindead and incompetent. At best they can string together some UI elements, a http library, and a json parser. Anything more complex than that is beyond them. For example, we need to aggressively cache data because we're dealing with shitty third world internet connections. Explaining the caching setup I'm providing is extremely difficult. The language barrier doesn't help.
I think they tried to give me documentation once in MS Word.
Like, my manager (went to school for economics or some shit) is getting frustrated to the point where he's learning to program so he and I can replace the Indians when our contract is up.
So, I was thinking about this...
Is programming really a "deep thoughts" kind of thread?
Like, the reason why I'm asking is because programming is a very practical topic. If you get too theoretical, you just start rambling about spergy programming philosophy that no one can relate to. But if you get too practical, then you're basically writing a vocational course. (And that's boring in its own way.)
So, I was thinking, would anyone be interested in maybe writing bots for this:
http://theaigames.com/competitions/warlight-ai-challenge/rules
Nothing formal, necessarily. I don't know if they're still running the contest either. But when I found it, I was intrigued by the idea. I don't know, maybe we could demonstrate/discuss programming techniques, while still having some practical demonstrations of them in the form of warlight bots.
Right now I'm just fucking around implementing the protocol for my own bot, for my own experimentation.
Someone wrote a very simple engine/viewer that lets you watch two bots play.
(By the way, warlight the game is actually pretty cool. I was never into board games as a kid, and Risk confused the hell out of me. But after playing warlight, I'm really realizing that there was a really interesting element to Risk that I can now appreciate as an adult.)