I first got the idea of using the structuer of cons to denote meaning for using the Guile Json library. You can feed in json formatted text and it turns the json into one big list you can work with in lisp. Latter I was watching this lecture, which now I really can't remember and don't want accidentally miss-quate, that said that before SGML Lisp was commonly used for structuring data like we would use XML for today.
I'm not sure how common it was. UNIX was around long before XML, but there are (very sadly) no s-expressions to be found in the UNIX core. It's just lots of horrible ad-hoc text formats. You see the same in early and now ubiquitous internet protocols, like email, which long predate XML.
Google Erik Naggum and XML to hear an old-school lisper rant aggressively against it. Naggum was heavily involved in the SGML project, so he knew what he was talking about.
SGML was the markup language that inspired HTML and XML. Its purpose was for marking up human written documents, being long bodies of text with annotations here and there for basic structure. The W3C decided to repurpose this markup language into a data format for arbitrary structured data. This is a really tortured repurposing that explains why XML is so god awful.
S-expressions, by contrast, were designed,
way back in 1957, as a simultaneously human and machine readable format for arbitrary structured data. But it was criminally ignored in the 90s in favour of XML.
Nowadays, JSON has displaced XML to a great extent, and it seems the programming world has recovered a bit from that earlier insanity. JSON isn't god awful, but I still find s-expressions superior. It's still really shameful that it's been largely forgotten about. The story of s-expressions, XML and JSON is the story of why the computing world is badly amnesic and so is destined to reinvent shittier and shittier wheels.
So every a in one these function is fist of one deeper and every d is the cdr of one deeper. This realization really makes it full circal that lisp expects you to use many bedded cons.
Yep. And now consider that your own code is a cons! So even nesting code is nesting conses. But you can (and should!) use it for any data.
In Guile
(read)
will just take stdin as a file port and parse it to lisp. It doesn't let you pass as string to it. I'm guessing because it's not implemented to take stdin as a string and instead uses binary representation of a file. Last time I looked at file ports I looked at file ports I think they needed you to specify encoding if changing them into strings. For example:
Code:
scheme@(guile-user)> (define thing (read))
this
$5 = this
scheme@(guile-user)> (symbol? thing)
$6 = #t
scheme@(guile-user)> (string? thing)
$7 = #f
There is a read-string function, but I'd have to look up which module it's in.
That's a little sad. Best I could do was:
Code:
> (call-with-input-string "(1 2 3)" read)
$1 = (1 2 3)
I see what I did wrong, I was relying on string-match to be both greedy and not permissive; take every occurrence and return #f not only if no matches were found, but if not the right matches were found. I understand why this didn't work. Taking this into mind, I also found that it would be easier to work with if the data was labeled, so I rewrote checking part with variables to make more readable.
This still doesn't look right. "2-4 a: abababab" should be valid (it has between 2 and 4 occurrences of 'a'). But your solution says it is invalid.
I'd avoid regular expressions entirely for this. Just use string splitting and then traverse the string in the password check, or convert the string to a list of characters and use set/list functions for the check.
I liked your implementation too. I'm not good at Haskell. Though I think you read the passwords in from a file which was a neat approach.
It's just so I can save the input page to a file from the browser, then run my code to get my answers.