alt.hn

1/8/2025 at 6:21:57 AM

AI in the 80s? How a Simple Animal Guessing Game Pioneered Machine Learning

https://medium.com/@alexey.medvecky/ai-in-the-80s-how-a-simple-animal-guessing-game-pioneered-machine-learning-before-it-was-cool-2f4a63dfe762

by rbanffy

1/12/2025 at 8:35:14 AM

In two places the article states that the original game had the ability to save the updated tree ("it had the ability to save progress and load it during the next run" and "It is an amazing example... of how, even with such a simple language, ... and the ability to save new questions").

The later part says the opposite - that the original implementation had "No ability to save progress" and that this is new in the C++ implementation.

I can't help but wonder (also due to other language features) if the author ran the article through an AI to 'tidy up' before posting... because I've often found ChatGPT etc. to introduce changes in meaning like this, rather than just rewriting. This is not to dismiss either the article or the power of LLM's, just a curious observation :)

by maginx

1/12/2025 at 5:10:55 PM

True. For example, the Apple ][+ came with a demo disk full of programs for the then-new Applesoft BASIC language, and this was one of them. The questions were saved by POKEing them into arrays hard-coded into the program, allowing you to SAVE the modified program after running it.

It seemed like a neat trick at the time. There was also a crude CRUD database that worked the same way, retaining up to 50 names and phone numbers.

Actually that disk had a lot of really cool programs, now that I think about it. A biorhythm plotter, Woz's "Little Brick Out" Breakout clone, and a few other demos by luminaries like Bruce Tognazzini and Bob Bishop. And of course, ELIZA, the mother of all LLMs... only it was called FREUD for some reason.

by CamperBob2

1/12/2025 at 5:14:07 PM

I believe that the intention was to say "No ability to save progress _between sessions_" in the original program, whereas the C++ implementation saves to text files.

Another portion of the article says more explicitly:

  Limitations:
    Maximum number of nodes: 200.
    The structure is stored only in memory (no disk saving).

by no_news_is

1/12/2025 at 7:13:31 PM

I don't think so. Consider "It didn’t just attempt to guess the chosen animal but also learned from its mistakes, adding new questions and answers to its knowledge base. What’s more, it had the ability to save progress and load it during the next run.". Data persistence across trials is already implied by the first sentence, so what would the next "What's more, ..." part refer to - it mentions "saving" and "loading"? Even if we grant "saving" to mean updating the in-memory data structure, what would "loading" refer to? Also note the later "No ability to save progress" which directly contradicts "It had the ability to save progress". These sentences, both clearly referring to the original, are in direct contraction with each other and use the exact same terms. Inspection of the code shows that it clearly only saves the memory and not to disk.

by maginx

1/12/2025 at 12:23:48 PM

Further suggested by the imagery used being AI generated

by okamiueru

1/12/2025 at 9:46:31 PM

Yes the whole tone of voice is typical of LLM's as well.

by maginx

1/8/2025 at 8:14:21 AM

let's not forget the 60s: https://en.wikipedia.org/wiki/Matchbox_Educable_Noughts_and_...

by 082349872349872

1/12/2025 at 4:17:20 AM

I read about this idea and actually built a variant in college for the Washington University ACM (our CS student club). We ran a demonstration at one of the engineering school events and ended up refilling the matchboxes multiple times that day - it was a hit!

If I recall, there is a way to significantly reduce the number of matchboxes needed by taking advantage of mirrored board conditions. Somewhere I have a Perl script that generated HTML for all the matchbox covers and possible next states.

by tdeck

1/12/2025 at 2:04:26 AM

That's immensely more interesting than a program that plays 20 Questions. I'm amazed I've never heard of it!

by CamperBob2

1/12/2025 at 9:19:11 AM

Bit misleading title as in the 80'd descision trees were concidered part of the discipline of AI, and you today might be surprised that machine learning was considered a diffetent discipline. This was the result of scientific kerfuffls about funding.

In practice many of us in the 'Nouvelle AI" movement that countered the ruling Symbolic AI paradigm (GOFAI was the commonn slur) had at least one foot in the ML, EA. And alife spaces as once you abandon symbolics straight programming becomes awkward fast.

by PeterStuer

1/12/2025 at 8:27:12 PM

True. The program in the article is an expert system, not a decision tree learner. The confusion probably stems from the fact that decision tree learners are (still?) a very popular machine learning algorithm; and because most people who are into AI and ML today don't know anything about its past.

And what happens to those who don't remember the past? Ask George Santayana.

by YeGoblynQueenne

1/12/2025 at 9:13:33 AM

I remember the version of this called Pangolins on the ZX Spectrum. Because pangolins were an unusual case (scaly mammals).

Looking it up, apparently it was a type-in in the programmling book that came with the computer.

https://spectrumcomputing.co.uk/entry/15157/ZX-Spectrum/Pang...

That's the book my (blind) dad got me to read to him which taught me to program!

by frabcus

1/12/2025 at 3:25:26 AM

This program was one of my first interactions with a pc. This was a wonderful shot of nostalgia.

by onionisafruit

1/12/2025 at 2:33:10 AM

Guilty. I worked with a version of this program.

by kazinator

1/12/2025 at 2:12:50 PM

Is this the same basic idea as those 20 Questions toys? Except, I imagine, that they pre-train the “model” and load it with the 2^20 leaf nodes?

by teeray

1/12/2025 at 10:00:48 AM

Nice memories. I had a version on the Amstrad called Animal Vegetable Mineral. It definitely kicked off a few neurons for me.

by richardw

1/12/2025 at 2:40:48 AM

Opening image: AI-generated picture of a slightly melting computer, with keys that can only type words like "kgonfutz" or "oøoøøo".

I can only assume the rest of the article is also AI-generated, with a similar attention to detail. Tab closed without reading.

by egypturnash

1/12/2025 at 4:03:47 AM

Well, dear Peggy, the program in question is, in fact, known to have existed in 1974 on Multics (https://multicians.org/terminals.html), so you're not half wrong.

The other major issue is that it isn't machine learning—it's a classic example of an expert system, even if there is a bit of a gray area around whether a binary decision tree qualifies as ML.

The worst part, of course, is that it takes less time to slap "GUESS THE ANIMAL" on a stock image of a glass terminal than it does to prompt a diffusion model to generate the same thing with passable fidelity... and it still wouldn't be an accurate depiction of what the program actually did.

by rhet0rica

1/12/2025 at 8:28:53 PM

It's not a grey area! A decision tree does what we call today "inference". A decision tree learner is the one that does the learning and that hasn't changed since the '80s.

Er. To my understanding of course.

by YeGoblynQueenne

1/12/2025 at 4:11:08 PM

I cried "LLM garbage" too. Catchy title, article full of bizarre errors and repetition.

It would be a prank--LLM writing about AI, badly.

But the Medium account behind it has only articles like this.

by FarmerPotato

1/12/2025 at 7:24:11 AM

It can be difficult to find hero images for articles like this, so people generate them. Why is that bad to you?

by blahyawnblah

1/12/2025 at 8:08:21 PM

Not everything that is AI is machine learning. The program in the article may be using decision trees but the program is not a learning algorithm but an expert system. The difference is that machine learning is inductive, while expert systems are deductive [1].

Here's another example of doing the same thing in Prolog from Markus Triska's website, that also identifies animals by asking the user. Title "Expert Systems in Prolog":

https://www.metalevel.at/prolog/expertsystems

And here's another one that identifies birds, on the Amzi Prolog website:

https://amzi.com/ExpertSystemsInProlog/02usingprolog.php

_____________

[1] Reasoning. Induction and deduction are forms of reasoning. You know, the thing all the LLM folks don't know how to define? We've been able to do that since before the '80s and decision trees are just one way to do it. Learning a decision tree, e.g. with a decision tree learner like ID3 or C4.5 is an inductive task. Using a (learned or hand-coded) decision tree to make a decision is a deductive one.

But hey, we don't have a definition of reasoning. Oooh nooo.

by YeGoblynQueenne

1/12/2025 at 8:18:56 PM

There is a very important reason why "decision tree" is both a deductive inference procedure and the "model" (really, a propositional logic theory) learned by decision tree learners.

The reason is that the way decision trees, well, make decisions, is the way that expert systems of the '80s made decisions. The first expert system was called DENDRAL (capitals mandatory), derived from the Greek word for "tree", "dendron" (also see: "dendrite"; little tree). DENDRAL and similar systems implement deductive inference procedures that traverse a tree (either bottom-up or top-down, with backward or forward chaining). In traditional expert systems the edges in the tree are production rules of the form IF-THEN-ELSE and the nodes are intermediary goals of what is essentially a proof of a propositional logic formula (the decision tree; it's a boolean formula).

Expert systems' Achilles heel was the so-called "knowledge acquisition bottleneck" which refers, in short, to the difficulty of constructing and maintaining a huge rule-base as needed for an expert system that can do something more interesting than play 20 questions like the animal identification system above. Because of this difficulty, expert systems never quite fulfilled their promises and they suffered from the brittleness that is one of the first thing anyone hears about expert systems today.

And what did people come up with in order to overcome the knowledge acquisition bottleneck? Why, machine learning, of course. The early days of machine learning as a field --remember, we're back in the '80s, a bit before the AI winter hits-- were exactly systems that tried to learn the IF-THEN-ELSE rules for expert systems' rule bases.

Decision tree learners, like ID3 and C4.5 by J. Ross Quinlan, and similar algorithms, that were wildly popular even well after the ImageNet moment of deep neural nets come from that era. Instead of learning only the rules though, decision tree learners, learn an entire decision tree; in short, an entire expert system, from top to bottom.

by YeGoblynQueenne

1/12/2025 at 2:39:25 AM

I've occasionally thought about trying to organize a filesystem that way.

by tzs

1/12/2025 at 4:16:42 AM

I am surprised that the program understood English grammar.

by anoncow

1/12/2025 at 5:52:27 AM

It didn’t. I implemented this program in BASIC as a teen. The brilliance is that, when it isn’t able to guess your animal, it asks the user for the question to ask, as well as the correct answer. All “intelligence” and grammmar comes from the user, but for someone playing it for the first time on a well-populated database, it looks impressively smart.

by jdlshore

1/12/2025 at 9:50:04 AM

File IO was extremely hard on the early home computers, so it was very unlikely to have the ability to save and load the tree. And with such short code, there's no way to balance the tree either.

by Dwedit

1/12/2025 at 2:35:20 PM

The Apple //e version absolutely saved and loaded the tree. I played it just last month with my niece and nephew and it loaded the same ridiculous questions my sister and I had inputted as children 30 years ago.

Rebalancing is not possible not due to code restrictions but due to the nature of the game. How could the program know what new labels to invent for the inner nodes of the rebalanced tree? Unlike integers, animals are not totally ordered.

by colanderman

1/12/2025 at 11:09:11 AM

I had some version of Animal on our Apple IIe in the 80s, and I'm pretty sure it had the ability to save the tree. I have vague memories of the disk drive whirring after it asked you to provide a differentiating question. Unfortunately I don't have my later Apple IIGS up and running to prove it.

But I did find this BASIC version for Atari computers, from Antic magazine in April 1986. The article specifies:

"It's vital to save your computer's knowledge base to disk or tape - this knowledge is what makes the game fun. To save, type [CONTROL] [S] and you'll be prompted for a device to save the file to, and a filename. Cassette users should simply type C: at the prompt. Disk drive owners type D: and then the filename. To LOAD a previously saved knowledge base, press [CONTROL] [L] and respond to the prompt as explained just above."

The .BAS file at the end doesn't seem to be a plain-text listing, it's probably a saved Atari file. But viewing it in a hex editor, I see strings in the binary file like "Shall I SAVE this data?"

https://www.atarimagazines.com/v4n12/Animal.html

by SyneRyder

1/12/2025 at 2:10:19 PM

We had it on an Apple II+ in 1978. It definitely stored everything on the floppy. Including the typos and poorly-phrased questions ("Is it geen?").

by bregma

1/12/2025 at 4:26:20 PM

File IO wasn't portable.

Consider the book where this Animal appeared: David H Ahl was aiming at a wide audience. The content was previously in Creative Computing Magazine.

It was up to the reader to adapt the BASIC programs to work on their machine.

The robot illustrations are the unforgettable charm of that book. It's on the net.

by FarmerPotato

1/12/2025 at 11:00:26 AM

I remember having this book or similar (at least I remember the robot and the person from the cover) and recall that some things just aren’t portable across all BASIC implementations.

I think ZX Spectrum was what I saw most often, and there would be a couple of things that just didn’t work in CP/M basic, gwbasic or whatever I was using at the time - I imagine file manipulation was avoided for that reason also.

by tass

1/12/2025 at 6:45:03 PM

> File IO was extremely hard on the early home computers

Speaking as someone who was there at the time: no, it wasn't. Opening a file, reading/writing data to it, and closing it was a straightforward set of operations on the C64 I had. The I/O was slower, of course.

by throwawayoldguy