4/17/2026 at 11:04:21 PM
I am this very term teaching 18-year-old students 6502 assembly programming using an emulated Apple II Plus. They've had intro to Python, data structures, and OO programming courses using a modern programming environment.Now, they are programming a chip from the seventies using an editor/assembler that was written in 1983 and has a line editor, not a full-screen one.
We had a total of 10 hours of class + lab where I taught them about assembly language and told them about the registers, instructions, and addressing modes of the chip, memory map and monitor routines of the Apple, and after that we went and wrote a few programs together, mostly using the low-resolution graphics mode (40x40): a drawing program, a bouncing ball, culminating in hand-rolled sprites with simple collision detection.
Their assignment is to write a simple program (I suggested a low-res game like Snake or Tetris but they can do whatever they want provided they tell me about it and I okay it), demo their program, and then explain to the class how it works.
At first they hated the line editor. But then a very interesting thing happened. They started thinking about their code before writing it. Planning. Discussing things in advance. Everything we told them they should do before coding in previous classes, but they didn't do because a powerful editor was right there so why not use it?...
And then they started to get used to the line editor. They told me they didn't need to really see the code on the screen, it was in their head.
They will of course go back to modern tools after class is finished, but I think it's good for them to have this kind of experience.
by apricot
4/18/2026 at 1:07:16 AM
I took a very similar class 9 years ago, and it was honestly one of the most helpful things I got out of my CS degree. The low level and limited tooling taught me to think before I start writing.I've had other people look askanse at me, but on greenfield work I tend to start with pen and graph paper. I'm not even writing pseudocode, but diagramming a loose graph with potential functions or classes and arrows interconnecting them. Obviously this can be taken too far, full waterfall planning will be a different exercise in frustration.
I find spending a few hours planning out ahead of time before opening an editor saves me tons of time actually coding. I've never had a project even loosely resemble the paper diagram, but the exercise of thinking through the general structure ahead of time makes me way more productive when it comes time to start writing code. I've tried diagramming and scaffolding in my editor, but then I end up actually writing code instead of big picture diagramming. Writing it on paper where I know I'll have to retype everything anyway removes the distractions of what method to use or what to name a variable.
The few times I've vibe-coded something this was super helpful, since then I can give much more concrete and focused prompts.
by zrobotics
4/18/2026 at 5:03:46 PM
This is why whiteboards used to be so popular in many/most tech company offices.Doing this exact same process interactively with other people, and a not to NOT ERASE or later taking a picture of the whiteboard with your phone.
by jimbokun
4/18/2026 at 5:20:37 PM
"used to be" ?? What are engineering team doing nowadays when discussing architecturing their systems ?by bsaul
4/18/2026 at 5:40:18 PM
In my experience the last several years, primarily we’re all on Zoom waving our hands and making false promises to update Confluence with what we talked about. I miss offices with walls and whiteboards.by wrs
4/18/2026 at 8:15:57 PM
Miro.com is one of the few SaaS products that our team's collaboration could not live without.Perfect for a distributed team to replace the DO NOT ERASE white boards of yore.
by josh_s
4/19/2026 at 10:23:20 AM
yes miro is also what i'm using. It's really a digital whiteboard.by bsaul
4/19/2026 at 4:31:08 AM
Working remotely.by jimbokun
4/18/2026 at 9:09:50 PM
I have theee whiteboards in ly office, and almost all the walls of my teams space is covered with whiteboards. They are always full and it is always a drama when some space need to be madeby BrandoElFollito
4/19/2026 at 1:57:01 AM
In my opinion you should immediately erase after solving the problem on the whiteboard, never taking a picture.Same with notes that you will never see again. Done in pen, on random pages.
That process is bulletproof, for me.
by hackable_sand
4/19/2026 at 2:12:08 AM
exactly the same for me, 30 years and counting…by bdangubic
4/18/2026 at 12:08:40 PM
"Plans are useless, but planning is essential."by chrisweekly
4/18/2026 at 6:45:05 AM
One of my favourite experiences coming up as an engineer was working with a very senior engineer right in the beginning. Whenever he had a task or problem, he would start out thinking, maybe doodling a bit on paper, go for a walk, and only then sit down at his computer and start typing. He would type in one go only compiling in the end, and it would work. (Even typos were rare.)All this to say that it is extremely useful to have the program and the problem space in your head and to be able to reason about it before hand. It makes it clearer what you expect and easier to catch when something unexpected happens.
by spockz
4/18/2026 at 4:39:21 PM
> He would type in one go only compiling in the end, and it would work. (Even typos were rare.)Then with each year grow more paranoid if there are no bugs or typos.
by econ
4/18/2026 at 6:04:38 PM
I see some of the value in planning, but experimentation is so cheap, there's also a lot of value in trying it, seeing what works, and learning from it. The main drawback I see from experimentation is failing to understand why something worked.by dehrmann
4/19/2026 at 2:02:31 AM
The cheapest option in all of software development is to develop the program in your headThat includes experimentation.
by hackable_sand
4/18/2026 at 10:09:51 AM
I was going to say "why on earth are you making them use a line editor there is probably a vscode plugin for the assembler with syntax highlighting" then I got to your point about it being in their head instead. This reminds me of what zed Shaw said, for some reason code written without an ide is better and he's not sure why.As a sort of an adjacent point, I worked through a book that is used on a course often called "from nand to Tetris". It is probably the best thing I've done, in terms of understanding how computers, assemblers and compilers work
by pipes
4/18/2026 at 12:19:07 PM
> This reminds me of what zed Shaw said, for some reason code written without an ide is better and he's not sure why.I am not sure whether the statement is correct; I am not sure whether the statement is incorrect either. But I tested many editors and IDEs over the years.
IDEs can be useful, but they also hide abstractions. I noticed this with IntelliJ IDEA in particular; before I used it I was using my old, simple editor, and ruby as the glue for numerous actions. So when I want to compile something, I just do, say:
run FooBar.java
And this can do many things for me, including generating a binary via GraalVM and taking care of options. "run" is an alias or name to run.rb, which in turn handles running anything on my computer. In the IDE, I would have to add some config options and finding them is annoying; and often I can't do things I do via the commandline. So when I went to use the IDE, I felt limited and crippled in what I could do. My whole computer is actually an IDE already - not as convenient as a good GUI, of course, but I have all the options I want or need, and I can change and improve on each of them. Ruby acts as generic glue towards everything else on Linux here. It's perhaps not as sophisticated as a good IDE, but I can think in terms of what I want to do, without having to adjust to an IDE. This was also one reason I abandoned vim - I no longer wanted to have my brain adjust to vim. I am too used to adjust the language to how I think; in ruby this is easily possible. (In Java not so much, but one kind of has to combine ruby with a faster language too, be it C, C++, Go, Rust ... or Java. Ruby could also be replaced, e. g. with Python, so I feel that discussion is very similar; they are in a similar niche of usage too.)
by shevy-java
4/18/2026 at 5:07:45 PM
Simpler tools are a forcing function for simplicity. If you don't have code search, you'll need to write code that is legible without searching. If you don't have auto-refactoring utils, you'll have to be stricter about information-hiding. And if you don't have AI, you might hesitate to commit to the first thing you think of. You might go back to the drawing board in search of a deeper, simpler abstraction and end up reducing the size of your codebase instead of increasing it.Conveniences sometimes make things more complicated in the long run, and I worry that code agents (the ultimate convenience) will lead to a sort of ultimate carelessness that makes our jobs harder.
by bccdee
4/18/2026 at 6:28:35 PM
> Simpler tools are a forcing function for simplicity. If you don't have code search, you'll need to write code that is legible without searching.i was working in a place that had a real tech debt laden system. it was an absolute horror show. an offshore dev, the “manager” guy and i were sitting in a zoom call and i was ranting about how over complicated and horrific the codebase was, using one component as a specific example.
the offshore dev proceeded to use the JetBrains Ctrl + B keybind (jump to usages/definitions) to try and walk through how it all worked — “it’s really simple!” he said.
after a while i got frustrated, and interrupted him to point out that he’d had to navigate across something like 4 different files, multiple different levels of class inheritance and i don’t know how many different methods on those classes just to explain one component of a system used by maybe 5 people.
i used nano for a lot of that job. it forced me to be smarter by doing things simpler.
by dijksterhuis
4/18/2026 at 5:06:29 PM
I really like this approach. A good reminder that Ruby started out as a shell scripting language, as evidenced by many of the built in primitives useful for shell programming.by jimbokun
4/18/2026 at 5:22:17 PM
When .NET first came out I started learning it by writing C# code in Notepad and using csc.exe to compile it. I've never really used Visual Studio because it always made me feel that I didn't understand what was happening (that said, I changed jobs and never did any really big .NET project work).by SoftTalker
4/18/2026 at 12:26:03 AM
I took several classes along these lines in college; one writing a rudimentary OS on bare metal 68k asm, wiring up peripherals on breadboards, etc. Creating an ALU using only 74 series logic chips and the like. This was 30y ago, but the 1970s chips were already antiques, but the lessons were timeless. I'm happy courses like this still exist and I wish everyone had an opportunity to take them as part of standard computer science curriculum. For me at least, they fundamentally shaped my perspective of computing machinery that I never would have experienced otherwise.Today I program 6502/7 asm for my Atari to help me unwind and it grounds me and gives me joy, while in my day job I'm easily 10 levels of abstractions higher.
by drzaiusx11
4/18/2026 at 3:38:19 PM
Programming in an assembly language is a very zenith like experience for me.by ggerules
4/18/2026 at 3:52:15 PM
I love having a relationship with the lowest levels of our craft. Access to an electron microscope and decapping chips to make my own reimplementions (in software) is next on my bucket list. If chip lithograph wasn't so prohibitively expensive I'd also try my hand at that...by drzaiusx11
4/19/2026 at 3:14:26 AM
If you have a couple hundred dollars, you can get a chip made through tiny tapeout [0].by AlotOfReading
4/19/2026 at 1:33:52 PM
Very cool! I meant more hands on diy fab, like how I make my own pcbs. It seems farming out my designs to be produced (hdl->chip) at reasonable one-off costs is a plausible avenue now, which is exciting as well. I've probably been exposed to too many toxic chemicals already anyways and should readjust my bucketlist plans...by drzaiusx11
4/18/2026 at 8:20:18 AM
A bit tangential, but I believe dynamic vs static typing works the same. I switch quite often between them, and when ever I've had a longer break from dynamic typing, coming back to it feels quite heavy. "How did I ever do this?" It feels so heavy.But a few hours (or days) in, I forget what the problem was. A part of my brain wakes up. I start thinking about what I'm passing around, I start recognizing the types from the context and names...
It's just a different way of thinking.
I recognized the same feeling after vibe coding for too long and taking back the steering wheel. I decided I'd never let go again.
by tikotus
4/18/2026 at 11:33:33 AM
In dynamically typed programs, you can if you allow it, let the types happen to you. In a statically typed program, you are forced to think about them from the beginning. That same abstract concept is at play with vibe coding, but instead the code now happens to you.My best LLM written code is where I did a prototype of the overall structure of the program and fed that structure along with the spec and the goal. It is kind of the cognitive bitter lesson, the more you think the better the outcome. Always bet on thinking.
by genxy
4/18/2026 at 1:23:41 PM
I've used a dynamically typed language extensively. I don't think they are suited for anything but small scripts.Refactoring is a nightmare, as as types don't exist, the compiler can't help you if you try to access a property that doesn't exist.
I think generally people have realised this, and there are attempts to retrofit types onto dynamically typed languages.
by okeuro49
4/18/2026 at 4:37:26 PM
To "realize" that it would have to be true. The longer I've stuck with untyped Python the more I've preferred it, and the more I've seen people tie themselves in knots trying to make the annotations do impossible (or at least difficult) things. (It also takes away bandwidth for metaprogramming with the annotations.)by zahlman
4/18/2026 at 3:16:04 PM
It bugs me that there are two kinds of languages. Parameters and variables could be typed optionally in a dynamic language; either error in the compiler or at runtime; otherwise you just haven’t made any type errors while you coded and the code is fine either way.by hyperhello
4/18/2026 at 4:38:24 PM
This is what gradual typing (such as TypeScript, or the use of Python annotations for type-checking) accomplishes. The issue is that it basically always is bolted on after the fact. I have faith that we aren't at the end of PL history, and won't be surprised if the next generation of languages integrate gradual typing more thoughtfully.by zahlman
4/18/2026 at 9:11:27 PM
JavaScript caught on because it was the best casual language. They've been trying to weigh it down ever since with their endless workgroup functionality and build processes. If we get a next generation casual language, it'll have to come from some individual who wants it to happen.by hyperhello
4/19/2026 at 12:39:48 AM
No, JavaScript caught on because at the time it was the only game in town for writing web front-ends, and then people wanted it to run on the server side so that they could share code and training between front end and back end.by HappMacDonald
4/19/2026 at 3:57:53 AM
It's not enough to just be first. It would have been replaced by now if it wasn't fit for purpose. Otherwise we might as well not bother to critique anything.by hyperhello
4/18/2026 at 7:26:41 PM
The problem with these two languages is that the runtime type system is completely different (and much weaker) than the compile time one; so that the only way to be safe is to statically type the whole program.CL has a pretty anemic type system, but at least it does gradual without having to resort to this.
by BoingBoomTschak
4/18/2026 at 7:23:54 PM
A needlessly confrontational view. Some people do use dynamic typing as a way to stumble around until it works (e.g. most scientists) but some others simply don't want the noise associated with a static type system accurate enough to really say what you want; especially during prototyping/interactive use. Which is why gradual typing exists, really.Same reason my views about GC evolved from "it's for people lacking rigour" to "that's true, but there's a second benefit: no interleaving of memory handling and business logic to hurt clarity".
by BoingBoomTschak
4/18/2026 at 12:54:36 AM
>> Ed is for those who can remember what they are working on.https://www.gnu.org/fun/jokes/ed-msg.html
My first job out of university I was taught how to use a line editor in IBM UniData. It was interesting getting used to writing code that way.
But it was an amazing day when I discovered that the "program table" was just a directory on the server I could mount over FTP and use Notepad++.
by wffurr
4/18/2026 at 9:24:24 AM
I've been diving into aarch64 assembly recently building a small Forth. It's honestly really refreshing.If you're prepared to forgo some portability and pick an architecture assembly opens up a lot if options. Things like coroutines, automatic SIMD become easier to implement. It's also got amazing zero cost C FFI (and I'm only half joking). Linux kernel booting into a minimal STC Forth is a lot of fun.
Not to mention you can run your code on android without SDK or NDK over ADB (in the case of aarch64).
by andersmurphy
4/18/2026 at 10:03:38 AM
first time heard about that! thx for sharingby hiroboto
4/18/2026 at 10:35:13 AM
Recommend this series if you want to dabble in assembly/forth:by andersmurphy
4/18/2026 at 2:46:34 AM
I love all of this, but at least let your students use vi, it was around back then (or close). plus they don’t have to give it up when they go back in the real world, it’s an evergreen skill!by cobbzilla
4/18/2026 at 5:39:22 AM
To be fair, it's mostly an evergreen skill because people don't know how to exit.by sagacity
4/18/2026 at 12:16:57 PM
I know you're joking but if one can't figure out how to quit vi they should find other employment opportunities in other fields of work.by assimpleaspossi
4/18/2026 at 6:41:33 PM
And I bet emacs has a mode for that too.by fuzztester
4/18/2026 at 8:35:12 AM
I don't have a problem, I can quit anytime I want!by s1mplicissimus
4/18/2026 at 8:48:02 AM
Friends don't let friends use vi - they know that once you start, you'll never quit!by TeMPOraL
4/18/2026 at 10:58:11 AM
if you learned it in class, I hope you learn how to exit!by cobbzilla
4/18/2026 at 3:42:05 PM
shift :wqOr....
Ctrl-Z
by ggerules
4/18/2026 at 3:34:16 PM
Upvotes for apricot and zrobotics for thoughtful shared experiences.One of the continuous battles I kept loosing when introducing an assembly language undergraduate course. Other higher up colleages and deans would say... too hard... nobody uses that anymore... and shut the course down. But I would always sneak it into other courses I taught, systems programming, computer languages, computer architecture. But I've always felt there was a hole in my student's understanding of computers.
I grew up in a time when assembly language was a part of the cariculum. It helped bridge the gap between higher level languages like C/C++ ...etc. Also why certain language features exist. Also how many language constructs work. Also more importantly, as pointed out by the two posters above, it gives you a way to think about the CPU one asm line at a time what is going on the CPU ecosystem. That is fantastic training!
Even though I kept loosing the assembly language course battles, I hope I planted enough seeds in students that they will take it up on their own at some point. Everyone should at least learn to program in one assembly language.
by ggerules
4/20/2026 at 12:43:37 AM
I remember having a fun time with a line editor the one time I tried it, it felt very intuitive. I can't remember if it was 'ed' or forth-based variation though.by fouc
4/18/2026 at 9:18:33 AM
> And then they started to get used to the line editor. They told me they didn't need to really see the code on the screen, it was in their head.As someone that used to write C and Assembly programs on a sheet of paper for university exams, I chuckled a bit. I finished university in post-soviet country twenty years ago or so and this was the norm. I used to hate it so much.
by p2detar
4/18/2026 at 11:25:08 AM
I find something similar happening as I transition to spec driven development - whilst the agents do the work I used to do, I spend a hell of a lot more time thinking about what I want the outcome to be, rather than hacking around the limitations of frameworks I know, avoiding tech I don’t. It’s freeing actually.by FrankRay78
4/18/2026 at 6:12:44 PM
Agree it's good. I used to program with these tools because that's all I had and when I had better tools I moved onto to them. We also used to build our own better tooling.When I built my first guitar I had very few tools so I used what I had since I'm cheap ;) Then I bought better tools and it made my life a lot easier. But I got some lessons from the experience. Mostly though it was a pain that's solved by better tooling.
by YZF
4/18/2026 at 11:42:12 AM
I would love if you made such lectures into a series of YouTube videos as I sadly didn't have the pleasure in university. We only Java 6/7 as an indicator how long ago that wasby neocron
4/18/2026 at 4:25:32 PM
What I found teaching coding us that Python is actually not good as the first language. The main issue is types -- you should always think of what type your variable is (and what types are collection items are). Python makes it hard.Spaces are sometimes mandatory sometimes not. Something I didn't even think might be confusing, for me it's like breathing.
by deepsun
4/18/2026 at 4:41:18 PM
Indentation is mandatory, but when are spaces? Imo, if you're running static analysis it'll take care of types well enough. Sometimes I need a cast but not often.Contrasting that to helping a college roommate with Arduino code he said he didn't understand what it was doing: he had 0 indentation. Braces everywhere. He didn't understand what it was doing not because it was complex logic (it was only maybe 30 simple lines) but because his flow control was visually incomprehensible. It's pretty hard to do that in Python.
But that's why I believe in polygloty. Best of multiple worlds.
by Neywiny
4/18/2026 at 2:22:13 AM
My first real program was a UVEPROM copier. It was written in MC6800 Machine Code, and we had 256 bytes (not kilobytes) of RAM for everything; including the code. That was in 1983.I am currently working in Swift, with an LLM, on a fairly good-sized app, in Xcode, for a device that probably has a minimum of 64 GB of storage, and 8 GB of RAM.
I don’t really miss the good ol’ days, to be honest. I’m having a blast.
by ChrisMarshallNY
4/17/2026 at 11:29:56 PM
Is this course online available? Sounds like great fun.by flawn
4/18/2026 at 12:31:54 AM
still remember my assembly class with HC11 20 yrs ago: amazed by how much we can do with so little hardware.by philipnee
4/18/2026 at 6:04:00 AM
I had a similar experience recently coding a wordle clone on and for a Psion 3a (an early 90s palmtop pc) the screen only shows a few lines of code, and the built in ide is little more than a text editor. I really enjoyed the processby MattBearman
4/18/2026 at 9:57:22 AM
Do you have any more notes/lectures/references that you can share? I would like to try something similar.by mchaver
4/18/2026 at 12:18:05 AM
Whoa, I didn’t know such an thing existed. What emulator do you use?by ssgodderidge
4/18/2026 at 3:22:53 AM
AppleWin, and the assembler is an early version of Glen Bredon's Merlin.by apricot
4/18/2026 at 6:22:20 AM
thank you for picking an enjoyable architecture!scaring people away w x86 cruft right out the gate is no good for anyone :-)
by sitzkrieg
4/18/2026 at 9:40:00 AM
Can't imagine anyone teaching x86 assembly these days, did you mean x86-64?by MaxBarraclough
4/18/2026 at 10:54:45 AM
What do you imagine the difference to be between those two?by tremon
4/18/2026 at 4:23:00 PM
The improved register count must make it much less claustrophobic for students. It's not just the same ISA but with wider words.Looks like I'm mistaken on terminology though. x86 includes the 16-bit, 32-bit, and 64-bit ISAs of that family, and doesn't refer specifically to the 32-bit generation.
by MaxBarraclough
4/18/2026 at 8:56:20 AM
This is a really close equivalent to keep learning sketch and clay modeling in design schoolby juliendorra
4/18/2026 at 6:45:51 AM
I thought it was common practice to think things through first and only then start doing something, but it seems that these days a lot of people have taken inspiration from Zuckerberg’s motto, “move fast and break things”… I’ll never forget that.by sixtyj
4/18/2026 at 5:59:35 PM
One place I've seen people get caught here is when they don't actually have the information they need to solve the problem - when they don't understand the problem space well enough, or they don't know the boundaries of the systems or technologies they're using well enough, or there's unanswered questions. At that point, I've seen people dig into research projects and 15 page design document discussions that would all be obviated by a day or two of just doing the thing and seeing what happens.My understanding is that was the actual point of "move fast and break things" - gain knowledge by trying stuff to help you make better decisions, even if you make a mistake and need to roll back or fix it. The art to this is figuring out how to contain the negative consequences of whatever you're testing, but by all means, experiment early to gather information.
I've stated it to mentees as "don't be afraid to start a fire as long as you know where the fire extinguishers are" - it's OK to fail in the service of learning so long as you fail in a contained way.
by roughly
4/18/2026 at 9:37:48 AM
P.S.: I didn't mean that in a negative way; I was just surprised that we have to learn this because our kids have forgotten it, or probably we don’t teach planning in elementary schools.by sixtyj
4/18/2026 at 6:02:42 PM
TBH I think the bigger problem for how we teach kids are twofold:1. There's a right answer to every problem in school
2. If you got it wrong, that's bad, and you did bad.
The pattern I've seen from younger people these days is a learned helplessness, where there's no room for them to be creative in school, and any attempt to do so runs the risk of failing an assignment, getting a B, missing out on Harvard, and spending the rest of their lives poor in a ditch, or so they're told.
by roughly
4/18/2026 at 3:26:14 PM
I think it depends on your goals. There are many domains where you’re better off just trying lots of things and iterating towards a more ideal solution, vs. waiting to start until it’s been analyzed thoroughly to find the perfect solution.For example, I suspect more startups die from over-analysis than from acting too quickly and breaking things beyond repair.
That said, I think LLMs can be a mixed bag here. I find that they can really help my analysis phase, by suggesting architectures, finding places where future abstractions will leak, reminding me of how a complex project works, etc. I’ve found it invaluable to go back and forth in a planning phase with an agent before even deciding what exactly I want to build, or how.
And on the implementation side, they make code attempts very cheap, so I can try multiple things and just throw them away if I don’t like the result.
But that said, I do find that it requires discipline, because it’s very easy to get into a groove where I don’t do any of that, and instead just toss half-baked ideas over the wall and the agent figure out the details. And it will, and it’ll be pretty decent usually, but not as good as if I pair program with it fully.
by senordevnyc
4/18/2026 at 11:58:11 AM
Hmm I dunno if I'm a fan of making things painful for students just because it was painful in the past. When I "learnt" assembly in uni we had to manually assemble the opcodes and type them into a 10 digit keypad. I didn't learn anything, it just put me off.I'm pretty skeptical that using a line editor will have helped them learn. It probably helped them memorise their code but is that really learning? Dubious.
by IshKebab