balls187 5 years ago

The final for Computer Architeture had us building an 8-bit CPU. It was a multiweek project, starting with designing the instruction set leading up to building the CPU in software implementing bubble sort in assembly.

The first and only time I had to do an all nighter (2 actually) in college was due to that project. Two days before the final presentation, the CPU didn't work. After a few clock cycles the memory would contain garbage. I ended up rebuilding it from scratch debugging every step of the way only to find out the 1-bit mux (a primitive supplied with the software) was wired backwards.

0 corresponded to the B input, and 1 selected the A input.

Once I correct that, the CPU worked like a charm, we nailed the final preso, and I slept for 16 hours.

  • spchampion2 5 years ago

    I had a very similar experience, including a very late night trying to fix a weird integration problem along with my project team. We had spent hours trying to find the issue, but nobody was thinking effectively and we were making stupid decisions. Everyone wanted to keep going, but I insisted that we were not thinking clearly and that sleep would help us better solve the problem. I even promised I would come back first thing in the morning to start debugging again on my own. The team begrudgingly agreed and we walked away for the night.

    I slept, came back in the morning, and had the problem fixed in like 15 minutes. I learned so much about CPU design from that class, but I also learned how important sleep is to thinking clearly.

    • sixothree 5 years ago

      I woke up one day recently and "knew" what a co-worker had done wrong. He pasted steps of his issue with screenshots. Having slept on it I realized he had misspelled a server name. I have no idea how I discovered that or realized that while I was sleeping. But there it was.

      • 52-6F-62 5 years ago

        I've commented on a similar thing previously—

        I worked for a while with some older Vietnamese refugees who were largely self-taught electrical engineers. They used to tell me how they'd dream about the circuits in these high-watt lighting ballasts they were fixing. One tale specifically had them trying to find a fix over the course of a week, only to dream up the solution and come in the next morning to repair the gear in minutes.

        I've definitely dreamed up solutions to software bugs and architecture problems myself.

        I've never really dug deeply into why, but it's fascinating.

      • BonesJustice 5 years ago

        Once you’ve solved your first programming issue in your sleep, you’re officially a Real Programmer(TM).

        Or at least you can feel confident that you’ve picked a good career path.

      • jussij 5 years ago

        The mind is an amazing bit of kit.

        On several occasion during my IT career I have faced situations where the problem at hand has taken days/weeks to solve.

        What is most amazing is that on some occasions the solution literally just pops into my head.

        It's almost like there is some background thread of the mind working subconsciously on the problem, only to rise to a conscious level when a solution is found.

        • ishiz 5 years ago

          There certainly seems to be something to it. This isn't a new phenomenon: there is a story that Archimedes around 250 BC couldn't solve a problem until he stopped working on it and took a bath. While this story may not be true, it does show that these "eureka moments" were common not just today but also thousands of years ago.

          I'm not a psychologist but from what I understand, "dual process theories" state that humans have two distinct thought processes. System 1 is fast, instinctive, and unconscious; it answers questions like "2+2=?" System 2 is conscious, deep thought; it answers questions like "if 2x+3=17, x=?". System 1 is the default thought process, but it can be suppressed by system 2. I would love if anyone who knows more knows how this might be connected.

          • Angostura 5 years ago

            To be fair, the problem Archimedes was stuck on was how to measure the volume of an irregular body. Immersing his irregular body in the bath kinda gave him a clue.

          • tkxxx7 5 years ago

            Many meditation practices focus on how to objectively detach from and observe system 1 so that we can maintain control over 2

      • sly010 5 years ago

        I once asked a question from a potential airbnb host through the contact form. The host answered something totally unrelated like "Hi, good question. We have 3 kitchen towels, 4 spoons, 7 chopsticks 5 plates 5 bathroom towels etc". It was strange. I woke up in the middle of the night when I understood it in my dream.

        • b_tterc_p 5 years ago

          and what did this mean?

          • sly010 5 years ago

            He wanted me to call him.

            • b_tterc_p 5 years ago

              I don’t think I would have gotten that. Good job.

              • J-dawg 5 years ago

                Took me a moment to get this, but the part I still don't understand is why he needed to encode the phone number like that?

                Does Airbnb strip contact details out of messages to prevent hosts and visitors from getting in contact directly?

                • irrational 5 years ago

                  Oh yes. They want all contact to go through them. They charge huge fees and they want to make sure that the hosts and clients don't try to make arrangements on their own without AirBnb getting their cut.

      • onei 5 years ago

        Sometimes you can't see the wood for the trees.

        I once spent more time than I want to admit trying to fix a broken import in python that I knew existed but wouldn't import. The morning after I realised I was trying to import "connnector" and couldn't see the extra "n".

      • HeWhoLurksLate 5 years ago

        related Studio C sketch: https://www.youtube.com/watch?v=CxV0Nh10h5s

        I've done that in my high school Visual Basic class- I woke up around 2 AM, knew exactly where I was going wrong, fixed it, and kept going until I'd finished my program. By the time my dad woke up at six, I had written over six hundred lines of code, had about twelve new features, and I was so exhausted I got sick for two days. I'm still proud of that sprint.

        • adrianhel 5 years ago

          My most absurdly productive episodes have been when I've worked through the night. I love the amount of focus I get when there is nothing to distract me!

      • larzang 5 years ago

        The absolute best coding I get done every day is during my morning shower. Letting your brain indirectly work on something is often the best approach to our kind of work.

    • agrippanux 5 years ago

      You might want to read "Why We Sleep" by Matt Walker. It's quite fascinating what sleep actually does and how it affects memories and thinking. There is a large section devoted to debunking myths about pulling all-nighters; the super broad stroke is that the short term memory area of the brain needs sleep to offload data to the long term memory area and a lack of sleep causes the short term memories to just bounce off. Once sleep is achieved the brain can resume normal functioning which seems like "a-ha" moments but really is just your short term memory functioning with a fresh cache again.

    • hbhakhra 5 years ago

      This reminds me of a time I pulled a near all nighter for a final. I came into the final on too little sleep and even though I knew the material, I couldn't think during the test. I took a 20 minute nap during the two hour final, woke up and aced the final.

  • cobookman 5 years ago

    Had to do the same assignment. I highly recommend doing it. Was the most painful thing I ever did in college, but it taught me so much about how processors work, and how to code in C.

    • _han 5 years ago

      Is there a public resource somewhere that describes the assignment/gives some guidelines? I would like to give it a try.

      • cweagans 5 years ago

        https://www.nand2tetris.org/ goes through the same process. Highly recommend!

        • BanazirGalbasi 5 years ago

          The game MHRD (MicroHard) offers the same content, but in an interactive environment. I haven't gone much into it yet so I don't know how the end content compares to Nand2Tetris, but a lot of people say they're different methods of teaching almost identical content.

          • mtreis86 5 years ago

            It is very similar in terms of the logic puzzles you need to solve, but the format and tests are different.

      • jlangemeier 5 years ago

        Introduction to Logic Circuits and Logic Design with VHDL: (ISBN: 3319341944)

        I'm sure each person here is talking about a slightly different assignment, course, and book; but for me (from an EE side of the coin) the person that wrote this book taught our 2 semester course series that starts with basic logic, moves into the circuitry, and then into modeling the entire computer architecture in VHDL on a reasonably inexpensive FPGA (the Altera dev boards are about $100-$200).

        The book follows that 2 semester series to a tee, and uses the guy's same in course narrative style (minus smacking the whiteboard @ 8 in the morning and us getting yelled at by the french teacher next door). The writer has won multiple teaching awards from IEEE, and is a real gift to the EE community - as you can tell I have 0 complaints.

        • fwip 5 years ago

          And if you want to learn the next level down, how computer chips work, read Introduction to VLSI Systems, by Conway and Mead.

          • jlangemeier 5 years ago

            That was the next semester - and further down the rabbit hole after that - if you can find a university with a clean room; is actual microchip design, even getting a PNP transistor built from silicon is a damned miracle - out of a class of 15, only 5 of us got one fully working to spec.

        • balls187 5 years ago

          Whats kind of cool (to me anyway) is the now FPGA based game machines are making a comeback.

          Being able to in software define the exact hardware specificaations of older consoles is pretty cool. Where as previously using general purpose RISC or CISC CPUs and have everything emulated as a software application.

  • userbinator 5 years ago

    Situations like this show why it's very important to have a complete understanding of the system when debugging --- the more complete your knowledge, the less likely it will be forgotten in panic/frustration and lead you to assume things about the system such as the correctness of certain parts.

    I've been a TA for a course that did something very similar, and whenever a student came to me with a noticeably vague understanding (usually expressed as "it doesn't work, I've spent days on this and I don't know why"), I would observe him/her debugging it for a few minutes and showing it to me, and almost always I'd spot the problem right away; but instead of pointing it out, I would ask the student to print several copies of the circuit onto paper, then tell him/her to annotate all the signals with their expected values for the few cycles leading up to, including, and after the problematic one.

    At this point a lot of them would look at me like I was insane, and reply with some variant of "I can't do it" or "we were never taught how to do that" and want to reach for the computer, whereupon I would stop them and show how. Once they figured it out, they would usually reach a point and say "I think this was my problem" --- wanting to go back to the computer again, and again I'd intervene to tell him/her to finish the whole annotation first (because they'd often have more bugs to be discovered.) Once finished, however, I'd let them use the computer again and compare, and then they would always have no problem finding and fixing the original bug, and perhaps several more after that.

    I believe this is closely related to another phenomenon I've observed, which I call "debugger tunnel-vision", where a human using a tool and trying to debug a system essentially starts to blindly trust parts of it as being correct, because his/her own understanding of the functioning is itself unclear. My insistence on not using the computer and going back to pencil and paper (and brain) is, albeit probably quite "old-school" to some here, I believe an extremely important technique in being able to understand and debug effectively. It's worked not only for low-level hardware courses, but more high-level ones too --- where I tell students struggling with their code to "mentally execute" each step and compare the resultant expected values with those obtained. One of my favourite sayings is: How can you expect to be able to tell a computer what to do, if you yourself don't know how to do it?

    • rarecoil 5 years ago

      > At this point a lot of them would look at me like I was insane, and reply with some variant of "I can't do it" or "we were never taught how to do that" and want to reach for the computer, whereupon I would stop them and show how.

      I wish I would have had such a TA in my university. Instead, I got compressed deadlines and a "you figure it out or you fail" mentality in the majority of my classes. There were lectures and a textbook - if you couldn't sort it out and make it click on your own, alongside all of your other courses, you were SOL.

  • mountainofdeath 5 years ago

    I had to do something similar although the task was to implement a subset of MIPS instructions and simulated pipelining and out-of-order execution.

    • InitialLastName 5 years ago

      One day I'll die at the hands of a branch delay slot in an OOO system.

  • cbanek 5 years ago

    Same thing for one of our computer engineering courses at UIUC. Great course, and while it took a long time to do it, it was a great experience (other than the fact that we had to do it in Mentor Graphics, which would randomly crash and eat your project). We had assembly programs we had to test against, and it was always really interesting debugging your register stalls. I think mine was a 5-stage processor, MIPS style, so we had to worry about all the pipelining. There were also a standard parts library so everyone was trying to compete against each other to get the best instructions per cycle.

  • ccozan 5 years ago

    Had almost the same assignement. We were teams of two, and we had to demonstrate that it works by implementing something very simple assembly instructions, we had to design them too. I was in charge of the pipelining and clocking and my colleague took the ALU and the I/O. Testing was job for both, AFAIK. I remember chasing a propagation bug, had to add some delays on some lines to get proper sync.

    Realy really fun for a semester and a well deserved note at the end of it.

  • urda 5 years ago

    I know that feel. I was tasked with wiring up the even / odd bytes for my team's Microcomputer. After countless hours of wire trimming and breadboarding, I was so afraid to boot up the board. Popped the brick in the wall, got a solid power light, and was presented with my terminal.

    Huge sigh of relief, I might have just passed out then and there if I didn't have to continue.

  • djhworld 5 years ago

    Hello!

    Really enjoyed your story.

    While I wasn't up against any deadlines I did have a few of those so-simple-it's-painful fixes when I was doing this project haha

  • cjensen 5 years ago

    Has to do that. Implemented a 68000.

    It was a stupid project. I learned nothing from the project that I had not already learned along the way with simpler toy instructional examples. And yeah, the double all-nighter and having one team member give up 12-hours before the due date was one of the worst experiences in my life.

    • dpe82 5 years ago

      If you'd already "learned it all" why did it require double all-nighters to complete the project while being so difficult a team member even gave up? Did they only give you 3 days to do it?

      • cjensen 5 years ago

        The 68K is a complex and full featured CPU, so there is a lot of tedious detail to be done that has nothing to do with the overall concept of "design a CPU". For example, there are so many instructions that it took a couple of days of time just to write the microcode.

        Let's use an analogy. Suppose you have a class where they teach using Minix. Tasks like "rewrite the time slice algorithm" or "implement part of a filesystem" can be valuable to understanding how an OS works. But if the final project is "reimplement the Linux kernel from scratch" well then you are going to spend a lot of time on the detail work of what you already understand without actually learning much new. It's a poor use of time.

        Or in compiler classes. Mess with a module in LLVM is a valid learning experience. Implement a tiny language is super valuable. But "write a C compiler conforming to the C Standard" gets you into a lot of gory details about C, takes an enormous amount of time, and teaches you very little about the theory and practice of compilers and languages.

      • aoeusnth1 5 years ago

        I can think of a bunch of reasons:

        - Things can be challenging without being educational. (OP explicitly says this)

        - ADHD

        - High course load / other assignments they might care more about.

        Your comment presumes knowledge of their situation that you don’t have. Don’t do that.

  • godelmachine 5 years ago

    Was it in a Bachelors course or a Masters course?

    • balls187 5 years ago

      This was a undergrad course.

      Computer Architecture 1

      It was required for Computer Engineering majors (my major), but I believe an elective for Electrical Engineering and Computer Science majors.

xigency 5 years ago

> So I’m trying to get a better understanding of this stuff because I don’t know what L1/L2 caches are, I don’t know what pipelining means, I’m not entirely sure I understand the Meltdown and Spectre vulnerability papers.

I wouldn't discourage anyone from learning about hardware from implementing gates or even looking at simple 8-bit CPUs, but if you are interested in learning how modern caches and pipelines work, there is a free Udacity course that goes into excellent detail [0]. You can also find the lecture videos on YouTube.

This is originally an online course at Georgia Tech and the professor does an excellent job teaching these concepts.

[0] https://www.udacity.com/course/high-performance-computer-arc...

  • dr_zoidberg 5 years ago

    There's a very good article [0] that makes rounds here very now and then. It gives a very good explanation of many of those concepts, while remaining understandable for those that haven't taken CPU design courses.

    [0] http://www.lighterra.com/papers/modernmicroprocessors/

    Edit: It'd be great a 2019 refresh (last update is from 2016) talking about Ryzen and the newer Intel designs.

    • djhworld 5 years ago

      Bookmarked, thanks for the link, I think it will help me with the next step

  • taormina 5 years ago

    I took this class at Georgia Tech as an undergrad and I also highly recommend it for the material. We had a chain of CS classes leading up to it (2110 -> 2200 -> 4290) on the undergrad side, but when I took it, this course was actually cross-listed as both an undergrad and graduate CS and ECE class (so CS 4290/6290 and the same sort of thing for the ECE major were all in the same literal lecture hall). Of course, most CS students were only taking this if they were in the System & Architectures thread (or maybe Devices?), but most CS students had to at least take up to CS 2200 (Systems & Networks, which basically took the concepts up to "Build me a working 1 CPU computer", whereas this class took the next steps from "Okay, so how do we do better than 1 instruction per cycle?"). Multi-core processors are really cool!

    I frequently take for granted that most people don't have this foundational knowledge. The basics come up more than you'd think, especially when provisioning the right hardware for the task.

    • filoleg 5 years ago

      Unfortunately, that’s not the case anymore with 2200. There is only one thread combination that allows CS students to skip 2110, but quite a few combinations that allow skipping 2200, not even mentioning 4290. Anecdotal, but out of people who graduated with me that I know, barely two thirds of them took 2200 :(

      P.S. For clarification, since those classes are in a series (2110->2200->4290), skipping an earlier one means you are skipping all the following ones too.

      • nostrebored 5 years ago

        That's horrible. 2200 was a hazing and a half but Leahy was such an effective instructor and I don't regret learning the material at all. Building our own pipeline processors was such a fun, if taxing, experience.

        • filoleg 5 years ago

          Took it with Leahy myself, I feel like the material was absolutely essential to me, even though I work in web dev. Understanding the basics of how it all works under the hood, down to the ticks of the processor and how it all gets flushed through the pipeline was an eye opening experience.

          P.S. Bill retired a year or two ago :(

  • djhworld 5 years ago

    A colleague linked me to a similar course - I'll definitely be checking this out though - thanks

  • jambutters 5 years ago

    what are the prerequisites? C?

    • harias 5 years ago

      From the website:

      You must be familiar with Assembly code, the C or C++ programming language, Unix or Linux, and the basics of pipelining.

blago 5 years ago

I took an architecture class back in college and the first time we met, the professor gave us a couple of programs described in plain English. One of them was basically sorting, and the other one something else.

Each of us spent the rest of the semester picking an instruction set, designing a system, writing an emulator, and writing the code that would perform the tasks described on the first day.

I went a little overboard and created a C backend for the emulator along with an in-browser JS client that was pretty much a full-blown machine language IDE and debugger.

Needless to say, this feature creep didn't end well. I barely made it work well enough to get an ok grade but I learned that overconfidence can be more dangerous than the lack of.

Here's what I was able to salvage from the front end on short notice: http://blago.dachev.com/~blago/CS-535/stage_2/src/web/ It's using ext.js which was pretty cool at the time.

  • lioeters 5 years ago

    > I went a little overboard and created a C backend for the emulator along with an in-browser JS client that was pretty much a full-blown machine language IDE and debugger.

    Haha, this made me smile. I can totally relate to that ambitious line of thinking: given a task to solve, you imagined the logical next steps, to build an environment for solving that general class of tasks.

    Great lesson about feature creep, but I also think that kind of vision and ambitious problem-solving can be valuable in the long term, if it works the whole community/ecosystem can benefit.

  • djhworld 5 years ago

    > I went a little overboard and created a C backend for the emulator along with an in-browser JS client that was pretty much a full-blown machine language IDE and debugger.

    My original goal for the project was to type a letter on the keyboard and get something rendering on the screen, but I definitely felt the urge to keep implementing new things. Luckily the blog post took a lot of that urge away and focussed it on writing about the project!

  • Hendrikto 5 years ago

    > I barely made it work well enough to get an ok grade but I learned that overconfidence can be more dangerous than the lack of.

    That is a very important lesson.

    • w0utert 5 years ago

      Yeah, reminds me of the time I had to write a BSP tree builder as a course assignment, and ending up spending 2 months writing an OpenGL renderer + an importer for the Apple QuickDraw 3D API instead (hint: don't try to do that, it's almost a full object-oriented programming language in itself if you want to support even a moderately complex model).

      All just because I found some nice architecture models that were only available in QD3D format, and I was convinced it would only take me a few hours to read them... I only got the brilliant idea of just converting them to something simpler like .3DS after I already finished the assignment >_<

flanbiscuit 5 years ago

Very cool!

I had the same thoughts as you and bought "From Nand to Tetris"[1] a while ago but I did not get as far as you. You've inspired me to pick it back up and finish the book.

Curious if you decided to use "But How Do It Know?" over "Nand to Tetris" for any specific reason or were you just not aware of the latter.

1. https://www.nand2tetris.org/

  • robbbbbbbbbbbb 5 years ago

    +1 for Nand to Tetris.

    As someone who came to computing through high level software and a little later than many (I wasn't dismantling appliances at age 5 like you hear in a lot of people's origin stories) this was a really empowering ground-up introduction to hardware architecture.

    One of the few Coursera courses [1] I actually finished and found rewarding, challenging and fun throughout.

    1. https://www.coursera.org/learn/build-a-computer

  • djhworld 5 years ago

    Hi author here!

    I actually didn't set out on building anything, it all happened organically! I honestly cannot remember why I chose the book, I think I just saw the blurb and figured it was good enough to read.

    It was only when I was a couple of chapters in that I figured I could probably whip something up in code :)

    • Hnrobert42 5 years ago

      Hello author. I just wanted to heartily thank you for trying this and writing a post about it. I love seeing people pursuing technology for the sheer joy of it.

      Staring at the blank whiteness of an unwritten post, one wonders whether anyone else will notice or care what words will come tumbling out. But through your post, your work is multiplied. It is inspiring.

      So, again, thank you.

      • djhworld 5 years ago

        No worries, thanks for the kind words.

        I got some great feedback from friends and colleagues before publishing it so thanks to them too :)

  • alpaca128 5 years ago

    Just a few days ago I found NandGame, which is a browser game where you build a computer starting with just the Nand gate and using only components you already built for the next one. Pretty cool idea.

js2 5 years ago

> However, after making my way through But How Do It Know? by J. Clark Scott, a book which describes the bits of a simple 8-bit computer from the NAND gates, through to the registers, RAM, bits of the CPU, ALU and I/O, I got a hankering to implement it in code.

Speaking of code, another excellent book along these lines is Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.

http://www.charlespetzold.com/code/

  • djhworld 5 years ago

    I actually read this book in 2012/13, it inspired me to build my gameboy emulator! It's definitely an excellent read.

    I can't remember if the book discussed the gate level stuff, maybe that washed over me a bit at the time I'm not sure.

  • kemiller2002 5 years ago

    Anytime someone is starting out their development career and didn't go to school for CS, I recommend this book. It's a fast read and an excellent primer.

  • drtse4 5 years ago

    An amazing book and one of my favorites, discovered just a few years ago sadly, I would have loved something like that when I was younger.

  • heinrichhartman 5 years ago

    yes, that's notably the only publication by Microsoft Press, I can recommend ;)

eatonphil 5 years ago

Going through a similar phase, though this is way further along. :) I had a great course in college on computer architecture that culminated in a processor in Logisim that could run fairly complex programs. (I recommend the Harris and Harris textbook for a surprisingly light/easy introduction [0].) But that was a while ago and I've never done anything for x86/amd64.

I started working on an emulator [1] a few weeks ago but it interprets Intel x86 assembly rather than ELF files. I found this a great way to get started since parsing text is easier and the instructions you need to get a basic C program (compiled to assembly) running take an hour or two: call, push, pop, add. You can shim _start in without having to implement syscalls.

Conditional jumping and syscalls took another weekend or two and now it can run some basic fibonacci C programs. I also had to build a graphical debugger for it to see what was going on... I will probably move to reading ELF files soon.

I'll be writing up the process in a series on x86/amd64 emulator basics.

[0] https://www.goodreads.com/book/show/2558730.Digital_Design_a...

[1] https://github.com/eatonphil/x86e

gilbetron 5 years ago

(Way) back in college, we had been learning high-level programming concepts (UI, OS, Compilers, Algorithms, etc), underlying math behind computing, and physics up through circuits. My favorite course was Computer Architecture which was set up to pull everything together, mostly through a series of simulated computer projects, until, at the end, you realized you knew how a computer worked from math, through physics, and up to what the person interacted with. One of my favorite educational experiences ever. The Professor (Yale Patt) was a really interesting guy, great storyteller, and still actively worked with Intel. I also thought he was really friendly, turns out he just liked talking my girlfriend at the time and I was a tag-along ;)

  • sanderjd 5 years ago

    This was my favorite part of college too. I think it was near the end of a really good networking class I took maybe my junior year when I distinctly remember chatting on AIM and thinking about how I could visualize approximately how my key presses were going from the hardware through the OS into memory into instructions on the CPU executing the application back out onto my screen and across the network to a server and on to my friend's machine for the reverse trip. It was a powerful moment. Of course as I've gone further through my career I've realized how far off my approximation was because everything is even more complex in the details, but I still think it was a fundamentally valid and valuable moment.

    This is one reason I'm ambivalent about the skepticism around computer science / engineering programs as a prerequisite for a career in software development. It bums me out to think of people toiling at this work without experiencing that kind of bottom-up knowledge of computing. But I think this is largely a projection of my own personality on others; that would be a bummer for me, but I think many people don't care about any of that and just want to do valuable work for good pay.

  • elamje 5 years ago

    Glad to hear from another survivor of Dr. Patts class. I took him for intro to computing first semester of college with no programming experience and I got wrecked by his class, but most of the content has stuck with me up through my career!

ibeckermayer 5 years ago

I'm working on a similar project, implementing the architecture described in this book (https://www.nand2tetris.org).

Ultimately I hope to implement it on an FPGA with attached keyboard and screen and actually use it to compute stuff: https://github.com/ibeckermayer/Nand2TetrisFPGA

  • mikevin 5 years ago

    Nice project. I've finished most of the course (up until the compiler because that's where it became too familiar) and I'm now redoing it on an actual fpga. It's a great learning experience and I can fully recommend keeping at it, it's one of the most rewarding projects I've done. Actually bought a Pynq Z2 last week to have some peripherals to connect and to drive it easily. Might put it up github too.

    • ibeckermayer 5 years ago

      Nice. I looked around online and it seems like others have attempted something like this but I didn't find any project that looked complete. I have a Basys 3 board which has VGA and USB I/O, so I hope to actually get from Nand to actually playing Tetris on my homespun platform.

  • djhworld 5 years ago

    Will look forward to following its progress!

hawkjo 5 years ago

Quick shout out to NAND Game, a version of this laid out as a computer game. Delightful, free, web-based.

http://nandgame.com/

  • hathawsh 5 years ago

    That is awesome. I plan to challenge my kids to finish that game this summer.

thrower123 5 years ago

I still think the single best course I took in college was my computer architecture course. We built an entire 8-bit CPU up from individual gates over the course of the semester in Logisim[1], to the point where we could compile a limited version of C down to the machine code for our simulated processor and load it into the simulated main memory and run it.

I'm incredibly glad there was that hands-on component to the course, rather than just the theoretical textbook and lectures learning; it was hard as hell, and I actually ended up dropping it the first time and taking it again later, but at the end I actually felt like I kinda knew what was going on. Pointers were never mysterious again, at least.

[1] http://www.cburch.com/logisim/

nstart 5 years ago

As a layman on this topic, I'm curious about the difficulty around this project:

It seems like a lot of people have gone through similar experiences of building an 8 bit CPU simulator/emulator. Curious why it's not 64 bits. I can guess the answer is "it's hard". I'm just wondering where the difficulty lies. Is it in the standards? The mechanical feasibility of connections? Actual limitations of building a 64 bit emulator on a machine powered by a 64 bit processor?

Apologies if the question seems obvious. I come into this knowing next to nothing and I could probably find the answer with some research of my own. Just wanted to ask the community for thoughts here ️

  • djhworld 5 years ago

    For this project? It wouldn't have been difficult to upgrade the machine to 64-bit, it's really just a matter of increasing the bus width and register size.

    But it would slow it down massively due to me writing it in a general purpose programming language and passing around booleans everywhere, there's a lot of for loops that copy things in and out of buses and I would have to increase the size of the ALU to accomodate the bigger numbers.

  • dpe82 5 years ago

    Designing a simple 64 bit CPU is not really more work than a simple 8 bit processor. 64 bit CPUs tend to have more features than just wider registers though.

pjc50 5 years ago

> https://github.com/djhworld/simple-computer/blob/master/cpu/...

The author themselves points out that a "big pile of gates" is not the best approach here, but I'm kind of surprised that at no point did they attempt to build another representation of the circuit and either execute it in a table-driven way, or write a short script that spits out go for compilation.

  • gnode 5 years ago

    > but I'm kind of surprised that at no point did they attempt to build another

    Perhaps because the existing implementation had already served its purpose - an educational exercise to understand CPU design, and not an instruction-level emulator or transcompiler.

  • djhworld 5 years ago

    Thanks for the feedback, these are some good ideas if I was do do this again.

    I'm sorry you didn't think I took the best approach.

mar77i 5 years ago

I got this link yesterday or so. I breezed through it until the memory latch: http://nandgame.com/

  • rdc12 5 years ago

    Did you manage to make the latch in the end? I would imagine feedback would be a tricky subject to try learn without help.

  • djmips 5 years ago

    Try this! It's fun!

anderspitman 5 years ago

I went back to school for CS after 6 years of professional programming. Just for being guided through logic gates, assembly, and implementing a simple CPU in an FPGA I consider it well worth the time and money.

The point when I realized I finally understood (on a surface level) everything from the electrical signals to my JavaScript reminds me of when I finished the last video of Khan Academy's series on Euler's formula and "understood" it, if only for a moment.

baybal2 5 years ago

Especially for people in line of web development.

Take a look to see how simple is working of a CPU, and by how many MAGNITUDES does the code size grow when any modern "programming paradigm" is involved.

Programs that are brought down to the absolute minimum of arithmetic and logical operations in assembler can often run thousand times faster than when written in higher level languages.

I remember I was shown how a classical computer science problem called "Sisyphus dilemma" can be done in a single logic instruction, instead of a kilobyte long program in Java that makes the smallest solution possible when no binary operations are allowed.

  • Aromasin 5 years ago

    And that's when you start to fall into the deep rabbit hole that is FPGA's and VHDL/Verilog, all in the pursuit of that need for speed.

  • adrianN 5 years ago

    I don't think you lose three orders of magnitude of performance with a high level language. What a modern C compiler produces with a little help from the programmer should be reasonably close to what you can achieve by hand. Even slow languages like Python are not a thousand times slower than C. Maybe a hundred times.

    • lm28469 5 years ago

      > What a modern C compiler produces with a little help from the programmer should be reasonably close to what you can achieve by hand.

      Unless you're an expert in ASM / have an unlimited amount of time / work on an extreme edge case I'd say the compiler will beat a human 99.9/100

      • dspillett 5 years ago

        I'm not sure I'd go as far as "beat a human 99.9/100", certainly not beat by a significant amount, but definitely "at least equal a human 99.9/100". Of course that 1‱ where the human wins could be large wins, but then you have to consider if they are in positions that actually matter i.e. tight loops.

  • mhh__ 5 years ago

    1. CPUs are not simple by any means, as evidenced by spectre and meltdown

    2. Sure, assembly is faster than Java script but good luck getting any abstraction or type safety in ASM. And also you're code is impossible to debug and takes 15 times longer to work on

  • oblio 5 years ago

    > Take a look to see how simple is working of a CPU, and by how many MAGNITUDES does the code size grow when any modern "programming paradigm" is involved.

    Yeah, but you can't scale that to teams of programmers working on complex business logic.

  • pjc50 5 years ago

    > classical computer science problem called "Sisyphus dilemma" can be done in a single logic instruction

    This sounds interesting and I'd like to read about it, but Google isn't helping.

    • baybal2 5 years ago

      https://en.wikipedia.org/wiki/Josephus_problem

      Mistook the name.

      Yep, just shift the binary representation of the total number of members by 1 bit

          /**
        * 
        * @param n (41) the number of people standing in the circle
        * @return the safe position who will survive the execution 
        * ~Integer.highestOneBit(n*2)
        * Multiply n by 2, get the first set bit and take its complement
        * ((n<<1) | 1)
        * Left Shift n and flipping the last bit
        * ~Integer.highestOneBit(n*2) & ((n<<1) | 1) 
        * Bitwise And to copy bits exists in both operands.
        */
       public int getSafePosition(int n) {
        return ~Integer.highestOneBit(n*2) & ((n<<1) | 1);
       }
    • mhh__ 5 years ago

      I've never heard of it either, but it's worth saying that if it can be done in one instruction it's entirely possible that GCC or clang will do it if possible

lqet 5 years ago

Nice work. Somewhat related to the Megaprocessor by James Newman [0]

[0] http://www.megaprocessor.com/progress.html

  • djhworld 5 years ago

    I watched the Computerphile episode about this on Youtube when I was in the midst of the project, apparently it's at a Computer History museum in Cambridge now, I really should find the time to go!

ixtli 5 years ago

I took a computer architecture course in undergrad and they had us make an ARM emulator similar to this in c++. It was extremely illuminating.

  • djhworld 5 years ago

    Cool.

    I always find that "learn by doing" works best in these sorts of things :)

peter303 5 years ago

Thats how Gates and Allen wrote their BASIC interpreter for the Altair microcomputer and started MicroSoft. They did not have access to a working 8080 microcomputer so they simulated one on a Harvard mainframe from a machine language specification. Bill took the interpreter by punch tape to MITS in Albuquerque. And legend has it their BASIC worked only after an hour of fiddling.

  • djhworld 5 years ago

    That's such a cool story haha.

    Not sure if my thing is remotely close to that, but nice to know I'm standing on the shoulder of giants :)

blattimwind 5 years ago

> I don't know how CPUs work

Guess introductory courses in universities aren't useless after all.

madrox 5 years ago

Reminds me of the early days of unmodded Minecraft when someone made a CPU in game with lots and lots of redstone. It really makes otherwise arcane concepts accessible. Super cool!

afpx 5 years ago

This is very cool. It would be so fun to run qemu on it.

https://www.qemu.org/.

  • djhworld 5 years ago

    I think the machine would need to be much _much_ faster, with the stack pointer register + a few extra instructions to even attempt a project like that hahahahaha

arcticbull 5 years ago

I did this too, when I was in college. I implemented a gameboy CPU emulator. Then I wanted to know how FPGAs work so I implemented a Chip-8 CPU on an FPGA. Once I clean it up, I'll throw it up on my GitHub and maybe write up a quick medium post. This is exactly how I learn: I don't know how something works, so I make one.

  • djhworld 5 years ago

    > This is exactly how I learn: I don't know how something works, so I make one.

    Uh huh, the FPGA Chip-8 thing sounds awesome! Definitely post about it!

  • bogomipz 5 years ago

    Please do, this would be an interesting read. Cheers.

floor_ 5 years ago

Making a chip-8 emulator is a reasonable project for those who are interested.

  • merlincorey 5 years ago

    I'm in the final stretches of mine right now, in fact.

    The article describes an 8 bit CPU with 17 instructions.

    CHIP-8 is an 8 bit CPU with 31 instructions, but crucially, it has a large amount of documentation[0] and a nice corpus of ROMS available[1] making it a bit easier as you don't have necessarily write your own programs for a CPU you don't fully understand yet.

    [0] See http://devernay.free.fr/hacks/chip8/C8TECH10.HTM among many other resources [1] For example this archive of the now defunct chip8 archive site https://github.com/dmatlack/chip8/tree/master/roms

    • djhworld 5 years ago

      Just looking at the docs, those 31 instructions would definitely help a lot, along with the stack pointer register.

      I'd definitely recommend doing something like that (not at the logic gate level though - too slow) but an emulator is a fun project to tackle. I wrote my Gameboy emulator first which is a bit more complicated than chip-8 and at the time the documentation was a lot more varied.

nlowell 5 years ago

Implementing a CPU simulation is a classic college exercise, as you can see in these comments. Super valuable based on how memorable it was for everyone! It really helps de-mystify computers.

elamje 5 years ago

Any Yale Patt alums here?

  • kickopotomus 5 years ago

    EE360N Labs all over again. Tangentially, does anyone know what Patt has had to say about Meltdown/Spectre? Speculative execution was/is his bread-and-butter if I remember correctly.

  • GrumpyYoungMan 5 years ago

    Giant beard? Fond of telling people that one of his grad students was the chief architect of the Pentium III? Yeah, I remember his comp. arch. classes fondly.

guhcampos 5 years ago

It's an impressive, yet symptomatic feat.

With more and more people getting into coding and languages of higher and higher levels, less of us are turning to the lower levels of computer science. Lots of developers did not study CS at all and the ones who did mostly neglected the computer architecture classes - myself included.

Sometimes I stop and wonder that most of my low level compute knowledge comes as mere luck because I ended up working for the EDA industry for a few years. If I had not, I'd be a computer scientist with close to no understanding of how a computer actually works. We should all know our HDL's.

  • cheschire 5 years ago

    Don't let yourself get too cynical! Development is becoming more accessible, and this is skewing the ratio.

    I bet there are far more low level developers now than 30 years ago, however.

    • baybal2 5 years ago

      > I bet there are far more low level developers now than 30 years ago, however.

      30 years ago every developer at least knew how bits and bytes with logic operations work.

      Now, just any minimally proficient C/Cpp devs are genuinely hard to find. I may well say that there are less of them in total now.

      C development community shrank a lot over the years.

      Things went so bad that now some people suggest running whole web servers on MICROCONTROLLERS to just blink some LEDs!

      • Klathmon 5 years ago

        > Things went so bad that now some people suggest running whole web servers on MICROCONTROLLERS to just blink some LEDs!

        Why is that such a bad thing? Even microcontrollers are magnitudes more powerful than they were years ago, so why stick with other "simpler" methods when you can go with something more secure, easier to write and understand, and more "standard"?

        I'm one of those people running webservers on microcontrollers (you'll hate this, but I program my esp8266 controllers in JavaScript!), and it seems silly to lament that. They are cheap (about $3 a piece to my door), power efficient (battery life is measured in months for the battery powered devices), and all of my code is about a dozen lines of simple code that allows me to integrate with the rest of my system.

        • keepmesmall 5 years ago

          Why save cycles today if you can borrow tomorrow?

      • rsynnott 5 years ago

        Visual Basic (where many people didn't really know what variables were) existed 28 years ago. And it wasn't the first thing like that, by any means. Even 50 years ago, well, things like COBOL and MUMPS were fairly divorced from the details.

      • contingencies 5 years ago

        Well, I have a whole startup whose flagship hardware is based on web servers on microcontrollers, and our build costs and iteration times are trivial compared to our competitors.

        Microcontrollers: cheap, reliable hardware. Web: cheap, reliable interface. Web programmers: cheap, numerous.

        What's not to love?

        You said yourself: minimally proficient C/Cpp devs are genuinely hard to find. HTTP GET with the program. ;)

        • baybal2 5 years ago

          Oh no... Don't tell me you are also on this train...

          Seriously, you want your noodle vending machines to run a web server on an MCUs? I can't wait to see how eval escape will look on a vending machine.

          Well, at least now you know whom to call when they will break =D

          • contingencies 5 years ago

            Heh, there's no way we'd expose that kind of interface, it's just internal plumbing. BTW we prefer the term service location to highlight the fact custom personalized meals are cooked direct from fresh ingredients within the machines, not simply 'dispensed' or 'vended' as "vending machines" as a concept has too many negative associations / too much cultural baggage.

      • kevin_thibedeau 5 years ago

        > C development community shrank a lot over the years.

        They're all in China and Korea churning out our cheap consumer hardware.

        Most IoT devices with embedded web servers are running on relatively modest processors. The main barrier to running an embedded networking stack is RAM and that is abundant enough in cheap micros to make it a non-issue.

  • porlw 5 years ago

    What I find fascinating is that Lisp, commonly perceived as too high level to be practical, is actually at its core (in its original implementations!) an incredibly low level language.

    • otakucode 5 years ago

      Really? One of the only things about Lisp and other functional languages that strive to be pure that trips me up is that I could not guess how to write a compiler for one, and especially with ample lazy evaluation I'm really not sure what the execution of a program would end up being in terms of the stream of instructions going through the CPU. When it comes to imperative coding, I at least have a fairly good idea, but with LISP I haven't the slightest idea. Does it create stack frames and do calls when recursing or does it simple 'emulate' it through maintaining its own stack? I could tell you what it's doing in terms of lambda calculus, or what it would be doing on the CPU if I had to write a Lisp 'emulator'... but what the actual Lisp runtime is doing, I've no idea!

      • lispm 5 years ago

        Lisp isn't purely functional.

        There are a bunch of books which explain how to write Lisp compilers. There are a bunch of different strategies.

        > Does it create stack frames and do calls when recursing

        That's what a typical Lisp might do. It might also change the stack frame and just jump to a function...

        It's also relatively easy to check out, since Common Lisp has a built-in disassembler. One can take an implementation, compile a function and see the generated code. Just call the function DISASSEMBLE with a function object...

      • btilly 5 years ago

        If you're interested, The Structure and Interpretation of Computer Programming winds up writing a Scheme interpreter in Scheme.

        This is part of a proud and surprisingly long-standing tradition. The first Lisp interpreter was originally written in Lisp by one person, and then hand translated into assembly by another. This came as a surprise to the rest of the lab who had intended to work on an actual implementation in a year or two. You know, some time after they came up with a real syntax for it.

  • masterisks 5 years ago

    >my low level compute knowledge

    But does it really make you better than other, very unlucky programmers, who doesn't have the "low level computer knowledge"? Does it make you the REAL programmer?

    Because it's not. Because that's THE reason we have higher-level programming languages (and also lower-level ones) in the first place. Is computer just a black box to you? That's completely fine.

    >computer scientist with close to no understanding of how a computer actually works

    Which is how it's supposed to be.

    Don't be such a gatekeeper.

  • baybal2 5 years ago

    > Lots of developers did not study CS at all

    I do feel the same. A lot of "computer science" degrees are in fact just basic programming ones, and that often goes to masters, and rare cases PhD level degrees.

    It is a common critic from industry that "computer science does not teach how to code," and I see too many universities seemingly taking it these days.

    It is ironic how once computing science was once a road to unemployability and coding was all rife, but now companies don't want to hire people without degrees to do menial jobs like webdev.

    • Klathmon 5 years ago

      Can we try to stop this gatekeeping and putting down some jobs as if they are "menial".

      A lot of the webdev work I've done has been significantly more challenging and technically difficult than most of the C work I've done. That's not to say C is "easier", it's not, but it is different.

      I've seen shitty devs in all areas, and I've worked menial jobs both in webdev and at the embedded C level (I had to write C code to display a custom font on a shitty LED display, that was easily the most boring job I've ever done).

      • vectorEQ 5 years ago

        i don't do web stuff because it's horrible frustrating and difficult. C is nice, i love it as it matches the pc platform and you can debug it normally etc. A lot of people like to hate on 'frontent developers' or 'full stack javascript developers' or w/e but you are right. whatever floats your boat will be 'better' and 'more logical' than whatever sinks your ship. for me thank sinking is caused by scipring and CSS. for others that might be C.

DonHopkins 5 years ago

I love how you can actually see the different parts of Guy Steel's Lisp Microprocessor, which he designed in Lynn Conway's legendary 1978 VLSI System Design Course at MIT.

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/MIT78.html

The Great Quux's Lisp Microprocessor is the big one on the left of the second image, and you can see his name "(C) 1978 GUY L STEELE JR" if you zoom in:

"Guy Steele: LISP microprocessor (LISP expression evaluator and associated memory manager; operates directly on LISP expressions stored in memory)."

http://ai.eecs.umich.edu/people/conway/VLSI/InstGuide/MIT78c...

And here is a map of the different parts of the Lisp microprocessor:

https://imgur.com/zwaJMQC

Here is the chalk board where they kept track of all the student's projects and where they would go on the chip:

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Status%20E...

And the layout of the chip with everyone's project (with the big Lisp microprocessor standing out at the lower left):

"The final sanity check before maskmaking: A wall-sized overall check plot made at Xerox PARC from Arpanet-transmitted design files, showing the student design projects merged into multiproject chip set."

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Checkplot%...

This is a photo of one of the wafers they made, with lots of chips:

"One of the wafers just off the HP fab line containing the MIT'78 VLSI design projects: Wafers were then diced into chips, and the chips packaged and wire bonded to specific projects, which were then tested back at M.I.T."

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Wafer%20s....

And here the classic paper about it, "Design of a LISP-based microprocessor" by Guy Lewis Steele, Jr. and Gerald Jay Sussman:

https://dl.acm.org/citation.cfm?id=359031

vectorEQ 5 years ago

cool project and nice writing style on the blog. awsome!

ZedIsNotDead 5 years ago

Modern problems require modern solutions.

ameyv 5 years ago

This is really good resource.. Thank you

gvand 5 years ago

Nice project!

Circuits 5 years ago

Modeling it in code is cool and all but you skipped out on all the fun and enjoyment (pain and suffering) of troubleshooting your own shitty wiring job by not doing this with discrete hardware on a proto board.

  • kabdib 5 years ago

    I started out doing hardware, as you say. Lots of bad soldering (initially). Expensive smoke and shitty chips from Radio Shack that appeared to have had their magic smoke removed prior to being shipped to stores in packaging that definitely was not anti-static. Capacitors that blew up into confetti, diodes that became LEDs for a single, glorious instant. Poking around in the back of old teevee sets and somehow avoiding being electrocuted or thrown against the wall by 30KV waiting patiently in a circuit where the bleed resistor had cracked. Realizing that you're going to need another paper route to afford the buffer chips and board work. Learning that manufacturer data sheets are sometimes full of lies. Redesigning the support circuitry for the CPU to remove just one more chip. Failing a bunch of high school courses because you were writing a BASIC interpreter (in anticipation of working hardware, someday) instead of doing homework.

    Oh, I built a working computer, too. That was fun. But when at the end of that project you've got a processor and some RAM and a display sitting there in front of you, well, you realize that you don't really know what to do with the rig. Now what?

    Then I realized that I could do far more damage in software (at scale) than with hardware (just one-off workbench class disasters). And so . . .

    • djhworld 5 years ago

      > Then I realized that I could do far more damage in software (at scale) than with hardware

      Haha, this was pretty much it for me too. I'd imagine going down the hardware route is a lot of fun with a lot of lessons learnt, but an expensive lesson.

      I might play around with some hardware stuff next though.

  • djhworld 5 years ago

    Oh yes, I've seen other people do similar things with 8-bit CPUs on a bunch of breadboards which is awesome, maybe I'll attempt that another time!

    I did suffer a different form of pain with trying to implement all the gates together in a GP programming language - that was probably a bad idea in hindsight!

    • Circuits 5 years ago

      I have approached it in two different ways. One was with discrete hardware on breadboards. The other was with VHDL and an FPGA. Out of the two I would say the breadboard method was the most satisfying because at the end you can show it off as wizardry to the two people or maybe three people in your life who love you enough to be bothered by it.

  • jimmaswell 5 years ago

    I remember in my college class that went over logic gates and all that, we had to make some circuits in a simulator program. I think we eventually made cycle-based adders in it. I'm pretty sure you could make a full CPU like that too.

tamaharbor 5 years ago

“See all that stuff in there, Homer? That’s why your robot never worked.” - Marge Simpson