I just don’t see a way to justifiably call the Amiga a 16 bit machine. Although the A1000 had some 16 bit hardware paths, a maxed out A3000 definitely wasn’t 16 bit, and they were nearly completely compatible with each other minus newer features.
Amiga was full-on 32 bit machine. It’s weird to hear it called anything else.
Many games crashed on the 32-bit clean A3000, A1200, A600, A4000 because programmers used the upper byte of addresses for their IQ or whatever. (Similar issues with ARM2 to ARM3 in Acorns, even RISC OS itself can be categorized into '26-bit' and '32-bit clean' varieties due to Acorn thinking the memory space ignores the upper 6 bits so they can store what they like there)
The competition before the Amiga's launch solidly called itself "8-bit". The next generation called itself "16-bit" to hype itself. Later machines touted their "32-bit"ness, and then came the Nintendo 64 and PSX on MIPS processors...
All the hedges you made, "don't look here, look there" can be reversed to emphasize the 16-bitness!
Does this say something about you? Did you come to the Amiga later in its life, e.g. 1991-1993, when 68020s/030s/040s were an option? Or were you there in 1985 when it debuted?
The 68k’s ISA is 32 bit through and through, however the underlying implementation looks. It did since I bought my A1000, marketed as a 32 bit system, in 1985.
I'm sure there must have been some, but most of Commodore's early Amiga ads didn't mention the number of bits at all, and from looking through old magazines it doesn't seem most vendors did either.
I remember the Amiga always being compared to other "16-bit" machines, like the Apple IIgs, Atari ST, and early Macs.
I also remember the 68000 being referred to as 16/32-bit. Still, from a programmer perspective, the 68000 looked like a 32-bit machine, similar to what Intel did with the 386DX and SX.
Commodore and Atari marketed their 68K machines as 16/32-bit, which is I guess technically the most correct. And other 68000-based machines, like the Sega Mega Drive/Genesis, were marketed as 16-bit - it even says it right on top of the unit!
And then the other part of it is the marketing angle: everyone knew full 32-bit inside and out chips were just on the horizon. Downplaying the 68k’s 32-bitness would give them a selling point for the 68020.
That was because the A1200 was the first Amiga to have a 68020 as the native CPU on the motherboard. The 68020 had 32-bit data registers and 32-bit address registers. Earlier Amiga's were designed around the 68000 CPU which was instruction set compatible with later 680x0 CPUs (which featured backward-compatible super sets). In the 68000's data registers only had 16 data lines connected externally, requiring two cycles to read or write 32-bits and the 32-bit address registers didn't have their upper 8 bits connected to external pins, limiting the directly addressable RAM to 16MB (24-bits). These compromises allowed the CPU to fit in a 64 pin DIP package while the standard 68020 came in a 114 pin PGA package and was fully 32-bit internally and externally.
However, it's confusing because the A1200 had a lower cost version of the 68020, the 68EC020, which also didn't have the top 8 bits connected and came in a smaller 100 pin QFP package. So technically, it had the same addressable RAM limit as the 68000 (although it had other instruction set and clock speed improvements).
Prior the the A1200 (1992) here was an earlier Amiga model, the A2500 (1989), which came with a full 68020 CPU but it was a 68000-based A2000 with Commodore's A2620 add-on accelerator card pre-installed, so it had both CPUs (although the 68000 was unused when the accelerator was added).
As it followed up on our ZX Spectrum and Commodore 64 8 bit home computers.
The 68000's address registers didn't have their upper 8 bits connected to external pins, limiting the directly addressable RAM to 16MB (24-bits). These external width compromises allowed the 68000 to fit in a 64 pin DIP package while the standard 68020, which did connect all 32 data and address lines, came in a 114 pin PGA package. Large packages with more pins were a significant cost while double-pumping data accesses and a 16MB limit on addressable RAM weren't significant issues for most 1980s desktop computers - especially since the 68000's elegantly orthogonal instruction set was so performant in other ways.
Thus, many of us more technically literate fans broadly thought of the 68000 as having 32-bits internally but 16-bit data / 24-bit address width externally. However, that was incorrect because the arithmetic logic unit (ALU) and two arithmetic units were also 16-bit only, generally requiring at least twice as many cycles even for purely internal 32-bit math operations, whereas the 68020 and later CPUs didn't. That's why the 68000 is probably best described as "a hybrid 16/32 bit internal architecture with 16-bit external data width and 24-bit addressing."
It gets even more confusing because some later Amiga models like the A1200 (1992) didn't have a standard 68020 but instead a lower cost version, the 68EC020, which also didn't have the top 8 address lines connected and came in a smaller 100 pin QFP package. So technically, it had the same addressable RAM limit as the 68000, although it had full 32-bit internal and external data widths, ALU, a 256 byte cache and many other other instruction set and clock speed improvements common to later 680x0 CPUs. The way a lot of us thought of the 68000's 16/32 architecture as being limited just in the memory addressing was really a more appropriate description of the difference between a full 68020 and 68EC020. The 68000's ALU being 16-bit is the inarguable smoking gun that makes it incorrect to think "it's really a 32-bit CPU internally" as I used to.
However, that should take nothing away from just how incredible the 68000 was. My first computer had the 68000's little brother, the 6809, which was generally the fastest 8-bit CPU clock-for-clock due to being an 8/16 bit design in much same way the 68000 was 16/32 bit. While the 6809 was incredibly fast, when I got a 68000-based A1000 in 1985 and programmed it in assembly language, it blew my mind how incredibly fast it was. Then in 1988 when I added an A2620 accelerator card to my A2000, it's full 68020 with 32-bit internals and direct 32-bit read/write to 4MB of RAM was like going from a Ferrari to a Lear Jet. Despite how the 68000 was confusingly marketed and inaccurately described by some media, it was truly a leap forward, but the reality is the 68020 was really the first true 32-bit CPU in the line.
I also used that LORA and some video models to try to make a little movie with the same style[2]
Here's a guide on how to generate LORAs too if you're interested[3]
Finally, there's a DeluxePaint clone someone released that is pretty cool to play around with[4]
[1]: https://civitai.com/models/875790/amiga-deluxepaint-or-fluxd
[2]: https://www.youtube.com/watch?v=_18NBAbJSqQ&feature=youtu.be
[3]: https://reticulated.net/dailyai/creating-a-flux-dev-lora-ful...
Access Restricted for Australian Visitors As of March 16, 2026, Civitai is no longer accessible to users in Australia.
This is due to Australia's Age-Restricted Material Codes, registered under the Online Safety Act and enforced by the eSafety Commissioner. These codes require platforms that host user-generated content — including AI-generated imagery — to implement age verification measures such as facial age estimation, digital identity wallets, or photo identification checks before allowing access to age-restricted material. Simple self-declaration of age is no longer considered sufficient. Non-compliance carries civil penalties of up to AUD $49.5 million per breach.
Amiga Graphics Archive - https://news.ycombinator.com/item?id=38431514 - Nov 2023 (20 comments)
Amiga Graphics Archive - https://news.ycombinator.com/item?id=17783531 - Aug 2018 (27 comments)
The Amiga Boing Ball Explained - https://news.ycombinator.com/item?id=12330689 - Aug 2016 (56 comments)
The Amiga Graphics Archive - https://news.ycombinator.com/item?id=10972849 - Jan 2016 (24 comments)
Jim Sachs was one of the early masters. The Wikipedia article about him does not do him justice: https://en.wikipedia.org/wiki/James_D._Sachs
One amazing thing was that even after the Amiga became available, he continued simultaneously making great art on the C-64.
You can see and experience old things, but it's impossible to recreate the context in which they were originally experienced. You can't erase your experience of 40 years of technical progress which makes this sort of thing feel merely quaint in comparison.
[0]: https://retroremake.co/pages/superstation%E1%B5%92%E2%81%BF%...
It's straightforward to convert HAM to PNG etc.
It would have sometimes been used together with interlaced mode to double the number of lines and that did flicker.
Interlacing might have flickered too, depending on your monitor. (Most monitors Commodore made would flicker in interlace mode, but I believe there were some higher end ones that did not).
Two big reasons. First, it's about running memory chips in parallel to increase bandwidth. Image data was hard to get to the screen fast enough with hardware in that era.
Second it allowed for simple backwards compatibility. Programs were used to writing directly to video memory, and in an EGA card the start of the video memory was valid CGA data. The rest of the colour data was in a separate bit plane.
One case where bitplanes could be faster was high-res bitmapped text. As long as your text was monochrome (all in one bitplane), you could write an 8 pixel wide character with one byte. This was a big deal when it came to scrolling a screen full of bitmapped text.
Fun fact! The Amiga Workbench is 4 colour hires by default, because hires is impressively businessy... but 8 or 16 colour hires would lock out the CPU most of the time, as the chipset would have to dip into the 68000's even cycle RAM accesses and stall it. 4 colour hires lets the CPU (on a chipmem-only system) run at full speed!
http://amigadev.elowar.com/read/ADCD_2.1/Hardware_Manual_gui...
I think a key aspect of the magic is that the technical constraints force art to be representational instead of photo realistic. There just weren't enough pixels or colors, so artists had to make intentional choices about where to focus their limited pixels and palette to imply the detail they couldn't fully draw and that made their images evocative in ways photo-realism usually isn't. Earlier digital graphics with 4 to 16 colors and resolutions around 160 x 120 to were generally 'moving icons' as seen in arcade games like Pac-Man, Donkey Kong and Galaga and most late-70s and early 80s home computers (Apple II, Atari 400/800, C64, etc). Of course, this wasn't just due to pixel and palette limitations but also the 8-bit CPUs at sub-4 MHz clock speeds and limited memory (usually 8k to 32k game size).
It wasn't until around the mid-80s when arcade and personal computer hardware with 16-bit CPUs at 8 Mhz+ and 256K memory hit that magic middle-ground we see as unique to that era of computer and arcade graphics. By the mid-90s it was already starting to vanish as palettes grew beyond 256 colors and resolutions exceeded 15Khz analog video (roughly 240 lines high). A great example of the peak visuals possible from the painstaking care and artistic virtuosity of this era can be seen in the incredible hand-drawn sprites of "Street Fighter II": https://fabiensanglard.net/sf2_sheets/index.html.
The other reason I think so many of us see the art style of this era as uniquely special is it ended suddenly with a huge leap to deep color palettes, higher resolutions and 3D rendered graphics. This happened due to the unique nature of analog 15Khz video and the desire to avoid interlace flicker, causing resolutions for most consumer-priced computers and game consoles to max out in the mid-80s at less than 240 vertical lines. Since artists generally want to work in roughly square pixels, this limits horizontal resolution to around 320. So, for nearly a decade the benefits of using the existing televisions consumers already had, limited the visual output of home computers and game consoles to 240 lines. It even froze the evolution of most arcade machines due to the cost savings of using CRTs made for TVs. Even one of the last 2D arcade hardware platforms, Capcom's 1996 CPS III, was limited to 384 x 224 resolution. After this unprecedented 'hold' of nearly ten years on the march of pixel progress, the next increment most consumers saw was a huge and seemingly sudden leap - a doubling of vertical and horizontal resolutions and a jump from 4 and 8-bit palettes (16 to 256 colors) straight to 16-bit palettes (65,535 colors). And this happened at almost the same moment the rush to 3D rendered graphics killed any interest in hand-drawn pixels. In just a few years, virtually all the computer and game pixels consumers saw changed dramatically in both scope and style, creating a clear divide between hand-drawn 2D pixel art at analog resolutions and everything that came after.
Fun memory: I was with my best friend at another friend's place and his father called him to do some chore. He had to quickly mow the small lawn or something like that. So we decided to prank him: I don't remember all the details but basically we launched Deluxe Paint and simulated an Amiga "guru meditation" using a font that wasn't even correct (I think because we were in 320x256 while the real guru meditation was using a mode with smaller pixels). Then in broken english we wrote something like this:
"Hardware failure. If you reboot or turn off your computer it is going to broke forever"
We then did a color cycling between red and black for one of the color and put the drawing software in "full screen".
When our friend came back, we played dumb and said we had no idea what happened but that apparently we really shouldn't turn the computer off. We managed to hold it for something like ten minutes while he though his computer was done for good but we were dying inside.
All three of us remember that prank to this day.
https://en.wikipedia.org/wiki/Guru_Meditation
P.S: as a side note with the help of Claude Code CLI / Sonnet 4.6 I managed to recompile a 30+ years old game I wrote in DOS in the early 90s (and for which I still have the source files and assets but not the tooling) and I was using converter (which I wrote back then) to convert files between the .LBM format and a "tweaked" (320x200 / 4 planes) DOS mode I was using for the game (which allowed double-buffering without tearing). I don't remember the details but I take it that if we had .LBM picture files, me and the artist where using Deluxe Paint on the Amiga.
In terms of colors the most popular VGA modes (320x200 or 320x240, 256 color palette, 18 bit color depth) are superior to the most popular Amiga graphics modes (320×200 or 320x256, 32 color palette, 12 bit color depth).
But somehow Amiga graphics is still often nicer.
Now for the shameless plug... My game's protagonist is an Amiga fan and the Amiga has a little cameo in it: https://store.steampowered.com/app/3040110/Outsider/
I remember seeing a PC one of the rich kids brought to boarding school in 1990 and realising it was just crisper than my A500. The PC’s in the school lab were all green and orange screens with one colour CGA, so this was quite a surprise. Still took me some time to accept reality :)
I tried playing my old games and software on modern TVs and monitors but somethig was "missing"; it didn't feel right.
Sure enough, the halo and color bleeds were leveraged by the great designers of the day. The sprites, fonts characters _require_ the "glow" to experience them as they were designed. It goes beyond simple nostalgia.
I finally broke down and bought a gorgeous Zenith Space Command TV, hacked it for various modern input sources (composite, s video, VGA and even HDMI.) It just brings the joy back that was missing.
Technology advanced much more rapidly in those days. Similar to how hard drive capacity seemed to double every six months for a while, or how there's a new bleeding edge AI model every three months today.
Also, VGA had 256 colors. The Amiga had 4,096 simultaneously.
Of course in 1987 a Macintosh II with a fully expanded "Toby" framebuffer could not only do 256 colours, it could do it in 640x480 mode where as a PS/2's VGA could only do 16 colours at that resolution. And an Amiga could only do flickervision at that res.
Of course with technology improving all the time, not having a updated chipset circa 1987 that at least had a progressive scan 640x480(ish) is one of those things that really killed the chances of Amiga as a serious computer. They only got that circa 1990, and "Super VGA" was already just about becoming a thing in the PC world (and Microsoft had kinda got round to making a version of Windows that didn't suck by then). I'm not sure if the mythical Ranger had a progressive mode, but it's it does show how Commodore inability to keep the custom chips updated in a timely mannner slowly sunk the system...
If cost is no issue, the PS/2 also had the 8514/A card that could do 256 colours at 1024x768. And there was also the PGC from 1984 that could do 256 colours at 640x480.
I guess you weren't there.
That's the highly special hold-and-modify mode (https://en.wikipedia.org/wiki/Hold-And-Modify). I tried pretty hard to word my comment fairly, remembering the sometimes legendary tenacity of Amiga fans. (Which nowadays includes yours truly.)
Not sure when VGA would have been considered mainstream... 1989 maybe? Mac LC was 1990, so probably before that.
VGA and Mac color were better for most practical things. Square pixels and far fewer resolution/flicker/color tradeoffs.
It was announced on July 26th, 1985 at Lincoln Center with Andy Warhol painting Blondie (Debby Harry) live on-stage (a demo which was re-created at the Computer History Museum this past Summer as part of the Amiga 040th celebration). But you're right it wasn't commonly available until the late Fall. I managed to get mine at the end of November.