Ask HN: Why did consumer 3D printing take so long to be invented?

74 points by superconduct123 a day ago

"Today I saw an old paper printer, it has all the pieces for 3D printing except the plastic. Computer sends it data, it moves in 1D to print the image on paper using ink. A 3D printer is just 3 of these moving pieces from a paper printer with a melting plastic and thin pipe and software to connect it all.

All the pieces existed to make a working 3D printer existed even in 1970! and relatively cheaply. So why has it taken so long for [at home] 3D printing to actually become a thing?

Is it because of the internet somehow? Did just no one care in the 1970-2010s? Like there aren't even prototypes from 1970 from garage hobbyists for 3D printing.

What was wrong?!"

- omgsoftcats

cityofdelusion a day ago

The pieces did NOT exist in the 1970s. Fast microcontrollers, stepper motors, precision miniaturized manufacturing, reliable and cheap miniaturized DC electronics, and far far more technology was non-existent at any kind of affordable price point. Look at kitchen appliances or metal/wood shop machinery from this era, still heavily analog, mostly made from sheet steel, mostly non-computerized. The 80s would bring better microprocessors but even the simple Nintendo was an inflation adjusted $450. For comparison the first RepRaps used a full power PC as their host machine and their materials cost roughly $1000 in today USD and needed parts from a commercial stratasys machine.

Some of the greatest and most under appreciated technological achievements in the last 40 years have been in materials science and miniaturization.

  • kragen a day ago

    Microcontrollers and stepper motors were already controlling 2-D printers in the 01970s and early 01980s at higher speeds and similar powers to currently popular 3-D printers. The first RepRaps did not "use a full-power PC as their host machine"; they were controlled by AVRs, just like Arduino (which didn't exist yet). Generating motor control waveforms on a full-power PC is a nightmare nowadays, and was already a nightmare in 02005. The LinuxCNC people would do it by running Linux under a real-time hypervisor. The first CNC machining was done in the 01950s with IBM 704s, comparable in power to an Intel 8008. The 6502 used by the Commodore 64 to drive its floppy drive would have worked fine, though it is much slower than the AVR and doesn't have built-in PWM generation hardware.

    I agree that the pieces did not exist in the 01970s, but the missing pieces weren't the computation.

    • actionfromafar a day ago

      Now I'm very tempted to build a 3D printer with an 8-bit home computer and vintage parts.

      • dcminter a day ago

        That sounds like a delightful project! This contemporary book from Usborne demonstrates the point that basic stepper controls from an 8-bit computer were well within hobbyist reach:

        https://drive.google.com/file/d/0Bxv0SsvibDMTZ2tQMmpyOWtsRFk...

        I do think that the limitations of memory (and disk) will require some ingenuity in 3D printing more than the most trivial procedurally defined objects!

        • Suppafly 15 hours ago

          That's a great book, it looks so simplistic and then the next thing you know it's teaching you about resistors and soldering and programming, all in like 50 pages of mostly graphics.

      • TuringTourist a day ago

        Make your own 8-bit computer on breadboard a la Ben Eater for bonus points

      • Suppafly 15 hours ago

        My son is into vintage computers and I'm pretty sure I could make a 3d printer from the old c64s, disk drives, and printers in my basement. The hot end would be the only issue and I'm pretty sure you could rig one up from a glue gun or soldering iron.

        • stevekemp 13 hours ago

          Floating point maths would be hard, and fitting the model in 64k would be a challenge - but paging could resolve that, or streaming/paging from disk if you assume something like CP/M.

          Logically it's a simple project, though the devil is always in the details!

          • kragen 11 hours ago

            Floating-point was ubiquitous in the BASIC interpreters on early home computers. It was just slow. CP/M didn't really do paging, although you could read and write 128-byte records to disk. But CP/M disks were typically only 90 kibibytes, or maybe up to a mebibyte for 8-inch floppies. And a lot of hobbyist home computers omitted disk drives because they were too expensive.

            • stevekemp 10 hours ago

              Sure floating point is possible, but if you're thinking of something like a 6502 or Z80 you'd have to implement it yourself - no maths co-processor, or dedicated instructions for it.

              In terms of paging I was thinking of paging in later layers of data from disk, perhaps from a large file, perhaps from a series of files. But CP/M certainly did have support for paging RAM, in 3.x, if your hardware supported it.

              (My own emulator, and the hardware I have running genuine CP/M are all running 2.x so no paged RAM there. Shame!)

              • kragen 10 hours ago

                Yeah, people did do overlays a lot on CP/M.

                If you were using a 6502 or Z80 in the late 70s you wouldn't have to write the floating-point routines; you could just call the ones in the BASIC interpreter you probably had in ROM. They'd still be slow.

                As for paging RAM, do you mean bank switching? The 8085 and Z80 didn't have MMUs, so you couldn't do "paging" in the sense people normally understand it today.

                • stevekemp 8 hours ago

                  Bank switching was the term I should have used, thank-you!

      • kragen 11 hours ago

        I'm very interested to hear about the results if you do. If I'm wrong, you'll be able to show me how!

      • aforwardslash a day ago

        Just buy an old ender 3, they came with stock 8bit cpus :)

      • wilg a day ago

        id watch a youtube of this

    • tdeck a day ago

      > The first RepRaps did not "use a full-power PC as their host machine"; they were controlled by AVRs, just like Arduino

      It's possible that they were referring to the RepRap Host Software, which was RepRap's original slicer.

      https://reprap.org/wiki/DriverSoftware

      • kragen a day ago

        Oh, I'm sure you're right! Thank you. I didn't realize Skeinforge wasn't the first.

    • inglor_cz a day ago

      An offtopic question: how do you manage to write all the years consistently with 0 at the start (I know why, but how), when everyone around you uses a different standard.

      If I decided, IDK, to write "cnow" instead of "know", or "J" instead of "I", I wouldn't be able to do so consistently. Not in a world that massively uses the other word.

      Or it would take me twice as much time to double-check everything that I have typed.

      • Suppafly 15 hours ago

        >An offtopic question: how do you manage to write all the years consistently with 0 at the start (I know why, but how), when everyone around you uses a different standard.

        Never underestimate what lengths someone will go through to appear to be unique.

      • smeej a day ago

        Wait, I can guess at why (somebody's probably still going to be alive in 8k more years), but is that really the reason? In case somebody's gonna care enough to read this comment on HN that long from now? And not realize that of course we didn't bother adding leading zeros? Why not 0001970s in that case?

        (I'm sincerely hoping I'm missing something here and am going to look really silly asking this question.)

        • kragen a day ago

          Go wild, man. I fully support your right to talk about the 0001970s.

          • kojeovo a day ago

            I'd put my money on it flipping back to 0 after 9999

            • Izkata 11 hours ago

              From past evidence, I think it's more likely to start a new calendar at 0 somewhere around 4000-5000.

      • kqr a day ago

        You get into the habit fairly quickly. I type organise rather than organize despite everyone around me not doing that, but it is legitimately difficult for me to do it the popular way.

      • layer8 a day ago

        More importantly, why stop at five digits? That seems to be taking quite a limited outlook, and I can already see the Y100K bugs it will cause.

      • kragen a day ago

        I usually double-check everything I write anyway. I just have to be careful not to "correct" literal quotes that include dates or accidentally talk about the Intel 08008 and the 06502.

        • esperent a day ago

          Why just one leading zero?

  • dcminter a day ago

    It depends how you define "affordable". Daedalus (the late David E.H. Jones) writing in New Scientist in 1974 sketched an idea for a laser-based system and almost immediately received a notice of complaint from an existing patent holder (plus ca change ...) although unlike patent trolls this patent holder had actual objects made by the process.

    The system here was based around a minicomputer (or at least a successor patent of 1978 so described it) so we're talking tens of thousands of dollars for the compute involved in that scheme - but that first 1971 patent must have expired in the 90s by which time inexpensive compute was trivially available to match early 70s minicomputer capabilities.

    Excerpts from the exceptionally excellent book "The Inventions of Daedalus - A Compendium of Plausible Schemes" which is sadly long out of print:

    https://paperstack.com/img/photos/page%2090.jpg

    https://paperstack.com/img/photos/page%2091.jpg

    • nonameiguess 15 hours ago

      This history is fascinating as hell. I looked up the name of the person in that excerpt who claimed the original patent. He's apparently in the Guinness Book of World Records: https://www.guinnessworldrecords.com/world-records/463935-fi...

      I don't know why they have a record for earliest filing of a 3d-printing patent, but this guy (Wyn Kelly Swainson) was an American graduate student of English Literature who filed the patent in Denmark after teaching himself some basic lithography and chemistry from a library after wondering why no technique existed to make copies of sculptures. He ended up doing research for DARPA and founding an engineering company.

      Also, for any random reader who isn't familiar with the cultural history of the west, Daedalus was the mythical designer of the labyrinth of Crete made to contain the minotaur, and also the father of Icarus, who crafted the artificial wings he and his son used to escape Crete after King Minos tried to keep them trapped there after he helped Ariadne help Theseus to kill the minotaur. He was also somewhat responsible for creating the minotaur in the first place, as he built the fake cow costume King Minos' wife used to mate with a bull and birth the minotaur. I guess it became a popular pen name in the early to mid 20th century, because it was also the name (Stephen Dedalus) of James Joyce's self-insert character in A Portrait of the Artist as a Young Man and Ulysses.

  • Suppafly 15 hours ago

    Sure but they did by the 80s and 90s. I had a cad/cam class in the 90s and the pen plotter we used, not sure which model but HP 7090 from the 80s is similar, had all the technology necessary to build a 3d printer. Hell you could build a 3d printer from essentially first principles using old broken tech and stuff from a hardware store. The first time I saw a 3d printer years ago, my first thoughts was why it took so long for them to happen because the technology was really basic.

    • kragen 11 hours ago

      You will probably be interested in reading my summary in https://news.ycombinator.com/item?id=42080682 of the things the RepRap project struggled with in the first years of the project; it explains why it took so long.

      • Suppafly 8 hours ago

        >They wasted a lot of time trying to make it work without even a heated bed, to keep costs down.

        I'm not trying to downplay the reprap team and their work did lead to some of the innovations in home 3d printing, but I suspect a lot of the project is like this, losing time by ignoring things that already existed and trying to reinvent things using cheap household items by going down unnecessary rabbit holes. Thank god that people serious about actually making a shippable product actually got involved at some point or it'd all still be theoretical.

        • kragen 7 hours ago

          That's easy to say with the benefit of hindsight! But before going down the rabbit holes, we didn't know which ones were unnecessary. If you didn't go down any of the rabbit holes, you'd end up with a US$60k machine built around US$5k ballscrews and ABS and whatnot.

          The point at which people serious about making a shippable product got involved was already after it was no longer theoretical.

  • euroderf 11 hours ago

    > Look at kitchen appliances or metal/wood shop machinery from this era, still heavily analog, mostly made from sheet steel, mostly non-computerized.

    Yup I miss that too. Maybe all the appliances came only in avocado green or harvest gold, but they held together and did not spy.

  • hagbard_c a day ago

    Given that it is possible to make a 3D printer out of old dot matrix printer parts and given that matrix printers were introduced in the '70s (Centronics [1] claims its model 101. launched in 1970 was the first impact dot matrix printer) it would have been possible to create 3D printers in the '70s. While the hardware and controlling firmware would have been possible to build back then it would have been quite useless without the software needed to create 3D models to print on the device.

    [1] https://en.wikipedia.org/wiki/Centronics

    • kragen a day ago

      However, the Centronics interface was designed to allow them to build the printer without including a microcontroller in it. (Microprocessors of any kind were several years off.) It wasn't until the mid to late 01970s that it became common to include microcontrollers in printers.

      As for the question of software for 3D models, see my overview of the history of that field in https://news.ycombinator.com/item?id=42080437.

      • simne 14 hours ago

        Just for info, Centronics interface is very slow for 3D printer. It could work (as 2D plotters work), but people make figures faster, and 3D made things even worse.

        • kragen 11 hours ago

          3-D printers are much lower bandwidth than conventional 2-D printers, and the conventional Centronics parallel port could handle up to "75,000 characters per second", according to https://en.wikipedia.org/wiki/Parallel_port#Centronics. The total amount of data in a print is larger, but typically the 3-D printer is only processing 1–10 commands per second, compared to about 100 for a dot-matrix printer.

          But the point I was trying to make was that dot-matrix printers predated the availability of microcontrollers, or even microprocessors, and you needed cheap microprocessors to make hobbyist 3-D printers.

          • simne 9 hours ago

            3D FDM printers need much higher bandwidth, because their controller need to make PWM for motors control, and for other things.

            Yes, one could make 3D printer without PWM, but it will be extremely slow, or even cannot do some things.

            With modern microcontrollers this problem resolved by using interface just for high level commands (all low level control performed by microcontroller), but on early machines used computer as controller and they have to deal with this.

            I have some experience with modern FDM and SLA, and I seen many cases, when FDM was severe limited with microcontroller PWM range.

            • kragen 7 hours ago

              Right, you need a microcontroller in the printer to avoid having to send PWM signals over your parallel port. But they don't have to be modern microcontrollers. An 8051 or Z80 would be fine, maybe with some external chips for PWM generation.

              • simne 6 hours ago

                > An 8051 or Z80 would be fine

                May be. Problem is by definition of G-code, which is much more than 8bit (as I remember, there somewhere between 12bit and 18bit if consider just integer numbers, but for example diagonal lines and arcs are calculated with floats). Yes, I know, digital 8bit CPU could calculate 32bit floats or even 64bit floats, but it is slow and need additional RAM. All these calculations are much easier to do on 32bit computer.

                > some external chips for PWM

                Additional chips are bad for economy by definition - because size of PCB grow and because additional pins mean additional costs also. That is why first designs was dumb, without any controller at all and driven by computer - to avoid additional chips.

                In modern designs microcontrollers used for convenience - so now printer could run without working computer (for example from file on flash).

                • kragen 6 hours ago

                  I agree that 32-bit chips (and decent instruction sets) make everything much easier.

                  You wouldn't necessarily have to interpret G-code on the 8-bit microcontroller itself, although it's about the same difficulty as interpreting BASIC on it. Keep in mind that keeping the motors of a 3-D printer busy only requires a few speed changes per second, maybe 10 at most. By contrast, the 8051 in an Epson MX-80 printed about 80 characters per second and had to actuate the 9 hammers in the print head with a new set of voltages 9 times per character, for a total of about 700 deadlines per second.

                  When Don Lancaster was trying to figure out how to build 3-D printers and other "flutterwumpers" in the 01990s, his idea was to use a bigger computer to rasterize curves into a set of one-byte step commands that would be interpreted by an 8-bit microcontroller, for example in https://www.tinaja.com/glib/muse140.pdf, as I mentioned in https://news.ycombinator.com/item?id=42080682. His first explanation of this concept may have been his July 01993 "Hardware Hacker" column in Electronics Now (previously in Radio Electronics and Modern Electronics) https://www.tinaja.com/glib/hack66.pdf where he's exploring the possibilities opened up by Parallax's BASIC Stamp:

                  > One potential use for the BASIC Stamp is shown in figure four. I’ve been doing a lot of work with the stupendously great PostScript general purpose computer language. In fact, this is the only language I use for all of my electronic design, pc layouts, stock market analysis, schematics, Book-on-demand publishing, and just about everything else.

                  > All the camera ready figures you have seen here in Hardware Hacker for years have been done by using nothing but my word processor and PostScript. Device independently.

                  > The only little problem has been that PostScript I/O tends to end up a tad on the skimpy side. Usually you only have three choices: Dirtying up otherwise clean sheets of paper or plastic; writing files to the hard disk; or returning your data back to a host for recording or other reuse.

                  > The BASIC Stamp can instantly let you extend the genuine Adobe Level II PostScript to any personal project or machine of your choosing!

                  > Assume you’ve got a homebrew machine that has an x-axis and y-axis stepper, an up/down mechanism, and a "both steppers home" sensor. This can be a vinyl signcutter, engraving, or embroidery setup. Or an automated printed circuit drill, a wooden sign router, or a Santa Claus machine.

                  ["Santa Claus machine" was Theodore Taylor's 01978 term for a 3-D printer; it's what Lancaster consistently called them in his columns over the years.]

                  > You could use two of your BASIC Stamp lines for RS423 serial comm with your PostScript printer. Use two lines for both x-axis stepper phases. And two lines for those y-axis stepper phases. One line for pen or drill or whatever up/down. And a final line that zeros only when both steppers are in their home position.

                  > The hidden beauty here is that all of those fancier PostScript fonts and the level 2 tools immediately become available for use on your own custom homebrew rig. At unbelievably low cost. With zero royalties!

                  Normally when people do diagonal lines and arcs on computers without FPUs, they don't use floats; they often don't even use integer multiplies per step, but rather strength-reduced algorithms like Bresenham's algorithm and the DDA algorithm Bowyer talks a bit about in https://3dprintingindustry.com/news/interview-dr-adrian-bowy....

                  I agree with you that additional chips are expensive, but in the overall scheme of things, a few 555s to drive your motors aren't going to make the difference between feasibility and infeasibility.

                  So I think it's clear that we could have done hobby 3-D printers 40 years ago. What similar opportunities are we missing today?

                  • simne 4 hours ago

                    > What similar opportunities are we missing today?

                    Something You have not done yourself or something for what You have not paid to someone else to do.

                    Life is complicated, it is very typical when appear opportunity but nobody use it. Because to use opportunity need 3 things:

                    1. The will. 2. Enough qualification (or enough IQ and time to learn). 3. Free (!!!) resources or cheap borrowed funds.

                    If only one thing from list is missing, it become extremely hard to use opportunity. For example, I live in Ukraine, and even without war, regulations in country are so prohibitive, that many services are not enter country or enter with severe limitations, like Paypal only enter with private accounts, but business accounts are not available in Ukraine.

                    Yes, You understand right, idea cost nothing, but ability to implement ideas worth billions if find fertile environment.

                    And yes, I spent lot of time to find ideas and I could share with You high level information on how to find (or generate) ideas. Just let me know if this theme is interest for You.

                  • simne 4 hours ago

                    > a few 555s to drive your motors aren't going to make the difference between feasibility and infeasibility

                    It depends on current market of considered niche. For some cases, market price is so low, every additional hole or trace on PCB could kill economy; for others, could include something like SGI or Cray as controller (mean, something severe overpriced) and it will be still profitable.

    • dragonwriter 21 hours ago

      > Given that it is possible to make a 3D printer out of old dot matrix printer parts

      Using no other parts, at least none others that would not also have been available in the 1970s?

      • tdeck 20 hours ago

        Possibly? This guy made a 3D printer out of wood and unipolar stepped motors salvaged from old typewriters:

        https://m.youtube.com/watch?v=yz_DKXIuL8U

        However getting the right thermoplastic filament would be a major challenge, and slicing would require a high end mainframe with graphical terminals. Everything would be prohibitively slow. Also regular 2D printers were much more expensive back then.

sottol a day ago

I think it's probably correct that few people *really* cared until the Reprap project showed a real DIY 3D printer was possible.

Early machines were industrial machines with huge price tags, proper linear motion systems, complicated extrusion systems and so on. There is a bit of a mental leap to go from seeing $100k+ machine and dreaming to design something that can be built for $200-500.

The problems were: no "reference designs", no tried and true go-to mechanical parts (like cheap chinese linear motion rails), extruders (they were DIYd!) or heated beds (early models were just PCBs) and so on - imo it just took someone to get this rolling and that may have well taken 30 years.

I think Reprap was first publicly shown around 2005. From then on it was taken on by more and more makers and refined. It culminated in the early 2010s hype with Makerbots and its contemporaries but they still cost > $1500 and were far from set-and-forget appliances, like 50% reliable and slow - we had one at work and I got fascinated but it printed at 5-20mm/s so parts would just take forever and often fail due to bed adhesion, clogs, ...

The last 10-15 years then have seen the popularization of 3d printers through the Prusa i3 and its clones (Ender and other cartesians < $300) and steady refinement of reliability through better materials. Then the last ~5years or so significantly bumped up the speeds through better linear motion components, motion systems and input shaping + firmware and ecosystems like Klipper.

Bambu imo got in at just the right time and refined everything that had culminated up to this point into a solid appliance. Imo their genius was more in the industrial design, making it reliable and affordable manufacturing than anything else.

  • TaylorAlexander a day ago

    A few notes: The patents were a huge impediment. They did not expire till 2008 - before then printers cost $25k. They were expensive and stratasys had no incentive to make them $300 like they are today so volumes were limited. From their release in 1995 until 2008 stratasys sold like 13000 printers according to their company history web page. For a long time now Prusa has shipped more printers than that in a single month. The Reprap Darwin wasn’t built until late 2007, then makerbot and others followed. I bought an Ultimaker in 2011 and that’s when it feels to me like home printing started to become viable.

    Once anyone could build their own design, a community of hackers and engineers formed that continuously improved the designs with diverse ideas and experiments. That community is what made 3D printing what it is today. And it was illegal for them to do all of that (in many countries) until the patent expiration in 2008. That’s a big reason why it took so long. I think it’s interesting to consider whether this would have happened sooner if they had never been patented, though perhaps the expired patent created a legal safe haven where no one could take away the basic principles by patenting them. Anyway, patents play a big role in this story!

    Edit: Some cool history here: https://3dprintingindustry.com/news/interview-dr-adrian-bowy...

    • kragen 11 hours ago

      Note that Bowyer doesn't mention the patents at all in this interview, so I question your assertion that the patents were a huge impediment to him. Possibly they dissuaded other people from starting a RepRap-like project years earlier, but I think the vast majority of such people didn't even imagine that an affordable 3-D printer was possible.

  • grogenaut a day ago

    There's also an economy of scale going on that snowballed a ton. In 2014 I got my first printer off Craigslist. The bom was like $1200. Steppers were like $80 ea, a ramps board without controllers was 80-120, the controllers were 20-40 each, the heated bed was a huge PCB so also not cheap. Cheap PCB houses were nascent at the time. Arduinos we're $50+. Power supplies like 40-150.

    Now you can get the steppers from Amazon for $8, a control board with stepper controllers for $20 with a built in 32 bit MCU. At scale if you're building a lot of them those parts are going to be even cheaper maybe even another order of magnitude. For a while it was difficult to actually even comprehend how much cheaper stuff was getting and what that lets you do. And then you see a resin printer for $140 and realize it's a cast-off I found screen one stepper and some extruded parts.

  • jdietrich a day ago

    The widespread commercial availability of polylactic acid was also a significant factor. It's one of the few plastics that can reliably print on an open-framed printer. The cheap i3 and Ender clones just wouldn't have taken off if we were stuck with ABS.

    • Doxin 13 hours ago

      I think PETG would've also worked. Keep in mind neither of those plastics were rare at the time. The filament form factor was essentially nonexistent though, which makes the whole 3d printing business a lot trickier.

      • kragen 10 hours ago

        It depends on what you mean by "rare". Both PETG and PLA were commercially available when RepRap started, but they weren't very widely used. PLA's biggest use at the time was reabsorbable medical implants, for example. I don't know what people used PETG for in 02005. I'm not sure either of them existed in the 01970s.

    • numpad0 10 hours ago

      IIUC, airsoft "biodegradable" bullets were always PLA. They were bought in killos and expended in hours.

  • johnny_canuck a day ago

    I'm curious how much better the printers are these days as I'd love to get another one.

    I had a Wanhao Duplicator i3 in 2015 and found it required a lot of tinkering and calibration every time I wanted to use it. I ended up selling it as it was so time consuming to get everything correctly set up that it just killed any interest I had in it.

    • nameless912 a day ago

      Nowadays spending 500-1000 USD on a machine gets you something that can print almost anything (within size limitations) out of the box with no calibration. The BambuLab printers are nothing short of extraordinary for quite reasonable prices (the A1 mini is only like 300 bucks, but it's small). And their software stack is good enough that 3D printing is roughly as easy as 2D printing (with the same caveats that occasionally your machine will jam and you'll want to chuck it out a window).

    • ben1040 a day ago

      I had a Makerbot 2X in 2014 and it required constant janitoring every time I wanted to print.

      I built a Prusa MK4 this spring; it calibrated itself and printed a great looking piece right from the get-go. The difference is night and day.

    • criddell a day ago

      Check out Bambu Lab printers if you want something more modern. They have several models available for less than $1000 and give good results with little messing around.

      I say this as someone who doesn’t want 3d-printing as a hobby. It’s just a tool I want to occasionally use in order to get something else done and the less time I have to spend tramming and calibrating, the better.

    • ok_dad a day ago

      I bought the cheapest printer available, the Kobra Go I believe, and with very little tweaking I had it running well enough that I don’t care to tweak more. I owned a delta printer ten years ago which I spent $1200 plus another $600 in custom parts and it ran about the same as my current one (though it was bigger) and my current one cost $150 plus shipping. I don’t suggest the cheapest one if you have the money but I only use mine once a month or less so it was the perfect price vs. functionality. I did build the old one from a kit and modified nearly every mechanism though, so I’m relatively confident with these machines. I do suggest the more expensive, better quality printers if you don’t want to tweak stuff much.

    • pfych a day ago

      I had an old Ender 3 kit that was nothing but a hassle from 3-4 years ago. My partner bought a BambuLab A1 & it was insane to see how it "just worked" out of the box. Highly recommend

    • dv35z a day ago

      Check to see if there is a community maker-space in your area - including libraries & universities. One of the benefits of using those machines are that they are well-maintained and frequently used & tuned. Also, you can meet other 3D printing experts who can help you with any project - its a good vibe, and a great way to get back into making without a large investment...

    • artificialLimbs a day ago

      I got an Ender 3 v3 CoreXY about a month ago. I just about pulled it out of the box and started hitting print. It's almost that easy. It's been printing almost continuously for about a month with very few problems.

  • PaulHoule a day ago

    I remember reading articles in hobby electronics magazines circa 1990 where people were talking about 3-d printing as a nascent technology.

LarsAlereon a day ago

It was invented but protected by patents, so it wasn't until those patents expired that companies and hobbyists started to experiment. Early 3D printers hadn't figured out things like layer adhesion yet so parts tended to be too weak to be very useful. It wasn't clear that this was an area for improvement rather than a fundamental downside of the technology.

  • jerf a day ago

    It's also easy to underestimate how much computation is involved and how little there was in the 70s. The 3D printer slicers generally require several seconds on modern hardware to calculate their final path. The resulting files are in the several megabytes. My Bambu 3D printer is all but festooned with sensors and is constantly adjusting things, but even the simple ones have more to their firmware than meets the eye. Even assuming that they'd use some simpler algorithms and data structures, you're looking at vast computation times, and most likely, vastly simpler objects being printed.

    Even something that seems as simple as "sous vide" cooking, which is basically a heater hooked to a thermocouple, took a lot of little innovations to make practical to hand to the masses.

    And then, there's the general improvement in motors speed, precision, and cost, along with any number of advancements here and there and everywhere to make it practical.

    Could someone thrust back to the 1970s and given a fairly substantial budget make some kind of 3D printer? Probably. But it's slow, extremely expensive, and can print various sizes of plastic bricks and spheres and other very algorithmically simple objects, and in rather low quality without many, many years further development. I can think of many ways of bodging on various improvements, but they're all going to have their own compromises, not be garden paths to what we now think of as modern 3D printing. (For example, someone could bodge together a machine that offsets a rod on the printer head, and then in the offset space, has an object to be "copied", by basically banging into the object with a very crude sensor, so there's no generation of geometry at all. But this is going to be clumsy, inaccurate, and full of very complex and disheartening limitations.) You're not going to be printing Dwayne "The Rock" Johnson's face embedded into a toilet [1] or anything, at 1MB of geometry. It will be commercially useless and inaccessible to hobbyists.

    [1]: https://www.thingiverse.com/thing:6436138/files

    • etrautmann a day ago

      Also 3D modeling software and hardware to run it.

    • silvestrov a day ago

      AirFryers are even simpler than "sous vide cooking". No reason they couldn't have been invented in the 50s - 60s when kitchens had good electrical supply.

      • creaturemachine a day ago

        Air fryers are just convection ovens packaged differently.

        • jdietrich a day ago

          Sort of, but not really. Air fryers have a much faster rate of airflow than any convection oven, which results in significantly faster and more even cooking. The air fryer really is a meaningful technological development.

          • Ekaros 15 hours ago

            Smaller volume, slightly bigger/faster motor. Actually looking at the fan there is nothing special with it.

            Temperature control is probably hardest part. But in general air fryer could have been done quite a lot of earlier. Maybe materials for the basket is other aspect.

          • knowitnone a day ago

            if you consider adding a fan as "meaningful technological development"

            • ozim a day ago

              Well it is. We had electrical ovens without fan that were great. We had electrical ovens with a fan that were great. Air fryer has much more advanced fan with much more power than earlier.

              For me this is how meaningful technological progress happens. It is not like someone one day wakes up and suddenly has rocket that reached space caught back on landing by mechanical arms.

            • ProllyInfamous 17 hours ago

              I just purchased my first air fryer summer 2023 — it has changed my cooking life, allowing me to rarely ever "eat out" without much meal prep.

              "Just adding a fan" is actually an extremely meaningful tech development.

      • jerf a day ago

        Oh, there's plenty of technologies that were developed and/or popularized decades after the basic tech stack was available to do it, even under the constraints of "commercially viable". That's an interesting study of its own.

        3D printing is not one of them.

    • kragen a day ago

      It's plausible that our current 3-D printing design workflow has grown up in the computationally intensive way that it did because all that computation was available; it's easier to just use a least-common-denominator but bulky file format like STL than to worry about the incompatibilities that would result from using more expressive files.

      People have been doing CAD/CAM since the 01950s. Boeing started using CNC in 01958 on IBM 704s, and MIT's Servomechanisms Lab (working with the Aircraft Industries Association: https://web.archive.org/web/20090226211027/http://ied.unipr....) sent out CNC ashtrays to newspaper reporters in 01959: https://en.wikipedia.org/wiki/History_of_numerical_control#C.... Pierre Bézier started writing UNISURF in 01968 at Renault, who was using it to design car bodies by 01975. The Utah Teapot was created in 01975, and it consists of nine Bézier patches; you could print the whole dataset on a business card: https://web.archive.org/web/20141120132346/http://www.sjbake...

      The IBM 704 was a vacuum-tube machine that could carry out 12000 floating-point additions per second and had a failure about once every 8 hours https://en.wikipedia.org/wiki/IBM_704. The Intel 8008 (not 8088, not 8080, 8008) that came out in 01972 could carry out over 79000 8-bit integer additions per second, which is about the same speed. But much faster computers were already available, such as the PDP-8, in wide use for real-time control, and they very rapidly became much cheaper. Any computation MIT's Servomechanisms Lab could do in the 50s was doable by hobbyists by the 80s.

      The reason 3-D printers mostly use stepper motors is that they don't require closed-loop feedback control. 2-D printers from the 01970s used stepper motors for the same reason. They were accessible to hobbyists; in the 80s I had a Heathkit printer someone had built from a kit in the 70s.

      If you wanted to print Frank Sinatra's face on a toilet, I think you'd probably want at least a 64×64 heightfield to get a recognizable Sinatra; 256×256 would be better than the line-printer pictures we were doing. 8 bits per heightfield point would be 65 kilobytes, which would fit on the floppy disks we were using at the time. This would have been totally feasible, though digitizing Frank Sinatra would have been a nontrivial project, quite aside from printing him.

      So I don't think computation was the limiting factor.

      Your "basically banging into the object with a very crude sensor, so there's no generation of geometry at all" is called a "pantograph" and it has been a common way to copy three-dimensional objects and engrave letters with a milling machine for 180 years: https://en.wikipedia.org/wiki/Pantograph#Sculpture_and_minti...

      • fragmede a day ago

        Computation for the printer was possible, but home computing definitely was not there enough to let someone model something up in fusion 360 like we think of today. That was the realm of srs workstations like the ones SGI made.

        I don't think that's a show stopper though. If we had had 3D printers in the 1970's and 1980's, sold at Sears, between the computers and the tools, you'd bring home a paper catalog from the 3d model company, thumb through it for models you wanted to build, send off a cheque (and a SASE) and they'd mail you back a cartridge that you plug into your printer so you can print that model. And then get that catalog in the mail forever after.

        As reference on how much home computing power was available back then, the original Apple I came out in 1976 with a "character generator" and was not capable of advanced graphics - the cost of the amount of ram it would take to draw an entire screen from ram would have been astronomical, so characters were stored in ram and the card was in charge of outputting the characters and rudimentary graphics.

        • kragen 10 hours ago

          I agree that you couldn't have done Fusion 360 on an Apple ][ or a Compaq Deskpro 386. But see https://news.ycombinator.com/item?id=42087099 for my notes on what kind of 3-D graphics you could have done on hobbyist computers of that time period and how we actually did do 2-D CAD on them.

  • fxtentacle a day ago

    That's also my understanding. The 3D printing boom started when the patents expired.

legitster a day ago

What needs would home 3D printing fill in the 70s?

A complete set of woodworking or metalworking tools was a lot cheaper than a home computer. And there were entire magazines dedicated to proliferating free or easily obtained schematics/designs. Labor was also cheaper, and people had more time for hobbies.

I would also argue the point that it would have been relatively cheap. We are used to the ubiquity of cheap DC motors and precision parts being a click away. But if you were to rummage through a vintage Radio Shack to cobble together a home printer, I think you would struggle to construct anything precise enough with consumer available parts.

> a melting plastic

Don't sleep on the chemistry of filament. It has to be extremely precise and consistent. We benefit from the massive economies of scale today, but this was small batch stuff 20-30 years ago. And if we are talking about the 1970s the plastics were really primitive by today's standards.

Legend2440 a day ago

3D printers were first invented in the 80s. There's a combination of several factors why they took off in the 2010s:

1. Cheap stepper motors and electronics from China

2. Expiration of Stratasys patents in 2009

3. Widespread availability of CAD software and desktop computers powerful enough to run it

4. Reprap project made it easy for companies (and individuals!) to develop their own printers

  • Onavo a day ago

    Number 1 is huge, it's also the primary driver of the shift from "model planes" to quadcopter drones with enormous capabilities. The crucial parts were brushless motors and ESCs. The Chinese scale brought the pricing down from ~3-4 figures to under 3 figures which was a watershed moment for hobbyist and commercial use cases.

    • dghlsakjg a day ago

      The other big one for drones was sensors and the processing to run them. Trivially small cheap gyro position sensors, accelerometers, GPS, pressure sensors, etc... needed to exist since quadcopters are more or less incapable of flying without a computer in the loop.

syntaxing a day ago

A mixture of patent protection, China’s (lack of) precision manufacturing at scale at that time, and software immaturity. LM8UU around 15 years ago when I started my 3D printing journey was upwards of $15 PER PIECE. You need at least 8 for a Mendel or i3 style printer during the reprap boom. Just the bearings alone would have cost you, $100 or so. Then factor in chrome rods, stepper motor, stepper motor drivers (Pololu just released their A4988 boards), you’re looking at almost $1K just for the parts. The software like slicers and even gcode interpreters weren’t even made yet. Merlin wasn’t even a big thing until about 10 years ago. We take for granted how much work the community has put into 3D printers to get us where we are today.

  • regularfry a day ago

    I don't think you can underestimate the development of efficient slicing algorithms. It might just have been something that would have happened anyway with other parts of the story in place, but there was a point in time when slicing anything complicated was very painful.

    • tjoff a day ago

      You don't start with anything complicated though. You start with cubes, make fixtures etc.

      And with waiting 5 hours to print it isn't unreasonable to wait an hour or two for the slicer either.

kens a day ago

You could buy a 3-D printer from Stratasys in 1990, but it cost $178,000! The printing unit was $130,000 and the Silicon Graphics workstation with slicing software to run it was $48,000 more. So the technology was there, but two orders of magnitude too expensive for the consumer market. Patents account for part of the cost, but the computer to control it was also way too expensive, as well as the custom (vs commoditized) mechanical components.

Link: https://books.google.com/books?id=0bqdMvDMv74C&pg=PA32&dq=st...

Ccecil 2 hours ago

1. Patents 2. Cheap CAD 3. Arduino (IMHO...before this it wasn't easy to interface to USB/PC for anyone who wasn't EE) 4. Surge of DIY online (Forums, mailing lists, instructables, pinterest) 5. Reprap leading the way with recycled parts as well as cheap imported motors and controllers. 6. Lack of interest- Machinists could usually do the same thing but better using subtractive. Additive was seen as "useless" for real world items even after reprap.

There are probably a couple other things I am missing...and what order you place my list probably doesn't matter. It took everything at once happening.

Animats a day ago

Have you ever seen 1970s CNC gear? I came across a 1970s CNC lathe in a surplus store once. The thing was the size of a small car. The electronics box was the size of two racks. Inside were cards that handled one bit of one register. Input was from a paper tape. No CPU; all special purpose electronics.

Directly controlling industrial machines from a microprocessor was very rare before the 1980s.

cyberax a day ago

> All the pieces existed to make a working 3D printer existed even in 1970! and relatively cheaply. So why has it taken so long for 3D home printing to actually become a thing?

Easy. The printing process itself is not that hard.

It's the model _design_ that is tricky. We needed home computers to become powerful enough to run 3D CAD software, and enough people to get proficient with it.

RepRap started in 2005. Realistically, we could have had it maybe a few years earlier. But not _much_ earlier.

the__alchemist a day ago

Some observations that I didn't see in the other comments:

- Home 3D printing is often more of a hobby than a traditional prototyping or engineering discipline. People view it as a skill to have, and a fun use of free time. Note how the cheapest and most finicky ones are popular; they can be made to work well through careful operation, troubleshooting, procedures, customization etc. They are not set-and-forget, and I think the userbase likes that.

- Home 3D printer parts (the motors, frames, electronics etc) are almost exclusively sourced from China. We live in an AliBaba world; that wasn't always the case.

  • whatusername a day ago

    I'm not sure how true that first statement is anymore. Most of the recommendations I see now are "just get a Bambu labs one". We are much closer to 3D printing as a utility as opposed to 3D printing as a hobby than we were ~3 years ago.

    • the__alchemist a day ago

      Great news! Given that time frame, my info is indeed out of date.

  • superconduct123 a day ago

    That's a really good point

    Those AliExpress clone kits were really popular in the community in the beginning

nkrisc a day ago

When I started my Industrial Design degree in 2007 the workshop there already had several large, commercial 3D printers for student use.

Mind you they were nothing like the tabletop consumer ones we have today. They were about the same of a large American refrigerator.

Since it was not really any special or amazing for us to have several of them, I have to imagine that industrial 3D printing capabilities were well established by the point.

Edit: as I recall they were mostly used to make parts which could be given a nice surface finish and then from which silicone molds could be made.

snakeyjake a day ago

3DBenchy.stl, the little boat that everyone prints as a benchmark during 3d printer reviews, is 10MB.

A 10MB hard drive cost $3,000-4,000 in 1980.

That's $12k-15k today.

Just opening the .stl file and having it render (USABLY) on screen in high-resolution was probably not economical until the late 1990s-early 2000s.

I am used to computing tasks being human-perception instant. It takes tens of seconds to run repairs on 3d models, which means it would have taken tens of hours to do that same thing, if there was even enough RAM, in the 90s.

jkingsman a day ago

I don't know that I'd argue that "consumer" 3D printing, as in a printer at home that just churns out a part when you want it, is even really hear yet, certainly not in the way that a dishwasher or lawnmower is. You need to do your own slicing, thinking through supports and brims and layer height, and the printers themselves need no small amount of troubleshooting that is much more than "turn it on and off again". So, I'd argue it's still a hobbyist realm than a consumer one.

  • Legend2440 a day ago

    The big limiting factor IMO is CAD skills - otherwise you're just printing parts off thingiverse.

    • OJFord a day ago

      Don't underestimate the number of people that do that though.

      • jimnotgym a day ago

        Until they have a little boat in each colour... and put the printer in the back of the garage to collect dust.

        CAD skills are essential, and it turns out not as hard as you might have thought!

        • OJFord a day ago

          I do, I've never printed one of those boats, but not everyone's interested.

          If you think of it as functional/decorative categorisation first of all, obviously some people will overlap but broadly speaking I think people are interested for one or the other, then within the 'decorative' camp you can go a hell of a lot further without and I think it's more obviously reasonable to not care about designing your own models. You never wanted to design your own toys, but there's appeal in printing things not available on Amazon, unofficial merch for a film you like, or whatever.

          Not to say there isn't functional stuff (which I exclusively print) on these sites, but often it won't be quite what I want, so yeah I end up in Fusion. (And typically starting from scratch eventually, because for some reason people don't share source, and working with imported STLs is hellish.)

          • dghlsakjg a day ago

            Resin printers are HUGELY popular for people that paint miniatures and do tabletop stuff. Most of them never design characters and just buy the cad models to print.

        • mrguyorama a day ago

          There are several "CAD lite" systems available if you don't actually need dimensional accuracy though. There's a model boom in DnD circles around sharing 3D models, slicing them up, gluing them together, and making your own designs by basically digital kitbashing.

          My friend is filling up hard drives with 3D models DMs share.

  • artificialLimbs a day ago

    Bro maybe 3-5 years ago. I pulled my Ender 3v3 out of the box and hit print and it's been running almost constantly for a month. I don't even know what 'bed leveling' is, because it self levels, whatever that means.

    • ThrowawayTestr a day ago

      Just FYI, bed leveling is fine tuning the distance between nozel and bed and making sure the bed is planar to the x and y axis.

paulorlando a day ago

Patents do explain some of this delay....

Fused Deposition Modeling or FDM (1989, expired in 2009), Liquid-Based Stereolithography or SLA (1986, expired in 2006), Selective Laser Sintering or SLS (1992, expired in 2012), metal processes like Selective Laser Melting (SLM) and Direct Metal Laser Sintering (DMLS) (1996, expired 2016).

vel0city a day ago

One thing to think about is you'd need quite a beefy machine to even properly render the gcode in the 70s and 80s. In this age of printers computers often didn't even have fonts, they were ROM cartridges your printer would take. And even into the 90s there were accelerator cards for printing documents and talking to the printer in alternative ways than parallel ports.

So really, for an average hobbyist the idea of a 3D printer controllable from a home PC wouldn't really be possible until like the mid 90s. So you really need to start your look at why it wasn't a thing at some point in history I'd start the digging there, not the 1970s.

  • kragen a day ago

    If by "render the gcode" you mean "interpret the G-code" then, no, G-code dates from 01963 and is easier to interpret than the BASIC that home computers used in the 01970s. If by "render the gcode" you mean "produce the G-code" then it depends on the geometry you're producing G-code for, and the algorithms you're using to do it, but potentially being able to do it overnight gives you a lot of flexibility there. People were doing computer numerical control with the APT language already in 01959.

    • vel0city a day ago

      I mean "render the gcode" as in going from a 3D model to the gcode you want your tool to follow. Following the gcode would be generally pretty simple; I agree.

      However, I doubt most home hobbyists had computers in their house in 00000000001959 capable of manipulating 3D graphics in a meaningful way that would really be approachable to a general hobbyist. Wikipedia suggests 1960s CNC's were often controlled by a PDP-8 or similar minicomputer. When the PDP-8 first came out in 00000000001965 they cost $000,0018,500, almost $000,180,000 today. I dunno about what kind of home you grew up in but a machine like that wasn't exactly a home PC available to spend all night rendering the output for a CNC machine which probably also several thousand dollars. And that's just the computer driving the CNC, not even thinking about the machines and knowledge it took to actually code the designs and curves and 3D patterns.

      I'm well aware of computerized CNC machines from at least the 80s. I had family who owned a machine shop. They were not really hobbyist accessible things.

      • kragen 11 hours ago

        We were talking about "the 70s and 80s", not 01959.

        My point about APT is that a "beefy machine" in 01959 was the embedded controller in your keyboard by 01983; it wasn't a beefy machine any more. The PDP-8 was indeed pretty common for CNC control in the late 01960s, and it could run about 333000 12-bit instructions per second, which is about half as fast as the original IBM PC. So, yeah, a machine like that was exactly a home PC by the mid-80s. For real-time control of the 3-D printer, you can get by with less.

        There were three big problems for manipulating 3-D models for home hobbyists in the 70s and 80s.

        One was computational speed: for games, you need to render the 3-D graphics fast enough to provide an illusion of immersion, and with a few hundred thousand instructions per second (and no multiplier) you were limited to a handful of polygons. Like, typically about 20. See the Apple ][ FS1 flight simulator https://youtu.be/lC4YLMLar5I?&t=93, the 01983 Star wars arcade game (at 2:02), Battlezone (2:14), and Ian Bell and David Braban's Elite from 01984, which was successfully ported from the 2MHz 6502 in the BBC Micro (2:53, but also most of the rest of the hour of the video) to the Z80-based ZX Spectrum (3.5MHz but noticeably slower than the 6502; see https://www.youtube.com/watch?v=Ov4OAteeGWs).

        For producing G-code, though, you don't need to be able to handle all the 3-D geometry in 50 milliseconds. You just need to be able to handle it overnight. That's a million times longer, so you can do a million times as much computation on the same hardware. You can't handle a million times as many polygons, because you don't have enough storage space, but you can represent geometry in more expressive ways, like Bézier patches, nine of which made up the Utah Teapot I mentioned in https://news.ycombinator.com/item?id=42080437, or parametric or implicit equations, solids of revolution, CSG, etc.

        You do need some kind of user interface for seeing what you're designing that doesn't require waiting overnight to see the results, which I think is what you mean by "a meaningful way that would really be approachable to a general hobbyist". But it's possible I have a different concept of general hobbyists than you do; as I remember it, home computer hobbyists in the 70s were constantly writing things like

            2120FORQ=1TO4:IFLEFT$(R$(Q),1)=O$THENRC=Q:ST=ST+2*Q:DX=DX-2*Q:NEXTQ
        
        and writing programs in assembly language. And machining metal in a machine shop has been a popular hobby in the US for at least a century, using user interfaces that are more demanding than that. So I think hobbyists would have been willing to tolerate a lot of demands on their mental visualization abilities and relatively poor user interfaces if that was the price of 3-D printing.

        But the user-interface issue gets us to the second problem hobbyists had with 3D in the 70s and 80s: display hardware. The limited displays of the time were hopelessly inadequate for displaying realistic 3-D. The Star Wars arcade cabinet mentioned above used a vector CRT in order to be able to do high-resolution wireframe, because framebuffers were far too small. With color palettes of 2–16 colors, even Lambertian flat shading was nearly out of reach until the late 80s.

        Again, though, this is much more of a problem for games than for 3-D printing.

        My first experience with CAD was on an IBM PC-XT with a 4.7MHz 8088, roughly five times faster than the BBC Micro Elite is running on above (https://netlib.org/performance/html/dhrystone.data.col0.html). I was using AutoCAD, a version whose 3D functionality was inadequate for any real use, and the machines had two video cards—a text-only MDA (basically just a character generator and some RAM) and a 320×200 CGA, on which I could see the actual drawing. Redrawing the whole CGA was slow enough that you couldn't do it after every drawing or erasing operation, so erasing a line or arc was instant, but would leave holes through the other lines it had crossed. Until you issued the REDRAW command, which took typically 1–5 seconds for the fairly simple mechanical drawings I was doing. (Remember that 2-D side-scrolling games were impossible on things like the CGA because you just didn't have enough bandwidth to the video card to redraw the whole screen in a single frame.) Zooming in and out also suffered from a similar delay, but was very necessary because of the low-resolution screen.

        Once I finished the drawing, I would plot it out on a CalComp pen plotter, which was basically a 3-D printer without the Z-axis. This had enormously higher resolution than the CGA because it didn't have to contain RAM to hold all the lines it had drawn; the paper would remember them. Some hobbyists did have HP-GL pen plotters during this period of time, but it wasn't very common. I had one at home in 01990. My dad used it to make multicolored Christmas cards.

        This sort of approach allows you to provide a real-time interactive user interface for modifying geometry even on a computer that is far too slow to rerender all the geometry every screen frame. It would have worked just as well for 3-D as it did for 2-D, which is to say, it would have been clumsy but adequate.

        But the third problem was, as you said, the knowledge. Because the 3-D printers didn't exist, hobbyists didn't know the algorithms, they didn't know the vector algebra that underlay them, they didn't have the software, they didn't have BBSes full of 3-D models to download, etc. That would have developed, but not overnight.

        • vel0city 8 hours ago

          > it's possible I have a different concept of general hobbyists than you do

          Yes, we're talking about very different levels of hobbyists here. I'm talking about the people who are frequently doing 3D printing today. They're often not thinking about the actual algebra and Bézier curves and what not. They're playing around in CAD software extruding surfaces and connecting vertices and what not, if they're even going that deep. They're downloading pre-made models off Thingverse and printing them with their printer, tinkering with the physical aspects of their printers. Lets try printing this model in this material, lets mess around with this fill, what if we print it at this angle, etc. They're not digging deep into the code of how the CAD application actually works to draw the curves. They absolutely wouldn't be hand typing out the functions to draw The Rock's face on to a toilet, letting it render all night long, going back to tweak their math, letting it render all night long, rinse and repeat.

          > on an IBM PC-XT with a 4.7MHz 8088

          > the machines had two video cards

          A pretty high-end machine for many home users (~$5k new in 000001983, almost $16k today). I realize there were even higher end machines out there though.

          > a version whose 3D functionality was inadequate for any real use

          Exactly my point. And when would that machine have come out? The PC-XT released in 000001983. So even in 000001983 decently high-end home computers with $001,000+ (of 1982 money, >$3k today) software suites weren't very suitable for even basic CAD work without many pitfalls and limitations.

          So general home users who barely passed highschool algebra and don't have degrees in mathematics trying to use 3D printers wouldn't really have the compute capacity to think about what they're trying to make and modify without getting really deep into it, at least until about the 90s. The 3D printing scene as it exists today pretty much couldn't exist until the 90s. Sure, very smart people with piles of cash could have done it in 000001980, potentially even the mid 000001970s, but not like a go to the computer store, buy all you need for <$000,500 (about $000,060 in 000001970s money), plug in to your cheap $000,200 (about $000,020 000001970s) computer, dial into the BBS, grab some files for The Rock's face on a toilet from your BBS, and start printing in an afternoon.

    • Novosell 14 hours ago

      Why are you giving your numbers leading 0s?

      • vel0city 12 hours ago

        It's to showcase their superior thinking over four digit year neanderthals. A very iamverysmart thing, as if people don't know after the year 9999 there will be a 10000.

        Which is why I'll always prepend even more digits when replying, because I'm thinking even further ahead than the short-sighted five digit people and even bigger numbers all the time.

        https://longnow.org/

        • kragen 10 hours ago

          HN comments are not the right place for personal attacks, and especially not personal attacks for being too nerdy.

        • Izkata 10 hours ago

          A single 0 just throws me off because it makes me think of octal.

simne 14 hours ago

Working 3D printers existed long time, but only SLS and very expensive.

Only in 2000s appear things important for cheap 3D printer:

1. Cheap power semiconductors.

2. Cheap 32-bit microcontrollers.

3. Powerful computers, to run slicer, and free slicer.

4. For photo-sensitive resin appear large 2nd market of powerful semiconductor lasers and light-modulators (from DLP projectors).

To be more exact, FIRST powerful transistors appear in 1970s, but become cheap in 1990s; microcontrollers similar, but before 2000s cheap 32-bit does not existed; microcomputers become enough powerful for 3D printing in late 1990s and appeared home CNC motion; lasers are still grow, they now have something like Moore law.

So to conclude, yes, sum of technologies need for cheap 3D printer appeared around Y2K, but then few years spend by fans to construct practice equipment and make first shipment to customer (Prusa made his design around 2009).

dekhn a day ago

I don;t think anybody was really doing FDM/FFF until the 90s and it didn't really take off until the 2000s. It was quite expensive, required a high level of expertise in both mech E and CS, and existing subtractive methods (like CNC and lathe) were very effective.

jeffreyrogers a day ago

There weren't good CAD tools to send the designs to the machine until recently. Early CAD tools mostly produced drawings and it was a separate step to manufacture from there.

The way inkjet and laser printers work is also quite different from the way a 3d printer works. The similarity is mostly in the gantry, so there was nontrivial innovation required here.

To some extent 3d printing is probably also a reaction to decreased access to domestic manufacturing. It doesn't make a lot of sense to produce most parts in plastic if you can get a cast or milled part quickly and cheaply.

evoke4908 a day ago

Part of it is technological advancement. It wasnt until the last 15 years or so that embedded processors became cheap and powerful enough to run a 3D printer at consumer prices.

There's also the problem of 3D modeling and slicing. Again up until quite recently, 3D CAD was out of reach for most consumers. Either due to hardware capabilities or cost of the software. Slicing is its own entire branch of 3D processing and it took time to develop all the techniques we use today that make it fast and reliable. Slicing software could only exist after the printers were common.

As well, I expect the availability and materials science of the plastics we use needed some further development.

As I recall, 3D printers rose to prominence at about the same time and speed as we started getting genuinely powerful personal computers. You really needed a fast CPU, and printing became more accessible as the early I5/I7 generations became cheaply available.

While you absolutely could build an FDM printer with 80s technology, I don't think it could ever be practical or affordable. Even if someone invented all the computational techniques for slicing, the compute available back then was not even close. It would literally take an actual supercomputer to slice your model. It'd take many, many hours on any consumer computer. This would hold true until the early 2000s. At a random guess, I'd say the tipping point would have been around the Pentium 4.

So, same as most technologies we take for granted these days. Enabled almost exclusively by the speed and capacity of computer available to consumers.

scyzoryk_xyz 12 hours ago

My father was an industrial design professor and I remember him showing me a 3D printed element in the early 00’s. He explained at the time that it’s the future but just way way too expensive and unstable for actual use cases. Chicken or egg sort of thing - I agree that it took the internet to exist

eigenvalue a day ago

Just having a computer that could handle working with relatively complex 3D models, not even visually but just via the terminal, would have been a tall order in the 80s. Even getting the data from the computer to the printer would be tough. Floppies wouldn't store enough data for even modestly complex 3D model files. Parallel port? Would be painfully slow. But even if you get around those issues, the algorithms for transforming the 3D model into 2D layers that could be printed with the proper support structures and with minimal material used are extremely complicated and involved a lot of intermediate data during the calculations. Most machines then wouldn't have had even close the amount of RAM needed to do that in any reasonable amount of time.

Findecanor a day ago

In my experience, it really took off when PLA filament became commonplace. ABS and UV-cured epoxy emitted noxious fumes which required separate ventilation systems and weren't really suitable to be used at home.

Early home 3D-printers also required more of the user. It took a lot of tweaking to make them produce decent prints.

mmmlinux a day ago

1. Patents. 2. CAD software was HUGELY expensive for a long time. 3. Parts were expensive. economy of scale on things like linear bearings made this crazy. I recently took apart my first gen printrbot (it cost me ~$600 and was one of the cheapest ones at the time) and it was all brand name VXB bearings. because that's basically all that existed.

kragen a day ago

It took that long because nobody was working on it, because it wasn't obvious that a low-cost 3-D printer was feasible.

The 3-D printers you're seeing today are basically the series of RepRap designs, named after famous scientists who studied self-reproduction: Darwin, Mendel, and Huxley. The RepRap project, which started in 02005, is the reason this happened. For the first several years, it was about half a dozen people: Rhys Jones, Patrick Haufe, Ed Sells, Pejman Iravani, Vik Olliver, Chris Palmer (aka NopHead) and Adrian Bowyer. The last three of these did most of the early work. Once they got it basically working, after many years of work, a lot of other people got involved.

There were a series of developments that had to happen together to get a working low-cost printer. They had to use PLA, because the plastics conventionally used (mostly ABS) had such a high thermal coefficient of expansion that they needed a heated build chamber. They had to design their own electronics, because Arduino didn't exist. They had to figure out how to build a hotend that wouldn't break itself after a few hours. They had to write a slicer. They had to write a G-code interpreter. They weren't industrial engineers, so they didn't know about Kapton. They wasted a lot of time trying to make it work without even a heated bed, to keep costs down. They improvised leadscrews out of threaded rod and garden hose. They made rotational couplings from aquarium tubing. Lots and lots of inventions were needed to get the cost down from US$60k to US$0.3k, and lots and lots of time was wasted on figuring out how to get the resulting janky machines to be reliable enough to be usable at all.

Starting in the mid-90s, Don Lancaster was excited about 3-D printers, which he called "Santa Claus machines" https://www.tinaja.com/santa01.shtml, when he could see that they were possible. He wrote lots of technical articles about building what he called "flutterwumpers": "low cost machines that spit or chomp". https://www.tinaja.com/flut01.shtml. For example, in https://www.tinaja.com/glib/muse140.pdf in 01999, he describes Gordon Robineau's low-cost PCB drill driven by MS-DOS software over a serial port, with a schematic. (The fishing-line cable drive sounds imprecise, since this was years before Spectra braided fishing line.) But nobody listened. I don't know if he ever built so much as a sign cutting machine himself.

Journalists like to talk about the patents, maybe because they're legible to nontechnical people in a way that difficulties with your retraction settings aren't, but when I was obsessively reading the RepRap blogs in the period 02005–02010, I can't recall that they ever mentioned the patents. They were just constantly hacking on their software, fixing their machines, having then break again after a few more hours of printing, and trying new stuff all the time. I don't think the patents even existed in their countries, and they were researchers, anyway, and generally patents don't prevent research. Maybe there's a vast dark-matter bulk of for-profit hackers who would have gotten involved and started up profitable consumer 3-D printing companies before 02005 if it hadn't been for the patents, but who never got interested because of the patents.

But what I saw was that businesspeople started commercializing RepRaps once the open-source RepRap hackers got them to work somewhat reliably. Before that, they mostly weren't thinking about it. After that, most of them spent a lot of years shipping very slightly tweaked RepRap designs. Josef Prusa got involved in the RepRap project and redesigned Ed Sells's Mendel, and everybody copied him, and he famously started selling it himself, very successfully. https://reprap.org/wiki/The_incomplete_RepRap_Prusa_Mendel_b... And more recently Bambu Labs has apparently gotten the machines to be much easier to use.

  • simne 4 hours ago

    > Journalists like to talk about the patents, maybe because they're legible to nontechnical people

    Journalists usually repeat what said experts, unfortunately, not always include enough context to understand.

    Patents context is global West, where patents are extremely powerful, so nobody could violate patent and make business on this. And as I know, in many cases people scared even to talk about something patented, and to do something scared like from hell.

    On global East (example ex-USSR), patents are mostly harmless, but also don't exist all other economy infrastructure, need to make big things (I mean free market, open borders, powerful VC funds, powerful industry).

  • superconduct123 a day ago

    Its interesting to me that a lot of products the R&D comes from some big company

    But for at home 3D printers it seemed like it was the hobbyists who did most of the R&D, then the companies came in later

    • kragen a day ago

      Generally, as in this case, a lot of the R&D comes from academia, and from the open source community. For-profit companies aren't very good at stuff that's far out, so they generally free-ride on open-source efforts like Linux, Python, the Web, and RepRap.

  • Palomides a day ago

    not talking about patents at the time was intentional, nobody was ignorant of the situation

    • kragen a day ago

      I think you should elaborate. Were you at the University of Bath at the time, or at Catalyst IT Ltd. in Auckland?

declan_roberts a day ago

The answer to this is the same reason why flying cars aren't ubiquitous. Inventory and discovery is often the easy part.

iamgopal a day ago

Short answer is, there is no need. Personal computer were sold to make nice printed paper. Here there is no specific application.

jeffbee a day ago

It was the polymers that needed to be invented. In the 1990s we used to order quick-turn prototype 3D prints for parts to check fit before committing to tooling for hard materials, but we only had 3 days to work with it before it fell apart into a puddle of goo.

  • foxglacier a day ago

    I don't think that's a reason. PLA and ABS were invented before the 1990's. What was this self-destructing material, and why?

    • kragen 10 hours ago

      A surprising number of popular plastics are chemically unstable to one or another extent.

lawlessone a day ago

Is it actually better than homes built by other methods?

The most expensive part of home is general the land it's built on.

  • jillesvangurp a day ago

    It's actually that and dealing with permitting. The actual problem of creating some kind of durable, comfortable, shelter isn't all that hard from a technical point of view. Getting permission to build a house though is. Society kind of forces you into debt for a mortgage just to have a place that you can live in. The alternative of paying high rent is not a great alternative as you permanently lose the money.

    There are all sorts of restrictions and rules that create this artificial scarcity. Even something as simple as buying a plot of land and parking a trailer on it is not legal in most places except in designated trailer parks. You can get a trailer for next to nothing. And lots of people live in them. But try finding a place where it is legal to put one down and live in one. If it were legal, lots of people would do that. Land plots are scarce and once you have one, you can't just do what you want with it in most places.

    I'm just using trailers as an example here. Think prefab buildings and raw material cost. This isn't rocket science. We've been building shelters since the stone age.

    There are of course good arguments for this to be made in big cities because of a lack of space. But it's equally frowned upon in areas where there's plenty of space.

  • cynicalsecurity a day ago

    He means just regular 3D printers used at home. But yeah, the title is confusing.

    • IIAOPSW a day ago

      Nonetheless the misinterpreted question is also interesting. The technical hurdles people are citing for why small scale 3d printers didn't happen earlier are completely different than those of the large scale "home printers". Why didn't we have automatic concrete wall making machines much earlier?

      • regularfry a day ago

        The bits of home printing that a machine can do aren't the hard bits, except for very specific buildings where the wall shape is odd, which there's limited demand for. You end up where the choice is a complicated, failure-prone robot in a hostile environment vs a human or three who can do the job quicker once you factor in machine setup.

    • superconduct123 a day ago

      Ya I've updated the title to consumer to be clearer

h2odragon a day ago

Micro-stepping controller chips.

Before that, the precision available without gearing and feedback wasn't sufficient. There were systems but they were order of magnitude more complicated and several orders more expensive.

  • Ccecil 2 hours ago

    Microstepping mostly is just to reduce noise and vibrations.

    The motors tend to fall to the nearest full step when loaded hard. Most people I have discussed this with believe there is little resolution gain (if any) past 4x microstepping but it certainly is a lot quieter to use x256.

  • regularfry a day ago

    I don't think it needed microstepping. The first reprap board bit-banged H-bridges with a PIC, and even later boards only used A3982 drivers. Microstepping helps, but it came later.

    You can look at early calibration settings descriptions and they're still talking about e.g. "The number of X stepper-motor steps needed to move 1 mm for the PIC."

  • foxglacier a day ago

    How do you reconcile that claim with the existence of cheap consumer paper printers and hard drives?

    • observationist a day ago

      Microcontrollers and software for consumer PCs and the like could have been produced, probably, but there are a lot of areas of deep specialization. You'd have had to bring together all sorts of different disciplines and technologies the old, hard way - universities and libraries and researching manually. The internet allowed all of those things to coalesce easily, and novice level people gained access to high quality information and research at the click of a mouse.

      The patents, compute, research access, and dozens of other relatively small barriers created a thicket of challenges and no obvious way to reconcile them, even if you had all the right ideas and inspiration. I think the internet would have been needed in order for all those ideas to come together in the right way.

    • h2odragon a day ago

      paper printers only needed such accuracy in one axis of motion, and had gearing to provide it.

      hard drives use voice coils, a completely different technology. The circuitry that does that evolved and certainly influenced the creation of microstepper controllers: the neat trick they do is treat the stepper motor as a voice coil in between full steps.

      • flimsypremise a day ago

        I have several 2-axis microscope stages from the 80s/90s that are driven by brushed motors with position feedback, and they are all capable of higher accuracy than any stepper motor I have. The capability was there, it was just pricey.

        Hell, CNC machines existed back then too.

      • kragen a day ago

        At the time, hard drives used stepper motors, but didn't use microstepping. Paper printers like the MX-80 used stepper motors too, it's true, but didn't use microstepping either. Gearing makes your step size smaller but adds backlash, so it can be the enemy of precision; position feedback like current inkjet printers use is much more precise.

    • eesmith a day ago

      "cheap consumer paper printers and hard drives" was not a 1970s thing.

      I mean, towards the end of the decade was something like the ImageWriter, which let you do bitmapped graphics, as a row of 9 dots at a time. At https://www.folklore.org/Thunderscan.html?sort=date you can read about the difficulties of turning it into a scanner. (Like, 'We could almost double the speed if we scanned in both directions, but it was hard to get the adjacent scan lines that were scanned in opposite directions to line up properly.')

      The LaserWriter wasn't until 1985 or so. My first hard drive, 30 MB, was a present from my parents around 1987.

      By the 1996, laser-based 3D printing based on cutting out layers of paper was a thing, available for general use in one of the computing labs in the university building where I worked.

      The result smelled like burnt wood.

      When I visited a few years later they had switched to some other technology, and one which could be colored, but I forgot what.

      • slantyyz a day ago

        The Thunderscan, for the time, was pretty awesome though. I remember borrowing one from a classmate to make some scans. Given how we keep a document scanner in our pocket these days, the whole notion of sticking a scanner into a printer seems so antiquated and kinda crazy.

K0balt a day ago

The magic ingredients were inexpensive microcontrollers, massively paralleled mosfet transistors (also a product of large scale integrated circuits, since they are actually millions of tiny mosfets working together), and the expiration of several key patents that made it possible to have a commercial explosion around the reprap scene.

The patents expiring was a big deal, since the main patent was on the fused deposition process itself.

The other factor was that normal desktop computers had become powerful enough to run sophisticated 3d modeling programs and make machine motion computations from 3d design files.

cynicalsecurity a day ago

Are many people going to buy a home 3D printer? It's not a profitable business.

  • JohnFen a day ago

    You don't have to have a large market in order to have a profitable business. That said, the market for 3D printers at home is larger than many assume, especially as printers get closer to being simple one-button-push kinds of things.

    They will probably never be the sort of thing that exists in every home, but they could very well be the sort of thing that exists in every home workshop.

  • Legend2440 a day ago

    Huh? There are millions of people with a 3D printer in their home. They're so cheap now that people buy them for their grandkids.

al2o3cr a day ago

Equivalent statement: "A flying car is just a car with wings, what's so difficult about that???"

refulgentis a day ago

Computers in 1970 were so slow we hacked in saying lightness, a scientific measure of color, was the average of the two highest color channels.

spacecadet a day ago

I knew someone who recycled paper printers into a home brew 3D printer and then a 2D shopbot style CNC.