Search
Close this search box.

Exploring 3D design and fabrication in the computer age.

Humans design and build things! It’s part of our DNA. It’s what makes us unique compared to every other biological species on this planet. We are Engineers!

Graphics and Computers

Graphic Design is nothing new. Since ancient times people have been drawing images on cave walls, pottery, trees and anything else they think will suffice. Techniques and methods change over time, but the basics of using imagery to represent and record is part of every human that ever existed.

It’s no surprise that as soon as computer technology advanced enough, people started to create computer graphics too. The first computer finished being built in 1946 (it took 2 years to build). The the first CAD/CAM software was finished in 1968 (by Pierre Bézier, an engineer at Renault). The original PC’s (1973/4) only had text graphics, but immediately text pictures were being produced. Email signatures used this technique for a long time, even after html emails included pictures. In 1982, AutoCAD brought true CAD capabilities to PC’s.

I first saw a computer in grade school and students started to learn how to code, type and create pictures. For me the first computer I used was an Acorn BBC (released 1981). In 1984, “Apple” produced the “Macintosh” computer. My Arts teacher bought one and kept it in his office at school. His students were allowed to use it, and I remember it was the first computer graphics program I ever used. It allowed you to change pixels between black and white, and you could draw lines either freehand or with a stretched line via a mouse. Brilliant for its time, but a little clunky compared to modern computers.

In May 1992, “id Software” released the program “Wolfenstein 3D”. I was at university then, and this game opened up a world of opportunities. Before this game, 3D computing had been all about models and static images. Even CAD/CAM systems would change forever.

At the time, everyone (at university) marvelled at this achievement. It was the first ever game to provide real-time imagery and movement within a virtual 3D world on a personal computer. “Wolfenstein 3D” showed that it is possible to construct and animate 3D models on personal computers, and this made 3D technology accessible to the world.

Virtual 3D Modelling

Drawing from scratch

Photogrametry

3D Scanners

3D Libraries

3d Manufacturing

CNC Machines

The industrial revolution (1760) is responsible for the mass-production and manufacturing techniques which evolved into looms, lathes, mills and machining equipment.

Starting in the 1820/30s, the cams used to control music boxes and cuckoo clocks were modified to provide some automation capabilities to manufacturing, but these cams required manufacturing and could only produce a specific part.

In the 1940s, the first NC (Numerical Controller) machines were developed to provide some level of programmability to machining. Originally punched tape was used.

The first true CNC machine was developed by MIT for the US Airforce and was demonstrated in 1952. It used server motors with feedback controls and was used to fabricate one-off designs. Although the quality was as good or better than human labour, the programming was too labour intensive to automate the production industry.

CNC development continued to be improved over the next decade and by the 1960s, much of the computerised control concepts were standardised (PRONTO 1958, APT 1959). APT (produced by MIT) is the first version of G-Code (officially RS-274) which has become the default standard for CNC programming.

With computer technology improving and costs dropping, the minicomputers of the 1960s started being used for CNC systems. This improved the programming speed and decreased the cost of the complete systems. By the 1970’s, microprocessors replaced the minicomputers and these are the controllers that are still in use today (all be it higher quality versions).

Until the early 1970’s, most of the CNC machines were produced in the USA where the initial research had been done, but slow production rates caused other countries like Germany and Japan to take up CNC machine manufacturing and by the end of the 1970’s Japan and Germany both produced more CNC machines than the USA.

In 1989, the EMC project (Enhanced Machine Controllers) brough CNC machines out of factories and into private workshops. This development and the LinuxCNC software meant that anyone with a computer could setup a CNC machine and manufacture any parts they wanted.

CNC machines can be built to carve most metals including steel, timber and wooden parts, plastics and styrenes, and even stonework.

3D Printing

The first 3D printer was built in Japan 1981 (Dr. Hideo Kodama) and it laid the foundations for Stereolithograms (SLA) as the first additive method for manufacturing custom parts. The first version used light to set a polymer to create a solid mass. It didn’t take off and the inventor didn’t get his patent.

French developers picked up the concept and made the stereolithogram (SLA) in 1984, but couldn’t find a commercial application, so the project was abandoned.

In 1986, Charles (Chuck) Hull (USA) picked up the SLA concept and built what is considered the first modern 3D printer. Chuck started “DTM Inc.” and 3D printing companies were born. “DTM Inc.” went on to produce the first SLS (Selective Laser Sintering) Machine which uses lasers to convert powder into solids instead of liquids.

3D printers now existed, if you could afford them, but were out of the price range of most people. Some specialist medical results were produced but the cost was mostly prohibitive.

In 2005, a doctor (Dr. Adrian Bowyer) started the Open source initiative that resulted in the RepRap (Replication Rapid-Prototyper) project. The project had the object of designing a 3D printer that could print its own part and hence self-replicate. The ultimate objective was not fully realised, but the project did result in several versions of low cost 3D printers being developed and being open source, anyone with enough technological understanding could use the results of the project to build their own 3D printer.

Now 3D printers which print 3D objects from plastic (ABS & PLA) can be bought from stationary shops (Office Works). Large versions can print houses with specialist concretes. Food (like chocolate) can be 3D printed. If you need metal parts it depends on which metals you want to use, but you can have your 3D designs produced for you. Titanium can be printed with specialist 3D printers, or you can have a 3D printer produce a mold and then have metal objects made from the mold (even gold).

Interestingly, the g-code used for CNC machines is the same G-code used for 3D Printers. They just has a few tool customisations to control the extruders and heated plates

Arc Welders

Welding has been around since the bronze age, as Forge Welding, but it was a specialist profession until electric or arc welding became available.

Arc welding was first identified in 1800 and independently again in 1802, but it wasn’t until 1881 that arc welding was practical in its design. It took a while for arc welding to take its place in manufacturing, gaining practical use during WWI and progressing steadily through to WWII. In practice, arc welding has been well understand since 1961 and although the equipment has improved since then, no major advances in the concepts have occurred.

Robotic welding was first used in 1962 by GM, but it wasn’t until the 1980’s that robotic welding became popular and even then it was mostly in the automotive industry.

With advances in automated mills, anyone who can program a CNC machine can also control a robotic welder.