Reference Series Table of Contents For This Issue

How Computers Work, Part I
August 2001• Vol.5 Issue 3
Page(s) 126-131 in print issue

Producing The PC
How Manufacturers Turn A Pile Of Components Into Your Next Computer

The first computer, which was completed in 1946, was big enough to fill an entire room; it was 100 feet long and 10 feet high, and it weighed 30 tons.
Manufacturing a PC is a bit more detailed than taking a few parts and fitting them inside a case. Although there certainly are businesses that order numerous components and assemble them within a generic case, among the top manufacturers in the industry, a typical PC manufacturing line requires much more: keeping abreast of the latest technology.

We spoke with numerous manufacturers to provide some behind-the-scenes information on the process of building a PC, maintaining supplies, and controlling inventory.

Before we start, we should note that our article begins after all the PC’s components have been manufactured and tested. While some PC manufacturers also make their own motherboards, RAM, and other components, most have these parts assembled at plants separate from where the actual PC is put together. For more information on how microprocessors are made, see "Building The Perfect Processor" in this issue.

 Size & Capacity. Manufacturing facilities range anywhere from 75,000 to 200,000 square feet, with between 40 and 60 different configurations being built simultaneously. Industry accounts vary on how many systems might be built each day. In some manufacturing companies, one person builds the entire system. If a single person constructs a system after the motherboard is installed, the average might be 60 units per day for a typical system, and 40 systems a day for a more complex system.

In a group capacity, where each person installs one component, starting with the motherboard, about 10 systems are built per hour, or about 70 systems per day. These numbers vary depending on the number of systems forecasted for the day and their specific configurations. Many of today’s PCs are custom configured for each customer. Manufacturers claim that custom configuration speeds turnover and reduces cycle times, providing customers a product finished to their specifications when they want it.

Manufacturers who use a team rather than a single person to build the system believe quality improves because after each person installs his or her component, the next person can inspect the previous work before adding another component. Additional inspection of the PC tends to increase quality. Some manufacturers, such as IBM, use this method to ensure quality. Everyone is responsible for the final product. Once the system is assembled, the team puts a stamp on it, indicating that they built this system and they stand behind it, much like the Inspected By sticker you can always find on a new pair of jeans.

 Orders & Suppliers. Behind the manufacturing scene, one person typically handles the scheduling and planning for various configurations, and that person also makes certain the parts are in stock or ordered. This is a vital part of maintaining inventory control. Systems are built as demand requires to avoid a backlog of systems that might not sell. As consumers and businesses order PCs and inventory is assigned to various superstores, the numerous system configurations are tracked daily to determine which systems need to be built.

An important aspect of building top-notch systems is deciding which parts to use from which vendors for every system configuration. Top manufacturers don’t choose components from just any vendor. Choices are based on qualifications, service, and component price. Different vendors are used based on availability, quality, and priority of the configuration. Forecasts are reviewed biweekly or monthly to make adjustments on supply. These companies work to determine the requirements of the buyers, sales group, and vendor availability.

Components come from all over the world, but usually only the top three to five vendors of each component are considered by larger manufacturers. From a supplier standpoint, manufacturers usually choose a parts supplier based on the location of the supplier and the manufacturing site. To manage the inventory, it’s important to get the parts delivered on time. Manufacturers expect the suppliers to send the right quantity and model mix to put them through the manufacturing engine.

The larger PC manufacturers also tend to shy away from single-source suppliers. Manufactures like to offer various configurations to the customer, such as a choice of four to five hard drive sizes from approved vendors. For the case (shell) of the system, some companies opt to use no more than one or two vendors.

 Assembly. Once the daily orders are in and the parts have come in from suppliers, components are picked individually according to the configuration, and the parts are put into a tote box. The number of parts varies according to how many systems the person or team is expected to produce. The parts are taken to the assembly area, and construction begins. Typically, there are eight to 10 major components: case, motherboard, processor, memory, hard drive, video card, modem or network card, CD-ROM drive, diskette drive, plus options such as a Zip drive, CD-RW drive, or DVD-ROM drive.

Each component undergoes an extensive qualification process, known as quality control, before it is used in a system. These companies have an infrastructure to provide a quality testing process, with years of research devoted strictly to testing systems. This extensive pretesting reduces the likelihood of shipping faulty systems.

Some manufacturers ship the shell and motherboard preinstalled from one location to another for final assembly. Other companies, such as IBM, piece everything together in the plant, including the motherboard.

The first stage of construction consists of inspecting the case for defects or scratches and applying any labels to the case. Next, workers install the motherboard, make the settings for the processor, and install the processor if it didn’t ship on the motherboard. Workers then insert memory and install any internal speakers. If an additional sound card is used, it is added at this time.

Manufacturers attach the hard drive to the proper bracket, making sure to attach it tightly within the case chassis, and the CD- ROM drive follows in the same manner. If the system has a DVD-ROM drive or an internal Zip drive, these components are also installed. The cables to these parts, as well as the cables to the audio card, are then connected so they can communicate with the motherboard.

The next step in the process is installing and connecting the power supply and its connectors. Then workers install any additional adapter cards, including a video card, modem, network card, or other device. Before inserting these cards into the proper places, the card is examined for any manufacturing discrepancies or design flaws.

After all of the parts are in place, the PC is given a second and even a third look to determine that the components are in the proper place, the cables are connected, and everything is secure. These inspectors pay attention to any cables that might be strung across the processor heat sink or resting against the memory slots. The machine is also examined for cleanliness and any visual defects. If the PC passes these tests, the top cover is attached, and it is sent off for further testing.

 Testing. Testing procedures differ from company to company, but one thing seems to be consistent throughout: Once a system is built, it enters a testing arena, where it must pass a rigorous burn-in period that can last from four to 48 hours. This burn- in period is similar to varying the speed of a new car during its first 500 miles on the road. Just before this burn-in, the system’s CMOS (complementary metal- oxide semiconductor), which is basically the circuitry for the processor and memory, is set up.

All of the system’s internal components are rigorously tested; this can take between one and two hours depending on the machine configuration. Servers take more time because of their infrastructure, and consumer PCs are fairly simple. After the burn-in period and additional diagnostic tests, it’s common for companies to recheck all of the components to verify that everything functions properly. If there are speakers, testers make certain the PC plays sound, and the same goes for a CD-ROM drive or any other drive. There are manual tests done on the keyboard and mouse, and even when the entire system is approved and sent on to the distribution center, it has a chance of being randomly inspected for any problems before it is sent off to shipping. At this point, the computers are sent out to consumers and businesses or to various computer stores.

 The Arena Changes. Since building some of the first PCs in the 1980s, manufacturers have seen a lot of changes in the PC arena. First of all, technology has increased dramatically, with improvements and new technologies being introduced at an amazing rate.

According to Wes Montgomery, a business operations manager at IBM, PCs, much like televisions, are a commodity market, which means the manufacturer with the best parts wins. At IBM, Montgomery stresses the importance of maintaining inventory control by managing the parts.

The average computer manufacturing plant can be as large as 200,000 square feet, or more than four football fields.
Ten years ago, companies with solid manufacturing and a great PC design had the competitive edge, but now it comes down to inventory control to maintain low overhead, Montgomery says. Most of the companies we spoke with say they stand behind a strict quality control regimen. They conduct income and quality inspections of parts, keep suppliers nearby for replenishment, and then pull parts from them as needed to maintain inventory control. A lot of burden is put on the suppliers to manage the inventory, and most companies have an internal organization to ensure proper turnover.

Once the manufacturer decides from whom to buy parts and how to organize inventory control, the next step is to employ personnel who can handle the daunting tasks at hand.

According to Nancy Henry-Serra, manager of business operations, Personal Systems Group, Manufacturing, Distribution, and Fulfillment at IBM, the key driver in manufacturing is head count. Automated assembly simply can’t keep up with the rapid changes in technology, so PCs are built almost entirely by hand.

From IBM’s perspective, a qualified candidate can be trained to start assembling systems in three to five days. During the times when manufacturing is aggressive, such as at the end of each quarter and especially the end of the year, some companies employ a temporary workforce to meet demands.

 History. There is a century of ideas and research that led to the PC manufacturing process as it exists today. We dove into the history files to trace the first origins of the computer and the first manufacturing lines.

Like any invention, the computer began as a few creative ideas. In 1822, Charles Babbage, a mathematician and university instructor, earned the “Father of Computing” title by creating the Difference Engine. Babbage imagined that the Difference Engine would be so large it would fill an entire room and be able to calculate and then print mathematical tables.

The English government, with Duke Wellington at the helm, decided the Difference Engine was a worthwhile idea, and in 1829, Joseph Clement, an engineer, was hired to build it. However, Babbage and Clement had a falling out, and the engine was never built. But Babbage’s idea couldn’t stop there. This concept was finally completed in the form of the Tabulating Machine in 1853 by George and Edward Scheutz, a father-and-son team who had studied Babbage’s ideas.

In 1834, Babbage conceived the Analytical Engine, which could store programs, add numbers in about three seconds, and solve a multiplication or division problem in two to four minutes. This concept included using punch cards for input, as early computers eventually did. Although Babbage did build a prototype the year of his death, some experts said the Analytical Engine was impossible to build, and so it was forgotten for quite some time.

Nearly 100 years passed from when the first computer was conceptualized to when it was finally completed in 1946 by J. Presper Eckert and John W. Mauchly, with funding from the U.S. Army. The system, known as the ENIAC (Electronic Numerical Integrator and Computer), took up an entire room and conducted 300 multiplications per second and 5,000 additions per second. It used 1,000 square feet of space and was made up of 17,468 vacuum tubes. It was used for testing hydrogen bomb theories, predicting weather, and calculating artillery trajectories.

The next computer, the ERA 1101, was invented in 1950 and was used by the U.S. Navy. It was intended for commercial production, and it stored information using magnetic pulses via a magnetic drum consisting of 1 million bits.

In 1952 IBM designed the 701, which used vacuum technology and was the first electronic computer. This introduced to the workplace as an innovative method of figuring payroll and billing. IBM continued to hone computer technology and constructed the first mass-produced computer, the 650, a magnetic drum calculator.

With the invention of the TRADIC (Transistorized Airborne Digital Computer) by AT&T Bell Laboratories, transistors replaced vacuum tubes. By using transistors, the overall design and power needed to run the computer were reduced, and the resulting system was 20 times faster than computers made with vacuum tubes.

In 1956, technology took another turn with the creation of the RAMAC (Random Access Method of Accounting and Control). The RAMAC, developed by IBM, sounds a lot like the random-access memory we see in current systems, except that it consisted of platters that could store 5 million bytes of data each. This was the first disk storage system.

Computer families started appearing in 1964 with the IBM System/360. IBM offered 19 different configurations, something that had never been done before. Consumers could suddenly upgrade hardware, and software was interchangeable between any of the 19 configurations. It proved successful for IBM. With orders for 1,000 or more System/360s a month, the manufacturing line was perfected. IBM had mass-produced systems in the past, but nothing compared to the popularity of this line.

In 1965, Douglas Englebart developed the first computer mouse. The mouse wasn’t used until the 1980s with the Apple computer. During the 1970s, Intel introduced RAM. IBM designed the 8-inch diskette in 1971, followed by the 5.25-inch diskette in 1976. Intel released its first microprocessor in 1971. Constructed by Marcian E. Hoff, the 4004 microprocessor had what was then considered an amazing speed of 108KHz.

With the invention of the microprocessor, new systems arrived yearly, including the Apple I computer in 1976 and the Apple II computer in 1977. The Apple II had plenty to offer, with a motherboard, a microprocessor, 16KB of memory, a 5.25-inch diskette drive, and a game. This early computer could be connected to a TV set, and it was bundled in a desktop case.

Other components vital to the PC manufacturing process appeared in the 1980s. Seagate Technology created the first hard drive in 1980. It stored 5MB of information via a metallic coated platter. Optical storage, which stores data by using a laser to burn marks onto the disc, was also introduced in 1980. Although optical discs didn’t become popular overnight, they could hold 60 times the capacity of a 5.25-inch diskette.

IBM introduced the first PC in 1981. The PC used a 4.77MHz processor and 16KB of memory that you could expand to 256KB. The IBM PC shipped with a color monitor, used the MS-DOS operating system, and contained one or two diskette drives.

Suddenly, computers started arriving in businesses, schools, and even home. From this point on, other manufacturers that assembled similar products running the same software platforms were considered “IBM clones.” During this time, IBM literally invented the PC manufacturing business to keep up with the many orders.

Sony invented the 3.5-inch diskette drive and diskette in 1981. These drives are still used in the majority of today’s systems; each diskette holds 1.44MB of data. In 1985, Sony and Philips introduced CD-ROMs, which hold about 650MB of data. It wasn’t until the 1990s that the majority of storage components used in current systems, including DVD-ROM and Zip drives, arrived on the scene.

As these new technologies appeared, they had to be implemented into the manufacturing line. Older manufacturing lines consisted of a long serial flow line, due to the large sizes and the number of machines. It took a lot of time to introduce a new part, and managing the inventory was a challenge. Now, new components can be introduced in just a few days.

 From Parts To PC. With new technology around every corner, current manufacturing lines are fine-tuned machines ready to implement any component. When you’re purchasing your next PC, think about its components and the organization and skill that goes into each system. Whether you’re checking for Inspected By labels or looking at the case construction, there’s bound to be some history behind every PC.  

by Buffy Cranford-Petelle

View the graphics that accompany this article.
Inside Your Computer
How Companies Manufacture PCs
(NOTE: These pages are PDF (Portable Document Format) files. You will need Adobe Acrobat Reader to view these pages. Download Adobe Acrobat Reader)

Want more information about a topic you found of interest while reading this article? Type a word or phrase that identifies the topic and click "Search" to find relevant articles from within our editorial database.

© Copyright by Sandhills Publishing Company 2001. All rights reserved.