NotesFAQContact Us
Search Tips
ERIC Number: ED305434
Record Type: RIE
Publication Date: 1988-Dec
Pages: 16
Abstractor: N/A
Reference Count: N/A
A History of Computer Numerical Control.
Haggen, Gilbert L.
Computer numerical control (CNC) has evolved from the first significant counting method--the abacus. Babbage had perhaps the greatest impact on the development of modern day computers with his analytical engine. Hollerith's functioning machine with punched cards was used in tabulating the 1890 U.S. Census. In order for computers to become a reality, the binary number system, Boolean algebra, and electromechanical circuitry, a system used by every digital computer, had to be discovered, invented, or developed. During World War II, IBM built a computer using simple electromechanical relays as on-off switching devices and punched tape to provide the necessary information to manipulate data. The 1950s saw the start of two related trends--building of a magnetic memory and development of the transistor. Through the 1960s, the integrated circuit, microcomputers, and memory chips were developed. Further advances in the chip, affordable computers, and software occurred in the 1970s. In 1976, CNC made its debut at the Chicago Trade Show. As computer chips became smaller and more powerful, the CNC machines could be used with almost any process involving X,Y,Z coordinates. Examples of contemporary CNC machines are the Bosto Matic Model 405, Cincinnati Milacron's Acramatic 760 G, and G.E.'s Fanuc Automation's CNC. Future developments will include more artificial intelligence and the clone-type brain. (YLB)
Publication Type: Speeches/Meeting Papers; Historical Materials
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Note: Paper presented at the American Vocational Association Convention (St. Louis, MO, December 1988).