jump to navigation

Electrical engineering definition and history September 27, 2012

Posted by AriantsBaMs PIN BB: 287C84A4 in Ilmu Pengetahuan, IT.
trackback

Electrical engineering is a field of engineering that generally deals with the study and application of electricity, electronics and electromagnetism. The field first became an identifiable occupation in the late nineteenth century after commercialization of the electric telegraph and electrical power supply. It now covers a range of subtopics including power, electronics, control systems, signal processing and telecommunications.

Electrical engineering may include electronic engineering. Where a distinction is made, usually outside of the United States, electrical engineering is considered to deal with the problems associated with systems such as power transmission and electrical machines, whereas electronic engineering deals with the study of electronic systems including computers and integrated circuits.[1] Alternatively, electrical engineers are usually concerned with using electricity to transmit energy, while electronic engineers are concerned with using electricity to process information. The sub-disciplines can overlap, for example, in the growth of power electronics, and the study of behavior of large electrical grids

 

History

Main article: History of electrical engineering

The discoveries of Michael Faraday formed the foundation of electric motor technology.

Electricity has been a subject of scientific interest since at least the early 17th century. The first electrical engineer was probably William Gilbert who designed the versorium: a device that detected the presence of statically charged objects. He was also the first to draw a clear distinction between magnetism and static electricity and is credited with establishing the term electricity.[2] In 1775 Alessandro Volta‘s scientific experimentations devised the electrophorus, a device that produced a static electric charge, and by 1800 Volta developed the voltaic pile, a forerunner of the electric battery.[3]

However, it was not until the 19th century that research into the subject started to intensify. Notable developments in this century include the work of Georg Ohm, who in 1827 quantified the relationship between the electric current and potential difference in a conductor, Michael Faraday, the discoverer of electromagnetic induction in 1831, and James Clerk Maxwell, who in 1873 published a unified theory of electricity and magnetism in his treatise Electricity and Magnetism.[4]

From the 1830s, efforts were made to apply electricity to practical use in telegraphy. By the end of the 19th century the world had been forever changed by the rapid communication made possible by engineering development of land-line, underwater and, eventually, wireless telegraphy.

Practical applications and advances in such fields created an increasing need for standardized units of measure; it led to the international standardization of the units ohm, volt, ampere, coulomb, and watt. This was achieved at an international conference in Chicago 1893.[5] The publication of these standards formed the basis of future advances in standardisation in various industries, and in many countries the definitions were immediately recognised in relevant legislation.[6]

Thomas Edison built the world’s first large-scale electrical supply network.

During these years, the study of electricity was largely considered to be a subfield of physics. It was not until the late 19th century that universities started to offer degrees in electrical engineering. The Darmstadt University of Technology founded the first chair and the first faculty of electrical engineering worldwide in 1882. In the same year, under Professor Charles Cross, the Massachusetts Institute of Technology began offering the first option of Electrical Engineering within a physics department.[7] In 1883 Darmstadt University of Technology and Cornell University introduced the world’s first courses of study in electrical engineering, and in 1885 the University College London founded the first chair of electrical engineering in the United Kingdom.[8] The University of Missouri subsequently established the first department of electrical engineering in the United States in 1886.[9]

Nikola Tesla made long-distance electrical transmission networks possible.

During this period, the work concerning electrical engineering increased dramatically. In 1882, Edison switched on the world’s first large-scale electrical supply network that provided 110 volts direct current to fifty-nine customers in lower Manhattan. In 1884 Sir Charles Parsons invented the steam turbine which today generates about 80 percent of the electric power in the world using a variety of heat sources. In 1887, Nikola Tesla filed a number of patents related to a competing form of power distribution known as alternating current. In the following years a bitter rivalry between Tesla and Edison, known as the “War of Currents“, took place over the preferred method of distribution. AC eventually replaced DC for generation and power distribution, enormously extending the range and improving the safety and efficiency of power distribution.

The efforts of the two did much to further electrical engineering—Tesla’s work on induction motors and polyphase systems influenced the field for years to come, while Edison’s work on telegraphy and his development of the stock ticker proved lucrative for his company, which ultimately became General Electric. However, by the end of the 19th century, other key figures in the progress of electrical engineering were beginning to emerge.[10]

Modern developments

During the development of radio, many scientists and inventors contributed to radio technology and electronics. In his classic UHF experiments of 1888, Heinrich Hertz transmitted (via a spark-gap transmitter) and detected radio waves using electrical equipment. In 1895, Nikola Tesla was able to detect signals from the transmissions of his New York lab at West Point (a distance of 80.4 km / 49.95 miles).[11] In 1897, Karl Ferdinand Braun introduced the cathode ray tube as part of an oscilloscope, a crucial enabling technology for electronic television.[12] John Fleming invented the first radio tube, the diode, in 1904. Two years later, Robert von Lieben and Lee De Forest independently developed the amplifier tube, called the triode.[13] In 1895, Guglielmo Marconi furthered the art of hertzian wireless methods. Early on, he sent wireless signals over a distance of one and a half miles. In December 1901, he sent wireless waves that were not affected by the curvature of the Earth. Marconi later transmitted the wireless signals across the Atlantic between Poldhu, Cornwall, and St. John’s, Newfoundland, a distance of 2,100 miles (3,400 km).[14] In 1920 Albert Hull developed the magnetron which would eventually lead to the development of the microwave oven in 1946 by Percy Spencer.[15][16] In 1934 the British military began to make strides toward radar (which also uses the magnetron) under the direction of Dr Wimperis, culminating in the operation of the first radar station at Bawdsey in August 1936.[17]

In 1941 Konrad Zuse presented the Z3, the world’s first fully functional and programmable computer using electromechanical parts. In 1943 Tommy Flowers designed and built the Colossus, the world’s first fully functional, electronic, digital and programmable computer.[18] In 1946 the ENIAC (Electronic Numerical Integrator and Computer) of John Presper Eckert and John Mauchly followed, beginning the computing era. The arithmetic performance of these machines allowed engineers to develop completely new technologies and achieve new objectives, including the Apollo program which culminated in landing people on the Moon.[19]

The invention of the transistor in 1947 by William B. Shockley, John Bardeen and Walter Brattain opened the door for more compact devices and led to the development of the integrated circuit in 1958 by Jack Kilby and independently in 1959 by Robert Noyce.[20] Starting in 1968, Ted Hoff and a team at Intel invented the first commercial microprocessor, which presaged the personal computer. The Intel 4004 was a 4-bit processor released in 1971, but in 1973 the Intel 8080, an 8-bit processor, made the first personal computer, the Altair 8800, possible.[21]

Komentar»

No comments yet — be the first.

Tinggalkan komentar