Stochastic Process

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 40

[3]

adevices did not become standard in meteorology for two centuries. The concept has
remained virtually unchanged as evidenced by pneumatic chart recorders, where a
pressurized bellows displaces a pen. Integrating sensors, displays, recorders, and
controls was uncommon until the industrial revolution, limited by both need and
practicality.

Early industrial[edit]

The evolution of analogue control loop signalling from the pneumatic era to the electronic era

Early systems used direct process connections to local control panels for control and
indication, which from the early 1930s saw the introduction of pneumatic transmitters
and automatic 3-term (PID) controllers.

The ranges of pneumatic transmitters were defined by the need to control valves and
actuators in the field. Typically, a signal ranged from 3 to 15 psi (20 to 100kPa or 0.2 to
1.0 kg/cm2) as a standard, was standardized with 6 to 30 psi occasionally being used
for larger valves. Transistor electronics enabled wiring to replace pipes, initially with a
range of 20 to 100mA at up to 90V for loop powered devices, reducing to 4 to 20mA at
12 to 24V in more modern systems. A transmitter is a device that produces an output
signal, often in the form of a 4–20 mA electrical current signal, although many other
options using voltage, frequency, pressure, or ethernet are possible. The transistor was
[4]
commercialized by the mid-1950s.

Instruments attached to a control system provided signals used to operate solenoids,


valves, regulators, circuit breakers, relays and other devices. Such devices could
control a desired output variable, and provide either remote monitoring or automated
control capabilities.

Each instrument company introduced their own standard instrumentation signal,


causing confusion until the 4–20 mA range was used as the standard electronic
instrument signal for transmitters and valves. This signal was eventually standardized
as ANSI/ISA S50, "Compatibility of Analog Signals for Electronic Industrial Process
Instruments", in the 1970s. The transformation of instrumentation from mechanical
pneumatic transmitters, controllers, and valves to electronic instruments reduced
maintenance costs as electronic instruments were more dependable than mechanical
instruments. This also increased efficiency and production due to their increase in
accuracy. Pneumatics enjoyed some advantages, being favored in corrosive and
[5]
explosive atmospheres.

Automatic process control[edit]

Example of a single industrial control loop, showing continuously modulated control of process
flow

In the early years of process control, process indicators and control elements such as
valves were monitored by an operator, that walked around the unit adjusting the valves
to obtain the desired temperatures, pressures, and flows. As technology evolved
pneumatic controllers were invented and mounted in the field that monitored the
process and controlled the valves. This reduced the amount of time process operators
needed to monitor the process. Latter years, the actual controllers were moved to a
central room and signals were sent into the control room to monitor the process and
outputs signals were sent to the final control element such as a valve to adjust the
process as needed. These controllers and indicators were mounted on a wall called a
control board. The operators stood in front of this board walking back and forth
monitoring the process indicators. This again reduced the number and amount of time
process operators were needed to walk around the units. The most standard pneumatic
[6]
signal level used during these years was 3–15 psig.

Large integrated computer-based systems[edit]


Pneumatic "three term" pneumatic PID controller, widely used before electronics became reliable
and cheaper and safe to use in hazardous areas (Siemens Telepneu Example)

A pre-DCS/SCADA era central control room. Whilst the controls are centralised in one place, they
are still discrete and not integrated into one system.

A DCS control room where plant information and controls are displayed on computer graphics
screens. The operators are seated and can view and control any part of the process from their
screens, whilst retaining a plant overview.

Process control of large industrial plants has evolved through many stages. Initially,
control would be from panels local to the process plant. However, this required a large
manpower resource to attend to these dispersed panels, and there was no overall view
of the process. The next logical development was the transmission of all plant
measurements to a permanently staffed central control room. Effectively this was the
centralization of all the localized panels, with the advantages of lower manning levels
and easy overview of the process. Often the controllers were behind the control room
panels, and all automatic and manual control outputs were transmitted back to plant.

However, whilst providing a central control focus, this arrangement was inflexible as
each control loop had its own controller hardware, and continual operator movement
within the control room was required to view different parts of the process. With coming
of electronic processors and graphic displays it became possible to replace these
discrete controllers with computer-based algorithms, hosted on a network of
input/output racks with their own control processors. These could be distributed around
plant, and communicate with the graphic display in the control room or rooms. The
distributed control concept was born.

The introduction of DCSs and SCADA allowed easy interconnection and re-
configuration of plant controls such as cascaded loops and interlocks, and easy
interfacing with other production computer systems. It enabled sophisticated alarm
handling, introduced automatic event logging, removed the need for physical records
such as chart recorders, allowed the control racks to be networked and thereby located
locally to plant to reduce cabling runs, and provided high level overviews of plant status
and production levels.

Application[edit]
In some cases, the sensor is a very minor element of the mechanism. Digital cameras
and wristwatches might technically meet the loose definition of instrumentation because
they record and/or display sensed information. Under most circumstances neither would
be called instrumentation, but when used to measure the elapsed time of a race and to
document the winner at the finish line, both would be called instrumentation.

Household[edit]

A very simple example of an instrumentation system is a mechanical thermostat, used


to control a household furnace and thus to control room temperature. A typical unit
senses temperature with a bi-metallic strip. It displays temperature by a needle on the
free end of the strip. It activates the furnace by a mercury switch. As the switch is
rotated by the strip, the mercury makes physical (and thus electrical) contact between
electrodes.

Another example of an instrumentation system is a home security system. Such a


system consists of sensors (motion detection, switches to detect door openings), simple
algorithms to detect intrusion, local control (arm/disarm) and remote monitoring of the
system so that the police can be summoned. Communication is an inherent part of the
design.

Kitchen appliances use sensors for control.


● A refrigerator maintains a constant temperature by actuating the cooling
system when the temperature becomes too high.
● An automatic ice machine makes ice until a limit switch is thrown.
● Pop-up bread toasters allow the time to be set.
● Non-electronic gas ovens will regulate the temperature with a thermostat
controlling the flow of gas to the gas burner. These may feature a sensor bulb
sited within the main chamber of the oven. In addition, there may be a safety
cut-off flame supervision device: after ignition, the burner's control knob must
be held for a short time in order for a sensor to become hot, and permit the
flow of gas to the burner. If the safety sensor becomes cold, this may indicate
the flame on the burner has become extinguished, and to prevent a
continuous leak of gas the flow is stopped.
● Electric ovens use a temperature sensor and will turn on heating elements
when the temperature is too low. More advanced ovens will actuate fans in
response to temperature sensors, to distribute heat or to cool.
● A common toilet refills the water tank until a float closes the valve. The float is
acting as a water level sensor.
Automotive[edit]

Modern automobiles have complex instrumentation. In addition to displays of engine


rotational speed and vehicle linear speed, there are also displays of battery voltage and
current, fluid levels, fluid temperatures, distance traveled, and feedback of various
controls (turn signals, parking brake, headlights, transmission position). Cautions may
be displayed for special problems (fuel low, check engine, tire pressure low, door ajar,
seat belt unfastened). Problems are recorded so they can be reported to diagnostic
equipment. Navigation systems can provide voice commands to reach a destination.
Automotive instrumentation must be cheap and reliable over long periods in harsh
environments. There may be independent airbag systems that contain sensors, logic
and actuators. Anti-skid braking systems use sensors to control the brakes, while cruise
control affects throttle position. A wide variety of services can be provided via
communication links on the OnStar system. Autonomous cars (with exotic
instrumentation) have been shown.

Aircraft[edit]
[7]
Early aircraft had a few sensors. "Steam gauges" converted air pressures into needle
deflections that could be interpreted as altitude and airspeed. A magnetic compass
provided a sense of direction. The displays to the pilot were as critical as the
measurements.

A modern aircraft has a far more sophisticated suite of sensors and displays, which are
embedded into avionics systems. The aircraft may contain inertial navigation systems,
global positioning systems, weather radar, autopilots, and aircraft stabilization systems.
Redundant sensors are used for reliability. A subset of the information may be
transferred to a crash recorder to aid mishap investigations. Modern pilot displays now
include computer displays including head-up displays.

Air traffic control radar is a distributed instrumentation system. The ground part sends
an electromagnetic pulse and receives an echo (at least). Aircraft carry transponders
that transmit codes on reception of the pulse. The system displays an aircraft map
location, an identifier and optionally altitude. The map location is based on sensed
antenna direction and sensed time delay. The other information is embedded in the
transponder transmission.

Laboratory instrumentation[edit]

Among the possible uses of the term is a collection of laboratory test equipment
controlled by a computer through an IEEE-488 bus (also known as GPIB for General
Purpose Instrument Bus or HPIB for Hewlitt Packard Instrument Bus). Laboratory
equipment is available to measure many electrical and chemical quantities. Such a
collection of equipment might be used to automate the testing of drinking water for
pollutants.

Instrumentation engineering [edit]

The instrumentation part of a piping and instrumentation diagram will be developed by an


instrumentation engineer.

Instrumentation engineering is the engineering specialization focused on the principle


and operation of measuring instruments that are used in design and configuration of
automated systems in areas such as electrical and pneumatic domains, and the control
of quantities being measured. They typically work for industries
[3]
devices did not become standard in meteorology for two centuries. The concept has
remained virtually unchanged as evidenced by pneumatic chart recorders, where a
pressurized bellows displaces a pen. Integrating sensors, displays, recorders, and
controls was uncommon until the industrial revolution, limited by both need and
practicality.

Early industrial[edit]
The evolution of analogue control loop signalling from the pneumatic era to the electronic era

Early systems used direct process connections to local control panels for control and
indication, which from the early 1930s saw the introduction of pneumatic transmitters
and automatic 3-term (PID) controllers.

The ranges of pneumatic transmitters were defined by the need to control valves and
actuators in the field. Typically, a signal ranged from 3 to 15 psi (20 to 100kPa or 0.2 to
1.0 kg/cm2) as a standard, was standardized with 6 to 30 psi occasionally being used
for larger valves. Transistor electronics enabled wiring to replace pipes, initially with a
range of 20 to 100mA at up to 90V for loop powered devices, reducing to 4 to 20mA at
12 to 24V in more modern systems. A transmitter is a device that produces an output
signal, often in the form of a 4–20 mA electrical current signal, although many other
options using voltage, frequency, pressure, or ethernet are possible. The transistor was
[4]
commercialized by the mid-1950s.

Instruments attached to a control system provided signals used to operate solenoids,


valves, regulators, circuit breakers, relays and other devices. Such devices could
control a desired output variable, and provide either remote monitoring or automated
control capabilities.

Each instrument company introduced their own standard instrumentation signal,


causing confusion until the 4–20 mA range was used as the standard electronic
instrument signal for transmitters and valves. This signal was eventually standardized
as ANSI/ISA S50, "Compatibility of Analog Signals for Electronic Industrial Process
Instruments", in the 1970s. The transformation of instrumentation from mechanical
pneumatic transmitters, controllers, and valves to electronic instruments reduced
maintenance costs as electronic instruments were more dependable than mechanical
instruments. This also increased efficiency and production due to their increase in
accuracy. Pneumatics enjoyed some advantages, being favored in corrosive and
[5]
explosive atmospheres.

Automatic process control[edit]


Example of a single industrial control loop, showing continuously modulated control of process
flow

In the early years of process control, process indicators and control elements such as
valves were monitored by an operator, that walked around the unit adjusting the valves
to obtain the desired temperatures, pressures, and flows. As technology evolved
pneumatic controllers were invented and mounted in the field that monitored the
process and controlled the valves. This reduced the amount of time process operators
needed to monitor the process. Latter years, the actual controllers were moved to a
central room and signals were sent into the control room to monitor the process and
outputs signals were sent to the final control element such as a valve to adjust the
process as needed. These controllers and indicators were mounted on a wall called a
control board. The operators stood in front of this board walking back and forth
monitoring the process indicators. This again reduced the number and amount of time
process operators were needed to walk around the units. The most standard pneumatic
[6]
signal level used during these years was 3–15 psig.

Large integrated computer-based systems[edit]

Pneumatic "three term" pneumatic PID controller, widely used before electronics became reliable
and cheaper and safe to use in hazardous areas (Siemens Telepneu Example)
A pre-DCS/SCADA era central control room. Whilst the controls are centralised in one place,
they are still discrete and not integrated into one system.

A DCS control room where plant information and controls are displayed on computer graphics
screens. The operators are seated and can view and control any part of the process from their
screens, whilst retaining a plant overview.

Process control of large industrial plants has evolved through many stages. Initially,
control would be from panels local to the process plant. However, this required a large
manpower resource to attend to these dispersed panels, and there was no overall view
of the process. The next logical development was the transmission of all plant
measurements to a permanently staffed central control room. Effectively this was the
centralization of all the localized panels, with the advantages of lower manning levels
and easy overview of the process. Often the controllers were behind the control room
panels, and all automatic and manual control outputs were transmitted back to plant.

However, whilst providing a central control focus, this arrangement was inflexible as
each control loop had its own controller hardware, and continual operator movement
within the control room was required to view different parts of the process. With coming
of electronic processors and graphic displays it became possible to replace these
discrete controllers with computer-based algorithms, hosted on a network of
input/output racks with their own control processors. These could be distributed around
plant, and communicate with the graphic display in the control room or rooms. The
distributed control concept was born.

The introduction of DCSs and SCADA allowed easy interconnection and re-
configuration of plant controls such as cascaded loops and interlocks, and easy
interfacing with other production computer systems. It enabled sophisticated alarm
handling, introduced automatic event logging, removed the need for physical records
such as chart recorders, allowed the control racks to be networked and thereby located
locally to plant to reduce cabling runs, and provided high level overviews of plant status
and production levels.

Application[edit]
In some cases, the sensor is a very minor element of the mechanism. Digital cameras
and wristwatches might technically meet the loose definition of instrumentation because
they record and/or display sensed information. Under most circumstances neither would
be called instrumentation, but when used to measure the elapsed time of a race and to
document the winner at the finish line, both would be called instrumentation.

Household[edit]

A very simple example of an instrumentation system is a mechanical thermostat, used


to control a household furnace and thus to control room temperature. A typical unit
senses temperature with a bi-metallic strip. It displays temperature by a needle on the
free end of the strip. It activates the furnace by a mercury switch. As the switch is
rotated by the strip, the mercury makes physical (and thus electrical) contact between
electrodes.

Another example of an instrumentation system is a home security system. Such a


system consists of sensors (motion detection, switches to detect door openings), simple
algorithms to detect intrusion, local control (arm/disarm) and remote monitoring of the
system so that the police can be summoned. Communication is an inherent part of the
design.

Kitchen appliances use sensors for control.

● A refrigerator maintains a constant temperature by actuating the cooling


system when the temperature becomes too high.
● An automatic ice machine makes ice until a limit switch is thrown.
● Pop-up bread toasters allow the time to be set.
● Non-electronic gas ovens will regulate the temperature with a thermostat
controlling the flow of gas to the gas burner. These may feature a sensor bulb
sited within the main chamber of the oven. In addition, there may be a safety
cut-off flame supervision device: after ignition, the burner's control knob must
be held for a short time in order for a sensor to become hot, and permit the
flow of gas to the burner. If the safety sensor becomes cold, this may indicate
the flame on the burner has become extinguished, and to prevent a
continuous leak of gas the flow is stopped.
● Electric ovens use a temperature sensor and will turn on heating elements
when the temperature is too low. More advanced ovens will actuate fans in
response to temperature sensors, to distribute heat or to cool.
● A common toilet refills the water tank until a float closes the valve. The float is
acting as a water level sensor.
Automotive[edit]

Modern automobiles have complex instrumentation. In addition to displays of engine


rotational speed and vehicle linear speed, there are also displays of battery voltage and
current, fluid levels, fluid temperatures, distance traveled, and feedback of various
controls (turn signals, parking brake, headlights, transmission position). Cautions may
be displayed for special problems (fuel low, check engine, tire pressure low, door ajar,
seat belt unfastened). Problems are recorded so they can be reported to diagnostic
equipment. Navigation systems can provide voice commands to reach a destination.
Automotive instrumentation must be cheap and reliable over long periods in harsh
environments. There may be independent airbag systems that contain sensors, logic
and actuators. Anti-skid braking systems use sensors to control the brakes, while cruise
control affects throttle position. A wide variety of services can be provided via
communication links on the OnStar system. Autonomous cars (with exotic
instrumentation) have been shown.

Aircraft[edit]
[7]
Early aircraft had a few sensors. "Steam gauges" converted air pressures into needle
deflections that could be interpreted as altitude and airspeed. A magnetic compass
provided a sense of direction. The displays to the pilot were as critical as the
measurements.

A modern aircraft has a far more sophisticated suite of sensors and displays, which are
embedded into avionics systems. The aircraft may contain inertial navigation systems,
global positioning systems, weather radar, autopilots, and aircraft stabilization systems.
Redundant sensors are used for reliability. A subset of the information may be
transferred to a crash recorder to aid mishap investigations. Modern pilot displays now
include computer displays including head-up displays.

Air traffic control radar is a distributed instrumentation system. The ground part sends
an electromagnetic pulse and receives an echo (at least). Aircraft carry transponders
that transmit codes on reception of the pulse. The system displays an aircraft map
location, an identifier and optionally altitude. The map location is based on sensed
antenna direction and sensed time delay. The other information is embedded in the
transponder transmission.

Laboratory instrumentation[edit]

Among the possible uses of the term is a collection of laboratory test equipment
controlled by a computer through an IEEE-488 bus (also known as GPIB for General
Purpose Instrument Bus or HPIB for Hewlitt Packard Instrument Bus). Laboratory
equipment is available to measure many electrical and chemical quantities. Such a
collection of equipment might be used to automate the testing of drinking water for
pollutants.
Instrumentation engineering [edit]

The instrumentation part of a piping and instrumentation diagram will be developed by an


instrumentation engineer.

Instrumentation engineering is the engineering specialization focused on the principle


and operation of measuring instruments that are used in design and configuration of
automated systems in areas such as electrical and pneumatic domains, and the control
of quantities being measured. They typically work for industries
[3]
devices did not become standard in meteorology for two centuries. The concept has
remained virtually unchanged as evidenced by pneumatic chart recorders, where a
pressurized bellows displaces a pen. Integrating sensors, displays, recorders, and
controls was uncommon until the industrial revolution, limited by both need and
practicality.

Early industrial[edit]

The evolution of analogue control loop signalling from the pneumatic era to the electronic era

Early systems used direct process connections to local control panels for control and
indication, which from the early 1930s saw the introduction of pneumatic transmitters
and automatic 3-term (PID) controllers.

The ranges of pneumatic transmitters were defined by the need to control valves and
actuators in the field. Typically, a signal ranged from 3 to 15 psi (20 to 100kPa or 0.2 to
1.0 kg/cm2) as a standard, was standardized with 6 to 30 psi occasionally being used
for larger valves. Transistor electronics enabled wiring to replace pipes, initially with a
range of 20 to 100mA at up to 90V for loop powered devices, reducing to 4 to 20mA at
12 to 24V in more modern systems. A transmitter is a device that produces an output
signal, often in the form of a 4–20 mA electrical current signal, although many other
options using voltage, frequency, pressure, or ethernet are possible. The transistor was
[4]
commercialized by the mid-1950s.

Instruments attached to a control system provided signals used to operate solenoids,


valves, regulators, circuit breakers, relays and other devices. Such devices could
control a desired output variable, and provide either remote monitoring or automated
control capabilities.

Each instrument company introduced their own standard instrumentation signal,


causing confusion until the 4–20 mA range was used as the standard electronic
instrument signal for transmitters and valves. This signal was eventually standardized
as ANSI/ISA S50, "Compatibility of Analog Signals for Electronic Industrial Process
Instruments", in the 1970s. The transformation of instrumentation from mechanical
pneumatic transmitters, controllers, and valves to electronic instruments reduced
maintenance costs as electronic instruments were more dependable than mechanical
instruments. This also increased efficiency and production due to their increase in
accuracy. Pneumatics enjoyed some advantages, being favored in corrosive and
[5]
explosive atmospheres.

Automatic process control[edit]

Example of a single industrial control loop, showing continuously modulated control of process
flow

In the early years of process control, process indicators and control elements such as
valves were monitored by an operator, that walked around the unit adjusting the valves
to obtain the desired temperatures, pressures, and flows. As technology evolved
pneumatic controllers were invented and mounted in the field that monitored the
process and controlled the valves. This reduced the amount of time process operators
needed to monitor the process. Latter years, the actual controllers were moved to a
central room and signals were sent into the control room to monitor the process and
outputs signals were sent to the final control element such as a valve to adjust the
process as needed. These controllers and indicators were mounted on a wall called a
control board. The operators stood in front of this board walking back and forth
monitoring the process indicators. This again reduced the number and amount of time
process operators were needed to walk around the units. The most standard pneumatic
[6]
signal level used during these years was 3–15 psig.

Large integrated computer-based systems[edit]

Pneumatic "three term" pneumatic PID controller, widely used before electronics became reliable
and cheaper and safe to use in hazardous areas (Siemens Telepneu Example)

A pre-DCS/SCADA era central control room. Whilst the controls are centralised in one place,
they are still discrete and not integrated into one system.
A DCS control room where plant information and controls are displayed on computer graphics
screens. The operators are seated and can view and control any part of the process from their
screens, whilst retaining a plant overview.

Process control of large industrial plants has evolved through many stages. Initially,
control would be from panels local to the process plant. However, this required a large
manpower resource to attend to these dispersed panels, and there was no overall view
of the process. The next logical development was the transmission of all plant
measurements to a permanently staffed central control room. Effectively this was the
centralization of all the localized panels, with the advantages of lower manning levels
and easy overview of the process. Often the controllers were behind the control room
panels, and all automatic and manual control outputs were transmitted back to plant.

However, whilst providing a central control focus, this arrangement was inflexible as
each control loop had its own controller hardware, and continual operator movement
within the control room was required to view different parts of the process. With coming
of electronic processors and graphic displays it became possible to replace these
discrete controllers with computer-based algorithms, hosted on a network of
input/output racks with their own control processors. These could be distributed around
plant, and communicate with the graphic display in the control room or rooms. The
distributed control concept was born.

The introduction of DCSs and SCADA allowed easy interconnection and re-
configuration of plant controls such as cascaded loops and interlocks, and easy
interfacing with other production computer systems. It enabled sophisticated alarm
handling, introduced automatic event logging, removed the need for physical records
such as chart recorders, allowed the control racks to be networked and thereby located
locally to plant to reduce cabling runs, and provided high level overviews of plant status
and production levels.

Application[edit]
In some cases, the sensor is a very minor element of the mechanism. Digital cameras
and wristwatches might technically meet the loose definition of instrumentation because
they record and/or display sensed information. Under most circumstances neither would
be called instrumentation, but when used to measure the elapsed time of a race and to
document the winner at the finish line, both would be called instrumentation.

Household[edit]

A very simple example of an instrumentation system is a mechanical thermostat, used


to control a household furnace and thus to control room temperature. A typical unit
senses temperature with a bi-metallic strip. It displays temperature by a needle on the
free end of the strip. It activates the furnace by a mercury switch. As the switch is
rotated by the strip, the mercury makes physical (and thus electrical) contact between
electrodes.
Another example of an instrumentation system is a home security system. Such a
system consists of sensors (motion detection, switches to detect door openings), simple
algorithms to detect intrusion, local control (arm/disarm) and remote monitoring of the
system so that the police can be summoned. Communication is an inherent part of the
design.

Kitchen appliances use sensors for control.

● A refrigerator maintains a constant temperature by actuating the cooling


system when the temperature becomes too high.
● An automatic ice machine makes ice until a limit switch is thrown.
● Pop-up bread toasters allow the time to be set.
● Non-electronic gas ovens will regulate the temperature with a thermostat
controlling the flow of gas to the gas burner. These may feature a sensor bulb
sited within the main chamber of the oven. In addition, there may be a safety
cut-off flame supervision device: after ignition, the burner's control knob must
be held for a short time in order for a sensor to become hot, and permit the
flow of gas to the burner. If the safety sensor becomes cold, this may indicate
the flame on the burner has become extinguished, and to prevent a
continuous leak of gas the flow is stopped.
● Electric ovens use a temperature sensor and will turn on heating elements
when the temperature is too low. More advanced ovens will actuate fans in
response to temperature sensors, to distribute heat or to cool.
● A common toilet refills the water tank until a float closes the valve. The float is
acting as a water level sensor.
Automotive[edit]

Modern automobiles have complex instrumentation. In addition to displays of engine


rotational speed and vehicle linear speed, there are also displays of battery voltage and
current, fluid levels, fluid temperatures, distance traveled, and feedback of various
controls (turn signals, parking brake, headlights, transmission position). Cautions may
be displayed for special problems (fuel low, check engine, tire pressure low, door ajar,
seat belt unfastened). Problems are recorded so they can be reported to diagnostic
equipment. Navigation systems can provide voice commands to reach a destination.
Automotive instrumentation must be cheap and reliable over long periods in harsh
environments. There may be independent airbag systems that contain sensors, logic
and actuators. Anti-skid braking systems use sensors to control the brakes, while cruise
control affects throttle position. A wide variety of services can be provided via
communication links on the OnStar system. Autonomous cars (with exotic
instrumentation) have been shown.

Aircraft[edit]
[7]
Early aircraft had a few sensors. "Steam gauges" converted air pressures into needle
deflections that could be interpreted as altitude and airspeed. A magnetic compass
provided a sense of direction. The displays to the pilot were as critical as the
measurements.

A modern aircraft has a far more sophisticated suite of sensors and displays, which are
embedded into avionics systems. The aircraft may contain inertial navigation systems,
global positioning systems, weather radar, autopilots, and aircraft stabilization systems.
Redundant sensors are used for reliability. A subset of the information may be
transferred to a crash recorder to aid mishap investigations. Modern pilot displays now
include computer displays including head-up displays.

Air traffic control radar is a distributed instrumentation system. The ground part sends
an electromagnetic pulse and receives an echo (at least). Aircraft carry transponders
that transmit codes on reception of the pulse. The system displays an aircraft map
location, an identifier and optionally altitude. The map location is based on sensed
antenna direction and sensed time delay. The other information is embedded in the
transponder transmission.

Laboratory instrumentation[edit]

Among the possible uses of the term is a collection of laboratory test equipment
controlled by a computer through an IEEE-488 bus (also known as GPIB for General
Purpose Instrument Bus or HPIB for Hewlitt Packard Instrument Bus). Laboratory
equipment is available to measure many electrical and chemical quantities. Such a
collection of equipment might be used to automate the testing of drinking water for
pollutants.

Instrumentation engineering [edit]

The instrumentation part of a piping and instrumentation diagram will be developed by an


instrumentation engineer.

Instrumentation engineering is the engineering specialization focused on the principle


and operation of measuring instruments that are used in design and configuration of
automated systems in areas such as electrical and pneumatic domains, and the control
of quantities being measured. They typically work for industries
ngineering

Satellite dishes are a crucial component in the analysis of satellite information.

Telecommunications engineering focuses on the transmission of information across a


[61]
communication channel such as a coax cable, optical fiber or free space.
Transmissions across free space require information to be encoded in a carrier signal to
shift the information to a carrier frequency suitable for transmission; this is known as
modulation. Popular analog modulation techniques include amplitude modulation and
[62]
frequency modulation. The choice of modulation affects the cost and performance of
a system and these two factors must be balanced carefully by the engineer.

Once the transmission characteristics of a system are determined, telecommunication


engineers design the transmitters and receivers needed for such systems. These two
are sometimes combined to form a two-way communication device known as a
transceiver. A key consideration in the design of transmitters is their power consumption
[63][64]
as this is closely related to their signal strength. Typically, if the power of the
transmitted signal is insufficient once the signal arrives at the receiver's antenna(s), the
information contained in the signal will be corrupted by noise, specifically static.

Control engineering[edit]

Main articles: Control engineering and Control theory


Control systems play a critical role in spaceflight.

Control engineering focuses on the modeling of a diverse range of dynamic systems


and the design of controllers that will cause these systems to behave in the desired
[65]
manner. To implement such controllers, electronics control engineers may use
electronic circuits, digital signal processors, microcontrollers, and programmable logic
controllers (PLCs). Control engineering has a wide range of applications from the flight
and propulsion systems of commercial airliners to the cruise control present in many
[66]
modern automobiles. It also plays an important role in industrial automation.

Control engineers often use feedback when designing control systems. For example, in
an automobile with cruise control the vehicle's speed is continuously monitored and fed
[67]
back to the system which adjusts the motor's power output accordingly. Where there
is regular feedback, control theory can be used to determine how the system responds
to such feedback.

Control engineers also work in robotics to design autonomous systems using control
algorithms which interpret sensory feedback to control actuators that move robots such
as autonomous vehicles, autonomous drones and others used in a variety of industries.
[68]

Electronics[edit]

Main article: Electronic engineering


Electronic components

Electronic engineering involves the design and testing of electronic circuits that use the
properties of components such as resistors, capacitors, inductors, diodes, and
[60]
transistors to achieve a particular functionality. The tuned circuit, which allows the
user of a radio to filter out all but a single station, is just one example of such a circuit.
Another example to research is a pneumatic signal conditioner.

Prior to the Second World War, the subject was commonly known as radio engineering
and basically was restricted to aspects of communications and radar, commercial radio,
[60]
and early television. Later, in post-war years, as consumer devices began to be
developed, the field grew to include modern television, audio systems, computers, and
microprocessors. In the mid-to-late 1950s, the term radio engineering gradually gave
way to the name electronic engineering.

[69]
Before the invention of the integrated circuit in 1959, electronic circuits were
constructed from discrete components that could be manipulated by humans. These
discrete circuits consumed much space and power and were limited in speed, although
they are still common in some applications. By contrast, integrated circuits packed a
[70]
large number—often millions—of tiny electrical components, mainly transistors, into a
small chip around the size of a coin. This allowed for the powerful computers and other
electronic devices we see today.

Microelectronics and nanoelectronics[edit]

Main articles: Microelectronics, Nanoelectronics, and Chip design


Microprocessor

Microelectronics engineering deals with the design and microfabrication of very small
electronic circuit components for use in an integrated circuit or sometimes for use on
[71]
their own as a general electronic component. The most common microelectronic
components are semiconductor transistors, although all main electronic components
(resistors, capacitors etc.) can be created at a microscopic level.

Nanoelectronics is the further scaling of devices down to nanometer levels. Modern


devices are already in the nanometer regime, with below 100 nm processing having
[72]
been standard since around 2002.

Microelectronic components are created by chemically fabricating wafers of


semiconductors such as silicon (at higher frequencies, compound semiconductors like
gallium arsenide and indium phosphide) to obtain the desired transport of electronic
charge and control of current. The field of microelectronics involves a significant amount
of chemistry and material science and requires the electronic engineer working in the
[73]
field to have a very good working knowledge of the effects of quantum mechanics.

Signal processing[edit]

Main article: Signal processing


A Bayer filter on a CCD requires signal processing to get a red, green, and blue value at each
pixel.

[74]
Signal processing deals with the analysis and manipulation of signals. Signals can be
either analog, in which case the signal varies continuously according to the information,
or digital, in which case the signal varies according to a series of discrete values
representing the information. For analog signals, signal processing may involve the
amplification and filtering of audio signals for audio equipment or the modulation and
demodulation of signals for telecommunications. For digital signals, signal processing
may involve the compression, error detection and error correction of digitally sampled
[75]
signals.

Signal processing is a very mathematically oriented and intensive area forming the core
of digital signal processing and it is rapidly expanding with new applications in every
field of electrical engineering such as communications, control, radar, audio
engineering, broadcast engineering, power electronics, and biomedical engineering as
many already existing analog systems are replaced with their digital counterparts.
Analog signal processing is still important in the design of many control systems.

DSP processor ICs are found in many types of modern electronic devices, such as
[76]
digital television sets, radios, hi-fi audio equipment, mobile phones, multimedia
players, camcorders and digital cameras, automobile control systems, noise cancelling
headphones, digital spectrum analyzers, missile guidance systems, radar systems, and
telematics systems. In such products, DSP may be responsible for noise reduction,
speech recognition or synthesis, encoding or decoding digital media, wirelessly
transmitting or receiving data, triangulating positions using GPS, and other kinds of
[77]
image processing, video processing, audio processing, and speech processing.

Instrumentation[edit]

Main article: Instrumentation engineering


Flight instruments provide pilots with the tools to control aircraft analytically.

Instrumentation engineering deals with the design of devices to measure physical


[78]
quantities such as pressure, flow, and temperature. The design of such instruments
requires a good understanding of physics that often extends beyond electromagnetic
theory. For example, flight instruments measure variables such as wind speed and
altitude to enable pilots the control of aircraft analytically. Similarly, thermocouples use
the Peltier-Seebeck effect to measure the temperature difference between two points.
[79]

Often instrumentation is not used by itself, but instead as the sensors of larger electrical
systems. For example, a thermocouple might be used to help ensure a furnace's
[80]
temperature remains constant. For this reason, instrumentation engineering is often
viewed as the counterpart of control.

Computers[edit]

Main article: Computer engineering

Supercomputers are used in fields as diverse as computational biology and geographic


information systems.

Computer engineering deals with the design of computers and computer systems. This
may involve the design of new hardware. Computer engineers may also work on a
system's software. However, the design of complex software systems is often the
[81]
domain of software engineering, which is usually considered a separate discipline.
Desktop computers represent a tiny fraction of the devices a computer engineer might
work on, as computer-like architectures are now found in a range of embedded devices
including video game consoles and DVD players. Computer engineers are involved in
[82]
many hardware and software aspects of computing. Robots are one of the
applications of computer engineering.

Photonics and optics[edit]


Main articles: Photonics and Optics

Photonics and optics deals with the generation, transmission, amplification, modulation,
detection, and analysis of electromagnetic radiation. The application of optics deals with
design of optical instruments such as lenses, microscopes, telescopes, and other
equipment that uses the properties of electromagnetic radiation. Other prominent
applications of optics include electro-optical sensors and measurement systems, lasers,
fiber-optic communication systems, and optical disc systems (e.g. CD and DVD).
Photonics builds heavily on optical technology, supplemented with modern
developments such as optoelectronics (mostly involving semiconductors), laser
systems, optical amplifiers and novel materials (e.g. metamaterials).

Contents hide

(Top)
History
Toggle History subsection
19th century
Early 20th century
Solid-state electronics
Subfields
Toggle Subfields subsection
Power and energy
Telecommunications
Control engineering
Electronics
Microelectronics and nanoelectronics
Signal processing
Instrumentation
Computers
Photonics and optics
Related disciplines
Education
Professional practice
Tools and work
See also
Notes
References
Further reading
External links
Electrical engineering
102 languages
Article
Talk
Read
Edit
View history
Tools
Appearance hide

Text
Small
Standard
Large
Width
Standard
Wide
Color (beta)
Automatic
Light
Dark
Report an issue with dark mode
From Wikipedia, the free encyclopedia

Not to be confused with Electronic engineering.

A long row of disconnectors


Occupation

Names Electrical engineer

Activity Electronics, electrical circuits, electromagnetics, power


sectors engineering, electrical machines, telecommunication,
control systems, signal processing, optics, photonics, and
electrical substations

Description

Competenci Technical knowledge, management skills, advanced


es mathematics, systems design, physics, abstract thinking,
analytical thinking, philosophy of logic (see also Glossary of
electrical and electronics engineering)

Fields of Technology, science, exploration, military, industry and


society
employment

Electrical engineering is an engineering discipline concerned with the study, design,


and application of equipment, devices, and systems which use electricity, electronics,
and electromagnetism. It emerged as an identifiable occupation in the latter half of the
19th century after the commercialization of the electric telegraph, the telephone, and
electrical power generation, distribution, and use.

Electrical engineering is divided into a wide range of different fields, including computer
engineering, systems engineering, power engineering, telecommunications, radio-
frequency engineering, signal processing, instrumentation, photovoltaic cells,
electronics, and optics and photonics. Many of these disciplines overlap with other
engineering branches, spanning a huge number of specializations including hardware
engineering, power electronics, electromagnetics and waves, microwave engineering,
nanotechnology, electrochemistry, renewable energies, mechatronics/control, and
[a]
electrical materials science.

Electrical engineers typically hold a degree in electrical engineering, electronic or


electrical and electronic engineering. Practicing engineers may have professional
certification and be members of a professional body or an international standards
organization. These include the International Electrotechnical Commission (IEC), the
Institute of Electrical and Electronics Engineers (IEEE) and the Institution of Engineering
and Technology (IET, formerly the IEE).

Electrical engineers work in a very wide range of industries and the skills required are
likewise variable. These range from circuit theory to the management skills of a project
manager. The tools and equipment that an individual engineer may need are similarly
variable, ranging from a simple voltmeter to sophisticated design and manufacturing
software.

History[edit]
Main article: History of electrical engineering

Electricity has been a subject of scientific interest since at least the early 17th century.
William Gilbert was a prominent early electrical scientist, and was the first to draw a
clear distinction between magnetism and static electricity. He is credited with
[1]
establishing the term "electricity". He also designed the versorium: a device that
detects the presence of statically charged objects. In 1762 Swedish professor Johan
Wilcke invented a device later named electrophorus that produced a static electric
charge. By 1800 Alessandro Volta had developed the voltaic pile, a forerunner of the
[2]
electric battery.

19th century[edit]
The discoveries of Michael Faraday formed the foundation of electric motor technology.

In the 19th century, research into the subject started to intensify. Notable developments
in this century include the work of Hans Christian Ørsted, who discovered in 1820 that
an electric current produces a magnetic field that will deflect a compass needle; of
William Sturgeon, who in 1825 invented the electromagnet; of Joseph Henry and
Edward Davy, who invented the electrical relay in 1835; of Georg Ohm, who in 1827
quantified the relationship between the electric current and potential difference in a
conductor; of Michael Faraday, the discoverer of electromagnetic induction in 1831; and
of James Clerk Maxwell, who in 1873 published a unified theory of electricity and
[3]
magnetism in his treatise Electricity and Magnetism.

In 1782, Georges-Louis Le Sage developed and presented in Berlin probably the


world's first form of electric telegraphy, using 24 different wires, one for each letter of the
alphabet. This telegraph connected two rooms. It was an electrostatic telegraph that
moved gold leaf through electrical conduction.

In 1795, Francisco Salva Campillo proposed an electrostatic telegraph system. Between


1803 and 1804, he worked on electrical telegraphy, and in 1804, he presented his
report at the Royal Academy of Natural Sciences and Arts of Barcelona. Salva's
electrolyte telegraph system was very innovative though it was greatly influenced by
and based upon two discoveries made in Europe in 1800—Alessandro Volta's electric
battery for generating an electric current and William Nicholson and Anthony Carlyle's
[4]
electrolysis of water. Electrical telegraphy may be considered the first example of
[5]
electrical engineering. Electrical engineering became a profession in the later 19th
century. Practitioners had created a global electric telegraph network, and the first
professional electrical engineering institutions were founded in the UK and the US to
support the new discipline. Francis Ronalds created an electric telegraph system in
[6]
1816 and documented his vision of how the world could be transformed by electricity.
[7]
Over 50 years later, he joined the new Society of Telegraph Engineers (soon to be
renamed the Institution of Electrical Engineers) where he was regarded by other
[8]
members as the first of their cohort. By the end of the 19th century, the world had
been forever changed by the rapid communication made possible by the engineering
development of land-lines, submarine cables, and, from about 1890, wireless
telegraphy.

Practical applications and advances in such fields created an increasing need for
standardized units of measure. They led to the international standardization of the units
volt, ampere, coulomb, ohm, farad, and henry. This was achieved at an international
[9]
conference in Chicago in 1893. The publication of these standards formed the basis
of future advances in standardization in various industries, and in many countries, the
[10]
definitions were immediately recognized in relevant legislation.

During these years, the study of electricity was largely considered to be a subfield of
physics since early electrical technology was considered electromechanical in nature.
The Technische Universität Darmstadt founded the world's first department of electrical
engineering in 1882 and introduced the first-degree course in electrical engineering in
[11]
1883. The first electrical engineering degree program in the United States was
started at Massachusetts Institute of Technology (MIT) in the physics department under
[12]
Professor Charles Cross, though it was Cornell University to produce the world's first
[13]
electrical engineering graduates in 1885. The first course in electrical engineering
was taught in 1883 in Cornell's Sibley College of Mechanical Engineering and Mechanic
[14]
Arts.

In about 1885, Cornell President Andrew Dickson White established the first
[15]
Department of Electrical Engineering in the United States. In the same year,
University College London founded the first chair of electrical engineering in Great
[16]
Britain. Professor Mendell P. Weinbach at University of Missouri established the
[17]
electrical engineering department in 1886. Afterwards, universities and institutes of
technology gradually started to offer electrical engineering programs to their students all
over the world.

During these decades the use of electrical engineering increased dramatically. In 1882,
Thomas Edison switched on the world's first large-scale electric power network that
provided 110 volts—direct current (DC)—to 59 customers on Manhattan Island in New
York City. In 1884, Sir Charles Parsons invented the steam turbine allowing for more
efficient electric power generation. Alternating current, with its ability to transmit power
more efficiently over long distances via the use of transformers, developed rapidly in the
1880s and 1890s with transformer designs by Károly Zipernowsky, Ottó Bláthy and
Miksa Déri (later called ZBD transformers), Lucien Gaulard, John Dixon Gibbs and
William Stanley, Jr. Practical AC motor designs including induction motors were
independently invented by Galileo Ferraris and Nikola Tesla and further developed into
a practical three-phase form by Mikhail Dolivo-Dobrovolsky and Charles Eugene
[18]
Lancelot Brown. Charles Steinmetz and Oliver Heaviside contributed to the
[19][20]
theoretical basis of alternating current engineering. The spread in the use of AC
set off in the United States what has been called the war of the currents between a
George Westinghouse backed AC system and a Thomas Edison backed DC power
[21]
system, with AC being adopted as the overall standard.

Early 20th century[edit]

Guglielmo Marconi, known for his pioneering work on long-distance radio transmission

During the development of radio, many scientists and inventors contributed to radio
technology and electronics. The mathematical work of James Clerk Maxwell during the
1850s had shown the relationship of different forms of electromagnetic radiation
including the possibility of invisible airborne waves (later called "radio waves"). In his
classic physics experiments of 1888, Heinrich Hertz proved Maxwell's theory by
transmitting radio waves with a spark-gap transmitter, and detected them by using
simple electrical devices. Other physicists experimented with these new waves and in
the process developed devices for transmitting and detecting them. In 1895, Guglielmo
Marconi began work on a way to adapt the known methods of transmitting and detecting
these "Hertzian waves" into a purpose built commercial wireless telegraphic system.
Early on, he sent wireless signals over a distance of one and a half miles. In December
1901, he sent wireless waves that were not affected by the curvature of the Earth.
Marconi later transmitted the wireless signals across the Atlantic between Poldhu,
[22]
Cornwall, and St. John's, Newfoundland, a distance of 2,100 miles (3,400 km).

Millimetre wave communication was first investigated by Jagadish Chandra Bose during
1894–1896, when he reached an extremely high frequency of up to 60 GHz in his
[23]
experiments. He also introduced the use of semiconductor junctions to detect radio
[24] [25][26]
waves, when he patented the radio crystal detector in 1901.
In 1897, Karl Ferdinand Braun introduced the cathode-ray tube as part of an
[27]
oscilloscope, a crucial enabling technology for electronic television. John Fleming
invented the first radio tube, the diode, in 1904. Two years later, Robert von Lieben and
[28]
Lee De Forest independently developed the amplifier tube, called the triode.

In 1920, Albert Hull developed the magnetron which would eventually lead to the
[29][30]
development of the microwave oven in 1946 by Percy Spencer. In 1934, the
British military began to make strides toward radar (which also uses the magnetron)
under the direction of Dr Wimperis, culminating in the operation of the first radar station
[31]
at Bawdsey in August 1936.

In 1941, Konrad Zuse presented the Z3, the world's first fully functional and
programmable computer using electromechanical parts. In 1943, Tommy Flowers
designed and built the Colossus, the world's first fully functional, electronic, digital and
[32][33]
programmable computer. In 1946, the ENIAC (Electronic Numerical Integrator and
Computer) of John Presper Eckert and John Mauchly followed, beginning the computing
era. The arithmetic performance of these machines allowed engineers to develop
[34]
completely new technologies and achieve new objectives.

In 1948, Claude Shannon published "A Mathematical Theory of Communication" which


mathematically describes the passage of information with uncertainty (electrical noise).

Solid-state electronics[edit]

See also: History of electronic engineering, History of the transistor, Invention of the
integrated circuit, MOSFET, and Solid-state electronics

A replica of the first working transistor, a point-contact transistor


Metal–oxide–semiconductor field-effect transistor (MOSFET), the basic building block of modern
electronics

The first working transistor was a point-contact transistor invented by John Bardeen and
Walter Houser Brattain while working under William Shockley at the Bell Telephone
[35]
Laboratories (BTL) in 1947. They then invented the bipolar junction transistor in
[36]
1948. While early junction transistors were relatively bulky devices that were difficult
[37]
to manufacture on a mass-production basis, they opened the door for more compact
[38]
devices.

The first integrated circuits were the hybrid integrated circuit invented by Jack Kilby at
Texas Instruments in 1958 and the monolithic integrated circuit chip invented by Robert
[39]
Noyce at Fairchild Semiconductor in 1959.

The MOSFET (metal–oxide–semiconductor field-effect transistor, or MOS transistor)


[40][41][42]
was invented by Mohamed Atalla and Dawon Kahng at BTL in 1959. It was the
first truly compact transistor that could be miniaturised and mass-produced for a wide
[37] [43][44]
range of uses. It revolutionized the electronics industry, becoming the most
[41][45][46]
widely used electronic device in the world.

[41]
The MOSFET made it possible to build high-density integrated circuit chips. The
earliest experimental MOS IC chip to be fabricated was built by Fred Heiman and
[47]
Steven Hofstein at RCA Laboratories in 1962. MOS technology enabled Moore's law,
the doubling of transistors on an IC chip every two years, predicted by Gordon Moore in
[48]
1965. Silicon-gate MOS technology was developed by Federico Faggin at Fairchild in
[49]
1968. Since then, the MOSFET has been the basic building block of modern
[42][50][51]
electronics. The mass-production of silicon MOSFETs and MOS integrated
circuit chips, along with continuous MOSFET scaling miniaturization at an exponential
pace (as predicted by Moore's law), has since led to revolutionary changes in
[52]
technology, economy, culture and thinking.
The Apollo program which culminated in landing astronauts on the Moon with Apollo 11
in 1969 was enabled by NASA's adoption of advances in semiconductor electronic
[53][54]
technology, including MOSFETs in the Interplanetary Monitoring Platform (IMP)
[55]
and silicon integrated circuit chips in the Apollo Guidance Computer (AGC).

The development of MOS integrated circuit technology in the 1960s led to the invention
[56][57]
of the microprocessor in the early 1970s. The first single-chip microprocessor was
[56]
the Intel 4004, released in 1971. The Intel 4004 was designed and realized by
[56]
Federico Faggin at Intel with his silicon-gate MOS technology, along with Intel's
[58]
Marcian Hoff and Stanley Mazor and Busicom's Masatoshi Shima. The
microprocessor led to the development of microcomputers and personal computers, and
the microcomputer revolution.

Subfields[edit]
One of the properties of electricity is that it is very useful for energy transmission as well
as for information transmission. These were also the first areas in which electrical
engineering was developed. Today, electrical engineering has many subdisciplines, the
most common of which are listed below. Although there are electrical engineers who
focus exclusively on one of these subdisciplines, many deal with a combination of them.
Sometimes, certain fields, such as electronic engineering and computer engineering,
are considered disciplines in their own right.

Power and energy[edit]

Main articles: Power engineering and Energy engineering

The top of a power pole

Power & Energy engineering deals with the generation, transmission, and distribution of
[59]
electricity as well as the design of a range of related devices. These include
transformers, electric generators, electric motors, high voltage engineering, and power
electronics. In many regions of the world, governments maintain an electrical network
called a power grid that connects a variety of generators together with users of their
energy. Users purchase electrical energy from the grid, avoiding the costly exercise of
having to generate their own. Power engineers may work on the design and
[60]
maintenance of the power grid as well as the power systems that connect to it. Such
systems are called on-grid power systems and may supply the grid with additional
power, draw power from the grid, or do both. Power engineers may also work on
systems that do not connect to the grid, called off-grid power systems, which in some
cases are preferable to on-grid systems.

Telecommunications[edit]

Main article: Telecommunications engineering

Satellite dishes are a crucial component in the analysis of satellite information.

Telecommunications engineering focuses on the transmission of information across a


[61]
communication channel such as a coax cable, optical fiber or free space.
Transmissions across free space require information to be encoded in a carrier signal to
shift the information to a carrier frequency suitable for transmission; this is known as
modulation. Popular analog modulation techniques include amplitude modulation and
[62]
frequency modulation. The choice of modulation affects the cost and performance of
a system and these two factors must be balanced carefully by the engineer.

Once the transmission characteristics of a system are determined, telecommunication


engineers design the transmitters and receivers needed for such systems. These two
are sometimes combined to form a two-way communication device known as a
transceiver. A key consideration in the design of transmitters is their power consumption
[63][64]
as this is closely related to their signal strength. Typically, if the power of the
transmitted signal is insufficient once the signal arrives at the receiver's antenna(s), the
information contained in the signal will be corrupted by noise, specifically static.

Control engineering[edit]

Main articles: Control engineering and Control theory

Control systems play a critical role in spaceflight.

Control engineering focuses on the modeling of a diverse range of dynamic systems


and the design of controllers that will cause these systems to behave in the desired
[65]
manner. To implement such controllers, electronics control engineers may use
electronic circuits, digital signal processors, microcontrollers, and programmable logic
controllers (PLCs). Control engineering has a wide range of applications from the flight
and propulsion systems of commercial airliners to the cruise control present in many
[66]
modern automobiles. It also plays an important role in industrial automation.

Control engineers often use feedback when designing control systems. For example, in
an automobile with cruise control the vehicle's speed is continuously monitored and fed
[67]
back to the system which adjusts the motor's power output accordingly. Where there
is regular feedback, control theory can be used to determine how the system responds
to such feedback.

Control engineers also work in robotics to design autonomous systems using control
algorithms which interpret sensory feedback to control actuators that move robots such
as autonomous vehicles, autonomous drones and others used in a variety of industries.
[68]

Electronics[edit]
Main article: Electronic engineering

Electronic components

Electronic engineering involves the design and testing of electronic circuits that use the
properties of components such as resistors, capacitors, inductors, diodes, and
[60]
transistors to achieve a particular functionality. The tuned circuit, which allows the
user of a radio to filter out all but a single station, is just one example of such a circuit.
Another example to research is a pneumatic signal conditioner.

Prior to the Second World War, the subject was commonly known as radio engineering
and basically was restricted to aspects of communications and radar, commercial radio,
[60]
and early television. Later, in post-war years, as consumer devices began to be
developed, the field grew to include modern television, audio systems, computers, and
microprocessors. In the mid-to-late 1950s, the term radio engineering gradually gave
way to the name electronic engineering.

[69]
Before the invention of the integrated circuit in 1959, electronic circuits were
constructed from discrete components that could be manipulated by humans. These
discrete circuits consumed much space and power and were limited in speed, although
they are still common in some applications. By contrast, integrated circuits packed a
[70]
large number—often millions—of tiny electrical components, mainly transistors, into a
small chip around the size of a coin. This allowed for the powerful computers and other
electronic devices we see today.

Microelectronics and nanoelectronics[edit]

Main articles: Microelectronics, Nanoelectronics, and Chip design


Microprocessor

Microelectronics engineering deals with the design and microfabrication of very small
electronic circuit components for use in an integrated circuit or sometimes for use on
[71]
their own as a general electronic component. The most common microelectronic
components are semiconductor transistors, although all main electronic components
(resistors, capacitors etc.) can be created at a microscopic level.

Nanoelectronics is the further scaling of devices down to nanometer levels. Modern


devices are already in the nanometer regime, with below 100 nm processing having
[72]
been standard since around 2002.

Microelectronic components are created by chemically fabricating wafers of


semiconductors such as silicon (at higher frequencies, compound semiconductors like
gallium arsenide and indium phosphide) to obtain the desired transport of electronic
charge and control of current. The field of microelectronics involves a significant amount
of chemistry and material science and requires the electronic engineer working in the
[73]
field to have a very good working knowledge of the effects of quantum mechanics.

Signal processing[edit]

Main article: Signal processing


A Bayer filter on a CCD requires signal processing to get a red, green, and blue value at each
pixel.

[74]
Signal processing deals with the analysis and manipulation of signals. Signals can be
either analog, in which case the signal varies continuously according to the information,
or digital, in which case the signal varies according to a series of discrete values
representing the information. For analog signals, signal processing may involve the
amplification and filtering of audio signals for audio equipment or the modulation and
demodulation of signals for telecommunications. For digital signals, signal processing
may involve the compression, error detection and error correction of digitally sampled
[75]
signals.

Signal processing is a very mathematically oriented and intensive area forming the core
of digital signal processing and it is rapidly expanding with new applications in every
field of electrical engineering such as communications, control, radar, audio
engineering, broadcast engineering, power electronics, and biomedical engineering as
many already existing analog systems are replaced with their digital counterparts.
Analog signal processing is still important in the design of many control systems.

DSP processor ICs are found in many types of modern electronic devices, such as
[76]
digital television sets, radios, hi-fi audio equipment, mobile phones, multimedia
players, camcorders and digital cameras, automobile control systems, noise cancelling
headphones, digital spectrum analyzers, missile guidance systems, radar systems, and
telematics systems. In such products, DSP may be responsible for noise reduction,
speech recognition or synthesis, encoding or decoding digital media, wirelessly
transmitting or receiving data, triangulating positions using GPS, and other kinds of
[77]
image processing, video processing, audio processing, and speech processing.

Instrumentation[edit]

Main article: Instrumentation engineering


Flight instruments provide pilots with the tools to control aircraft analytically.

Instrumentation engineering deals with the design of devices to measure physical


[78]
quantities such as pressure, flow, and temperature. The design of such instruments
requires a good understanding of physics that often extends beyond electromagnetic
theory. For example, flight instruments measure variables such as wind speed and
altitude to enable pilots the control of aircraft analytically. Similarly, thermocouples use
the Peltier-Seebeck effect to measure the temperature difference between two points.
[79]

Often instrumentation is not used by itself, but instead as the sensors of larger electrical
systems. For example, a thermocouple might be used to help ensure a furnace's
[80]
temperature remains constant. For this reason, instrumentation engineering is often
viewed as the counterpart of control.

Computers[edit]

Main article: Computer engineering

Supercomputers are used in fields as diverse as computational biology and geographic


information systems.

Computer engineering deals with the design of computers and computer systems. This
may involve the design of new hardware. Computer engineers may also work on a
system's software. However, the design of complex software systems is often the
[81]
domain of software engineering, which is usually considered a separate discipline.
Desktop computers represent a tiny fraction of the devices a computer engineer might
work on, as computer-like architectures are now found in a range of embedded devices
including video game consoles and DVD players. Computer engineers are involved in
[82]
many hardware and software aspects of computing. Robots are one of the
applications of computer engineering.

Photonics and optics[edit]


Main articles: Photonics and Optics

Photonics and optics deals with the generation, transmission, amplification, modulation,
detection, and analysis of electromagnetic radiation. The application of optics deals with
design of optical instruments such as lenses, microscopes, telescopes, and other
equipment that uses the properties of electromagnetic radiation. Other prominent
applications of optics include electro-optical sensors and measurement systems, lasers,
fiber-optic communication systems, and optical disc systems (e.g. CD and DVD).
Photonics builds heavily on optical technology, supplemented with modern
developments such as optoelectronics (mostly involving semiconductors), laser
systems, optical amplifiers and novel materials (e.g. metamaterials).

You might also like