SCADA+History

June 4, 2018 | Author: Israel Carrillo | Category: Scada, Electric Power System, Cathode Ray Tube, Weather Forecasting, Telemetry


Comments



Description

SCADA/EMS HistoryTable of Contents Supervisory Control Systems Pilot Wire Stepping Switches Relay/Tones Solid State Vendors Telemetry Current Balance/transducers Pulse Rate/variable frequency Selective Continuous Scanning Automatic Data Logger Vendors SCADA Computer master Remote Terminal Units User Interface Communication channels Vendors Automatic Generation Control (AGC) Recorders Servos Mag amps/ operational amplifiers Computer Controlled Analog Vendors System Operation Computers SCADA Forecast and scheduling functions Configurations Vendors Energy Management Systems Network Analysis Functions Configurations Consultants Vendors Projects Supervisory Control Systems Westinghouse in conjunction with North Electric Company. The electric utilities used this technology as a early form of supervisory control. There were two sizes the “junior” used one select/check sequence. which when received by the remote would cause it to send out a corresponding check message for the selected device. I worked on it so often that I could trouble shoot it often by just listening to the relay sequence. It was very reliable and had very few false operations. The remote would then send a confirmation message to the master station to complete the action. or a fixed message size with combinations of short or long pulses. This select before operate scheme has been used in supervisory control systems for many years and forms of it are still in use today. If this was received by the master station it would indicate this to the dispatcher who would then initiate a operate message which when received by the remote would cause the device to operate. Stepping switch systems It soon became obvious that if they could multi-plex one pair of wires and control several devices. However in doing this it would be necessary to have very good security. Most of the installations were with electric utilities. However to assure security they developed a select/check/operate scheme. During the 30's the telephone company developed magnetic stepping switch’s for switching telephone circuits. Relay/Tone systems Relay systems used telephone type relays to create pulses that were sent over a communication channel to the remote. This was expensive but justified if the equipment needed to be operated often or in order to restore service rapidly. Each pair of wires operated a unique piece of equipment. It used the select/check/operate scheme for security. the process would become much more efficient. or send a line crew out to operate equipment. The first approach was to use a pair of wires or a multi-pair cable between the sites. This scheme had the master station send out a selection message. to control the landing lights.Pilot Wire Systems (1940's and earlier) Supervisory control in electric utility systems evolved from the need to operate equipment located in remote substations. Visicode used two time delay relays to create pulses. gas companies and even done an installation in the control tower of O Hara airport in Chicago. I am most familiar with this equipment since I spent over five years doing installation and trouble shooting on it. developed Visicode supervisory control based the pulse count approach. In the past it was necessary to have personnel stationed at the remote site. although it could happen. They typically used a coding scheme either based on the number of pulses in a message. Thousands of this equipment went in service during the years from 1950 until about 1965. in fact more than once I was able to solve their problem over the telephone. but the larger used a select/check sequence for a group and a second for the point. and timing chain of relays to count them. because the affect of operating the wrong piece of equipment could be severe. some were pipelines. One I investigated was caused because the equipment . Telemetry Current Balance/transducers Telemetry in electric utilities began like supervisory control because it became necessary to obtain readings of power values like volts. from remote locations. This first required a transducer to convert the power system value to a DC voltage or current proportional to the power system quantity. Watt and Var readings were more complex however. and Control Corporation. The first approach was to utilize a pair of wires and pass a current proportional to the reading. Solid State (60's and 70's) Westinghouse developed their solid state supervisory system around 1960. Control Corporations solid state system was called Supertrol. and the battery connection was loose. The remote was going crazy. Ironically I went with CDC . As the battery connection would make and break with the vibration. and the first successful transducers commonly used for these were . These systems were basically solid state versions of their previous systems. General electric also had a supervisory control system that was a competitor of Visicode. telemetry was required. GE came out with a similar system about that time that I believe was called GETAC. As power systems became larger and more complex and the need for central operation developed. The was a lot of low frequency vibration. but since I had only been at Westinghouse for two years I decided to stay.who had bought Control Corp. Control Corporation supplied a supervisory system that used combinations of tones for the coding. amps. They were created with a “Vibrasender” and received with a “Vibrasponder”. the master was sending out a constant stream of pulses 24 hours a day. 13 years later. watts etc. It used a fixed word length format with a checksum character with each message. It was fun listening to him trouble shoot because he put a speaker on the line and listened to the tones. There were European vendors such as Siemens but they were not a significant factor in the US market. Vendors The predominate vendors during this evolution of supervisory control were Westinghouse. He turned my name over to their personnel department who contacted me and I went to Minneapolis to interview. Both used the select before operate scheme. and it finally hit the right combination for a false operation. It operated similarly but used a fixed length message type and relied on two different pulse lengths for the coding. General Electric. The first transducers for voltage or current were relatively simple since the AC value from metering transformers could be rectified to create a DC value.was installed in a diesel generating station with the master station cabinet beside the generator. We spent a lot of time together at lunch and in the evenings. It still used the select/check/operate security technique. In 1957 I was installing a Visicode system in Watertown South Dakota with the Bureau of Reclamation and a Control Corp field engineer was there installing one of their systems also. It was called REDAC. power line carrier. They used a mechanical variable length pulse that was generated by a rotating serrated disk. Various vendors developed continuous scanning systems with a solid state master system. telemetry schemes were devised using pulse rate and variable frequency methods. The variable frequency systems worked similar. Since it is usually only necessary to read these values periodically. All of these systems had the advantage of being able to send the signal over a variety of communication media. and in some cases on dedicated communication channels. Unfortunately these transducers could not be applied directly to a communication channel consisting of a pair of wires. This equipment was used more in gas and pipeline systems because of it’s slow time response. Continuous Scanning systems. The solid state system generated a signal with a frequency that varied from 15 to 35 cycles. They were typically used over pilot wire. as the telemetered value was proportional to the range between the minimum and maximum frequency. proportional to the telemetered value. With no mechanical parts involved the gear would not wear out. The variations in resistance of the long communication channel .with temperature.Thermal Converters. and to reduce the number of communication channels needed. The pulse rate equipment generated a pulse rate that between the minimum rate and the maximum rate was proportional to the telemetered value. Since the equipment was by now implemented with solid state technology. It was first implemented using vacuum tubes and later on transistors. The value in this scheme was proportional to the pulse width. Selective Telemetry systems. Automatic data loggers. a different type of transducer was invented which used the Hall Effect to create the proportional voltage. which was called Teledac. Therefore a telemetry transmitter was required that would convert the transducer voltage to a constant current source that would be nearly independent of the variations in resistance. This need was consistent with the requirements of supervisory control. prevented a consistent accurate reading. and micro-wave. The dispatcher could take his hourly readings by selecting the readings on a supervisory/telemetry system. Later on. So the usage of selective telemetry over supervisory control became common. This was done with a current balance type of telemetry transmitter. it became necessary to multiplex several readings over a single communication channel. In order to transmit telemetry across other types of communication channels such as power line carrier. While at Westinghouse in the late 50's I worked on the development of their system. . These units used instrument transformer inputs from voltage and current readings to heat a small thermal element which was then detected by thermocouples to create a proportional voltage. it was possible to continuously scan the values. and to improve accuracy. Pulse rate/Variable frequency telemetry systems. There was another pulse based system that was used during these years that was supplied by the Bristol Company. SCADA System(1965 and later) The term SCADA for Supervisory Control and Data Acquisition Systems. I installed a Automatic Data Logger in 1958 at the dispatch office of Omaha Public Power District. A few systems . and later.mostly in Europe. The telemetry scheme used was Bristol pulse duration. Solid state equipment and computer based master stations replaced this type of equipment. Westinghouse and GE both built processors that could be used by this time. several new vendors such as Pacific Telephone and Moore Associates appeared. This was possible if the communications channel was full duplex. Remote Terminal Units The remote station of the SCADA systems took the nomenclature. and alarming for changes. monitoring the data or status. on digital displays. it was difficult to maintain and was replaced after a few years. This system used stepping switches at each end to select the points. The Westinghouse computers were called PRODAC and GE processors GETAC. (Could send and receive simultaneously). The functions now included scanning data. mounted on printed circuit cards and . However because of all the complexity and electro-magnetic components. although they were unusual until the solid state or computer master evolved. A few other vendors were telemetry suppliers like Bristol. General Electric. By the middle 60's there were computers that were capable of real time functions. Once a point was selected it would take at least three pulses before the telemetry would settle down. that was a good example of these early attempts. during the 60's. so that had to be added to the logic in the field. The complexity of the logic required to make a hardwired master that would provide all the necessary functions of a SCADA system was so great. Vendors The vendors of early telemetry systems included Westinghouse. Most SCADA systems worked on a continuous scan basis with the master sending requests for data and the remotes only responding. displaying the data. The digital output of the converter was typed out on a IBM long carriage typewriter. RTU’s used solid state components.Automatic data loggers relieved the dispatcher from the tedious task of recording all the readings each hour. Remote Terminal Units (RTU’s). and Control Data who were extending their supervisory control business. however later there developed a market for RTU’s from vendors other than complete SCADA vendors.. was coupled to it. Other computer companies came out with computers that were applicable to master station SCADA included Digital Equipment Company and Scientific Data Systems. RTU’s were supplied primarily by the SCADA vendor. Early versions came before solid state and continuous scan telemetry. on CRT displays and periodic data logging. and as the equipment turned to solid state continuous scanning. came into use after the use of a computer based master station became common. It was a mess to get working but it did the job. that advantage of using a computer became apparent. The telemetry receiver was a Bristol Metamaster which was a big mechanical beast that had a shaft position output and a Gianini shaft position analog to digital converter. had the remote issue data continuously without a master station request. Large SCADA systems would have several hundred RTU’s. megawatts etc can be presented on the one lines drawings and periodically up-dated. A major evolution came during the late sixties when the Cathode Ray Tube. In the 80's Full Graphic CRT’s became available. the communication protocol had to be both efficient and very secure. as it now became possible for the master station to communicate with RTU’s of different protocols as the micro processor based communication interface could be programmed to handle the different protocols. The development of the Micro Processor based communication interface at the master station solved some of the compatibility problems however. They were typically supplied in 90 high steel cabinets. Power system values such as volts. In order to allow different types of RTU’s on a SCADA system there was a effort to standardize protocols undertaken by the IEEE. These CRT’s had a graphic character set that would allow presenting one line diagrams of electric circuits with dynamic representation of circuit breakers. confirming the selection visually and initiating the control operation. control digital outputs and sometimes analog control outputs. Control operations were accomplished by selecting the device on the diagram.. There were several . Security was a primary factor. They felt they were too complicated and too “computer like”. The original versions were black and white character mode CRT’s. They offered the ability to display data and status in tabular format. CRT. A keyboard was used instead of the pushbutton panel for interacting with the system. Since most RTU’s operated on a continuous scan basis. proprietary. and the select/before/operate scheme used on control operations. suitable for mounting in a remote power substations They needed to operate. so sophisticated checksum security characters were transmitted with each message. so they were connected to the substation battery. central logic controller. which was usually 129volt DC. even if the power was out at the station. Up-date rates were typically in the area of five seconds. The basic structure of a RTU consisted of the communication interface. common protocols used in other industries such as ASCII were not used. amps. Data was displayed using digital displays. digital inputs. and since it is important to have fast response to control operations in event of a system disturbance. although it did not take them very long to appreciate the added flexibility and capability. and to list alarms as well as acknowledge them. switches etc. with room for terminal blocks for field wiring to substation equipment. which was a communication check code developed in the 60's. Then the Limited Graphic color CRT’s became available in the early seventies. and clear them. Because of the need for both very high security and efficiency. Now one-line diagrams became much more sophisticated as you could pan across a large diagram or zoom in to a substation to present more detailed data.typically housed in card racks installed in equipment cabinets. and alarms were listed on printers or teletypewriters. At first there was resistance from the dispatchers to the new methods. User Interface(Man/machine Interface) Early SCADA systems used a wired panel with pushbuttons for selecting points and doing control operations. was applied to the Man/machine interface (MMI). The select/before/operate scheme still prevailed. During the 60's and 70's most RTU communication protocol’s were unique to the RTU vendor i. and input/output system with analog inputs. Selection was done by moving the CRT cursor to the device on the diagram. The most common security check code used was BCH.e. They are typically mounted on the wall in front of the dispatchers. it became possible to make these boards dynamic. using digital displays. They could show the status of devices such as circuit breakers and. A modem was provided by the SCADA vendor that used audio range frequency. Early Westinghouse supervisory control was a combination of Westinghouse personnel and people from North Electric Co. stations. As computer based SCADA systems proliferated. then a track/ball was used were the dispatcher rolled the ball to position the cursor. although lots of dispatchers resisted it for the same reasons they resisted CRT’s. called Radiation to develop SCADA systems. Communication Channels. and sometimes provided by the telephone company. Early systems used a keypad with directional arrows. instead going with less dynamics on the board and making more use of the CRT. indicate line flows and voltage. by the late seventies and early eighties reduced the need for a elaborate dynamic board. Moore Associates began making digital telemetry equipment evolved into a SCADA vendor during the seventies. By the middle seventies these boards became very elaborate and expensive. and were usually line printers. power line carrier was used. Although the limited number of frequencies limited the usage. In the early days they were static boards. Vendors: The vendors for SCADA systems evolved from the vendors of supervisory control. These were sometimes laid by the utility. and salient equipment. For a while in the late sixties. A company in San Jose California. Finally the computer “mouse” that is so common today evolved as the most useful. It was expensive but had the advantage of providing a large number of channels. General Electric and Control Corporation. and they became less common. These included Westinghouse. In the early days the most common communications media was a pair of dedicated wires. In the later seventies and eighties the most common channel became utility owned microwave equipment. and was frequency modulated to connect to the channel. Another approach was to use a light pen that was used to touch the screen and b ring the cursor to that position. Automatic Generation Control (AGC) AGC Evolved from the use of telemetry and recorders to record and display data such as tie . They serve various purposes but one of the most important is to keep the dispatchers constantly familiar with the power system. This company later became Landis&Gyre. showing lines. Sometimes they were manually updated with tags or flags to equipment out of service or on maintenance.devices used to move the cursor that evolved over the years. However the capability of the Full Graphic CRT’s. In the sixties a few people from both operations split off and joined a company in Melbourne Florida. Mapboards have been used in Dispatch offices for many years. With the alarm lists being presented on CRT’s alarm printing became just a record keeping function and so the printers were now in the computer room instead of the dispatch office. This company evolved into Harris Systems. These made developing the analog computing methods needed by these systems much easier. During this period the concept of Incremental cost and Economic dispatch was refined. There were hundreds of systems shipped in the 60's using this technology. The ACE value was then filtered and processed. They were a type 1 control system. Frequency error was developed from the frequency recorder. developed a unit error signal which was usually used to create control pulses which were transmitted to the plant and applied to the governor motors to adjust generation in response to the error. and earlier. It was then added to total generation to develop a total requirement value that was allocated to the individual generating units by the Economic Dispatch circuitry. This was the beginning of AGC. However it’s commitment to be put on line. a value representing Net Interchange Error was developed and displayed. can be loaded with respect to the other units on line according to its incremental cost. If these systems can result in a savings of a few percent in a large utilities production cost. using hundreds of Mag Amps. such that when a analog voltage was applied. This error signal was then applied to a mechanical device that would generate pulses of either a raise or lower direction. and a desired frequency setter. etc. servo mechanisms. These systems were usually similar in control configuration. By the late fifties. This allowed the control system to control interchange and contribute to intra-system control of frequency. and other factors. These systems were large. These pulses were then transmitted to the various generating plants. There were several large AGC/EDC systems built by the late fifties.1 cycle.. Dispatch offices in the fifties. start-up cost. It was at this time I moved out to Pittsburgh and joined the Westinghouse group that was developing analog AGC. The incremental cost curves were implemented with resistors and diodes. primarily Westinghouse and GE. and applied to a governor motor control panel that would drive the governor motor and hence raise or lower generation. They often used retransmitting slidewires on the recorders to add the tie line flows together and display Net Interchange on a separate recorder. There is a large justification for the complexity and expense of using an automatic Economic Dispatch System with transmission loss compensation. value was developed by comparing net interchange to desired interchange. the output was a nonlinear curve representing the incremental cost curve for the unit or plant. expensive and complicated. typically had recorders that showed all major generation MW values and tie line flows. Also total generation was often summed up using a similar method from the various generation recorders. System frequency was also typically displayed on a recorder. The control pulses would move the generating units to bring the Net Interchange Error to zero.line flows and generation values. Economic Dispatch is based on the fact that a unit once on the line. there were other vendors that got involved in AGC. recorders. The systems Area Control Error. Later a Time Error Bias was added in. By using a slidewire on the Net Interchange recorder and comparing it to a Net Interchange setter consisting of a digital dial driven potentiometer. and measured by a frequency transducer. and adding in a frequency bias term. By the late 50's/early 60's solid state operational amplifiers became available. a few even included large “B” matrix systems to compensate Economic Dispatch for transmission losses. is a function of production cost. This developed a Unit Generation requirement that when compared to actual generation. Frequency error was applied through a frequency bias setter calibrated in MW/0. By the middle 60's there were enough advances in real time oriented digital computers that the Economic Dispatch . The technology at this time used a combination of recorder retransmitting slide wires and Magnetic Amplifiers. it would amount to several million dollars a year. System Operation Computers: By the late sixties the term System Operation Computers (SOC) became common. Load Forecast: In order for a utility to schedule the generation it needs to meet the load each day it needs a accurate forecast of the system load. while the other portion of AGC ( typically called Load Frequency Control) must execute as often as every few seconds. There were dozens of systems like this shipped in the late 60's. Unit Scheduling or Unit Commitment. By this time I had been in the Systems Control group at Westinghouse for ten years working on mostly analog systems. local Sequence of Events Reporting (down to a few milli-seconds).function could be done on the computer. with the algorithm becoming more capable. and by the early70's Harris Controls. and AGC functions. The SCADA function was still the most important and basic function. but was fortunate to work on the earliest of the all digital systems. and Interchange Negotiation. Vendors: AGC evolved from recorders and re-transmitting slidewires in the 40's and 50's. and others became involved. Economic Dispatch was typically done at approximately three minute periods. such as reallocating generation limited by maneuvering limits. Within the year we had a AGC system at CDC and our first order for a System Operation Computer. The primarily vendor at the start was Leeds and Northrup. although by the late fifties both General Electric and Westinghouse had developed analog systems. These three were the primary vendors through the 60's. These functions are so important to the evolution of System Operation Computers that I will go into some detail about each. About this time the digital computers with real time operating systems evolved to the point where the entire AGC algorithm could be implemented on the computer. Since correct scheduling of system resources can result in the savings of hundreds of thousands of dollars each day to a large utility (more on this later). IBM. a job I was fortunate to work on. These systems combined the SCADA. with the forecast and scheduling functions. Forecast and scheduling functions. There were even a few cases of substation computers taking over the RTU function. It became more sophisticated with RTU’s capable of more sophisticated functions such as local analog data monitoring. a accurate load forecast is essential. Thus was developed the Computer Controlled Analog type of system where Economic Dispatch with transmission loss compensation was done in the computer and the result output on a digital controlled set-point to the analog LFC system. Also computer based AGC became common. One of the first. Also at about this time I left Westinghouse to go with Control Data Corporation who had been a SCADA vendor but wanted to develop AGC. After the 60's Control Data became a active vendor. The three major functions involved are system Load Forecast. During the 40's and 50's the . TRW. The major addition was the inclusion of the forecast and scheduling functions. and local display and logging functions. was at PG&E. fairs etc. Since many of the constraints are over several hours. Unit Scheduling (Unit Commitment). as well as any events scheduled for the forecast period.e. there are thousands of unique paths through the scheduling period of several days. such as start-up crew scheduling time. They then put together a hourly forecast for the scheduling period. Since there may be twenty valid combinations to meet the load each hour. the schedule for one hour affects the next few hours. however the adjustments hopefully were minor since much of the Unit Scheduling had already occurred by then. Typically the forecast was put together the night before and adjusted early in the morning. A experienced accurate forecaster was highly regarded. minimum on line time.. They then considered the detailed weather forecasts for the scheduling period. The programs typically ran a forecast for a forecasting period up to a week. unit start-up costs. or millions in a years time for a large utility. and each must be dispatched and coasted. Simplifications were made and the programs results then compared over time with the forecasts made by the experienced operator who had been forecasting. on the computer of the time . In the 60's the computer became fast enough to take over the forecasting function. The load forecast programs of the 70's and 80's became ever more sophisticated and accurate. to run a 24 hour forecast could take 30 hours). The early Unit Commitment programs attempted to find the optimum path by investigating all combinations. As the load grew during the day it typically became necessary to adjust the forecast. fuel production cost. ramp loading time once on line.. The scheduling task is to find the combination of units for each hour of the scheduling period that meets all the constraints. However there is only one optimal path. However they were consistently reasonable and they didn’t depend on your forecaster working past his retirement date. and based on experience. Cost are incurred for each unit that is committed. For a large utility this is a complex task. It is necessary for a Utility to schedule enough generation to be on line at any time to meet the load plus the spinning reserve requirements. Time to minimum load. It is very important to find the right schedule because the difference between the best schedule and others may be tens of thousand dollars a day . usually derived from the National Weather Service.load forecasting function was generally done by a very experienced scheduling person in the system operation group. meeting all the constraints. for the second or third day but rarely beat the scheduler for the next day. and the load. Load Forecast programs were adaptive in that they used load data from the past several years to create a forecast based on the day of the year.they would take longer than the forecast period to run (i. such as crew start-up costs. They used patterns of loads that had actually occurred for different types of days recently. more or less manually. etc. or in other words the one with the lowest over-all cost. maintenance costs etc. Each unit has a wide variety of constraints. The units need to be scheduled hour by hour over the scheduling period. For any hourly load level. This is a very complex function and the early programs often had so much complexity that. such as ball games. there may be 20 combinations of units that will meet the load. and is at the minimum over all cost. although only the first several days were accurate enough for use. By the late 50's there were computer programs developed to help with this problem. The problem was the running time of the program was . usually two or three days. Before the use of computers this task was done by a generation scheduling group. The computer generated forecast were as good or better. and various scheduler entered data. This forecast was then adjusted according to the weather forecast. Economy A or/and economy B contracts were often available at any time and the decision on whether to enter in a contract to buy or sell was complex. They require including a function to optimize hydro with respect to thermal usage. pond level scheduling etc. This program would be run often during the day to evaluate contract opportunities. but when to pump and when to generate with the pump storage units. Since these opportunities occurred often a separate program was developed called Economy A Interchange Negotiations. this adds a whole new level of complexity. During the 70's. They were relatively easy to justify because you could take actual scheduling done manually and compare it to corresponding runs of the programs and compare the production costs. called Economy A. is the need to include Hydro generating units. but it was close and much better than could be done manually. and therefore had to be evaluated including unit commitment. There were two types of contracts. If there are many potential contracts this can result in a lot of runs of the function. such as dam outflow. or to propose contracts to the neighboring utilities. Every time one of these studies were done the results showed the programs would save millions of dollars over a years time. and later. Studies showed the path was not quite as good as checking all paths. with dual ported process I/O. these programs became more and more sophisticated. A few utilities even have some large units that can be used as pump storage units. A interchange contract has different constrains than a generating unit so it cannot just be considered as another potential unit to commit. (the running time for a 24 hour schedule. as now not only do you have to figure out a way to commit a variety of units. and longer term (several hours to up to six months) Economy B. Obviously the problem had to be simplified. several versions of the Economy B Interchange Negotiation program utilize a simplified version of unit commitment. Another complication that happens with some utilities. short term(usually next hour). and dual ported communication interfaces. They were a major justification for a large computer system. During the 40's and 50's most utilities were connected with their neighbors. Therefore they could be evaluated by using a study version of the Economic Dispatch function to determine the system production cost with and without the contract. but were major cost justifications for a larger System Operation Computer.longer than the scheduling period. These contracts directly affected the Unit Commitment schedule. and entered into interchange agreements to buy and sell power. river level. To properly evaluate each contract you have to run unit commitment with and without the interchange. consisted of a dual computer configuration. Configurations: The most common control system configuration for these early SOC systems. Hydro units have a completely new list of constrains. To alleviate the possible long run times of the function. There were many variants of this approach developed in the 70' and 80's. Because of the requirement for very high availability for the . Interchange Negotiation. might be 40 hours or more). The Economy A contracts were easier to evaluate as they primarily only affected the load level of the units on line. However evaluating or proposing Economy B interchange contracts was a much more complex task. Engineers developed the technique of Dynamic Programming which studied the combinations for the current hour and a few in the future and past to pick a optimum path. and each required a lot of computing capability. These could display dynamic one line electric system diagrams. To achieve these reliability requirements meant all process I/O. as the network is complex. the trackball was used in conjunction with the keyboard cluster. initiated fail-over. In the 40's and 50's. colored CRT’s with a limited graphic character set. and communication interfaces. The computers used were process control capable processors with real time operating systems. if certain equipment was taken out of service. IBM. and allowed operator interaction with these functions. is a difficult calculation.critical power system control functions. complete redundancy was required. By the early 70's. (EMS). but quickly adapted to them for the greatly increased functionality. Periodic functions such as AGC and SCADA would not be significantly interrupted. CDC in the 70's used their 1700 series. in a process called “check-pointing.” This was done usually over a high speed data link between the processors. in a matter of seconds. and Leeds and Northrup. Dispatchers were reluctant to give up lighted push-button panels for keyboards. It was required to complete a fail-over to the secondary system for a primary system failure. had to be dual ported and connected to each computer. or later on early power flows. involving both real and imaginary values. It would not be necessary to complete man/machine functions such as display call-ups in process. Dispatchers always had a need for a fast load flow calculation that they could use to investigate requests being made to allow clearances of equipment. A load flow calculation for a large network (i. and were used in conjunction with lighted push-button panels. 700 branches for example). man machine equipment. Westinghouse. Harris Controls and TRW became a factor by the middle 70's. GE used their GETAC 400 series computers. They needed to know what the affect would be on the system. The primary system periodically passed all time critical data to the secondary. These studies could not be done rapidly and were therefore not of great value for daily operations use. the only help for them were engineering studies done by the system planning group using analog network analyzers. A direct solution of the . there was a lot of work to develop a faster load flow program for the computer. The requirement was to achieve a “bump-less” transfer from the dispatchers view. IBM their 1800 series and Harris and TRW used various machines. One of the two process control computers acted as the primary processor in the configuration. One of the most important and complex functions of these systems was Configuration Control. Data was check-pointed as soon as it was received by the primary system. CDC. 300 nodes. although it was necessary to continue execution of long running time programs. and L&N used the Scientific Data Systems SDS 900 series. Energy Management Systems Network Analysis Functions The addition of network Analysis functions in the middle 70's brought on the nomenclature Energy Management Systems. In the middle sixties. Vendors: In the late 60's the major vendors for SOC systems were General Electric. which monitored the system.e. By the late 60's the user interface had evolved from push-buttons to CRT’s. came into use. For Cursor control. First the CRT’s were character format black and white units. Westinghouse their PRODAC 500 series. and indicated which measurements seemed to be out of line or erroneous. thus making them useful to Dispatchers. Know that a model was created that was consistent with a load flow solution and could be checked with contingencies using a load flow. to create a model of the system that could use all of the telemetered data. it would result in the dispatcher being aware of the current risks to the system. It was difficult to use a conventional load flow as the model however. then up-date the model with real time data so that it closely resembled the actual system in real time. a group of engineers at MIT led by Fred Schweppe developed the State Estimator. and it was needed to check them as often as every fifteen minutes. Possible failures then could be handled as further contingencies. A iterative approach was developed that greatly reduced the computing time. until a major outage is predicted. It was desired to operate the system such that no first contingencies would cause further trouble. It involved the concept of on line security analysis. If all contingencies that were feasible or likely. We developed a analog representation of the network.. Later a method know as “Neuton Raphson” was developed which developed a change matrix relating the change in bus injections to line flows. The model then could be periodically modified with possible contingencies that could happen to the system and each of these would be checked to report any failures. To help resolve this problem. and we were able to get solutions in about 10 seconds for a 300 bus system. However it was difficult to maintain the analog and digital hardware and the system was not very reliable. The idea was to build a model of the power system. The first commonly used method was a “Gaus Siedel” method which involved solving the equations for each node or “bus” and then iterating through the network. By the late sixties it was possible to get load flow programs that would solve in 10 to 15 minutes. It even used redundant values. it served to verify the telemetered data base. We then used the computer to output bus injections to the analog model and read in the line flows with analog inputs. and significant were checked. The Dispatchers Load Flow was a main part of the Network Analysis functions but there was another function that became a cornerstone of these systems. Instead the telemetered data included primarily line flows. It took about five or six iterations. but was still much faster than a direct solution. I was presenting a paper on the hybrid load flow at a conference in Minneapolis. This typically required several hundred iterations for a solution. A very high-speed load flow was needed. E. This program took the telemetered data base and matched it to a load flow calculation. as the telemetered data did not ordinarily include all the bus injection data as is necessary to up-date a load flow. Thus not only did the function recreate the needed model. Since the power system network model matrix is sparse (I. The dispatcher could then make changes to avoid these risks. This resulted in much work to develop a higher speed load flow. Iterating with this method resulted in solutions with four to six iterations. Also the digital computers were rapidly getting faster and faster by the late sixties making a all digital approach the best choice. The analog calculation solved the IYV portion of the calculation and we iterated through the computer for the bus injections. using operational amplifiers. when CDC contacted me and I ended up going with them. using a least squares method.network equations could take several days on the computers available in the sixties. using load flow techniques. not nearly a line . However for a large system there may be hundreds of valid contingencies to check. I was involved in an interesting project in the late sixties attempting to get a fast power flow for dispatchers use it was a hybrid load flow. sparse matrix solution techniques were developed to speed up the solution. A split off from SCI became ESCA. there was much consolidation in these consulting firms. Mark Enns and others. One early consulting firm was MACRO. Another technique that was developed by Brian Stout. Starting in the late sixties to early seventies there developed a group of consulting companies who were capable of helping the utility specify. In the late 80’s and 90’s. Redundancy is required in order to achieve a high availability. and forecast and scheduling functions.between every bus). a Dutch company and ESCA by a French company. who again developed from a consultant to become a vendor. It was not un-common to see configurations with up to a dozen computers. One technique used by CDC and others was to use smaller process control oriented computers for the “front end” functions such as SCADA and larger computers more capable of engineering functions such as network analysis. very powerful server oriented processors became available. ECC was formed from people that left Westinghouse. and the trend towards distributed processing increased. and then became a consultant. EMS configurations were made up of various numbers of pairs of computers. security constraints etc. and the other standby. Systems Control Incorporated. it became more and more difficult for a Utility to specify and purchase a system. A typical specification requires 99. The configuration control software was very sophisticated to control these large distributed processor configurations. and many were bought out by European companies. and the consultants worked on several systems a year. was associated with CDC. EMCA from people that left CDC. Some vendors configurations were made up of as few as two main computers while others used as many as six dual processors of different sizes. Often the User/Interface functions were also distributed to two or more separate computers. One processor was generally the on-line processor. The consultants developed expertise that was difficult for the utility to acquire since the Utility went through the process of acquiring a new system only ever 10 years or so. . were from people that used to work at GE. and later a vendor. The standby processor is kept -up-to-date by a “checkpoint” process that run at frequent intervals.8% or better availability for critical functions. was the Decoupled Power Flow where the real and reactive portions of the solution were separated during the iterations. as that was were the detailed work was initially done. ECC and Macro were purchased by KEMA. purchase and in some cases implement these systems. In the later years. Most of the consulting companies in this business evolved from personnel who worked for vendors. Some configurations now used thirty or more of the processors. The result was simplified load flows that solved in seconds for large networks and allowed the function of Security Analysis to be accomplished. Configurations. which was formed mostly from people that left Leeds and Northrup. Consultants: As these systems became larger and more complex. Other smaller firms such as Stagg. late 90’s and later. Later additions to this grouping of functions included the Optimal Power Flow which allowed production cost to be minimized considering line losses. and foreign vendors began to buy out some of the US vendors.Vendors: Systems in the class to be called EMS developed in the seventies. the installation and testing often went on for several years. the negotiation period with the vendors often lasted a year before the contract was awarded. Boeing and North American Rockwell took several large contracts each during this period. Some large EMS systems cost over 40 million dollars. Harris. CDC became driver in the market because of their Cyber computers offered the large scale computing requirements of the network analysis functions. in the eighties it was not un-common for there to be a dozen bidders on a EMS specifications. ESCA was bought out by the French. consultants and utility personnel lived at the vendors site and later vendor people stayed at the utility location. and CDC by the Germans. and Systems Control Inc Later in the period a split off from SCI. and subsequent reduction in government defense contracts several defense contractors tried to gain a foot hold. A typical project would take up to a year for the specification to be released and. and the spec. and implementation and testing up to three years. Projects These projects were large and expensive. Harris Controls got involved in the middle seventies. General Electric and Leeds and Northrup evolved into the market from their involvement with System Operation computers as did Control Data Corporation. becoming active and then fading out. After the system was shipped. and involve four or five consultants. TRW was a major vendor trough out the period. Therefore their developed a close relationship between the various project staffs that lasted for long . Other vendors who were active for a while were Stagg Systems. and IBM was a factor several times during the EMS heaviest activity. when electric utility load decreased. This was at a time. but left quickly when they found out utilities were not as apt to fore-give cost over-runs as the government was. ESCA became a major factor in the market. Two of these that evolved from CDC are OATI and OSI. During the energy crisis of the late seventies. By the early nineties there was consolidation in the market as it could not sustain that many vendors. However as time went on and the market consolidated. Siemens (CDC). and other foreign vendors. 10 to 15 utility personnel and up to fifty vendor people. Then the project detailed definition took another year or so. At it’s peak. Westinghouse and GE left. Their were close to a dozen vendors active by the late eighties. there were some smaller companies that were split offs of other companies who became active. At different stages of a project. By the mid nineties their were fewer vendors. ESCA. consultant and vendor on a project could last as long as ten years. during the energy crisis. GE etc either got out of the business or combined with foreign vendors. could have come from as any as eight different consultants. Therefore the total involvement between the utility. and traditional US utility apparatus vendors like Westinghouse. And they would go on for five or more years. The EMS market developed rapidly in the seventies and seemed to peak during the late eighties. Westinghouse . however because of the mutual involvement this was usually avoided on these projects. As time went on during the project.periods of time. Similar government projects usually resulted in large delays and cost over runs. all of the people involved usually took ownership of the project and they all worked together for a satisfactory conclusion. .
Copyright © 2024 DOKUMEN.SITE Inc.