Electronic data processing
Electronic data processing can refer to the use of automated methods to process commercial data. Typically, this uses relatively simple, repetitive activities to process large volumes of similar information. For example: stock updates applied to an inventory, banking transactions applied to account and customer master files, booking and ticketing transactions to an airline's reservation system, billing for utility services. The modifier "electronic" or "automatic" was used with "database processing", especially c. 1960, to distinguish human clerical data processing from that done by computer.
History
Herman Hollerith then at the U.S. Census bureau devised a tabulating system that included cards, a punch for holes in them representing data, a tabulator and a sorter, The system was tested in computing mortality statistics for the city of Baltimore. In the first commercial electronic data processing Hollerith machines were used to compile the data accumulated in the 1890 U.S. Census of population. Hollerith's Tabulating Machine Company merged with two other firms to form the Computing-Tabulating-Recording Company, later renamed IBM. The punch-card and tabulation machine business remained the core of electronic data processing until the advent of electronic computing in the 1950s.to a customer, on the introduction of electronic data processing
factory Wolfsburg, 1973
The first commercial business computer was developed in the United Kingdom in 1951, by the J. Lyons and Co. catering organization. This was known as the 'Lyons Electronic Office' – or LEO for short. It was developed further and used widely during the 1960s and early 1970s.
By the end of the 1950s punched card manufacturers, Hollerith, Powers-Samas, IBM and others, were also marketing an array of computers.
Early commercial systems were installed exclusively by large organizations. These could afford to invest the time and capital necessary to purchase hardware, hire specialist staff to develop bespoke software and work through the consequent organizational and cultural changes.
At first, individual organizations developed their own software, including data management utilities, themselves. Different products might also have 'one-off' bespoke software. This fragmented approach led to duplicated effort and the production of management information needed manual effort.
High hardware costs and relatively slow processing speeds forced developers to use resources 'efficiently'. Data storage formats were heavily compacted, for example. A common example is the removal of the century from dates, which eventually led to the 'millennium bug'.
Data input required intermediate processing via punched paper tape or punched card and separate input to a repetitive, labor-intensive task, removed from user control and error-prone. Invalid or incorrect data needed correction and resubmission with consequences for data and account reconciliation.
Data storage was strictly serial on paper tape, and then later to magnetic tape: the use of data storage within readily accessible memory was not cost-effective until hard disk drives were first invented and began shipping in 1957. Significant developments took place in 1959 with IBM announcing the 1401 computer and in 1962 with ICT making delivery of the ICT 1301. Like all machines during this time the processor together with the peripherals – magnetic tape drives, disks drives, drums, printers and card and paper tape input and output required considerable space in specially constructed air conditioned accommodation. Often parts of the punched card installation, in particular sorters, were retained to present the card input to the computer in a pre-sort form that reduced the processing time involved in sorting large amounts of data.
Data processing facilities became available to smaller organizations in the form of the computer services bureau. These offered processing of specific applications e.g. payroll and were often a prelude to the purchase of customers' own computers. Organizations used these facilities for testing programs while awaiting the arrival of their own machine.
These initial machines were delivered to customers with limited software. The design staff was divided into two groups. Systems analysts produced a systems specification and programmers translated the specification into machine language.
Literature on computers and EDP was sparse through articles appearing in accountancy publications and material supplied by the equipment manufacturers. The first issue of The Computer Journal published by The British Computer Society appeared in mid 1958. The UK Accountancy Body now named The Association of Chartered Certified Accountants formed an Electronic Data Processing Committee in July 1958 with the purpose of informing its members of the opportunities created by the computer. The Committee produced its first booklet in 1959, An Introduction to Electronic Computers. Also in 1958 The Institute of Chartered Accountants in England and Wales produced a paper Accounting by Electronic Methods. The notes indicated what appears capable and the possible implications of using a computer.
Progressive organizations attempted to go beyond the straight systems transfer from punched card equipment and unit accounting machines to the computer, to producing accounts to the trial balance stage and integrated management information systems. New procedures redesigned the way paper flowed, changed organizational structures, called for a rethink of the way information was presented to management and challenged the internal control principles adopted by the designers of accounting systems. But the full realization of these benefits had to await the arrival of the next generation of computers
Today
As with other industrial processes commercial IT has moved in most cases from a custom-order, craft-based industry where the product was tailored to fit the customer; to multi-use components taken off the shelf to find the best-fit in any situation. Mass-production has greatly reduced costs and IT is available to the smallest organization.LEO was hardware tailored for a single client. Today, Intel Pentium and compatible chips are standard and become parts of other components which are combined as needed. One individual change of note was the freeing of computers and removable storage from protected, air-filtered environments. Microsoft and IBM at various times have been influential enough to impose order on IT and the resultant standardizations allowed specialist software to flourish.
Software is available off the shelf: apart from products such as Microsoft Office and IBM Lotus, there are also specialist packages for payroll and personnel management, account maintenance and customer management, to name a few. These are highly specialized and intricate components of larger environments, but they rely upon common conventions and interfaces.
Data storage has also been standardized. Relational databases are developed by different suppliers to common formats and conventions. Common file formats can be shared by large main-frames and desk-top personal computers, allowing online, real-time input and validation.
In parallel, software development has fragmented. There are still specialist technicians, but these increasingly use standardized methodologies where outcomes are predictable and accessible. At the other end of the scale, any office manager can dabble in spreadsheets or databases and obtain acceptable results. Specialized software is software that is written for a specific task rather for a broad application area. These programs provide facilities specifically for the purpose for which they were designed.