Dan Barnes follows the historical path of the technology that has allowed banking to progress in the past 80 years and looks at what lies ahead in the next 20 years.

The last century was the golden age of automation. By January 1926, when the first edition of The Banker was launched, mechanisation had changed the face of the world, contributing to ‘efficiency’ in warfare, transport and industry. Henry Ford had been operating his car production plant in the US for more than a decade. Within the next four years, two-thirds of American homes would have electricity and almost half would have telephones. Technologies were varied, using hydraulics, pneumatics and electricity (the last two even in the office environment).

Pioneering spirit

The financial services sector has often only been rivalled by the military as an investor in, and early adopter of, technology, reflecting the importance that people place upon protecting their money – second only to protecting lives. Early adoption is always a risky business and has created headaches as well as headway (for example, the ATM and the internet). Both were initially taken up in the 20th century in an effort to reduce costs, but their initial impact was to increase the cost of doing business. However, financial services’ enthusiasm to test new ideas has set it well ahead of other industries in its use of IT, to the point that some financial institutions now sell systems in competition with pure IT companies.

Initially, technologies were demonstrated to show the public that specific basic tasks could be conducted more efficiently or in greater volume than previously thought possible. Generating interest in a technological solution required much the same skills then as it does today, notes Bob Tramontano, vice-president of self-service engineering and product management at technology provider NCR.

“In 1884, when John Patterson created the global market for the cash register, he was trying to make sure that [banks] could account internally for all the transactions so that you could reduce theft,” Mr Tramontano explains. However, Patterson realised that the consumer market had to accept a company’s use of a system if it were to be widely adopted. “He was approaching it from a consumer perspective as well, with the idea that an individual would want proof of a transaction. He spent a lot of time trying to create demand for the receipt. If the consumer doesn’t understand it, then people won’t adapt to it,” says Mr Tramontano.

Aiming for consistency

In Europe by the 1920s, banks were using standardised procedures to ensure that there was consistency of process and to provide an accurate picture of exposure across their organisation, which was vital as consolidation and broadening customer bases increased the scale of operations, demographically and geographically. The development of procedures and investment in technologies were mainly focused on bookkeeping and accounting (for example, the Burroughs Adding machine, the millionth of which was produced in 1926) and the processing of paperwork. Cheques were a growing concern then and one that still has not been fully addressed, as the US ‘Check 21’ regulation indicates (this allows truncation of and electronic processing of cheques to facilitate clearing).

By 1926 the National Cash Register Company (NCR) had been producing cash registers with electric motors for 20 years and two years prior, the Computer-Tabulating-Recording Company had become globally known as International Business Machines (IBM, a name it had previously only applied to its Canadian operations). The Midland Bank (now HSBC) in the UK had trialled ledger posting machines, which were later adopted to record account balances with debits and credits to create a historic document for tracking and verifying account activity.

Post-war progress

Then came the Great Depression, followed by the Second World War – the latter producing its own share of technological advances but not in banking. After the war ended, technology in the finance industry continued to advance apace. Innovators looked to their existing systems and attempted to reach ahead, using technology as a significant differentiator. Unsurprisingly, the first technological developments in the banking industry were used to reduce cost-repetitive, labour-intensive activities such as clearing.

During the 1950s, the Stanford Research Institute began developing the original mainframe for use in financial services – the Electronic Recording Machine, Accounting (ERMA) – as a response to Bank of America’s (BoA) request for an automated solution to cheque processing. Building ERMA included the creation of magnetic ink character recognition (MICR). The initial version was developed in 1955 and later replaced by ERMA Mark II, which used more modern transistors and magnetic core memory in the place of vacuum tubes and magnetic memory drums. General Electric (GE) was asked to produce the final models for BoA, finally installing the 32 requested machines in 1959. NCR then went on to sell ERMA as the NCR 204, which was retained as BoA’s main computing system until 1970.

A mainframe allows banks to automate back-office processes and speed up transaction times well beyond the capacity of manual workers. One bank that became an early adopter of technology was Barclays. Tony Gandy, a consultant to the industry and former technology editor at The Banker, recalls that Barclays purchased at least one mainframe, an EMIDEC 1100, in 1961. Mainframes varied greatly and Dr Gandy explains the latest and most advanced technologies were not always used, dependent upon access to components. “[It] was a fascinating design. During the mid and late-1950s, transistors were still rare and expensive in Europe, so EMI took a different route. They used magnetic cores, both to provide the 1k memory and for the logic circuits. They were cheap and reliable, [but ultimately] overtaken by transistor technology.”

Primitive mainframes

The machine was no one-trick pony, however, says Dr Gandy, and other factors (such as printer and punch-card technology) were prime differentiators, if a little primitive. “The EMIDEC 1100 was one of the first machines, like ERMA, developed for the commercial market, and the speed of the central processor was only one issue. To help provide output devices for the 1100, EMI turned to a business machines company, Powers-Samas – later to become part of ICT and the then ICL. Its printers worked well, but the EMI engineers were always concerned by the use of bicycle chains in the printer mechanism.”

The mainframe market leader was IBM – its main US competitors (Burroughs, Control Data Corporation, GE, Honeywell, NCR, RCA and UNIVAC) were often referred to as the ‘seven dwarfs’ – precisely because its peripherals were reliable and fast. “If you had looked inside an IBM machine it may have had the same chain but it would have looked flash and it would have worked faster,” notes Dr Gandy.

Another early adoption by Barclays was the first ATM (or, more precisely, cash dispenser), which the bank unveiled in 1967 in north London. Mr Tramontano notes that with this development, the industry had opened a new door to customer-facing technology: “When Barclays introduced the first ATM globally, they were very much a thought leader and they were trying to provide convenience.”

The early ATMs, which dispensed fixed amounts of cash using tokens and cards, required the same perception of usefulness by the public. As they achieved this, banks began to appreciate that the data gathered from these devices held further commercial potential. Banks were already aggregating data but this was only a precursor to the databases and warehouses of the future. “We began to look at how we could create internal users for the information, giving us two separate user bases: the external customer and the internal bank employee,” says Mr Tramontano.

As IT’s potential began to be realised, it was applied in different ways in different organisations and often in different areas of the same bank. Some institutions, for example Société Générale, began to create offshore facilities using the same cost efficiencies that are seen today to prepare for the financial impact that the predicted growth in IT projects would have. The vector for this growth was the development of the minicomputer. Digital Equipment Corporation (referred to as DEC or Digital) led the field in this area with its production of the VAX minicomputer, becoming the second largest global computer company during the 1980s.

Move to decentralise

“Minicomputers effectively allowed banks to purchase ‘bank-in-a-box’ technology. It allowed banks and departments to act in a decentralised way,” says Dr Gandy. By moving away from the mainframe as the only source of data processing, the cost of doing business was reduced, meaning banks could start up without the investment in expensive mainframe hardware. This in turn gave individual departments the ability to purchase their own technology, reducing the power of the centralised IT department.

The packaged software market opened up to provide companies with standardised processes, fulfilling the same role as mechanised systems had earlier in the century. In trading rooms, the provision of digital data feeds by market data providers added to the motivation for internal purchase of their own systems. At the same time, the falling price of hardware components was making the microcomputer powerful enough to become a legitimate business tool.

The introduction of the personal computer (PC) built by IBM created further potential as individuals were able to use IT tools to create their own spreadsheets and documents. IBM eventually became market leader in microcomputers due to its success in producing the peripherals around the computer rather than the processing power of the PC itself. In the late 1980s and early 1990s, as the microcomputers increased in power, minicomputers such as IBM’s AS400 were used less as local area networks (LANs) of PCs gained favour.

The next major change was the creation of the client server architecture, separating the client, normally the graphical user interface (GUI) powered by a PC, from the data used in an application held on the server. The term ‘server’ originally applied to the application itself but in common parlance has come to represent the minicomputer running the application. Data could now be transferred between systems, greatly increasing the potential for its use, although translation was required.

The use of credit and debit cards and ATMs had become increasingly widespread, while the microcomputer had become familiar technology as consumers used them in their homes. In 1991, Conseil Européen pour la Recherche Nucléaire (European Council for Nuclear Research or CERN), began publicising the World Wide Web, a method for transmitting data across a network, now commonly referred to as the internet (although in fact, that is the network itself). Banks providing access over the telephone had already been successfully launched and, as the internet and home PCs both began to gain popularity in developed nations, the concept of internet-only banking was born.

Up to standard

Large-scale application, designed for specific industries were being bought to ensure regularity of procedure across banks that had expanded globally, but challenges remained due to the lack of common standards between systems. The development of XML as a standard has begun to address this problem, adding metadata tags to data so that systems can understand what they are being asked to read. The latest iteration in the architectural field is the service-oriented architecture concept, in which applications are built of components that can be reused – for example, a customer relationship management tool may include a component for extracting data from a data warehouse that can be reused as a risk management tool.

Some areas of IT are reaching limits in the scope of technologies. Howard Edelstein, former president and CEO of BT Financial Services, explains that network latency is now not an IT issue, it is an issue of physics. “I think the speed of light is going to be a limiting factor so we’re going to have to come up with some creative answers,” he says.

“At the moment we locate people’s trading algorithms right next to the liquidity centre that they want electronically, so they don’t have to spend 20 milliseconds getting there. Who would have thought of that three years ago? The future’s going to be that the buy-side and the sell-side have to be situated within a shorter distance to the venue than the speed of light would otherwise tolerate. And when you’re starting to deal with physics, as opposed to engineering, then you had better have some smart people focused on your business demands.

“The speed of light is a physical constraint that you have to work around. If CEOs aren’t tuned into [complex regulations like] MiFID yet, they certainly aren’t looking at the speed of light.”

The next phase

At HSBC, a recent showcase has been opened in the bank demonstrating near-future technologies: window projections displaying information that window shoppers can access via touchscreen; security software detecting any inert objects left in a branch while tracking customer movement within the bank; voice recognition document-fill tools; and a thin client terminal no larger than a plug socket. Amanda Kenney, in charge of e-channels development at HSBC, explains: “The technologies on display aren’t in use yet but some are really not that far away.”

Bridget Van Kralingen, global managing partner, financial services sector at IBM, says that systems will not be taking over the banks yet but they will be getting smarter in the near future. “We have a product called the semantic engine that takes demographic information, historical information on a particular client and the practices of the best buyers, sellers and brokers in that bank, weaves them together and creates a dynamic script,” she says..

“You could take a new person, have them interact with it, and craft and tailor suggestions for products that work with a client and it tells you why. So you have a very transparent advisory conversation with a client, basically being said by this engine. It’s a very classical tactical thing and it allows your people to do their work better.”

PLEASE ENTER YOUR DETAILS TO WATCH THIS VIDEO

All fields are mandatory

The Banker is a service from the Financial Times. The Financial Times Ltd takes your privacy seriously.

Choose how you want us to contact you.

Invites and Offers from The Banker

Receive exclusive personalised event invitations, carefully curated offers and promotions from The Banker



For more information about how we use your data, please refer to our privacy and cookie policies.

Terms and conditions

Join our community

The Banker on Twitter