Share the article
twitter-iconcopy-link-iconprint-icon
share-icon
SectionsJanuary 2 2008

New year’s solutions

Alan Duerden explores five important New Year’s resolutions that CIOs should be making for 2008 to ensure they deliver value for money.
Share the article
twitter-iconcopy-link-iconprint-icon
share-icon

The start of the New Year heralds new challenges for the banking industry. With the increasing importance of technology in the financial markets, no-one sits more exposed than the chief information officers (CIO), who in the coming years will be expected to perform complex juggling acts to initiate continued change across their business. In this article, The Banker identifies five major themes that CIOs will have on their radar for 2008.

Budget tightening

Undoubtedly, 2007 has thrown banks a few curveballs and, with tougher market conditions ahead for 2008, the pressure will be on CIOs to demonstrate that they have delivered more ‘bang for their buck’ and that the business has got value out of the investments that they have been making. New regulation across the financial markets is also consuming a lot of IT budget with demands from the business on replacing data warehouses and initiating programmes that allow a single customer view.

CIOs will be asked to achieve more with less money and, specifically, will have to demonstrate cost transparency as the business asks the question: “Are we getting value out of our IT investment?”

One such supporter of this notion is Frédéric Ponzo, managing director at Net2s, the enterprise connectivity consultancy firm. He believes that there will be big shifts from an IT budget perspective among banks in the next couple of years and the environment is going to be tough, as it was in 2001 and 2002.

“Budgets are going down,” says Mr Ponzo. “At best, budgets are flat for the banks that haven’t been too exposed to the credit crunch, but the next couple of years are going to be tough in terms of IT spend. All CIOs, every single one of them, is expecting to have their budget for 2008 decreased; whether it starts on April 1 or December 1, budgets are going down.”

For example, says Mr Ponzo, Net2s has clients that are pre-paying for 2008 services that have not been given to them yet because they can take it from the 2007 budget, “putting the money in the fridge” as he calls it. One of the firm’s clients recently signed a $1m cheque for services that Net2s had not yet provided, but the client had spare money in its 2007 budget and knew that 2008 would be a leaner time, he says.

Jean Cadroy, director of information services at Société Générale, agrees that CIOs will be watching their purse strings and believes that, depending on what the global market trends throw up, banks may have to reduce their ambitions later on in the year. “Regarding the subprime crisis, market experts think that the first two quarters of 2008 will be critical. If things worsen, we will have to reduce our ambitions and this is why we must keep flexibility,” he says.

Banks that have internal contractors as part of their IT infrastructure may find that this is one of the first areas to be hit by a reduction in budget and where they feel they can make savings. “We currently have a ratio of 55% internal and 45% contractors,” says Mr Cadroy. “This is one of the flexibility levers we could trigger. However, this is only a hypothesis and not our current working assumption.”

IT real estate management

While traditionally many banks have been notoriously bad at using the speculate-to-accumulate model when spending money on the IT backbone that runs through their business, one area that CIOs will increasingly be looking at is how they can optimise the usage of their computer ‘real-estate’. There are two key technologies that they will be keeping their eyes on; grid computing and virtualisation.

Grid computing allows an organisation to distribute calculations, programmes or processing power across a large number of machines. Multiple independent computing clusters act like a grid because they are composed of nodes, not located within a single administrative domain or at a single geographical location. “We are progressively using grid processing in our infrastructure,” says Sahba Saint-Claire, CIO of wholesale banking at Standard Chartered Bank. “In most cases, this gives us the capacity to cope with our projected growth.”

Virtualisation is essentially a piece of software that runs on one or more machines to create a virtual machine. It allows the user to get multiple instances of the whole operating system, server or storage device running as one resource or one instance of an operating system, server or storage device running as multiple resources. One of the benefits of virtualisation is that it allows a bank to reconfigure its server structure without touching the hardware.

If a bank has x number of machines and x number blade servers and there is one application that requires more servers than another application, it can use its virtualisation software to reallocate that computation power to where it is going to be used best. This allows the bank a massive amount of flexibility because traditionally the same outcome would have been achieved either by buying more machines or formatting the existing machines’ hard drives and reinstalling the operating system.

“One of the things about virtualisation is that it allows you to put an abstraction layer between the computing power and the application logic, and in planning for capacity utilisation, it is hard to know which processes are going to take off and consume a lot of capacity versus others,” says Gordon Burnes, vice-president of marketing at OpenPages, a risk management solutions provider. “What virtualisation allows people to do is to be responsive to the needs of the business as processes change more dynamically.”

Although virtualisation is not a new technology, it is a major solution that has started to become mainstream in the financial sector. A handful of banks are already using it and there are some big virtualisation projects kicking off this year because it allows banks the flexibility to reconfigure the allocation of their data centre in hours or days, rather than spending weeks with the engineers racking in new boxes of kit. Coupled with this, most banks run at 5% to 10% virtualisation on average, and just by moving the percentage of virtualisation within an institution from 10% to 20%, twice as much work can be done for the same amount of money.

Mr Cadroy of Société Générale sees the issue of IT ‘real estate’ management as something that CIOs will be focused on in the coming years. “In our 2008-2010 plan, we have five big initiatives across our organisation to optimise our efficiency,” he says. “Two of them directly relate to IT rationalisation and optimisation of the IT infrastructure. For instance, we have currently more than 10 data production centres for Société Générale Security Services only, so there is room for optimisation.”

While organisations are being driven by maximising shareholder value and trying to increase efficiency, it just so happens that using the data centre more efficiently is also related to the trendy ‘green’ issues of reducing power consumption and the space that data centres take up. So just as people are figuring out how to use less power, virtualisation has emerged as one way of doing that. “We do not see companies being driven towards making decisions about deployment platforms and the primary driver being green issues; it is more of a coincidental merging of issues,” says OpenPages’ Mr Burnes.

Data management

Data centres are also virtualising their hardware platforms; and data management is going to be a big theme for 2008 under the tide of regulation that is breaking on banks and financial institutions.

New regulations have come from multiple sources that have their own objectives and motives, whether they be regulators or industry bodies, and a lot of investment is being ploughed into aggregating and calculating new information that did not previously exist in order to disclose and conform to these new regulations.

“This year is certain to be an annus horribilis for any CIO who is not 100% sure of one of their company’s greatest assets: information,” says Laurence Trigwell, associate vice-president, financial services, at Cognos, a business intelligence solutions provider. “Without easy access to one version of the truth and up-to-date and accurate data, it will be impossible to review, analyse and plan for the upcoming year. Meanwhile, having visibility of and understanding the company’s exposure to risk remains an unachievable dream,” he says.

Where an organisation wants to change from having a large number of different data marts [a data silo] to having one data warehouse, there are data management and governance challenges associated with the process because it has to be ensured that every customer record keeps each required set of specific qualities from each data mart.

Initially, these data marts sprang up because a centralised data structure was not sufficient for the individual business needs. The advantages of centralised data structures are now being realised, however, and the migration to a data warehouse-type model can initiate operations and management savings while also enhancing the customer management experience.

The challenge is to fulfil requirements from all the different parts of the business so that a centralised data model works for each of the business structures. Technology can help with a lot of those issues and there is an opportunity for organisations to exploit that data information internally to make better decisions and align processes more consistently than was previously possible.

However, Mr Burnes believes the issue is not just one of technology innovation. “The first step is to establish a data governance council and get people to buy into a new process of managing data. It’s not a technology issue; it’s a process and governance issue,” he says. “You need to think about the different policies that you are going to implement as part of that data management strategy and once you have come up with a policy, then the policies can start to drive action.”

The CIO can take a real lead in this process by showing the business how the end vision of centralised data is going to save the organisation money in the long run. Once this has been done, a data governance council can be established, a set of policies on which the whole business agrees can be initiated and the migration of data to a common platform can begin.

One sector within the data management area where it is thought that the CIO can be more proactive is in risk management. With an increasing focus on risk management – the knock-on effect of regulatory change – individual parts of the business are starting to create new data marts to manage individual types of risks, as they did for customer data. As already explained, some banks are in a quandary about their customer data because it is spread out across their enterprise; now the same thing is happening with risk management data, whereby the data is spread out across the enterprise, being housed by the individuals responsible for managing the particular risk.

What the CIO risks now is allowing a proliferation of those databases and consequently a similar situation in customer data but in the risk management data space. Risk management is a discrete function that should be managed by the business as a whole but the current way of operating risks is giving rise to data silos that should be integrated. (See risk management round table, page 52).

Talent shortage

Financial services companies are finding it the most difficult to recruit for their IT vacancies, compared with the retail or public sector, according to the results of a 2007 skills survey commissioned by silicon.com, published last August. The survey also reports that the financial services industry is suffering specific IT shortages of staff with programming languages skills and workers with database skills.

“We want to continue recruiting talented and knowledgeable people. In the securities services business industry, this exercise is particularly difficult. It’s a kind of squaring the circle exercise,” says Mr Cadroy. “You have to find people who know new technologies and they are rather young; then you need people who know the securities industry well and they are rather senior. So, you need people who not only have the technical skills but also have the management skills, and in an industry that is becoming more and more global and international, they need to be able to speak good English.”

Recruitment has been a problem for a couple of years. There is a talent shortage in the IT space in the financial sector and the problem that banks face is that the quality of a company is proportional to the quality of the staff. Mr Ponzo believes that the best way a CIO can safeguard against the talent shortage problem is by diversifying their sources of recruitment, and so their risk. He also believes that if a bank can utilise small targeted outsourcing partners that have the critical mass in a specific area of the business, they can capitalise on that experience if IT talent dries up in one area of the market.

“For example, we have 120 guys working on exchange connectivity, which is 10 to 15 times more than any bank,” says Mr Ponzo. “We have that critical mass that means, beyond those individuals, we have intellectual property that belongs to the company and allows us to maintain a high standard of service.”

Continued investment in the development and nurturing of existing talent in the business is another area that the CIO can explore so they are not exposed if IT talent in specific areas of the market dries up.

“There are shortages in small pockets but it is not a major issue for us,” says Mr Saint-Claire. “Continuous training of talent is a priority and we have internal academies around certain areas of our technology environments. The education system needs to be modified and we should not be teaching computer science as a pure theoretical science. It is time for specialisation of the discipline in our universities for certain industries. This would be extremely beneficial to the students and the institutions.”

Infiniband

Infiniband has been in existence for a couple of years but is still a technology in its infancy outside of the trading space. In essence, Infiniband is a high-speed inter-connection for transferring data between systems. Feeding data in the traditional method, through an operating system and network card, was found to produce a latency bottleneck. What Infiniband initiates is known as remote memory access, whereby one machine writes to another machine’s memory directly via an Infiniband card that is installed on each of the machines. The source machine knows the remote address of the destination machine and writes directly to it, cutting out a lot of the network and layers, and dramatically reducing latency and increasing throughput.

The technology is used anywhere where speed is of critical importance, for example, in programme trading and algorithmic trading, which are likely to leverage Infiniband as a high-speed interconnection to get the data in, do some analysis and get the data out as quickly as possible.

Massoud Maqbool, strategic relationships manager at Tervela, a networking and middleware technology provider, says that with a middleware appliance, there is data going in, and a system (usually a feed handler) that reads the data and sends it to a middleware layer that is responsible for distributing the data. In the case of algorithmic trading, data is taken from the feed handler machine, goes to another machine that has the middleware software on it, goes from there to the server where the algorithmic trading engine is located and then goes to another machine for order routing.

“If you look at those different components where milliseconds or microseconds can make the difference with respect to price, and you say that price slippage is directly correlated to time, then the longer the time it takes for you to execute an order, the higher the risk of a price slipping,” says Mr Maqbool. “Infiniband is something that allows you to cut down the time it takes to execute on a particular price and is the interconnect transport mechanism between the different hardware machines and servers.”

With Infiniband dramatically reducing latency, the knock-on effect is that, theoretically, more calculations can be run on the same piece of kit. If an algorithm is quite computationally intensive, it might only be possible to process a certain number of data points because of the time it takes to get all the data into the system and manipulate it. However, Infiniband reduces the overhead of moving data so an algorithm may be able to have a higher number of data points, allowing more intelligent and complex analysis because latency is no longer a factor.

Currently a tool that has been well suited to the front office, Infiniband can be adapted to work in other areas of the bank. It can be used for real-time risk monitoring, position keeping, pricing and order matching – anywhere where the organisation has to take in large amounts of data and distribute them.

Many middleware vendors are now forwarding their software to Infiniband compliance solutions and offering Infiniband-compliant versions of their software to allow their clients to take advantage of high-speed interlinks where it is important.

Was this article helpful?

Thank you for your feedback!

Read more about:  Digital journeys , Fintech