Dimension Data Unveils Next Generation Technology Lifecycle Assessment Service

Published October 11th, 2009 - 09:28 GMT

Dimension Data, the $4.5 billion specialist IT solutions and Services Company, has announced the release of its next generation technology lifecycle assessment service.  The move comes off the back of strong demand by global organisations and is driven by requirements gathered in executing over 300 assessments across five continents.

“Dimension Data is happy to introduce Technology Lifecycle Management Assessment (TLM Assessment) to our clients in the Middle East,” says Nader Atout, Sales Director – Gulf Region at Dimension Data. “This service enables organisations to expand their network asset planning activities beyond a single manufacturer.  It also provides the foundation for expanding asset planning beyond network assets to include server platforms and storage devices among others. TLM Assessment also includes the ability to identify serial information and to provide clients with maintenance coverage information thereby ensuring that they are neither over- nor under covered.” 

Rich Schofield, Global Business Development Manager, Network Integration at Dimension Data says interest in the new TLM Assessment has been extremely positive. “The TLM Assessment has evolved, thanks to what we learned from literally hundreds of assessments we executed in the last 18 months.
“The data we gathered was built back into the service. We added new features, increased automation, and reduced report delivery time. These, combined with improving the readability and general usefulness of assessment reporting, make it more cost effective and much easier for clients to remove risk, waste, and uncertainty from their IT infrastructure,”  Schofield explains, and adds that new statistics and trends from the over 300 assessments executed to date will be released early next year in the 2010 edition of the Dimension Data Network Barometer Report.

“Of the first 150 assessments executed in 2008, 73% of networking devices were running with known security vulnerabilities. This exposes a business to both external and internal security attacks and breaches, and could seriously jeopardise an organisation’s ability to meet regulatory compliance.

“The ramifications for IT infrastructures are huge. More importantly, however, this statistic highlights the fact that organisations could be protecting their networks better but either  don’t know they need to, or don’t have the processes in place to do so.”

Dimension Data also found that there was an average of 30 configuration issues per network device. The financial services sector - with an average of 36 - has the highest average number of configuration errors per device.

“The concern here is compliance,” Schofield says. “That, along with the fact that the most frequently misconfigured category is authentication, holds all sorts of implications for organisations. This amounts to the equivalent of leaving your front door unlocked even though the door has a perfectly functional deadbolt,” he explains.

Almost half of all network devices were found to have entered the obsolescence cycle, putting them at risk of extended downtime and unplanned for, forced expenditure to regain business continuity.

“The great irony is that every one of these problems is avoidable through appropriate life cycle management that allows companies to maximise the useful life of its network assets with a rational approach that minimises risk.   Organisations that don’t get the help that’s available now could find they’re being held hostage by their networks in the near future."

The price tag for downtime is significant and growing as businesses enable more operational processes with IT. Schofield explains: “Some industry analysts calculate that system downtime can cost as much as $42,000 per hour of downtime in a large corporate – with a typical business experiencing around 87 hours downtime a year -  and up to 3.6% of an organisation’s annual revenue.”

According to Schofield there’s not been much help available until now, because of the rapid and haphazard way in which IT has evolved – with the Internet in particular triggering a scramble for e-enablement that left no time to develop technology lifecycle management best practice.

“However, both the IT industry and its customers are maturing.  They’re also realising that their organisations and, therefore their networks, are going to come under even more pressure as Web 2.0, software as a service (SaaS), video, voice, and mobility applications become dominant.

“There’s also the drive to save costs through initiatives such as virtualisation and standards-based IT service management, pervasive connectivity, convergence, and standardisation on IP - all of which have impacts on the network. In other words, they’re realising that to sensibly manage all the options, is to execute regular assessments via an automated service that is constantly evolving and maturing along with the technologies themselves.”

Schofield believes that technology lifecycle management assessment methodologies should be improved continuously. That’s because the networks they assess are constantly changing.

“It’s vital to keep ploughing experience of network management back into a TLM Assessment service – just as it is vital for clients to continue adjusting their understanding of their network assets.” 

© 2000 - 2019 Al Bawaba (www.albawaba.com)

You may also like