Yearly Archives: 2014

IBM £1.3m G-Cloud deal shows SME mindset challenge in government

Posted on January 14, 2014 at 8:55 am

In the year and a bit since the G-Cloud was launched the government has made great overtures about the fact it would open up the exciting world of government IT procurement to small and medium-sized firms.

This is a lofty ideal and an important one, as for too long the big firms have had a complete monopoly on those making the buying decisions in the public sector, possibly because everyone is so scared of making a screw up they stick with the tried and trusted suppliers.

Nothing wrong with that per se, but when you’re trying to overhaul government IT for the modern age while driving down costs, the ability to turn to small, nimble and cheap, companies is important, and helps to energise digital services within government.

However, while the G-Cloud was meant to address this issue and give IT buyers the chance to purchase services with confidence, it appears there is a long way to go to change the mindset of the public sector.

Data from the Cabinet Office unearthed by V3 shows that of the £4m spend on the G-Cloud in April, one third went to the most iconic of giant IT companies, IBM, which secured deals totaling £1.3m.

The Home Office was the one signing off the deals, and it must pain Cabinet Officer minister Francis Maude – who admits the service is ‘underused’ – that his colleagues down the road are ignoring the pleas to embrace SMEs and are instead sticking with the old monopoly suppliers like Big Blue.

It’s not IBM’s fault, and it may well be that its services were best suited to the government’s needs, but it doesn’t paint a very welcoming picture for small firms trying to boost their balance sheets when they see US giants hoovering up major contracts yet again.

05 Jun 2013

Posted in Cloud Hosting

Dell leaving Amazon and Microsoft to fight for the public cloud

Posted on January 12, 2014 at 5:30 pm

SAN JOSE: Dell claims the public cloud is a “race to the bottom” and told V3 that it pulled the plug on its public cloud offering because it did not want to take on the “800lb gorillas” currently in the market.

Dell’s decision to exit the public cloud market came as a shock as rival vendors are pushing hard to get a piece of the market that is currently dominated by Amazon Web Services. However, Sam Greenblatt, chief architect of Dell’s Enterprise Solutions Group told V3 that the firm was not willing to put up the resources to fight the current industry incumbents.

Greenblatt said: “There are several 800lb gorillas in the public cloud market and we are not one of them. Building that market, you really have to understand in detail how you are going to be the low-cost provider and at the same time deliver the best value. Some people call it a race to the bottom in the public cloud.”

According to Greenblatt, Dell’s internal research said more than 90 percent of its customers were interested in private clouds. “Of the 90 percent, 75 percent is actually doing virtualisation, not cloud. People say cloud, but they are usually managed hosting providers, they are not really doing the cloud market. So we decided we were not going to invest heavily in public and we were going let the 800lb gorillas slug it out in that market, but we are not giving the private cloud business, or Openstack.”

Greenblatt was keen to state that the firm was fully behind the Openstack project but did have some sobering words about the hype surrounding the cloud. “It’s a transport mechanism and a computing architecture, it’s not something magical that you sprinkle fairy dust on and all your problems go away,” he said.

Greenblatt also said Dell’s decision to get out of the public cloud market had meant the firm let a number of contractors go, though he didn’t have exact numbers. Instead Greenblatt said Dell had a lot of contractors, which suggests it was at least partly a cost-cutting measure.

Given Dell’s internal research, perhaps it is not surprising that the firm didn’t want to take on the likes of Amazon, Google and Microsoft in the public cloud market. Although Greenblatt didn’t refer to the two companies by name and said margins were not the primary reason behind its decision to abandon the public cloud, it is becoming clear that new vendors will find it increasingly hard to mount a challenge to public cloud incumbents.

Posted in Cloud Hosting

Windows Server 2012 R2: key features

Posted on January 10, 2014 at 5:20 pm

Windows Server 2012 R2 is a comprehensive refresh of Microsoft’s server platform, with advances in storage, networking, Hyper-V and across the board, according to the firm.

However, some features stand out as “game changing”, according to Jeff Woolsey, principal programme manager for Windows Server Virtualisation. These include storage tiering in software, and a multi-tenant gateway to support software defined networking (SDN) in cloud deployments.

Storage tiering is an update to the Storage Spaces feature of Windows Server 2012. It creates a pool of storage from a bunch of disks directly attached to the server, with thin provisioning and resiliency provided by the file system.

In the upcoming R2 release, customers can now tier that storage using a combination of SSD and spinning disks, delivering a dramatic boost in I/O performance.

“We’re taking mainstream SSDs, applying them to hard disks, and giving you phenomenal performance,” Woolsey told V3.

In a demo at the TechEd conference, Woolsey showed how server with just spinning disks achieved 7400 input/output operations per second (IOPS). The same task with four SSDs added for tiering delivered 124,000 IOPs – a 16x performance improvement.

“Now you can set up a scale-out file server with JBOD storage and JBOD SSD, and deliver the same performance, resilience and fault-tolerance as a SAN at a fraction of the cost,” Woolsey said.

R2 also supports deduplication for active virtual machines, which will enable customers to slash the costs of storage to support virtual desktop infrastructure (VDI) deployments.

“This has been one of the blockers to VDI – when customers actually see the cost of the storage to implement it, it just doesn’t make business sense,” said Woolsey.

While the dedupe is processed in software, it does not significantly affect performance, he said, as “servers are never compute bound, as most of the time they are waiting around for I/O and storage.”

Meanwhile, the multi-tenant gateway extends the network virtualisation features introduced in Window Server 2012 to allow service providers to better support multiple customers in their cloud infrastructure.

“Customers want to be able to bring their network to that cloud, and to do that you need a gateway. Today, there some hardware gateways, but you have to buy the right one, and so we just provide that in software under R2,” Woolsey said.

System Center is the control plane to create and manage network virtualisation and the data plane lives in Windows Server, he explained. The R2 release also extends Microsoft’s PowerShell automation framework, turning it into a “fundamental building block for operating the cloud,” according to Woolsey.

“If you are an IT pro, you have to have PowerShell on your resume today. You have to,” he said.

Windows Server 2012 R2 will be available as a preview release later this month and set to ship commercially later this year.

05 Jun 2013

Posted in Cloud Hosting

Salesforce agrees $2.5bn ExactTarget buy to boost marketing offerings

Posted on January 8, 2014 at 1:31 pm

Salesforce has agreed to acquire cloud marketing firm ExactTarget for $2.5bn, a 53 percent premium on the firm’s pre-deal trading price.

Salesforce intends to use the acquisition to bolster its digital marketing capabilities, as the cloud pioneer recognises the burgeoning IT spending clout that is being wielded outside the traditional IT department.

Marc Benioff, chief executive at Salesforce.com, said: “The [chief marketing officer] is expected to spend more on technology than the CIO by 2017. The addition of ExactTarget makes Salesforce the starting place for every company and puts Salesforce.com in the pole position to capture this opportunity.”

Salesforce has already invested heavily in is marketing offerings, paying $276m for social media monitoring firm Radian 6 in 2011 and $689m for Buddy Media in 2012. But Salesforce clearly felt it had further work to do in strengthening its arsenal, given those previous deals and the premium it was willing to pay for ExactTarget.

ExactTarget’s 6,000-plus client list, which includes the likes of Coca-Cola, Nike and Gap, may have added to its allure, wrote Angela Eager, an analyst with TechMarketView on a company blog.

“Salesforce.com is putting lot of faith as well as a lot of money into ExactTarget. It will substantially expand Salesforce’s Marketing Cloud, which has good coverage of social channels thanks to previous Buddy Media and Radian6 acquisitions,” she said. “Salesforce.com has momentum but as this deal indicates it is having to invest more extensively to maintain or increase the pace.”

The deal, which is expected to close by 31 July, comes hot on the heels of IBM splashing out $2bn on cloud firm SoftLayer Technologies.

Posted in Cloud Hosting

IBM boosts cloud computing arsenal with SoftLayer buy

Posted on January 6, 2014 at 7:38 am

IBM has announced the purchase of public cloud computing infrastructure firm SoftLayer in a sizeable boost to its cloud offerings to enterprises.

Terms of the deal were not disclosed, but figures around the $2bn mark were floated by numerous sources cited online. IBM said the deal was designed to help it boost its mix of cloud computing offerings so it could meet the needs of all firms, especially those in looking for public cloud infrastructure.

Ric Telford, IBM vice president for Cloud Services, told V3 that the deal would help make the firm a ‘one-stop shop’ for cloud services as the demand for public cloud services from enterprises increases.

“One cloud does not fit all and there is no one approach to the cloud, it’s dependent on workloads, or the applications you want to deploy as to whether you want a private, public or hosted environment,” he said.

“That’s what intrigued us about SoftLayer to round out our portfolio of cloud offerings. They have all three models but one management layer, so we can broaden our existing portfolio and meet the demands of customers.”

Telford said that in recent years IBM has seen a growing number of enterprise customers show a willingness to operate in public cloud environments.

“In the early years most deployments were private cloud but not we’re seeing many, firms are more comfortable dealing with public cloud offerings around software as a service and platforms,” he added.

The SoftLayer offerings will be incorporated into a new business unit within the Global Technology Service Business, and be offered alongside IBM’s existing products in its SmartCloud portfolio so the firm can meet any firm’s cloud needs.

“Ultimately this gives us the breadth of flexibility. We know firms like other vendors offerings, but they don’t have the breadth of options that they can get with IBM,” added Telford.

“So you may want to run some applications in the private cloud, some in the public and have the ability to move them back and forth as you need and so now you can do this with our portfolio.”

Dallas-based SoftLayer has around 21,000 customers and owns 13 data centres in locations across the US, Asia and Europe, which will allow IBM to meet the needs of those working within the restrictions of data privacy laws.

“SoftLayer has a strong track record with born-on-the-cloud companies, and our move today with IBM will rapidly expand that footprint globally as well as allow us to go deep into the large enterprise market,” said Lance Crosby, chief executive of SoftLayer.

The use of public cloud computing services is growing rapidly, with firms like Amazon Web Services hosting several notable firms within their infrastructure, such as digital streaming firm Netflix. However, the perils of public cloud with regards to outages have been shown on several occassions.

Posted in Cloud Hosting

G-Cloud head departs as government IT procurement service hits £22m spend

Posted on January 4, 2014 at 3:51 pm

The erstwhile head of the government’s G-Cloud service, Denise McDonagh, has confirmed her departure from the programme, which is moving to the auspices of the Government Digital Services (GDS) group.

McDonagh, who took charge of the G-Cloud programme last April, said the service would “forever change the way we commission and use IT in the public sector”.

“I can now hand over G-Cloud to GDS, safe in the knowledge that we have started such a groundswell of support and momentum for change that G-Cloud is here to stay,” she wrote on a G-Cloud blog.

“This has been the most enjoyable roller-coaster ride ever.”

G-Cloud has become a critical part of the government’s IT strategy, and is touted as the best way of procuring IT services cheaply, without getting tied into multi-year, multi-million pound contracts. It has also been heralded as the best way to open up public sector contracts to small and medium IT suppliers.

The service was the brainchild of Chris Chant, who spent more than three decades wrangling with the labyrinthine nature of Whitehall IT. Chant took to Twitter on Tuesday to sing the praises of the G-Cloud team.

Nonetheless, despite the upbeat tone to McDonagh’s announcement, there is plenty of work to be done before G-Cloud comes anywhere close to attaining its goal of changing public sector IT procurement. According to McDonagh, as of April 2013, a paltry £22m had been spent via G-Cloud, a drop in the ocean of government IT spend.

Earlier this week, Cabinet Office minister Francis Maude labelled the service “under used”.

McDonagh will revert to her role as IT director at the Home Office, a position she held in addition to overseeing G-Cloud. 

Posted in Cloud Hosting

G-Cloud spend hits £22m as IBM wins £1.3m Home Office deal

Posted on January 2, 2014 at 7:45 pm

Spend on the government’s under-pressure G-Cloud service has now hit £22m after several sizeable deals were signed off in April, including a spend of £1.3m with IBM by the Home Office.

The figure marks an increase of around £4m from the £18m that had been spent on the service by March and comes amid pressure on the platform to demonstrate more value for money, having failed to really ignite interest in the public sector since its launch a year ago.

This was driven by deals such as those won by IBM, as well notable others such as a deal worth £68,000 signed off by the Cabinet Office with Steria and a deal worth £205,000 signed off by the Ministry of Justice with system integrator firm i2N.

V3 contacted the Home Office for details of the services it purchased from IBM but had received no reply at the time of publication. IBM had also not responded.

While the increase proves the public sector is still interested in the platform, the number is a tiny amount when compared to the overall spend of government IT and won’t do much to relieve the pressure on the service.

On Monday Cabinet Office minister Francis Maude admitted the service was ‘underused’ and the head of the service Denise McDonaugh announced she was stepping down from managing the G-Cloud as it moves to the Government Digital Services division.

Posted in Cloud Hosting

« Previous Page