Monthly Archives: January 2014
Posted on January 10, 2014 at 5:20 pm
Windows Server 2012 R2 is a comprehensive refresh of Microsoft’s server platform, with advances in storage, networking, Hyper-V and across the board, according to the firm.
However, some features stand out as “game changing”, according to Jeff Woolsey, principal programme manager for Windows Server Virtualisation. These include storage tiering in software, and a multi-tenant gateway to support software defined networking (SDN) in cloud deployments.
Storage tiering is an update to the Storage Spaces feature of Windows Server 2012. It creates a pool of storage from a bunch of disks directly attached to the server, with thin provisioning and resiliency provided by the file system.
In the upcoming R2 release, customers can now tier that storage using a combination of SSD and spinning disks, delivering a dramatic boost in I/O performance.
“We’re taking mainstream SSDs, applying them to hard disks, and giving you phenomenal performance,” Woolsey told V3.
In a demo at the TechEd conference, Woolsey showed how server with just spinning disks achieved 7400 input/output operations per second (IOPS). The same task with four SSDs added for tiering delivered 124,000 IOPs – a 16x performance improvement.
“Now you can set up a scale-out file server with JBOD storage and JBOD SSD, and deliver the same performance, resilience and fault-tolerance as a SAN at a fraction of the cost,” Woolsey said.
R2 also supports deduplication for active virtual machines, which will enable customers to slash the costs of storage to support virtual desktop infrastructure (VDI) deployments.
“This has been one of the blockers to VDI – when customers actually see the cost of the storage to implement it, it just doesn’t make business sense,” said Woolsey.
While the dedupe is processed in software, it does not significantly affect performance, he said, as “servers are never compute bound, as most of the time they are waiting around for I/O and storage.”
Meanwhile, the multi-tenant gateway extends the network virtualisation features introduced in Window Server 2012 to allow service providers to better support multiple customers in their cloud infrastructure.
“Customers want to be able to bring their network to that cloud, and to do that you need a gateway. Today, there some hardware gateways, but you have to buy the right one, and so we just provide that in software under R2,” Woolsey said.
System Center is the control plane to create and manage network virtualisation and the data plane lives in Windows Server, he explained. The R2 release also extends Microsoft’s PowerShell automation framework, turning it into a “fundamental building block for operating the cloud,” according to Woolsey.
“If you are an IT pro, you have to have PowerShell on your resume today. You have to,” he said.
Windows Server 2012 R2 will be available as a preview release later this month and set to ship commercially later this year.
05 Jun 2013
Posted in Cloud Hosting
Posted on January 8, 2014 at 1:31 pm
Salesforce has agreed to acquire cloud marketing firm ExactTarget for $2.5bn, a 53 percent premium on the firm’s pre-deal trading price.
Salesforce intends to use the acquisition to bolster its digital marketing capabilities, as the cloud pioneer recognises the burgeoning IT spending clout that is being wielded outside the traditional IT department.
Marc Benioff, chief executive at Salesforce.com, said: “The [chief marketing officer] is expected to spend more on technology than the CIO by 2017. The addition of ExactTarget makes Salesforce the starting place for every company and puts Salesforce.com in the pole position to capture this opportunity.”
Salesforce has already invested heavily in is marketing offerings, paying $276m for social media monitoring firm Radian 6 in 2011 and $689m for Buddy Media in 2012. But Salesforce clearly felt it had further work to do in strengthening its arsenal, given those previous deals and the premium it was willing to pay for ExactTarget.
ExactTarget’s 6,000-plus client list, which includes the likes of Coca-Cola, Nike and Gap, may have added to its allure, wrote Angela Eager, an analyst with TechMarketView on a company blog.
“Salesforce.com is putting lot of faith as well as a lot of money into ExactTarget. It will substantially expand Salesforce’s Marketing Cloud, which has good coverage of social channels thanks to previous Buddy Media and Radian6 acquisitions,” she said. “Salesforce.com has momentum but as this deal indicates it is having to invest more extensively to maintain or increase the pace.”
The deal, which is expected to close by 31 July, comes hot on the heels of IBM splashing out $2bn on cloud firm SoftLayer Technologies.
Posted in Cloud Hosting
Posted on January 6, 2014 at 7:38 am
IBM has announced the purchase of public cloud computing infrastructure firm SoftLayer in a sizeable boost to its cloud offerings to enterprises.
Terms of the deal were not disclosed, but figures around the $2bn mark were floated by numerous sources cited online. IBM said the deal was designed to help it boost its mix of cloud computing offerings so it could meet the needs of all firms, especially those in looking for public cloud infrastructure.
Ric Telford, IBM vice president for Cloud Services, told V3 that the deal would help make the firm a ‘one-stop shop’ for cloud services as the demand for public cloud services from enterprises increases.
“One cloud does not fit all and there is no one approach to the cloud, it’s dependent on workloads, or the applications you want to deploy as to whether you want a private, public or hosted environment,” he said.
“That’s what intrigued us about SoftLayer to round out our portfolio of cloud offerings. They have all three models but one management layer, so we can broaden our existing portfolio and meet the demands of customers.”
Telford said that in recent years IBM has seen a growing number of enterprise customers show a willingness to operate in public cloud environments.
“In the early years most deployments were private cloud but not we’re seeing many, firms are more comfortable dealing with public cloud offerings around software as a service and platforms,” he added.
The SoftLayer offerings will be incorporated into a new business unit within the Global Technology Service Business, and be offered alongside IBM’s existing products in its SmartCloud portfolio so the firm can meet any firm’s cloud needs.
“Ultimately this gives us the breadth of flexibility. We know firms like other vendors offerings, but they don’t have the breadth of options that they can get with IBM,” added Telford.
“So you may want to run some applications in the private cloud, some in the public and have the ability to move them back and forth as you need and so now you can do this with our portfolio.”
Dallas-based SoftLayer has around 21,000 customers and owns 13 data centres in locations across the US, Asia and Europe, which will allow IBM to meet the needs of those working within the restrictions of data privacy laws.
“SoftLayer has a strong track record with born-on-the-cloud companies, and our move today with IBM will rapidly expand that footprint globally as well as allow us to go deep into the large enterprise market,” said Lance Crosby, chief executive of SoftLayer.
The use of public cloud computing services is growing rapidly, with firms like Amazon Web Services hosting several notable firms within their infrastructure, such as digital streaming firm Netflix. However, the perils of public cloud with regards to outages have been shown on several occassions.
Posted in Cloud Hosting
Posted on January 4, 2014 at 3:51 pm
The erstwhile head of the government’s G-Cloud service, Denise McDonagh, has confirmed her departure from the programme, which is moving to the auspices of the Government Digital Services (GDS) group.
McDonagh, who took charge of the G-Cloud programme last April, said the service would “forever change the way we commission and use IT in the public sector”.
“I can now hand over G-Cloud to GDS, safe in the knowledge that we have started such a groundswell of support and momentum for change that G-Cloud is here to stay,” she wrote on a G-Cloud blog.
“This has been the most enjoyable roller-coaster ride ever.”
G-Cloud has become a critical part of the government’s IT strategy, and is touted as the best way of procuring IT services cheaply, without getting tied into multi-year, multi-million pound contracts. It has also been heralded as the best way to open up public sector contracts to small and medium IT suppliers.
The service was the brainchild of Chris Chant, who spent more than three decades wrangling with the labyrinthine nature of Whitehall IT. Chant took to Twitter on Tuesday to sing the praises of the G-Cloud team.
Nonetheless, despite the upbeat tone to McDonagh’s announcement, there is plenty of work to be done before G-Cloud comes anywhere close to attaining its goal of changing public sector IT procurement. According to McDonagh, as of April 2013, a paltry £22m had been spent via G-Cloud, a drop in the ocean of government IT spend.
Earlier this week, Cabinet Office minister Francis Maude labelled the service “under used”.
McDonagh will revert to her role as IT director at the Home Office, a position she held in addition to overseeing G-Cloud.
Posted in Cloud Hosting
Posted on January 2, 2014 at 7:45 pm
Spend on the government’s under-pressure G-Cloud service has now hit £22m after several sizeable deals were signed off in April, including a spend of £1.3m with IBM by the Home Office.
The figure marks an increase of around £4m from the £18m that had been spent on the service by March and comes amid pressure on the platform to demonstrate more value for money, having failed to really ignite interest in the public sector since its launch a year ago.
This was driven by deals such as those won by IBM, as well notable others such as a deal worth £68,000 signed off by the Cabinet Office with Steria and a deal worth £205,000 signed off by the Ministry of Justice with system integrator firm i2N.
V3 contacted the Home Office for details of the services it purchased from IBM but had received no reply at the time of publication. IBM had also not responded.
While the increase proves the public sector is still interested in the platform, the number is a tiny amount when compared to the overall spend of government IT and won’t do much to relieve the pressure on the service.
On Monday Cabinet Office minister Francis Maude admitted the service was ‘underused’ and the head of the service Denise McDonaugh announced she was stepping down from managing the G-Cloud as it moves to the Government Digital Services division.
Posted in Cloud Hosting
« Previous Page