Monthly Archives: June 2013
Posted on June 10, 2013 at 11:13 am
A survey from the Linux foundation has found that 80 percent of enterprises plan to increase their Linux server deployments over the next five years.
According to the report, the adoption of the cloud in business circles is a major reason for the planned increases. The Linux Foundation says that increase deployment of Linux is starting to put a dent in Microsoft server sales.
Those surveyed reported that they were more likely to deploy Linux servers than Microsoft servers over the next five years.
The Linux Foundation’s report found that only 20 percent of those surveyed plan to increase the number of their Windows servers over the next five years.
That figure is in comparison to the 80 percent of respondents who plan on increasing their Linux server deployments over the same time frame.
A key reason for the lack of Windows deployments may be because of the recent release of Windows 8. According to the study, 39 percent of survey respondents say they are moving away from Microsoft because of the latest version of Windows.
Cloud computing was reported to be another key reason behind Linux adoption. The report found that 76 percent of cloud-enabled companies surveyed were using Linux servers to run their cloud.
“Cloud computing is a natural fit for Linux, as it depends on openness, and is an area where Windows is struggling,” the group wrote in its report.
The Linux Foundation found that the three main causes for the increase in Linux use were technical superiority, lower total cost of ownership, and security.
Reasons for enterprise’s hesitation to join the Linux community included lack of features, driver availability, interoperability, and the dearth of qualified workers trained in Linux.
The Linux Foundation reports that survey participants inability to find Linux trained staffers has increased by 11 percent year-over-year.
The study results come from a crop of survey responses from 1,279 businesses. About 355 of survey respondents make over $500m in annual revenue or staff over 500 employees.
Posted in Cloud Hosting
Posted on June 8, 2013 at 7:12 am
A judge has dismissed a patent claim against Rackspace involving the patenting of mathematical algorithms.
Rackspace and its partner Red Hat got the case thrown out on the grounds that mathematical algorithms could not be patented. The case reportedly marks a major victory for the open-source community.
“This is a major victory for open source software,” said executive vice president and general counsel of Red Hat Michael Cunningham in a statement.
“We are gratified to have beaten another patent for open source and for our customer. We also believe that the thoughtful dismissal by Chief Judge [Leonard] Davis [will] encourage earlier decisions by other courts on invalid software patents, reducing vexatious litigation by non-producing entities and their corrosive effect on innovation.”
The case involved Rackspace and Red Hat fighting in court over a specific algorithm found in Red Hat’s flavor of Linux. Red Hat legally assisted Rackspace in the case because the firm uses its version of Linux in some of its products.
US firm Uniloc accused Rackspace of illegally using its patent, #5,892,697, for use in its products. Uniloc was hoping to receive damages from Rackspace based on the indiscretion.
However, Chief Judge Davis ruled that Uniloc’s patent claim was invalid. The judge based his decision on a Supreme Court ruling which finds that mathematical algorithms are not patentable.
The court’s ruling comes in the eastern district of Texas. That region has been well-known for often ruling in favor of firms like Uniloc which hold onto patents but don’t openly make anything with them.
According to Rackspace and Red Hat, the case marks the first time that the eastern district of Texas has granted an early dismissal to a patent suit on the grounds that the material in question was un-patentable.
The idea that the case could be a turning point in the battle against patent trolls is also shared by Mark Webbink, executive director of the Center for Patent Innovations at New York Law School. Webbink wrote in a blog post at Groklaw that the importance of the case could not be emphasized enough.
“The importance of this case cannot be underscored,” Webbink wrote.
“It demonstrates that a court that has been favored by patent plaintiffs for years recognizes that there are some really bad patents out there, and the court is not going to hesitate to throw them out at the first opportunity.”
The courts ruling could play favorably to future cases involving patent trolls. Patent related cases involving non-producing entities have continued to hurt open-source innovation in a variety of fields.
Earlier this year, the Electronic Frontier Foundation (EFF) took the fight to patent trolls in the 3D printing industry. The advocacy group called on the public to help uncover patents that failed to meet proper legal standards.
Posted in Cloud Hosting
Posted on June 6, 2013 at 10:54 am
Researchers have uncovered thousands of misconfigured Amazon S3 storage buckets, making it possible to obtain access to potentially highly sensitive company data.
In the tests, penetration testing firm Rapid 7 was able to access personal photos from a social media service, a car dealership’s sales records and account information, firms’ employee records as well as unprotected database backups containing site data and encrypted password.
According to Will Vandevanter, senior security consultant at Rapid 7, the firm was able to access the information having identified more than a thousand publicly accessible Amazon S3 storage buckets.
Firms typically use Amazon’s S3 system to store static content such as server backups, company documents, web logs, and publicly visible content such as website images. The files in S3 are then organised into so-called buckets.
“Although a file might be listed in a bucket it does not necessarily mean that it can be downloaded. Buckets and objects have their own access control lists,” Vandevanter wrote on a company blog.
Firms that had stored their data insecurely in S3 could be set for a rude awakening, he warned.
“Much of the data could be used to stage a network attack, compromise users accounts, or to sell on the black market.”
Typically, Amazon would make S3 buckets private, so the public ones are likely to be the results of users misconfiguration, said Vandevanter.
Nonetheless, Amazon Web Services makes it easier for would-be hackers by using a URL structure that is easy to guess, making it child’s play to access public buckets.
“Checking if a bucket is public or private is easy. All buckets have a predictable and publicly accessible URL,” he added.
Having identified hundreds of public buckets, Vandevanter and his colleagues took a random sample to check which buckets had accessible content.
They discovered more than five million accessible text documents, many of which where marked ‘private’ or confidential.
AWS told V3 there were many legitimate reasons users might choose to leave buckets open, but in cases where customers were unsure, it did its utmost to work with them to secure their data.
“AWS Support staff regularly reach out to customers who may have potential configuration issues with AWS, to assist those customers with achieving better efficiency, reduced costs, or in some cases, to remedy their security configuration and posture, for S3 and other services,” a company spokesman told V3.
Posted in Cloud Hosting
Posted on June 4, 2013 at 12:53 pm
Amazon has unveiled a security platform aimed at improving protections for its AWS cloud computing platform.
The company said that the CloudHSM platform would allow users to purchase use of dedicated security modules which can encrypt AWS instances to prevent unauthorised access.
Under the CloudHSM plan, customers can purchase use of the hardware modules which generate and store encryption keys. The keys are then used to encode and decrypt the data stored in AWS instances. The keys themselves will only be available to the users, providing additional protections.
Amazon said that while it works to secure every AWS instances, some customers have sought out additional protections. To provide more security and to comply with certain regulations regarding data storage and security, the company said that it needs to offer the heightened protections of a service such as CloudHSM.
“Until now, organizations’ only options were to maintain data in on-premises datacentres or deploy local HSMs to protect encrypted data in the cloud,” Amazon said in announcing its new service.
“Unfortunately, those options either prevented customers from migrating their most sensitive data to the cloud or significantly slowed application performance.”
Security worries have long plagued the adoption of cloud computing services. While vendors themselves have sought to dismiss such concerns, many organisations have listed fears of data breach and unauthorised access among their chief concerns for cloud migration.
Posted in Cloud Hosting
Posted on June 2, 2013 at 6:45 pm
Oracle continues to struggle to sale servers. Earlier this month the firm reported that third quarter revenue for its servers were down 23 percent year-over-year. Overall the company failed to compete with competitors with only a four percent market share.
The Larry Ellison led company has failed to find the secret sauce necessary to interest firms in its line of hardware. Oracle inability to compete with rivals has really hurt the company’s bottom line over the past few years.
While the company recently launched a new line of Sparc T5 and M5 servers, it is yet to be seen if that will be enough. Even if the servers turn out to be amazing, Oracle’s real problem is its strategy of designing entire systems for only Oracle gear.
The firm runs on the idea that by doing it all themselves they can create the best systems. Not to mention, build out platforms that require all-Oracle software and hardware.
Unfortunately, for Oracle the hardware industry no longer works without some form of corporation. Take HP for example, its Pathfinder program sees the firm working with other tech firms to create ARM servers for its Project Moonshot program.
The program sees HP co-developing servers with hardware and software vendors. HP’s approach is different from Oracle’s in that it aims to build servers that are not so closed and proprietary.
Oracle still exists on an old way of thinking. The firm believes that a business can get away with offering a proprietary system. But in today’s infrastructure that is not true. Users want choice and the ability to not be bogged down by a single option.
HP will need to focus on open platforms if it wants to turn things around. And under Meg Whitman it looks like its going that way. Project Moonshot is a good example of a new paradigm that HP is creating. Overtime that paradigm shift could mean big things for the firms ability to take some of Oracle’s business.
Things at HP and Oracle are both quite bumpy at the moment. But one firm is making the smart move (at least when it comes to servers). HP sees a future more in line with what smaller firms like Salesforce are doing. Oracle, however, is struggling to adapt.
The idea that Oracle doesn’t “get it” isn’t necessarily anything new. Salesforce chief executive and Larry Elision’s mortal enemy Marc Benioff said something similar back in 2011. But Ellison and Oracle still don’t get it.
HP is adapting, even IBM is adapting, but Oracle just doesn’t get it.
27 Mar 2013
Posted in Cloud Hosting
« Previous Page