Yearly Archives: 2014

How Often Should You Backup Your Website?

Posted on November 28, 2014 at 12:17 pm

As any webmaster should know, backing up your website’s data is the only sure way to ensure against a catastrophe. Knowing this will still leave you with a lot of questions though, such as; “How often should I back up my website?”, or “How do I back up my website?”.

To answer this question, a simple answer is not enough. This is because the demands of each website is not equal, for instance; if a website is only updated sporadically, then backing it up after ever update could be the most efficient solution; however, if your website holds a massive database with hundreds or thousands of users and products, with new information being added on a near constant basis, having back-ups running nearly constantly could be the best solution, helping you to avoid liability and loss of revenue.

Depending upon the hosting package you’ve acquired, this may already be happening, and if it isn’t, acquiring it through your hosting company is a good choice.

Posted in Web Hosting Solutions

How Much Bandwidth do you need?

Posted on August 28, 2014 at 6:43 pm

When you’re buying hosting for the first time you’ll see that there are plenty of different packages with different bandwidth allowances, but how much do you need? Well, it all depends on type of website you have, as well as the number of visitors.

Once you’re able to estimate the average page size of your website, you should multiply it by the monthly average number of visitors. You then get a fair idea on how the amount of usage you need, yet the more big files you have that people will be downloading, the more bandwidth you’ll need.

In simple terms it’s always better to have some room to grow, because you never know when your traffic may start to increase significantly. If you have a friend or a peer and you want a site similar to theirs, see how much bandwidth their using, because when you’re choosing your first hosting contract, you have to get it right. We recommend you choose bandwidth for a couple of moth san see how it goes, you can then make an informed decision when you’re choosing a longer contract.

Posted in Web Hosting

Improve online visibility in Northampton

Posted on August 20, 2014 at 7:54 am

Why do top web development companies always focus on improving the usability of a website in a detailed manner? Usability can make or break the reputation of a website. Your website should motivate the visitors to come back and if people can’t use a site properly, they will get easily distracted. A top quality web development company in Northampton always concentrates hard to build an easily accessible and highly useful website for the visitors. Here are the most significant factors that increase the usability of a website in the best manner:

Content

When the visitors open a highly usable website, they can see around 90 % of the content that they are looking for quite easily and an average site reveals only 50 % to 60 % of the content. You can always associate a lot of unwanted ads, irrelevant graphics and complicated navigation with a substandard site. The content plays the most prominent role in making the web usability really superior and if you do not have quality content on your site, web pages will certainly look meaningless. A reliable web development company in Northampton will give adequate importance to content during the design process.

Page layout

If you want to display the content in an attractive fashion, your page layout should be really good. Page layout always determines how the content is displayed on the page and quality layout keeps the visitors stay on the site. If visitors cannot find the content easily and quickly, they may leave your site and that is exactly where the importance of maintaining your pages simple and clean comes in. Top developers will teach you how to position your content properly and it is being done by offering highly attractive grid layouts. Such a method of approach improves the usability in the best possible manner. You need to place primary content in the most important column and smaller columns can be used to place different elements like advertising, navigation and non-essential graphics as well.

Colours

Colours always influence the web usability and certain standard colours can be used with web browsers that represent links. For active links, you can use red colour and violet colour can be used for visited links. For other types of links, you can use blue colour as well. Using too many colours can cause confusion in the minds of the visitors and every element should be represented using the right colour. A competent web development company in Northampton selects colours with proper care and planning and they will always have better understanding about how to make a good colour palette.

<HTML>

If you do not know how to write valid HTML, you will not be able to maintain your site highly usable for the visitors. Top developers will have deep knowledge about different aspects of HTML and they will teach you how to apply the latest version of HTML to improve usability with utmost efficiency.

Conclusion

Better application of CSS properties will certainly improve the usability of your website and the load time also plays a key role in keeping the site user-friendly. The bottom line is that if you want to improve the usability in the best manner, you should hire a reliable and reputed web development company in Northampton.

 

Posted in Development

VMware adds storage and network virtualisation in push towards the software defined data centre

Posted on June 17, 2014 at 8:39 pm

VMware is advancing its vision to build the software defined data centre with several key enhancements to its cloud and virtualisation stack, including an update of vSphere, the release of its NSX network virtualisation technology and a beta of VMware Virtual SAN, a platform for virtualising storage in the datacentre.

Announced at VMware’s VMworld event in San Francisco, the updates are intended to help customers move a step closer towards the goal of an entirely automated and virtualised data centre capable of delivering services on demand and adapting dynamically to changing requirements.

While VMware has built its reputation on its vSphere platform for virtualising server compute resources, the firm is now seeking to embrace networking and storage. This is necessary in order to bring the entire data centre under automated control, according to VMware’s senior product marketing manager in EMEA, Rory Choudhuri.

“The whole of your IT needs to become virtualised. In order to achieve the response times required for business-critical applications and services, it has to be completely automated. You have to take the human out of the loop,” he said.

To this end, the firm is releasing VMware NSX, which delivers the entire Layer 2 to Layer 7 networking and security model in software. The platform unites VMware’s vCloud Network and Security (vCNS) with the network virtualisation technology it gained via the acquisition of Nicira last year.

NSX, which was first detailed by VMware earlier this year, treats the physical network as a pool of transport capacity that can be carved up by creating virtual networks as required, in a similar way that vSphere pools and allocates server resources.

However, VMware said it has learned lessons since the introduction of its hypervisor, and has been working closely with the networking industry to ensure that vendors are aware of what it is doing and are on board. Dell is preparing to ship a new line of switches compatible with VMware NSX, for example.

Meanwhile, VMware Virtual SAN (VSAN) is being made available as a public beta later this quarter. It is designed to take existing storage resources in the datacentre, such as SAN and NAS arrays and direct attached storage (DAS), pool this together, then present it back to the system.

VSAN implements a policy-driven control plane that automates storage consumption and management, and is being touted as especially useful for customers implementing virtual desktops (VDI) or Hadoop deployments.

Choudhuri said that VSAN has been operating successfully in private beta deployments for about six months and is “basically ready to go”, but that VMware is taking the cautious approach of pushing it out for public beta tests before a full commercial release.

VMware also announced vSphere 5.5, which adds performance and scalability enhancements, plus support for operating Hadoop deployments. The Hadoop support comes in vSphere Big Data Extensions, while vSphere 5.5 also enables configurations with twice the previous limits on physical CPU and memory.

Also extended in vSphere 5.5 is VMware’s high availability (HA) support. With vSphere App HA, this can detect failure of apps as well as virtual machines, and can recover from this automatically.

VMware also announced that its vCloud Automation Center and vCenter Operations Management Suite capabilities have now been added to all editions of its vCloud Suite 5.5. Previously, only the Enterprise and Advanced editions had Operations Management Suite, while vCloud Automation Center was only in the Enterprise edition.

Posted in Cloud Hosting

Gigamon launches network traffic visibility as a service tool

Posted on June 15, 2014 at 8:15 am

Traffic management firm Gigamon has unveiled an update for its GigaVue service that offers IT teams the ability to gain insights on the different traffic demands from different areas of the business.

The GigaVue 3.1 update will include a Visibility as a Service (VaaS) add-on within the Flow Mapping process to enable administrators to supply data on the traffic within departments.

This could be used by teams such as marketing to analyse visitor traffic, or security teams looking at event monitoring after an incident.

Gigamon chief strategy officer Shehzad Merchant said that providing this kind of system will help enterprises benefit from the cloud in internally to gain greater insights into their data.

“The notion of multi-tenancy has made its way from the public cloud space into enterprise IT infrastructure as well,” he said.

“This solution enables network administrators and services teams to virtualise the Visibility Fabric and offer Visibility as a Service to the different IT departments.”

The firm said that this capability will enable organisations to alter management policies on a per-team and per-department basis as needs requires, while maintaining the compliance and privacy controls they have in place across the enterprise.

These tenants, who include various IT operations teams, will have the power to dynamically change monitoring and traffic visibility policies on a per-organisation or per-tenant basis without impacting other departmental monitoring polices, while maintaining compliance and privacy.

The GigaVue 3.1 update also includes a host of other updates such as role-based access control capabilities and improved workflow displays for monitoring of policy configurations. The update will be launched on 30 September for no additional cost for existing customers of the GigaVue tool.

Posted in Cloud Hosting

Capgemini launches big data Elastic Analytics tool on Amazon Web Services

Posted on June 13, 2014 at 7:16 pm

Technology and outsourcing services firm Capgemini has unveiled a new service for firms to help better manage ever-growing amounts of data, by utilising Amazon Web Services cloud computing platform.

Capgemini claims the Elastic Analytics services will offer customers an end-to-end and big data analytics solution and will support most leading business intelligence (BI) software packages.

The Elastic Analytics services works by combining large source sets of structured and unstructured data. It does this using existing extract, transform, load technologies and the AWS Hadoop-based solution, Amazon Elastic Map Reduce (EMR). Once collected it then extracts and merges the data into analytics engines that can be used by businesses to study the data.

Capgemini senior vice president for business information management (BIM) Scott Schlesinger said that AWS’ adaptable nature makes Elastic Analytics one of the most flexible and cost effective big data solutions available.

“Organisations are continuously looking for optimized solutions that deliver shorter ‘time-to-value’ advanced analytics. AWS is a highly adaptable and extensible platform that rapidly offers organizations the ability to launch and sustain their advanced analytics initiatives,” he said.

Big data management is a growing problem facing businesses around the world, with many holding vast reserves of unstructured and often unprotected data.

Numerous companies have listed solving the big data problem as a key opportunity, with firms like SAP offering similar analytics tools through its HANA platform. The German software firm originally loaded its HANA analytics database platform as an enterprise cloud service into the cloud in May.

HP has also taken interest in the market, with chief executive Meg Whitman claiming traditional IT solutions are no longer sufficiently powerful for enterprise-level businesses, warning during a speech at HP Discover earlier this year that companies will need to move their systems into the cloud if they hope to compete in the new landscape.

Posted in Cloud Hosting

Box doubles free storage offering to 10GB, targets SMBs with £3.50 price plan

Posted on June 11, 2014 at 6:58 pm

Cloud storage firm Box has announced a pricing strategy overhaul in order to entice previously elusive SMB customers to its services.

In addition to doubling the amount of storage space available to its free customers, the company has also created a new Starter price plan specifically designed for small businesses. Costing £3.50 per user per month, Starter gives teams of up to 10 members 100GB of space with a maximum individual file size of 2GB.

Box chief operating officer Dan Levin told V3 that this new pricing strategy was a response to the realisation that the firm’s £11 per user per month Business plan was not suitable for smaller enterprises. He said that often smaller businesses instead found themselves using the free version of Box to run their businesses, which meant they could not administer additional users or properly manage their data.

“That was not the the right strategy to get those SMBs,” Levin explained. “These technologies, which are used by some of the largest companies in the world, are now accessible to SMBs in a way they haven’t previously been.”

The doubling of the storage space provided to non-paying users was an important addition, according to Levin. “In order to continue to add value to our personal users, we’re doubling the amount of storage from five to 10GB. We’re making sure that businesses that use the personal product have the right service; it’s a very important part of our business model,” he said.

As well as the new pricing plans, Box has also enhanced its Business option, adding integration for one enterprise application such as Salesforce or Active Directory. Previously this integration was only available on the more expensive Enterprise price plan, which now costs £25 per user per month and still provides unlimited application integration.

Posted in Cloud Hosting

Rackspace tempts VMware users with stepping stone to the cloud

Posted on June 9, 2014 at 6:21 pm

Cloud and hosting firm Rackspace has announced a new service aimed at getting VMware customers into using hosted infrastructure as a stepping stone to future cloud adoption.

The Dedicated VMware vCenter Server is part of Rackspace’s Managed Virtualisation service. Unlike a public cloud, it will provide managed single-tenant vCenter Servers running inside Rackspace data centres, designed to give enterprise customers the confidence to migrate existing VMware workloads outside of their own premises.

According to Rackspace, the Dedicated VMware vCenter Server environment will look and feel like an extension of the customer’s own data centre. It will enable customers to fully manage servers via the VMware vCenter APIs or equivalent third-party management tools, while providing visibility into costs and usage, with Rackspace providing support for the physical infrastructure.

Under the current Managed Virtualisation service, Rackspace exposes some features of vCenter through its MyRackspace customer portal, but not the full set of management capabilities.

Rackspace chief technology officer John Engates said: “This new service has been designed to enable customers to migrate workloads out of their data centre and into a Rackspace data centre. This allows Rackspace to do what we do best, which is providing a fully managed hybrid cloud hosting service backed by Fanatical Support with maximum uptime.”

The firm has been offering VMware-based hosted private cloud infrastructure since at least the start of 2011, but has since begun to shift its emphasis towards services based on the OpenStack cloud computing framwork that it co-founded.

Rackspace has also been strongly touting its vision of hybrid cloud computing, where private on-premise cloud infrastructure is supplemented by public cloud resources as required.

Today’s announcement can thus be seen as Rackspace attempting to drum up more hybrid cloud business by offering VMware users a stepping stone towards it. By offering dedicated servers linked to a customer’s on-premise infrastructure, the firm seems to be banking on users seeing the attraction of having someone else take care of managing the infrastructure.

At launch, licensing for for the Dedicated VMware vCenter Server will be a flat monthly fee per hypervisor, according to Rackspace, regardless of the number of VMs managed. However, the exact price has yet to be disclosed.

Posted in Cloud Hosting

HP combines Autonomy WorkSite with HP Flow CM to offer file sharing with governance

Posted on June 7, 2014 at 8:26 pm

HP has unveiled a cloud service that enables organisations to securely share and collaborate on files with colleagues, customers and business partners, while maintaining visibility and governance over content.

Called Autonomy LinkSite, it effectively integrates HP’s on-premise Autonomy WorkSite document and email management system with HP Flow CM, a cloud-based file-sharing and collaboration service. The result, according to HP, is an enterprise-grade document and email management system with the ease of use and simplicity of a consumer solution.

HP said the tool is due for early release to testers from mid-September, and is scheduled for full commercial release on or near 15 October.

HP LinkSite is the latest in a series of product launches aimed at tackling the problem of sharing content easily, while allowing an organisation oversight and control over the information contained within.

The problem, according to HP, is that workers expect to be able to share content freely as they do with consumer-grade services such as Dropbox, and are liable to resort to these if their organisation does not provide a satisfactory alternative solution.

Autonomy LinkSite addresses this by extending the traditional workspace from Autonomy WorkSite into the cloud, enabling users to share a single file or an entire project folder with others both inside and outside the firewall.

However, content uploaded to the cloud this way inherits all security properties set in Autonomy WorkSite, according to HP. All actions taken on content in the cloud are also reported via the Autonomy WorkSite audit trail, extending enterprise security and governance to the cloud.

Files shared via Autonomy LinkSite are synchronised across all employee devices, and can be accessed through any web browser, HP said.

Neil Araujo, general manager of Enterprise Content Management at HP Autonomy, said that organisations no longer have to turn a blind eye to workers using consumer file sharing services.

“Businesses now have a very attractive alternative that satisfies the needs of the users as well as the IT and compliance teams,” he said.

Pricing for HP LinkSite is expected to start at $19.95 (around £13) per user per month, depending on the length of contract, HP said. For larger enterprise organisations with 1,000 users or more, licensing is likely to be as low as $9.95 (around £6.50) per user per month, also dependant on length of contract and features selected.

Posted in Cloud Hosting

Google offers to encrypt all customer data stored in its Cloud Storage platform

Posted on June 5, 2014 at 1:48 pm

Google is beginning to encrypt all data uploaded to its Cloud Storage platform, in a bid to bolster its security credentials.

Google’s Cloud Storage is a data storage product for businesses, intended for static content such as web pages and other permanent files. Previously, users would have to create their own encryption keys and manage them personally, but with this update Google will do the legwork for its customers, handling the encryption keys and the encryption process.

Dave Barth, Cloud Storage product manager, detailed this change further on the Cloud Platform blog. He said: “If you require encryption for your data, this functionality frees you from the hassle and risk of managing your own encryption and decryption keys.

“We manage the cryptographic keys on your behalf using the same hardened key management systems that Google uses for our own encrypted data, including strict key access controls and auditing. Each Cloud Storage object’s data and metadata is encrypted under the 128-bit Advanced Encryption Standard (AES-128), and each encryption key is itself encrypted with a regularly rotated set of master keys.”

Barth added that if users wish to provide their own encryption, they are still free to do so. Currently, only new data written to the Cloud Platform will be encrypted by Google, which includes existing files that are overwritten. Barth said that older files left untouched will gradually undergo the encryption process in the coming months, while he also maintained that the service itself would not change in any visible way, either in terms of performance or functionality.

Encryption features were already available in other Google products including Persistent Disks and Scratch Disks, which are part of the Google Compute Engine cloud service. The encryption does not yet extend to Google’s consumer-facing cloud product, Google Drive.

Last week Google suffered a partial outage in various heavily used services including Search, Gmail, Drive and Talk. As a result, a 40 percent decline in overall web traffic was reported, highlighting the power that the search giant has over the world’s internet consumption.

Posted in Cloud Hosting

Next Page »