Click here to close now.

Welcome!

Cloud Expo Authors: VictorOps Blog, Amit Gupta, Andreas Grabner, Carmen Gonzalez, Pat Romanski

Blog Feed Post

ElasticHosts Reduce Cost of Hosting By 50% With The Launch Of Next Generation Cloud Servers

Auto-scaling "Elastic Containers" elastically expand and contract as needed, enabling typical cloud users to cut their bill in half by only paying for the capacity they actually use

London, 9th April 2014: Cloud server provider, ElasticHosts, today announced the launch of its new Elastic Containers; the first cloud servers to be billed based purely on consumption, rather than capacity, delivering substantial cost savings. ElasticHosts' breakthrough auto-scaling container technology elastically expands and contracts to meet customer demands, entirely eliminating the need for manual provisioning. Elastic Containers are available to all Linux users from ElasticHosts' global datacentres, backed by Solid State Drive (SSD) storage for added performance. Elastic Containers require no additional software or server configuration; customers simply sign up for the service and capacity is continuously available.

This revolutionary technology promises to disrupt the cloud market by offering:

  • The industry's first and only usage-based billing system: Traditional capacity-based IaaS billing is focused on provisioning capacity in blocks; yet concerns over performance and availability force companies to pay for a buffer of excess space that they are not actually using. By billing for actual usage, rather than available capacity, customers can run their servers with space continuously available for immediate scaling at no additional cost; they only pay for what they actually use, right down to the MB.

  • Auto-scaling for continuous high performance availability: The cost of a website or application failure can run into the millions of pounds; therefore servers need to be running with sufficient capacity at all times. By using Elastic Containers, companies can now handle all their usage peaks and troughs effortlessly, automatically scaling each container up to 64GB RAM. This provides peace of mind that capacity will be instantly available when needed, ensuring continuity of service without downtime.

  • Self-managing infrastructure: Currently, aside from deploying complex software to automate the process, companies are forced to provision and adjust capacity manually, which can be costly, time consuming and inaccurate. ElasticHosts' auto-scaling, elastic infrastructure expands and contracts automatically, completely removing the need for manual management or provisioning; companies can simply turn on the service and forget about it.

Richard Davies, CEO and Co-Founder of ElasticHosts, comments: "We've analysed hundreds of servers from some of our largest customers and noticed two major differences: firstly, a server running a typical workload will see 50% cost saving versus other major IaaS clouds, since typically less than 50% of total capacity is used through a full weekly cycle. Secondly, a server which frequently runs below its peak capacity, either due to idle periods or because it only occasionally needs to handle a large load, can save 75% or more. To help customers take full advantage of these savings, we are billing in 15 minute intervals based on usage, as opposed to the common industry practice of hourly billing based on available capacity."

Elastic Containers represent the next generation of cloud server technology. Recent advances in the Linux kernel have enabled ElasticHosts to bring its new elastic auto-scaling, next-generation Elastic Containers to the masses. Aside from disrupting the cloud market, Elastic Containers will also impact the load balancer and disaster recovery markets:

  • Eliminate load balancers: To scale, but avoid paying for unused capacity, companies currently deploy load balanced clusters and add and remove cloud servers from these according to demand. With ElasticHosts' next-generation Elastic Containers this crude and imprecise block-based approach can be avoided, as all peaks and troughs in demand are handled within the container. By billing on usage, rather than capacity, immediate and automatic scaling up and down is always possible at no extra cost and with no additional software or hardware - such as load balancers.

  • Reduce disaster recovery costs: Current disaster recovery solutions are very expensive, with companies replicating 50-100% of their servers as 'hot spares' and constantly provisioning them at full capacity. This means they are effectively paying for capacity twice. Elastic Containers can strip out 80% or more of these costs, as a fully-configured version of the primary server - commonly known as a 'hot spare' - can be running continuously. This version is ready for action if needed, but runs at a minimal cost, since actual usage is very low on the idle 'hot spare'.

Davies concludes: "We have been building towards this elastic vision for six years and now the technology has caught up to enable it. We were one of the first European cloud infrastructure companies in 2008, first to use the Linux KVM hypervisor, first to offer free choice of server sizing and first to offer SSD at all instance sizes. Now our breakthrough next-generation Elastic Containers are the first and only cloud servers that can provide truly elastic, intelligent auto-scaling. Companies no longer have to sacrifice performance to reduce costs; they can have their cake and eat it too; the future of cloud is here!"

About ElasticHosts
ElasticHosts is a global Cloud Server provider that offers easy-to-use Cloud Servers with instant, flexible computing capacity. As well as Elastic Containers, ElasticHosts also offers traditional Virtual Machines, Managed Cloud Servers and Reseller Programs. The company has thousands of customers in over 60 countries worldwide, and has 9 data centres located in UK, Europe, US, Canada, Asia and Australia. Its headquarters are in London, UK. ElasticHosts is committed to developing simple, flexible and cost-effective cloud services for businesses worldwide.

About Linux Containers
Recent advances in the mainline Linux kernel now support containers, an operating system-level virtualisation method. Containers offer higher flexibility and lower performance overhead than traditional hypervisor-based virtualisation.

For further information, please contact:
Spark Communications
[email protected]
+44 (0)20 7436 0420

ElasticHosts
[email protected]
www.elastichosts.com
+44 (0)20 7183 8250
+1 415 358 5210

Source: RealWire

Read the original blog entry...

More Stories By RealWire News Distribution

RealWire is a global news release distribution service specialising in the online media. The RealWire approach focuses on delivering relevant content to the receivers of our client's news releases. As we know that it is only through delivering relevance, that influence can ever be achieved.

@CloudExpo Stories
NaviSite, Inc., a Time Warner Cable company, has opened a new enterprise-class data center located in Santa Clara, California. The new data center will enable NaviSite to meet growing demands for its enterprise-class Cloud and Managed Services from existing and new customers. This facility, which is owned by data center solution provider Digital Realty, will join NaviSite’s fabric of nine existing data centers across the U.S. and U.K., all of which are designed to provide a resilient, secure, hi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
The IoT market is projected to be $1.9 trillion tidal wave that’s bigger than the combined market for smartphones, tablets and PCs. While IoT is widely discussed, what not being talked about are the monetization opportunities that are created from ubiquitous connectivity and the ensuing avalanche of data. While we cannot foresee every service that the IoT will enable, we should future-proof operations by preparing to monetize them with extremely agile systems.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and ot...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-cri...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com...
There has been a lot of discussion recently in the DevOps space over whether there is a unique form of DevOps for large enterprises or is it just vendors looking to sell services and tools. In his session at DevOps Summit, Chris Riley, a technologist, discussed whether Enterprise DevOps is a unique species or not. What makes DevOps adoption in the enterprise unique or what doesn’t? Unique or not, what does this mean for adopting DevOps in enterprise size organizations? He also explored differe...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...