Carpool

What if your server could work like a carpool?

Carpooling is great! When you’re not using a car, you don’t have to pay for the service. Yet it still offers the comfort of having access to a car without having to handle all the hassle such as car maintenance, car parking and more.

If you do own a car that you don’t utilize 24/7, you still have a fixed cost for it. It’s the same with servers and hosting. Generally speaking, you pay for the hardware even though you only use a few percent of the resources. In the server world this is referred to as excess.

But what if your hosting could work like a carpool? Now this is possible, thanks to Serverless Computing. Serverless computing is a cloud-computing model in which the cloud provider, such as Google or Amazon, manages the server while dynamically and automatically providing the available hardware resources. Since the pricing is based on the actual amount of resources consumed by an application, it's often much more cost effective than using pre-purchased units of capacity (hardware such as RAM, CPU or bandwidth) which is very common traditionally.

Only pay when a request is being served

Simply put, you can say that you only pay while your web page is loading and that’s all. Amazon claims that this approach can reduce costs by up to 95% compared to traditional hosting as the “excess” cost is practically eliminated.

What about performance and scaling?

Building scalable and performant applications is hard. There is a lot if complexity involved in making your system respond quickly regardless of load and number of users talking to your system. This obstacle does not exist with Serverless Computing since the scaling and infrastructure is managed by the cloud provider. It’s therefore very easy to create applications that perform well under almost any condition and traffic.

What's the catch?

There are of course downsides to all of this. There is no such thing as a free lunch. Here are a few things to consider before going all-in on Serverless computing:

  • Other services, such as databases, integrations, data sources and more effect the performance. If they have bad performance and don’t scale you will see very little benefit from going Serverless.
  • Startup speed is very important. You can’t just “apply Serverless architecture” to your existing systems it is often a complex highly interconnected piece of software (often referred to as a Monolithic application). You can however apply it to more independent parts of your system such as integrations, image processing for example.
  • Simultaneous requests will run on separate servers.
  • It may not be ideal or cost effective for long running tasks as you pay for the time your code is running and not idling.

Typical use cases

You should definitely consider to go full on serverless if one or several of the following points are valid:

  • You have sporadic traffic or very high load
  • You already have a distributed architecture in place
  • You do data processing with a high request rate
  • You want a cost effective distributed and scalable hosting solution
  • You need to process IoT input at scale
  • You do a lot of multimedia processing

We are happy to talk more about this topic. Contact us by filling out the form and we'll get back to you as soon as possible! 

Author

Richard Davison

Solution Architect Director
Star Republic, SQLI Group