Blog

Stackery Smooths AWS Serverless Integration for DevOps

Top Devops Consultants

Stackery today announced a plug-in capability that will allow developers to build and debug code destined to run on the Lambda serverless computing service provided by Amazon Web Services (AWS) from within an integrated development environment (IDE) running on a local machine.

Company CTO Chase Douglas said the latest Stackery tool can be dropped into any IDE, either via a Stackery VS Code Serverless Tools Plug-In that automates installation or a manual installation process, regardless of the programming language being employed. The goal is to make it easier for DevOps teams to construct workflows that include AWS serverless computing frameworks, which now are expanding rapidly, he said. In total, the Stackery tools can invoke any one of 86 AWS CloudFormation-compatible resources.

Stackery provides a set of visual tools that make it easier to write functions for the AWS Lambda service without having to know how to create YAML files to manipulate infrastructure as code. This latest offering extends that capability to local environments, which will allow DevOps teams to work offline on their laptops and make changes without having to make every change on the AWS cloud over an internet connection. This capability does not require customers to set up a Stackery account to use it.

Also read- Robust skills, not DevOps, is key to intelligent developments

Right now, Stackery is squarely focused on providing those tools for AWS environments, but Thomas said eventually those tools will be extended to other serverless computing environments.

The goal is to provide DevOps teams with tools that make all serverless computing frameworks more accessible to both developers and IT operations teams, Douglas said.

Serverless computing frameworks are gaining in popularity because they make it easier for developers to invoke additional IT infrastructure resources to kick-off, for example, an analytics module whenever a certain event occurs within their application. That capability reduces the amount of code developers need to include in their core applications. Serverless computing frameworks are usually employed to run stateless processes for a very short amount of time, while containers and virtual machines continue to be relied on for running longer running applications. But for DevOps, that creates another layer of abstraction to include within the context of a larger continuous integration/continuous deployment workflow.

Most organizations are already struggling with mastering the workflows they employ to manage the building and deployment of applications using containers and virtual machines, so any new abstraction layer of computing resources need to incorporate within the workflows easily if the providers of that service hope to achieve mainstream adoption among DevOps practitioners. Long-term, of course, the expectation is all these processes will be completely automated, which means developers only will have to ever worry about writing business logic.

In the meantime, it’s likely to be a while before DevOps shops are invoking multiple serverless computing frameworks running on different clouds, much less automating them on end to end. That day, however, may not be as far off as people today might assume.

Disclaimer- This article was originally published on www.devops.com.

The following two tabs change content below.
BDCC

BDCC

Co-Founder & Director, Business Management
BDCC Global is a leading DevOps research company. We believe in sharing knowledge and increasing awareness, and to contribute to this cause, we try to include all the latest changes, news, and fresh content from the DevOps world into our blogs.
BDCC

About BDCC

BDCC Global is a leading DevOps research company. We believe in sharing knowledge and increasing awareness, and to contribute to this cause, we try to include all the latest changes, news, and fresh content from the DevOps world into our blogs.