September 20, 2019

When the cost and effort of running your own IT infrastructure is diverting attention and resources away from your core business, it’s definitely time to turn to the public cloud.

All you need is access to the internet and then every employee in your business can start to benefit from modern, efficient services … without being tied to a specific machine or a specific office. Today’s public cloud providers take responsibility for the uptime of your applications and the ongoing maintenance of the underlying infrastructure. That means your IT professionals can focus on developing the tools and solutions that will give your business the edge in the marketplace.  Most of us know about the basic principles and advantages of using the public cloud. In this blog post, Marcin Dąbrowski, Chief Innovation Office at ITMAGINATION and his team share 30 useful tips based on their experience in helping companies from a variety of different geographies and sectors move their IT operations to the cloud. This list includes some general tips, followed by sections dedicated to Microsoft Azure, Amazon Web Services (AWS) and Terraform. Did you know that…


General tips

  1. IBM Multicloud Manager enables businesses to manage entire cloud environments – including multiple types of cloud, from multiple vendors – in an easy and efficient manner. With just this one tool, you’re able to manage several Kubernetes clusters, across multiple locations, and regardless of whether they’re public or private.
  2. Thanks to Jaeger infrastructure, it’s possible to trace the entire flow of communication between microservices. The tool is open source and is a Cloud Native Computing Foundation incubator project.
  1. Excel 365, the cloud-based counterpart of Microsoft’s classic spreadsheet tool, is capable of handling over one million records at a time. Power Pivot is an incredibly powerful tool that enables you to build tabular data models and to perform business analytics directly within the program. Regardless of how they’ve felt about Excel in the past, most analysts have come to love this feature.
  1. In Excel 365, a simple webpage URL can be provided as a source of data. Provided that the webpage contains data that is structured in tabular form, it can be easily imported into an Excel spreadsheet and updated, on demand, without having to go through the mundane process of ‘copy, paste’ again and again.
  1. In Power BI, Microsoft’s suite of interactive data visualization tools, the header of a report can be defined dynamically. By working with the data analysis expressions (DAX), the headers – and even their appearance (e.g. font, font color, background) can change based on the data displayed in the visualization.
  1. Many services facilitate the creation and rapid integration of machine learning on the cloud. Code notebooks based on Jupyter Notebooks, such as AWS EMR and Azure Notebooks, enable developers and engineers to experiment with specific models in a live environment. When the model is ready, it can be quickly tested and deployed on your chosen cloud platform.
  1. With the use of Kubernetes clusters, it’s possible to deploy your application on many different cloud platforms and also on your ‘on-premise’ infrastructure simultaneously. At present, Kubernetes is supported by AWS, Azure and Google Cloud Platform, among others.
  1. With AWS Lambda, you pay only for the actual compute time required to support your application, and not for each hour that the server is assigned to the application. You also don’t need to worry about scaling your cluster of virtual machines or the instance of your application – all the administration and scaling is done for you. Microsoft’s Azure Functions also offers similar features. These solutions are known as event-driven serverless computing platforms, and they allow us to focus all our efforts on building great applications, without worrying about much of the administration associated with provisioning, sizing, deploying, etc.
  1. Amazon DynamoDB, Azure Cosmos DB and other NoSQL database services are worthy of consideration if you’re working with systems built using Event Sourcing. Using these services, processing a single event results in the updating of a single aggregate in the database, and it all happens in milliseconds.
  1. gRPC and Google’s Protocol Buffers are becoming increasingly popular as alternatives to REST and JSON for facilitating communication between microservices. gRPC is available for all of the most commonly used programming language. What’s more, it has been shown to be more than twice as fast as REST and yet uses significantly less resource. gRPC is also a Cloud Native Computing Foundation incubator project.
  1. Apache Kafka, a distributed streaming platform, allows you to set your ACK parameters. This means you can change the way you send communications from ‘fire and forget’ to a way in which you receive a confirmation that the message was sent (and won’t be lost as a result of changes in the cluster). In this way, you can be 100% sure that your messages are reaching their intended destination.

Microsoft Azure

  1. By using Azure API Management, you can monetize access to the interface of your application program (API). Using the Stripe platform, it’s possible to monitor the actual use of your APIs by clients, set limits, create pricing plans and charge accordingly.
  1. Once you’ve taught your instance of Azure Cognitive Services to recognize specific patterns, it’s possible to access and make use of the results offline. For example, you can access an exported model from TensorFlow using your Android smartphone without connecting to the internet.
  1. Virtual machines in the cloud can be useful for much more than creating testing or production environments. For example, series NV Azure virtual machines with NVIDIA cards can serve as remote rendering stations or platforms for video game streaming.
  1. In Azure Functions, it’s possible to write functions using PowerShell. Although it’s not a standard toolkit for building websites, it can be used by Administrator teams or DevOps professionals to automatically create their own APIs and to make use of them with a tool like, for example, HTTP Trigger.
  1. The second generation of Azure Data Lake Storage is becoming increasing popular. However, it’s important to remember that it’s not easy to unblock files that are blocked. For now, the only way to unblock them is to wait for a ‘timeout’, which means to wait for the expiration of the block, or to cancel the ‘lease’ on the file using a REST API.
  1. It’s possible to create, share and manage package feeds in a variety of different formats (such as NuGet, Npm, Maven, Gradle) in a quick-and-easy way using Azure Artifacts. In this way, you have complete control of the packages added to your project and to the dependencies between them.
  1. In Azure, MySQL databases in the form of Platform as a Service (PaaS) are automatically backed up every 5 minutes, and backed-up copies are available for 35 days after the backup takes place. In the cases of high availability requirements, replicas are stored in the same region (to enable rapid access) and, in the case of failure, the replicas are activated automatically.
  2. Azure Event Grid enables you to define which listeners and receivers do and do not require constant connection to the Service Bus. This means that savings can be made by switching off listeners that are not being used.
  3. Keys and other confidential data saved in Azure Key Vault can be used in PowerShell scripts to automatically create environments. This is done by writing an appropriate Runbook in Azure Automation.

Amazon Web Services (AWS)

  1. Amazon Elastic Compute Cloud (EC2) Instance Connect – a web-based service that enables scalable computing power in the cloud – facilitates access using Secure Shell (SHH) communication protocol to EC2. Access rights can be controlled though AWS Identity and Access Management (IAM). The idea is simple – in the moment of connection, EC2 Instance Connect provides a one-time temporary public key to the metadata of the instance.
  2. Logs can be transferred automatically from the AWS CloudWatch service to AWS ElasticSearch, which dramatically improves their search ability and the ability to find errors.
  3. By making use of AWS Spot Instances, it’s possible to save up to 90% of your virtual machine costs. A good idea is to make use of this feature on development environments or even as part of your production cluster.
  4. With the help of Event Source, you can send messages into the Amazon Simple Queue Service (SQS) to activate functions that have been coded and deployed as AWS Lambda.
  5. When using the PaaS model within AWS, in addition to the many SQL and NoSQL database options that are available, it’s worth checking out Amazon Neptune, a graph database that supports the Open Graph API as well as queries with the popular Apache TinkerPop Gremlin. Of course, with PaaS, you don’t need to worry about machines, availability or backups.
  6. As a way of protecting data saved in AWS, you can switch on the low-level encryption within the Relational Database Service (RDS) with the use of keys saved in AWS Key Management Service (KMS). Most of the services available on the AWS platform (such as S3, EBS and DynamoDB) offer this possibility. In this way, any leaked backups or other leaked files do not threaten the security of the data.

Terraform

  1. Terraform enables infrastructure to be managed and also facilitates the storing of its definition in code form, in HCL. This Infrastructure-as-Code (IaC) model makes it easy to create new environments on the cloud platform of your choice (e.g. AWS, Azure, Google Cloud, IBM Cloud).
  2. With Terraform, you can do much more than simply automate the provision of infrastructure within the cloud. You can also create Kubernetes clusters. You can divide them by Deployments or Services and, what’s more, you can define clusters to be deployed ‘on-premise’ with the same approach.
  3. With multi-stage YAML Pipelines in Azure DevOps you can define CI and CD processes using YAML files saved together with the source code of the application in a code repository. Using this, together with Docker files and Terraform, it’s possible to create a complete definition of the of the product within the code itself.
  4. HCL language used in Terraform does not provide you with the option to use IF statements. But it’s easy to define whether a given resource should be created or not using the COUNT parameter. You can calculate this value using other variables and logic operators and have the values 0 or 1 returned.

How can ITMAGINATION empower you with the public cloud?

ITMAGINATION is helping companies of all shapes and sizes move their IT to the cloud and develop the optimal mix of cloud services. And we practice what we preach – the vast majority of ITMAGINATION’s IT estate runs from public cloud platforms. Furthermore, ITMAGINATION is an official partner of leading technology companies such as Amazon, Google, IBM, Microsoft and Oracle. We know cloud and we know how to empower your business to maximize the benefits. Let’s talk.

 

Learn it. Know it. Done.