Industry use cases and case study !!
In 2013 Docker began to gain popularity by allowing developers to quickly create, run and scale their applications by creating containers. In just two years, Docker had been able to turn a niche technology into a fundamental tool within everyone’s reach thanks to its greater ease of use. But if the number of applications grows in the system, it becomes complicated to manage. Google is probably the first company that realized it needed a better way to implement and manage its software components to scale globally, and for years developed Borg (later called…
Ansible is a simple way to automate apps and infrastructure. It includes Application Deployment, Configuration Management and Continuous Delivery. It runs on many Unix-like systems, and can configure both Unix systems as well as Microsoft Windows.
1537 companies reportedly use Ansible in their tech stacks, including LaunchDarkly, Tokopedia, and ViaVarejo.
Looking at Ansible customers by industry, we find that Computer Software (36%) and Information Technology and Services (9%) are the largest segments.
Description — AWS*_ 💻
🔅 Create a key pair
🔅 Create a security group
🔅 Launch an instance using the above created key pair and security group.
🔅 Create an EBS volume of 1 GB.
🔅 The final step is to attach the above created EBS volume to the instance you created in the previous
1. Check the AWS CLI version
According to Amazon, the number of active AWS users exceeds 1,000,000. Whether it’s technology giants, television networks, banks, food manufacturers, or governments, many different organizations are using AWS to develop, deploy, and host applications.
Here are the names of some company who use AWS:-
Adobe, Autodesk, Canon, Coursera, Disney, Docker, Reddit, Philips, Slack, Sony, Ubisoft, HTC, Nokia, SAP, Johnson & Johnson, ESPN.
According to Intricately, the top ten AWS users based on EC2 monthly spend are:-
In this Article we are going to discuss how Hadoop so Fastly store and retrieve the Data from the Bigdata. And Discuss About Some Big Company like Facebook …
Hadoop is a framework that allows us to store Big Data in a distributed environment, so that, you can process it parallelly.
Components of Hadoop: -
HDFS: It allows us to store data of various formats across a cluster.
YARN: We used for resource management in Hadoop. It allows parallel processing over the data
It an extraordinary computational system, where we can interconnect computers using…
1. Create a Security group that allows the port 80.
2. Launch EC2 instance.
3. In this Ec2 instance use the existing key or provided key and security group which we have created in step 1.
4. Launch one Volume using the EFS service and attach it in your VPC, then mount that volume into /var/www/html
5. A developer has uploaded the code into Github repo also the repo has some images.
6. Copy the Github repo code into /var/www/html
7. Create an S3 bucket, and copy/deploy the images from Github repo into the s3 bucket and change the permission…
1. Write an Infrastructure as code using Terraform, which automatically creates a VPC.
2. In that VPC we have to create 2 subnets:
3. public subnet [ Accessible for Public World! ]
4. private subnet [ Restricted for Public World! ]
5. Create a public-facing internet gateway to connect our VPC/Network to the internet world and attach this gateway to our VPC.
6. Create a routing table for Internet gateway so that instance can connect to the outside world, update and associate…
In this story, I am going to tell you about how to set up your Kubernetes cluster on the top of the AWS cloud and deploy any web app and connect with the SQL server and store data in centralized EFS storage. We also discuss the Farget cluster in this Article.
TOPIC WE DISCUSS:-
i. Kubernetes Cluster using AWS EKS.
ii. Integrate EKS with EC2,EBS,LB,EFS.
iii. Deploying WordPress & MySQL on top of it.
iv. Farget Cluster (Serverless Architecture).
You should have some basic knowledge of EC2 instance, EFS storage, Kubernetes, and yml syntax. You should have Aws IAM account…
Here we create the custom Docker image using Dockerfile and integrate all the tools so that we can fully automate the process.
1. Create container image that’s has Jenkins installed using dockerfile
2. When we launch this image, it should automatically start the Jenkins service in the container.
3. Create a job chain of job1, job2, job3 and job4 using build pipeline plugin in Jenkins
4. Job1: Pull the Github repo automatically when some developers push the repo to Github.
5. Job2: By looking at the code or program file, Jenkins should automatically start the respective language interpreter install image…