DevOps Intern Assignment - Nov2021
DevOps Intern Assignment - Nov2021
Thank you for your interest in the DevOps Engineer Internship at DataGrokr.
DataGrokr provides solutions to our global clients in the fields of Cloud enablement, IT
Automation (DevOps), data management and Big Data Analytics. We work on several
different technologies, and we must adapt very quickly to our client demand.
Learning agility and ability to deal with new technologies on the fly is a must-have
requirement to succeed at DataGrokr. It is a great place if you love challenges.
As such, our selection process is geared to identify candidates who will enjoy this type of work
and thrive in this environment. As a part of our selection process, we ask interested
candidates to complete a technical assignment. Given the nature of the DevOps internship,
this technical assignment requires some familiarity with the AWS platform.
The details for the technical assignment are provided in the attached document. The
document contains detailed instructions on what you need to do as well as the deliverables
expected.
Your submissions are due to us by end of day November 14th. Based on your submissions,
we will shortlist some candidates and schedule Google Meet/Skype interviews in the
following week.
Good luck and we hope you enjoy the assignment and learn something new in the process.
Thanks,
DataGrokr Team.
Assignment Instructions
The assignment will test your knowledge of AWS services. If you do not have any familiarity
with AWS, you may find this assignment difficult.
You will need an AWS account to solve the assignment. You create and manage your AWS
resources in an AWS account. Please create an AWS Free Tier account if you do not have one
already. You can Use AWS Free tier resources and services, so that you do not incur any cost
to solve the assignments.
For example, use t2.micro EC2 instance which is free of cost. You will be able to solve all the
assignment questions using free tier account and resources.
Part-1:
Our client wants to process, validate, and store information from a JSON file to a DynamoDB
in AWS account. Client can’t store the raw JSON file directly in AWS account due to a
compliance requirement, so they want to push the contents of the file from local machine to
a SQS queue and process it.
You are assigned to this project and your manager has asked you to create a Script to push
the attached file contents to SQS Queue and a Lambda function to process, validate and store
information that meets the following requirements:
1. Develop Python program to read the attached JSON file(user.json) and connect to your
AWS account to push the contents of the file to a SQS Queue. Separate message must
be sent to SQS queue for each user in the JSON file. Create a SQS queue manually in
advance in your AWS account, so that the python program can send the messages to
it.
2. Arrival of message in SQS queue from #1 should trigger a Lambda function, the lambda
function should parse the message to store the following details about the user in an
existing DynamoDB table. Create the DynamoDB manually in advance with these 4
columns given below and UserId should be the Primary Key in the table.
Hint: Use AWS SDK for Python (boto3) to store the information in DynamoDB table.
i. UserName
ii. UserId
iii. RegistrationCode
iv. Status
Before storing the details of the user to DynamoDB table the lambda function should
validate the RegistrationCode of each user as per the following validation rules:
g) Finally delete the message from the queue once processed successfully.
3. Non-functional requirements:
I. The Lambda function code should be properly formatted and readable
with proper naming conventions.
II. While you can use any language to develop the lambda code, Python is
strongly preferred.
4. Useful resources:
Deliverables:
1. Create an IAM user (datagrokr) with ReadOnly permission and send us your AWS account
alias, username and password so we can log into your account.
2. A document containing the names of the AWS resources that are involved in your solution:
I. SQS Queue
II. IAM role for Lambda execution
III. Lambda function
IV. DynamoDB table
3. Python program code (.py file) to send messages to SQS queue.
Part-2:
Our client wants to access and read files in a S3 bucket on our AWS account and store the files
in their own AWS account.
You are assigned this specific task to read a JSON file from a S3 bucket in DataGrokr’s AWS
account, append data to the file and store the modified file to a S3 bucket in your AWS
account that meets the following requirements:
2. The JSON file currently has one key - “message”. Your lambda function should read
the JSON file and add one more key - “user”. Value for the “user” key should be your
name. The Lambda function should store the modified file to a S3 bucket in your
account.
{
"message": "Hello from the DevOps internship hiring team."
}
{
"message": "Hello from the DevOps internship hiring team."
"user": "Bob"
3. Non-functional requirements:
a. The Lambda function code should be properly formatted and readable with
proper naming conventions.
b. While you can use any language, Python is strongly preferred.
4. Useful resources:
a. Assume role from another AWS account
b. Amazon S3 sample Python Code
Logistics of submission:
1. Please send all your submissions to cloud@datagrokr.com on or before November 14th.
2. The subject of the email should be “DevOps: <FirstName LastName>”
3. The email should have two attachments:
a. Your latest resume
b. One comprehensive document
c. Python code (.py file) to send messages to SQS queue.
4. The comprehensive document should have the following sections:
a. IAM Credentials for us to login into your account
b. Part-1 resource names
c. Part-2 resource names
If you have any questions about the assignments, please drop a mail to cloud@datagrokr.com
with a clear description of the issues/question. Please mention the subject line of the email
as “DevOps: Need help” to get a timely response. Please remember to Google the
issue/question before asking us!
Good luck and we hope you learn something new in this process!