0% found this document useful (0 votes)
61 views

DevOps Intern Assignment - Nov2021

The document describes a technical assignment for an internship as a DevOps Engineer at DataGrokr. It consists of two parts testing skills with AWS services. Part 1 involves creating a Python script to read a JSON file and send messages to an SQS queue to trigger a Lambda function that will validate data and store in DynamoDB. Part 2 involves a Lambda function to assume a cross-account role, read a file from one S3 bucket, modify it, and upload to another S3 bucket. Applicants must complete the code challenges and submit required AWS resources and Python files by the deadline.

Uploaded by

Aneesh Jain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views

DevOps Intern Assignment - Nov2021

The document describes a technical assignment for an internship as a DevOps Engineer at DataGrokr. It consists of two parts testing skills with AWS services. Part 1 involves creating a Python script to read a JSON file and send messages to an SQS queue to trigger a Lambda function that will validate data and store in DynamoDB. Part 2 involves a Lambda function to assume a cross-account role, read a file from one S3 bucket, modify it, and upload to another S3 bucket. Applicants must complete the code challenges and submit required AWS resources and Python files by the deadline.

Uploaded by

Aneesh Jain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Hi,

Thank you for your interest in the DevOps Engineer Internship at DataGrokr.

DataGrokr provides solutions to our global clients in the fields of Cloud enablement, IT
Automation (DevOps), data management and Big Data Analytics. We work on several
different technologies, and we must adapt very quickly to our client demand.

Learning agility and ability to deal with new technologies on the fly is a must-have
requirement to succeed at DataGrokr. It is a great place if you love challenges.

As such, our selection process is geared to identify candidates who will enjoy this type of work
and thrive in this environment. As a part of our selection process, we ask interested
candidates to complete a technical assignment. Given the nature of the DevOps internship,
this technical assignment requires some familiarity with the AWS platform.

The details for the technical assignment are provided in the attached document. The
document contains detailed instructions on what you need to do as well as the deliverables
expected.

Your submissions are due to us by end of day November 14th. Based on your submissions,
we will shortlist some candidates and schedule Google Meet/Skype interviews in the
following week.

Good luck and we hope you enjoy the assignment and learn something new in the process.

Thanks,
DataGrokr Team.
Assignment Instructions

The assignment will test your knowledge of AWS services. If you do not have any familiarity
with AWS, you may find this assignment difficult.

You will need an AWS account to solve the assignment. You create and manage your AWS
resources in an AWS account. Please create an AWS Free Tier account if you do not have one
already. You can Use AWS Free tier resources and services, so that you do not incur any cost
to solve the assignments.

For example, use t2.micro EC2 instance which is free of cost. You will be able to solve all the
assignment questions using free tier account and resources.

AWS Free Tier account


How do I create and activate a new AWS account?

The technical assessment contains two parts.

Part-1:

Our client wants to process, validate, and store information from a JSON file to a DynamoDB
in AWS account. Client can’t store the raw JSON file directly in AWS account due to a
compliance requirement, so they want to push the contents of the file from local machine to
a SQS queue and process it.

You are assigned to this project and your manager has asked you to create a Script to push
the attached file contents to SQS Queue and a Lambda function to process, validate and store
information that meets the following requirements:

1. Develop Python program to read the attached JSON file(user.json) and connect to your
AWS account to push the contents of the file to a SQS Queue. Separate message must
be sent to SQS queue for each user in the JSON file. Create a SQS queue manually in
advance in your AWS account, so that the python program can send the messages to
it.

2. Arrival of message in SQS queue from #1 should trigger a Lambda function, the lambda
function should parse the message to store the following details about the user in an
existing DynamoDB table. Create the DynamoDB manually in advance with these 4
columns given below and UserId should be the Primary Key in the table.
Hint: Use AWS SDK for Python (boto3) to store the information in DynamoDB table.

i. UserName
ii. UserId
iii. RegistrationCode
iv. Status
Before storing the details of the user to DynamoDB table the lambda function should
validate the RegistrationCode of each user as per the following validation rules:

a) It must end with "@imagine".


b) It must start with 6 numbers.
c) It cannot have any other special characters beside "@" & "#" and must
include "@" & "#".
d) Its length should always be greater than 16.

e) If it passes all the validation checks:


user details should be added to the DynamoDB, and "Status" should be
marked as "VALID".

f) If it fails in any of the validation check:


user details should be added to the DynamoDB, and its "Status" should be
marked as "INVALID".

g) Finally delete the message from the queue once processed successfully.

3. Non-functional requirements:
I. The Lambda function code should be properly formatted and readable
with proper naming conventions.
II. While you can use any language to develop the lambda code, Python is
strongly preferred.

4. Useful resources:

I. What is AWS Lambda?


II. Building Lambda functions with Python
III. CRUD Operations in DynamoDB with Python
IV. SQS Send Message
V. Get started with SQS and Python

Boto3 library reference.


VI. DynamoDB

Deliverables:
1. Create an IAM user (datagrokr) with ReadOnly permission and send us your AWS account
alias, username and password so we can log into your account.
2. A document containing the names of the AWS resources that are involved in your solution:
I. SQS Queue
II. IAM role for Lambda execution
III. Lambda function
IV. DynamoDB table
3. Python program code (.py file) to send messages to SQS queue.
Part-2:

Our client wants to access and read files in a S3 bucket on our AWS account and store the files
in their own AWS account.

You are assigned this specific task to read a JSON file from a S3 bucket in DataGrokr’s AWS
account, append data to the file and store the modified file to a S3 bucket in your AWS
account that meets the following requirements:

1. Create a Lambda function that will assume the IAM role


arn:aws:iam::001082169132:role/S3GetObjectAccess to read devops-
assignment.json file from datagrokr-devops-technical-assignment-june-2021
DataGrokr’s S3 bucket in us-east-1 region.
Hint: Use AWS SDK for Python (boto3) to assume cross account role.

2. The JSON file currently has one key - “message”. Your lambda function should read
the JSON file and add one more key - “user”. Value for the “user” key should be your
name. The Lambda function should store the modified file to a S3 bucket in your
account.

File in DataGrokr’s S3 bucket:

{
"message": "Hello from the DevOps internship hiring team."
}

File in your AWS’s S3 bucket should look like this:

{
"message": "Hello from the DevOps internship hiring team."
"user": "Bob"

3. Non-functional requirements:
a. The Lambda function code should be properly formatted and readable with
proper naming conventions.
b. While you can use any language, Python is strongly preferred.
4. Useful resources:
a. Assume role from another AWS account
b. Amazon S3 sample Python Code

Boto3 library reference.


c. S3
Deliverables:
1. A document containing the names of the AWS resources that are involved in your solution:
a. IAM role for Lambda execution
b. Lambda function
c. S3 bucket

Logistics of submission:
1. Please send all your submissions to cloud@datagrokr.com on or before November 14th.
2. The subject of the email should be “DevOps: <FirstName LastName>”
3. The email should have two attachments:
a. Your latest resume
b. One comprehensive document
c. Python code (.py file) to send messages to SQS queue.
4. The comprehensive document should have the following sections:
a. IAM Credentials for us to login into your account
b. Part-1 resource names
c. Part-2 resource names

If you have any questions about the assignments, please drop a mail to cloud@datagrokr.com
with a clear description of the issues/question. Please mention the subject line of the email
as “DevOps: Need help” to get a timely response. Please remember to Google the
issue/question before asking us!

Caution on copying assignments:


If we find any two assignments to be similar, we will disqualify both the applications. Please
do not share your assignments with your friends or store them on public repos like Github
and risk getting disqualified.

Good luck and we hope you learn something new in this process!

You might also like