Backend Developer - 1238 - JD

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Backend Developer (Python/PySpark)

In All Media

In All Media is a trailblazing Nearshore Managed Service Provider, laser-focused on Team


Augmentation for software development. We craft bespoke, highly specialized teams that
effortlessly merge with our client's processes and culture, delivering unparalleled results.

The Challenge

As a Backend Developer specializing in Python/PySpark, you will be instrumental in


enhancing and optimizing the Python/PySpark codebase within our data stack. Your role
will involve collaborating closely with Data Scientists, Marketing, and Product teams to
translate personalized customer messaging requirements into technical solutions.
Troubleshooting and optimizing Python/PySpark jobs, managing codebase via Git, and
ensuring smooth production code releases are central aspects of this role.

Key Responsibilities

● Develop new capabilities and enhancements to the existing Python/PySpark


codebase within our data stack.
● Collaborate with cross-functional teams to translate personalized customer
messaging requirements into technical solutions.
● Troubleshoot and optimize Python/PySpark jobs to enhance the performance and
reliability of our data stack.
● Participate actively in the full software development life cycle from concept through
delivery.
● Engage in code and design reviews with the Data Science team regarding
new/enhanced stack capabilities.
● Develop automated reports on the stack’s customer message deliveries.
● Troubleshoot the stack’s AWS/Airflow scheduling and runtime code issues as they
occur.
● Manage the codebase efficiently using Git.
● Handle production code releases for the data stack.
Minimum Qualifications

● Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a


related field.
● Strong programming skills in Python/PySpark.
● Knowledge and hands-on experience in implementing programming design
patterns.
● Good understanding of SQL.
● Experience with Jupyter notebooks or similar Python notebook technologies
(preferred).
● Ability to work independently.
● Experience using Git as a code repository (preferred).
● Experience working with cloud infrastructure and AWS services like EMR, Redshift,
etc.
● Familiarity with common workflow orchestration frameworks, such as Airflow.
● Familiarity with big data storage technologies such as Hadoop Distributed File
System (HDFS) and Amazon S3.

Benefits

● 100% remote work.


● Payments made from the US.
● International teams.
● Exciting projects; all our clients are top-notch US-based companies.
● Hourly rates in US dollars.
● Full-time and long-term projects.
● Contract as a vendor.

You might also like