Sr. AWS Data Engineer #27438

If you enjoy helping clients achieve business success through technology, you will love being a member of the Datum team. We hire only those with a commitment to the highest level of technical and professional excellence, coupled with leadership skills and a great attitude.

<- Back to job list

Sr. AWS Data Engineer #27438

Location: Atlanta, Georgia
ID Number: #27438

Sr. Developer

Long term Contract ,  Atlanta GA 


Top 3 skills needed

 o Skill 1: Strong hands on experience in Scala, Hadoop Platform tools, Real time streaming using Kafka, Node js, Microservice architecture and Rest APIs

Skill 2: AWS tech stack and tools experience,S3, EC2, EMR, MSK, FaaS, Cloudwatch, APIgateway , service discovery tools , monitoring alerting and dashboards.
o Skill 3: Agile practices and DevOps CI/CD .

 DATA and API   to becoming the best IT organization in the airline industry.  IT is on a journey to introduce state of the art technologies to support our innovative business needs.

. You will be embedded within a large-scale program, rotating through a variety of Test Automation, Data Engineering, and Development roles in order to gain a deep understanding of the program and emerge with subject matter expertise.

Our on the job curriculum includes paired programming, hands-on activities, and virtual training.

Training topics focus on program-specific technologies and processes. You will also be assigned mentor to provide support throughout the program.

We have active executive management sponsorship and involvement throughout program.

During your rotations, you will have the opportunity to participate in and/or complete features which address real business and customer needs. Lastly you will improve your technical, business and professional skills while working in a collaborative, agile organization.


- Lead, create and implement solid and supportable modular designs for data streaming, transformation, and API product development in support of critical applications -

Hands on design, creation, and support of data-centric products that encompass multiple specializations, platforms, and technologies

- Collaborate with solution architect and business area to analyze technical information and produce quality software

- Drive efficiencies and provide guidance in development coding practices to increase team velocity

- Successfully form creative solutions to overcome obstacles

- Lead team in automated testing and CICD processes

- Partner with consumers of the platform to understand evolving needs

- Be a technical expert on the products we build

- Document solutions in written and diagram form, and communicate across teams to ensure seamless integration

- Review developer code to ensure it meets design goals and business needs

- Identify technical issues, articulate impact and need for prioritization

- Identify technical issues, articulate impact and need for prioritization

- Proactive communication to both team and leadership

- Work collaboratively with a vendor(s) and integrate vendor as part of the unified team

- Following Agile practices, you develop quality scalable, tested and reliable data services using Industry best practices

- Accustomed to making decisions with little supervision or direction with a “can-do, make-it-happen” attitude.

- Participate in rotational on-call support as needed

- Learn our business domain and technology infrastructure quickly and share your knowledge freely and enthusiastically with others in the team WHAT ARE WE LOOKING FOR? / WHAT EXPERIENCE DO YOU NEED?

Must Haves:

- Bachelor’s or Master’s degree in Information Systems, Computer Science with 8 or more years of software development experience

- Strong data engineering background

- designing and building data pipelines to support real time streaming and eventing - minimum of 2 years

- Minimum of 2 years building micro services and API architecture

- Experience with AI/ML tools and technologies

- Experience with Hadoop, HBase, Hive and other Big Data technologies/tools

- Experience working with relational and non-relational data

- Experience working with Spark/Scala java and Python - Experience with ingestion integration tools like Apache NIFI - Must have the ability to listen to and collaborate with colleagues; convey ideas effectively; and prepare clear, written documentation

- Large project experience with high transaction volumes is required - Driven to solve difficult challenges

- Exposure to TDD and automated testing frameworks

- Experience with any of the following message formats : Parquet, Avro, Protocol Buffer

- Experience with Cloud tools and technologies, including AWS

- Create solutions on AWS using services such as Kinesis, Lambda and API Gateway.

- Candidate must be a self-leaner with the ability to pick up new technologies and provide tangible results.

- Must be solutions oriented using rigorous logic and methods to solve difficult problems with effective solutions, probing all sources for answers

- Create Self healing processes to address anomalies in the environment to minimize the human intervention.

Automate, analyze and create ad-hoc reports utilizing various data sources - Maintain current knowledge of relevant technologies

- Participates in continuous improvement efforts in enhancing and optimizing performance and providing increased functionality

- For mission critical application during prod support 24*7 on call flexibility and availability

- Work alongside with multiple cross functional team, security and architects

- Knowledge of build tools like Maven, Gradle and the artifact lifecycle from snapshot to releases and patch fixes

- You’re a self-starter who is adaptable and able to meet aggressive deadlines


- A minimum of 2 years working in a fast-paced agile environment with VersionOne or similar

- Ability to communicate with both business and technical teams on how product works and integrations with other products

- Previous experience in supporting large, critical applications in production environment with strong debugging skills required

- Proficient on at-least on any cloud automation frameworks like Terraform cloud formation etc

. - Extensive experience architecting designing and programming applications in an AWS cloud environment.

- Experience with designing and building application using AWS services such as EC2, MSK etc

- Experience architecting highly available systems that utilize load balancing, horizontal scalability and HA.

- Exposure to Node JS WHAT ELSE?


Hi I'm
Dan Yanzon
Interested in this job?
Act now!
Your message:*
Attach a resume: CHOOSE FILE
Supported file types are: doc, docx, odt, pdf, rtf, txt
* Required field

Don't see your dream job? Click Here