Data Engineer – AWS

The Data Engineer is responsible for developing data pipeline and data engineering components to support strategic initiatives and ongoing business processes for customers.
This role works with leads, analysts, and data producers/consumers to understand requirements, develop technical solutions, and ensure the reliability and performance of the data engineering solutions.
Focus on scalability, performance, service robustness, and cost trade-offs
Continuous drive to explore, improve, enhance, automate, and optimise systems and tools to best meet evolving business and market needs
Attention to detail, coupled with ability to think abstractly
Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Create prototypes and proof-of-concepts for iterative development
Keen to learn new technologies and apply the knowledge in production systems
Take complete ownership of projects and their development cycle
Experience with big data tools like Hadoop, Spark, Kafka, fink, Hive, Sqoop etc.
Experience with relational SQL and NoSQL databases like Mysql, Postgres, Mongodb and Cassandra.
Experience with data pipeline tools like Airflow, etc.
Experience with AWS cloud services like: EC2, S3, EMR, RDS, Redshift, BigQuery
Experience with stream-processing systems like: Storm, Spark-Streaming, Flink etc.
Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Strong communication and inter-personal skills
7+ years
Bachelor / Master in Engineering, Computer Science , MCAExtensive hands-on experience implementing data migration and data processing using Azure services: Networking, Windows/Linux virtual machines, Container, Storage, ELB, AutoScaling, Azure Functions, Serverless Architecture, ARM Templates, Azure SQL DB/DW, Data Factory, Azure Stream Analytics, Azure Analysis Service, HDInsight, Databricks Azure Data Catalog, Cosmo Db, ML Studio, AI/ML, etc.
Cloud migration methodologies and processes including tools like Azure Data Factory, Event Hub, etc.
Minimum of 3 years of RDBMS experience

Job Type: Remote

Apply for this position

Allowed Type(s): .pdf, .doc, .docx