Enterprise Data Management team is integral to Nissan’s data strategy to develop new digital offerings, We build next generation data platforms in cloud to enable X360, Single source of Truth and Advanced data analytics solutions.
We are looking for Hands-On lead data engineer to build next generation data warehouses, data lakes and business intelligence solutions in AWS part of Nissan X360 initiatives. The candidate would be required to understand and gather the data warehouse / data lake requirements, design and implement data pipelines in cloud (AWS preferred).
Minimum 7-10 years of experience in the data engineering space with at least few complex & high volume data projects experience is mandatory
The primary responsibilities include:
- Collaborate with product / business owners through the product/project lifecycle and develop data engineering solutions aligning to Nissan's data strategy.
- Lead a team of data engineers and guide them during the development lifecycle.
- Work with solution and Data architect to resolve any technical and architectural challenges during the implementation.
- Design and develop various components of data pipeline that include data ingestion, data processing and analysis of business data.
- Create high level and low level design for the module.
- Lead scoping effort, provide inputs in preparing effort, and time estimates for projects.
- Develop medium to complex modules using coding standards and guidelines. Build reusable components/frameworks.
- Perform Code review and maintain code review standards
- Follow best practices of agile & DevOps while implementing solutions.
Skills and Qualifications:
The ideal candidate should have worked on end-to-end data warehousing, data lake solutions in cloud platforms (AWS). The candidate should have the following skills sets:
- Strong data engineering (ETL) experience in cloud preferably in AWS.AWS Certification (developer/Devops/SA) preferred.
- Excellent understanding of distributed computing paradigm.
- Should have excellent experience in data warehouse and data lake implementation.
- Should have excellent experience in Relational databases, ETL design patterns and ETL development.
- Should have excellent experience in CICD frameworks and container based deployments.
- Should have excellent programming and SQL skills.
- Should have good exposure to No-SQL and Big Data technologies.
- Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some of the below technologies:
- Data integration/Engineering – ETL tools like Talend ETL, AWS Glue etc. Experience in Talend Cloud ETL will be plus.
- Data warehouse - Snowflake and or AWS Redshift. Experience in Snowflake cloud DWH would be an advantage.
- Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies.
- Programming - Java/Python/Scala and SQL.
- Data visualization – Tools like Tableau, Quick sight.
- Master data management (MDM) – Concepts and experience in tools like Informatica & Talend MDM.
- Demonstrate strong analytical and problem solving capability
- Good understanding of the data eco-system, both current and future data trends.
- Should be a go to person for the above technologie