Job Search

Home > Job Search

Data Architect

Posted on:


Closing on:


Brief description :

Job Description

Nissan is a pioneer in Innovation and Technology. With a focus on Mobility, Operational Excellence, Value to our Customers and Electrification of vehicles, you can expect to be part of a very exciting journey here at Nissan.
Nissan is going after a massive Digital Transformation backed by leading technologies across the organization globally. We are committed to building a diverse, entrepreneurial organization, and our current team is a strong evidence of that. Our people are what drive the business forward. At Nissan Digital, you will be part of a dynamic team with ample opportunities to grow and make a difference.


The Role:

We are looking for passionate and seasoned data architects to build next generation data warehouses / data lakes and business intelligence solutions for our business stakeholders. The candidate would be required to understand and gather the data warehouse / data lake requirements, architect, design / model and implement timely solutions for the business needs. The candidate should be able to own and be responsible for the complete architecture of the solution.

Data architect will own, optimize and maintain conceptual, logical and physical data models. Exposure to Sales / Finance, Manufacturing, Quality, Engineering, Supply Chain, Finance, After-sales / Credit and Digital workplace Domain would be preferred.



Data Architect need to architect, design / model and implement data lake / data warehouse and business intelligence solutions using various technologies. The primary responsibilities include:

  • Own and drive the architecture for a project / product
  • Choose the right architecture that would solve the business problems
  • Lead a team of technical people, mentor and guide them during the solution
  • Communicate and translate the technical solution to the team and other stakeholders
  • Drive re-use initiatives and promote re-use culture within the team
  • Design and implement various components of data pipeline that include data integration, storage, processing and analysis of business data
  • Analyze business requirements and derive conceptual model by identifying entities and the relationship between them. Identify attributes and create logical and physical models
  • Proficient in business process modelling, process flow modeling, data flow modeling
  • Proficient in creating flowcharts, process flows and data flow diagrams
  • Collaborate with application development and support teams on various IT projects
  • Develop complex modules and proof of concepts
  • Lead and drive performance optimization efforts
  • Define standards and guidelines for the project and ensure team is following those
  • Assist in setting strategic direction for database, infrastructure and technology through necessary research and development activities
  • Automation should be a key driver and the development & testing should be automated through frameworks / tools
  • Monitoring performance and advising any necessary infrastructure changes through capacity planning & sizing exercises
  • Work with various vendors like cloud providers and consultants to deliver the project
  • Define non functional requirements and make sure that the solution adhere to the requirements
  • Define best practices of agile & DevOps while implementing solutions

Skills and Qualifications:

The ideal candidate should have worked on end-to-end data warehousing / data lake and / or business intelligence solutions in the past. The candidate should have the following skills sets:

  • Excellent understanding of distributed computing paradigm
  • Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts
  • Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write queries
  • Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some of the below technologies:
    • Data integration – ETL tools like Talend and cloud Ingestion mechanisms.
    • Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in Snowflake modelling would be an advantage
    • Data visualization – Tools like Tableau
    • Big data – Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig and HIVE (good to have)
    • Data processing frameworks – Spark & Spark streaming  (good to have)
  • Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required
  • You should have mandatory knowledge of various data modelling techniques and should be hands on with data modelling tools like ERWin, TOAD, PowerDesigner, etc.
  • Experience in cloud data eco-system, specifically with AWS. Experience in AWS data components like EC2, S3, Redshift & RDS, managed services like Aurora & Athena are desirable
  • Demonstrate strong analytical and problem solving capability
  • Good understanding of the data eco-system, both current and future data trends
  • 10+ years of experience in the data engineering space with at least few complex & high volume data projects as an architect is mandatory

Preferred skills