The Data Engineering Architect will be responsible for creating and implementing scalable, secure, and efficient data architectures that support analytics, AI, and business intelligence across the organization. This position combines strong technical knowledge of modern data systems with strategic thinking to ensure that the data infrastructure supports enterprise objectives and long term growth.
Key Responsibilities
1. Data Architecture and Strategy
- Design and maintain complete or partial data architecture for the organization, covering ingestion, storage, processing, and access layers.
- Establish data engineering standards, frameworks, and best practices.
- Develop conceptual, logical, and physical data models for analytics, reporting, and machine learning needs.
- Assess and recommend modern data platforms such as BigQuery (preferred), AWS Redshift, Snowflake, and tools like Google Dataproc, Cloud Composer, DBT, Airflow, AWS EMR, MWAA, and Kafka.
2. Data Engineering and Development
- Lead the creation and deployment of ETL or ELT pipelines for both structured and unstructured data.
- Improve data flow and data collection processes for cross functional teams.
- Maintain data quality, lineage, and governance with automation and metadata management solutions.
- Work with DevOps teams to integrate CI or CD into data pipelines.
3. Leadership and Collaboration
- Guide and mentor data engineers and analysts while encouraging technical excellence and continuous learning.
- Work closely with data scientists, analysts, and business teams to ensure technical solutions meet analytical and business needs.
- Promote the use of cost effective, high performance cloud native, event driven, and real time data processing architectures.
4. Data Governance and Security (Optional)
- Establish policies and frameworks for data security, compliance, and privacy such as PII, GDPR, and HIPAA.
- Implement master data management and data catalog systems.
- Coordinate with Infosec teams to maintain data integrity and compliance.
Required Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field.
- At least nine years of experience in data engineering or data architecture roles.
- Strong hands-on experience with cloud data environments, preferably Google Cloud or AWS.
- Proficiency in SQL, Python, and distributed data processing tools like Spark.
- Strong knowledge of data modeling, metadata management, and data governance methods.
- Solid understanding of data warehousing, data lakes, and lakehouse structures.
Preferred Skills
- Experience with containerization and orchestration tools such as Docker, Kubernetes, and Airflow.
- Knowledge of machine learning pipelines and AI or ML data workflows.
- Strong communication and leadership abilities with the capacity to connect technical and business areas.
- Certifications in Google Cloud data platforms such as Professional Cloud Architect, Professional Data Engineer, or Professional Database Engineer, or certifications in AWS like AWS Certified Solutions Architect Professional.
CTC – 35-55 LPA(Depending on experience)
Job Category: Engineering
Job Type: Full Time
Job Location: Bangalore