Introduction
Are you passionate about building robust and scalable data pipelines that power cutting-edge Artificial Intelligence and Machine Learning solutions? Fractal, a leading global AI and analytics company, offers exciting opportunities for Data Engineers across its vibrant centers in India, including Mumbai, Bengaluru, Gurugram, and Chennai.
Fractal partners with the world’s most admired Fortune 500 companies, leveraging AI, engineering, and design to help them make smarter decisions. As a Data Engineer at Fractal, you’ll be the architect of the data ecosystems that fuel these transformative solutions. You’ll work with diverse datasets, implement big data technologies, and integrate cloud platforms to build efficient and reliable data foundations for analytics, machine learning models, and business intelligence. This role is perfect for those who thrive on challenging data problems, love working with cutting-edge technologies, and are eager to make a tangible impact on client success.
Roles and Responsibilities
A Data Engineer at Fractal is a crucial part of the analytics and AI delivery team, responsible for the end-to-end data lifecycle. Key responsibilities typically include:
- Data Pipeline Development: Designing, building, and maintaining robust, scalable, and efficient ETL/ELT pipelines to ingest, transform, and load data from various sources (databases, APIs, streaming data, cloud storage) into data lakes and data warehouses. This often involves using Python, PySpark, and SQL.
- Data Warehousing & Modeling: Developing and optimizing data models (relational and dimensional) for performance and analytics readiness. Expertise in concepts like OLTP, OLAP, facts, and dimensions.
- Big Data Technologies: Implementing solutions using big data frameworks such as Apache Spark, Hadoop, Hive, and relevant cloud-native big data services.
- Cloud Data Platforms: Working extensively with major cloud platforms (primarily Microsoft Azure, AWS, or GCP) and their data-related services (e.g., Azure Data Factory, Azure Data Lake Storage, Azure Databricks, AWS S3, Redshift, Glue, Athena, Kinesis, Google Cloud Storage, BigQuery, Dataflow).
- Performance Optimization: Identifying and resolving performance bottlenecks in data pipelines and queries, ensuring efficient data processing and delivery.
- Data Quality & Governance: Implementing data quality checks, validation processes, and ensuring data integrity and consistency. Contributing to data governance best practices.
- Collaboration: Working closely with Data Scientists, ML Engineers, Business Analysts, and client teams to understand data requirements, provide data access, and support analytics and machine learning initiatives.
- Automation & Monitoring: Automating data processes, implementing CI/CD pipelines for data solutions, and setting up monitoring and alerting for data pipelines.
- Troubleshooting: Diagnosing and resolving data-related issues, ensuring data availability and reliability.
Salary and Benefits
Fractal offers highly competitive compensation and a comprehensive benefits package for its Data Engineers in India, reflecting its position as a leading AI and analytics firm.
- Average Annual Total Compensation (CTC) in India (as of July 2025 data):
- For a Data Engineer (2-5 years of experience), the average total annual compensation is around ₹20.5 lakhs, typically ranging from ₹17.3 lakhs to ₹30.9 lakhs per annum. This includes base salary and performance-based bonuses.
- For Senior Data Engineers (5+ years of experience), salaries can be significantly higher, reaching ₹30 lakhs to ₹40+ lakhs per annum, depending on the depth of expertise in specific cloud platforms or big data technologies.
- Note: These figures are indicative and can vary based on the specific role level (e.g., L3, L4), individual skills, performance during interviews, and the specific location within India (e.g., Mumbai, Bengaluru might be at the higher end).
- Key Benefits and Perks:
- Competitive Pay & Performance Bonuses: Rewarding individual and company success.
- Health & Wellness: Comprehensive medical insurance for employees and their dependents, wellness programs, and mental health benefits.
- Flexible Work Arrangements: Fractal supports hybrid work models and offers flexibility where applicable to promote work-life balance.
- Generous Paid Time Off: Includes paid holidays, vacation, and sick leave.
- Learning & Development: Significant investment in employee growth through continuous learning programs, paid industry certifications (e.g., AWS, Azure, Snowflake), job training, and access to industry conferences.
- Mentorship Program: Opportunities to learn from experienced professionals.
- Global Exposure: Chance to work on diverse client projects across various industries globally.
- Innovation & Culture: An environment that fosters creativity, continuous improvement, and a strong sense of purpose in powering human decisions with AI.
- Company Equity: Eligible employees may receive company equity for long-term growth.
Eligibility Criteria
Fractal seeks Data Engineers with a strong technical foundation, problem-solving abilities, and a passion for data.
- Educational Qualification:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related quantitative field.
- Experience:
- Typically 2-10+ years of experience in data engineering, ETL development, data warehousing, or big data engineering roles.
- Some roles for apprentices/freshers (BCA, MCA, MSc in CS/IT) with 0-1 year experience might exist, focusing on training.
- Key Technical Skills:
- Programming Languages: Strong proficiency in Python (including PySpark) and/or Scala.
- Database & SQL: Expertise in SQL (including complex queries, window functions) and strong understanding of relational databases. Experience with NoSQL databases is a plus.
- Big Data Frameworks: Hands-on experience with Apache Spark is often mandatory. Familiarity with Hadoop, Hive.
- Cloud Platforms: Mandatory expertise in at least one major cloud platform: Microsoft Azure (Azure Data Factory, Azure Data Lake Storage, Azure Databricks, Azure SQL, HDInsights, Synapse Analytics) or AWS (S3, Redshift, Glue, Athena, Kinesis, EMR, DynamoDB) or GCP (BigQuery, Dataflow, Cloud Storage).
- Data Warehousing: Strong knowledge and hands-on experience in data warehousing concepts (OLTP, OLAP, dimensions, facts) and data modeling (relational and dimensional modeling). Experience with cloud-based data warehouses like Snowflake, Redshift, Azure Synapse is a strong plus.
- ETL/ELT Tools: Experience with ETL/ELT tools such as Informatica, Talend, Matillion, or cloud-native data pipeline orchestration tools (e.g., Azure Data Factory, AWS Glue, Apache Airflow).
- Data Governance & Quality: Understanding of data quality frameworks, data lineage, and metadata management.
- CI/CD & DevOps: Experience setting up CI/CD pipelines for data solutions and familiar with DevOps practices.
- GenAI/MLOps (Plus): Understanding how to enable analytics using cloud technology, ML Ops, and exposure to Generative AI concepts is an added advantage.
- Key Soft Skills:
- Problem-Solving: Excellent analytical and problem-solving abilities to tackle complex data challenges.
- Communication: Strong verbal and written communication skills to articulate complex technical concepts to both technical and non-technical stakeholders.
- Collaboration: Proven ability to work effectively in cross-functional, agile teams and with clients.
- Ownership & Proactiveness: Self-driven with a strong sense of ownership and a “can-do” attitude.
- Learning Agility: Eagerness to continuously learn and adapt to new technologies and client requirements.
Application Process
Fractal’s interview process for Data Engineers is designed to assess technical proficiency, problem-solving skills, and cultural fit within their client-facing, AI-driven environment.
- Online Application: Apply through the official Fractal careers portal (fractal.ai/careers). Ensure your resume clearly articulates your data engineering experience, technical skills, and specific projects.
- Online Coding Test: This is a common initial step to assess foundational coding and SQL skills. Expect questions on Python and SQL, covering data structures, algorithms, and logical aptitude.
- Technical Interview Rounds (2-3 rounds): These will be deep-dive technical discussions.
- SQL & Python/PySpark Coding: Expect live coding challenges focusing on complex SQL queries (joins, window functions, aggregations) and Python/PySpark for data manipulation, transformations, and solving data-related problems.
- Big Data Concepts: In-depth questions on Spark architecture, optimization techniques, Hadoop ecosystem, and distributed computing concepts.
- Cloud Data Services: Detailed discussions on your experience with Azure/AWS/GCP data services, including specific components like Azure Data Factory, Databricks, AWS Glue, S3, Redshift, etc. Expect scenario-based questions.
- Data Warehousing & Modeling: Questions on relational and dimensional modeling, ETL/ELT concepts, data quality, and schema design.
- Project Discussions: You’ll be asked to explain your past data pipeline projects in detail, focusing on design choices, challenges, and the impact of your work.
- Manager Round: This round assesses your behavioral acumen, team fit, problem-solving mindset, and ability to handle client-facing situations. Questions might cover your approach to support processes, collaboration, and resolving work-related challenges.
- Senior Leadership/HR Round: A final discussion to assess overall fit, career aspirations, and cultural alignment.
- Offer & Background Check: Successful candidates receive an offer, contingent on a successful background verification.
Conclusion
A Data Engineer role at Fractal in India offers a dynamic and intellectually stimulating career for data professionals passionate about building robust data foundations for AI. You’ll work on diverse and challenging projects for Fortune 500 clients, leveraging cutting-edge cloud and big data technologies. If you are a technically strong, problem-solving individual with a zeal for data engineering and a desire to contribute to the future of AI, Fractal provides an exceptional platform for growth and impact.