Search Carrier Jobs
Data Engineer
Ecospace Campus 3A, 4th Floor, Outer Ring Road, Bellandur, Bengaluru- 560103 Building No 12D, Floor 5, Raheja Mindspace, Cyberabad, Madhapur, Hyderabad - 500081, Telangana, India
Role: Data Engineer
Location: Hyderabad, India
Full/ Part-time: Full time
Build a career with confidence
Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do.
About the Role:
The Data Engineer will design, build, and operate scalable, reliable data pipelines and data products on a GCP‑based Lakehouse architecture. This role focuses on enabling analytics, AI/ML, and data products by leveraging Google Cloud Platform services while adhering to open data standards to support long‑term portability and interoperability.
You will work closely with platform engineers, architects, data product owners, and governance teams to deliver trusted, well‑governed datasets using modern batch and streaming patterns on GCP.
Key Responsibilities:
1. Data Pipeline Development (Batch & Streaming)
- Design and implement batch and streaming data pipelines on GCP using services such as Cloud Storage, BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Composer.
- Build ingestion and transformation pipelines that support raw, curated, and consumption‑ready datasets aligned to Lakehouse patterns.
- Optimize pipelines for performance, reliability, and cost efficiency on GCP.
2. Lakehouse & Open Data Standards
- Implement data storage and processing using Lakehouse principles, including separation of storage and compute.
- Work with open table formats (e.g., Apache Iceberg or equivalent) and open file formats (e.g., Parquet) to enable interoperability across engines.
- Support schema evolution, time‑travel, and transactional consistency where required by analytics and AI use cases.
3. Data Modeling & Data Products
- Design analytical data models optimized for BI, reporting, and advanced analytics on GCP.
- Partner with data product owners to deliver reusable, well‑documented data products.
- Ensure datasets are discoverable, understandable, and trusted by downstream consumers.
4. Data Quality, Governance & Lineage
- Implement automated data quality checks and validation rules as part of data pipelines.
- Capture and publish metadata and lineage in alignment with enterprise standards and platform capabilities.
- Follow defined security and access‑control patterns for sensitive and regulated data.
5. DataOps & Operational Excellence
- Contribute to CI/CD pipelines for data workloads, including automated testing and deployment.
- Monitor and troubleshoot data pipelines to ensure operational stability.
- Participate in production support and incident resolution for data pipelines.
- Apply FinOps‑aware practices to manage and optimize GCP data processing costs.
6. Collaboration & Continuous Improvement
- Collaborate with platform engineers and architects to align pipeline implementations with platform standards.
- Contribute to shared frameworks, templates, and best practices for data engineering on GCP.
- Mentor junior engineers and support knowledge sharing within the data engineering community.
Required Qualifications
- 5+ years of experience in data engineering or related roles.
- Strong hands‑on experience with Google Cloud Platform, including services such as:
- Cloud Storage
- BigQuery
- Dataflow / Dataproc
- Pub/Sub
- Cloud Composer
- Proficiency in Python and SQL for data processing and transformation.
- Experience building batch and streaming data pipelines in production environments.
- Solid understanding of Lakehouse architecture concepts and modern analytics data modeling.
Preferred Qualifications
- Experience with open table formats (e.g., Apache Iceberg) and multi‑engine query patterns. [GCP Activation Plan | PowerPoint]
- Familiarity with Dataplex, data catalogs, or metadata management tools on GCP.
- Exposure to AI/ML pipelines or integration with Vertex AI. [GCP Activation Plan | PowerPoint]
- Experience with infrastructure‑as‑code (e.g., Terraform) and DevOps/DataOps practices.
- Prior experience in cloud‑agnostic or multi‑cloud data platforms.
Benefits
We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary.
- Enjoy your best years with our retirement savings plan
- Have peace of mind and body with our health insurance
- Make yourself a priority with flexible schedules, parental leave and our holiday purchase scheme
- Drive forward your career through professional development opportunities
- Achieve your personal goals with our Employee Assistance Programme.
Our commitment to you
Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way.
Join us and make a difference.
Apply Now!
Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.
Job Applicant's Privacy Notice:
Click on this link to read the Job Applicant's Privacy Notice
Explore Jobs at Carrier
You currently have no recently viewed jobs.
You currently have no recently viewed jobs.