Dell Technologies company is hiring for Consultant, Data Engineer Jobs 2026 |DELL Jobs 2026
Position Description:
Dell Technologies is driven by data. Our business is built on the understanding that data creates value and drives progress. So, we take the stewardship of our data very seriously. That’s the vital role played by Business Intelligence teams within Business Support. Internal clients across Dell rely on us to analyze, test, validate and maintain the integrity of data.
Join us as a Consultant, Data Engineer on our Business Intelligence team in Bangalore into do the best work of your career and make a profound social impact.
🎓Qualification: Bachelor’s degree or higher
📌Application closing date: 30-April-26
👉Salary: Expected Rs. 5.5 to 10 LPA
Position Summary:
As a Consultant – Data Engineering, you will architect, build, optimize, and operationalize modern data solutions on the Google Cloud Platform, with deep expertise in BigQuery, SQL, Apache Airflow, Data Modelling, and ETL/ELT frameworks. You will lead complex data engineering initiatives while ensuring scalable, high‑performance, and cost‑optimized data systems.
Disclaimer: Welcome to rajeshjobportal.com!! Guys We gather job Information from various sources, including job websites and company official portals, to bring you the job opportunities to your interests. please verify job details individually, before taking any action. Its important to note that rajeshjobportal.com does not showcased any own info on our platform, not are we involved in the hiring process and were here to support you in your job search journey!….Thank You & Keep Support.
Work Responsibilities:
- Lead end‑to‑end architecture, design, and development of large‑scale data pipelines using BigQuery, Airflow, and GCP services.
- Define data engineering best practices, coding standards, and governance guidelines.
- Build highly optimized ETL/ELT pipelines for diverse structured and unstructured data sources using BigQuery, Airflow, and GCP-native tools.
- Develop efficient data models (star, snowflake, semantic layers) for analytics and machine learning use cases.
- Optimize data architecture for reliability, security, and cost efficiency across environments
Technical Requirements:
- SQL (analytical SQL, query optimization, window functions)
- Apache Airflow (DAG Scheduling, understanding)
- Data modelling (dimensional modelling, normalization/denormalization strategies)
- ETL/ELT development using modern cloud tooling
- Cloud services (Big Query, GCS, Dataflow, Cloud Composer) & Python or Java for data engineering (object‑oriented and functional paradigms)
How to Apply:
- 1. Review Job Description/ Information.
- 2. Click the Apply Link: Scroll down and click the <Apply Link” button to be redirected to the company official website.
- 3. Fill out the Application:
- 4. Double-Check your Information:
- 5. Submit Your Application:
- Resume Screening: The Recruitment Team will evaluate Applications based on skills, academics, and previous projects or work experience (if any).