View All Mercedes-Benz JobsMercedes-Benz
- Designs and builds conceptual and logical data models and flowcharts which conform to existing standards and conventions
- Collect, collate data from disparate data sources such as RDBMS systems, logs, data files of various formats. Strong SQL and data model understanding is necessary.
- Sound knowledge on building the Data Pipeline from scalability point of view by understanding the Application Landscape. Analyse structural requirements and builds data pipeline and models for analytics applications
- Cleanse the data, treat and process data for correctness, uniformity, fitment for further analytical processing. Should work with the large and diverse volumes of enterprise datasets.
- Strong Knowledge and working experience on ETL concept. Proficiency in data warehousing design, tuning and ETL/ELT process development
- Extensive experience on building Data Pipeline by scrapping data from Multiple Data Source (CSV, Excel, JSON etc).
- Experience in migrating data from legacy applications.
- Bachelor’s Degree in Information technology or Engineering
- Should be good in communication to create data stories and explain satirical outcomes. Should be able work with Notebooks and EDA components to explain analytical outcomes in a convincing way supported by data and visualizations.
- Knowledge of programing languages: Python and SQL.
- Knowledge on Big Data Analytics (PySpark)
- Expertise in working with modern data lakes and data warehouses.
- Strong knowledge on ETL Process and Technique.
- Good to have Knowledge in data modelling, process modelling, master data management, metadata management and enterprise data management.
- Proficiency in database structure principles and Relational DBMS with 3rd normal form relational modelling and dimensional modelling
- An Aspirant with good to have experience on AI/ML use cases with appropriate python/R packages. Deep dive into business domain and domain data, extract features, do through feature engineering so that interactive and iterative ML models can be built.
- Familiar with Cloud Solution : Azure (Data Bricks, Data Factory, ADLS )
- Deployment Knowledge: CI/CD Tool Chain.
- Strong knowledge on SQL DB and NoSQL DB.
- Exposure on Data science modelling concept.
- Evaluation and recommendation of new data management and storage technologies/standards
- Should independently work with global teams, deliver analytical and predictive outcomes expected. Also keep looking for opportunities to mine additional use cases as domain and data understanding grows.
- Knowledge of data mining and segmentation techniques
Vacancy Type: Full Time
Job Location: Bengaluru, Karnataka, India
Application Deadline: N/A