MarkTech Consultant

Sr Data Engineer

MarkTech Consultant
1000000 - 1400000 P.A.
2-5 Years Full Time
Pune, Maharashtra, IN

Vacancy: 3 Posted: 3 years ago Applicants: 0
Share via

Job Description

  • Candidates should have a minimum of 4 years of work experience with at least 2 years of experience as a Data Engineer role • Experience in migrating workloads from on-prem Hadoop to GCP
  • Strong Proficiency in GCP Technology with an emphasis on Big Query and Data Proc platform
  • Mastery in at least one of the following domain areas:
  • Big Data managing Hadoop clusters (all included services), troubleshooting cluster operation issues, migrating Hadoop workloads, architecting solutions on Hadoop, experience with NoSQL data stores like Cassandra and HBase, building batch/streaming ETL pipelines with frameworks such as Spark, Spark Streaming, and Apache Beam, and working with messaging systems like Pub/Sub, Kafka and
  • RabbitMQ Data warehouse modernization building complete • data warehouse solutions on BigQuery, including technical architectures, star/snowflake schema designs, query optimization, ETL/ELT pipelines, and reporting/analytic tools. Must have hands-on experience working with batch or streaming data
  • Processing software (such as Beam, Airflow, Hadoop, Spark, Hive). Strong SQL skills in at least 2 platforms - Hive, Big Query, or Data Proc · Experience writing software in one or more
  • Experience working with Google Cloud data products (CloudSQL, Spanner, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Bigtable, BigQuery,
  • Dataprep, Composer, etc) • Experience with systems monitoring/alerting, capacity planning, and performance tuning Role.
  • The candidate needs to be a self-starter - eager to learn new technologies, extract insights and value from data, and leverage technology to deliver those insights.
  • The candidate should focus on customer experience and design
  • Work collaboratively with architects and other engineers to recommend, prototype, build, and debug data infrastructures on Google Cloud Platform (GCP) • Perform data migrations, data archival and disaster recovery, and big data analytics solutions requiring a combination of batch or streaming data pipelines, data lakes, and data warehouses. Responsibilities
  • Design, develop, and deploy modern data warehouse solutions in the Google BigQuery platform.
  • Investigate, recommend, and implement data ingestion and ETL performance improvements Work closely with architect and business process owners to translate business requirements into technical solutions.

Skills Required: Hadoop,Big Data,NoSQL,Hive,BigData Proc


JOBS BY CATEGORY