JobVacancyResult
Want a Job
Post a Job
Home
Blog
Pricing
FAQ's
Contact Us
Want a Job
Post a Job
Sr Data Engineer
MarkTech Consultant
1000000 - 1400000 P.A.
2-5 Years
Full Time
Pune, Maharashtra, IN
Apply Now
Save Job
Vacancy:
3
Posted:
3 years ago
Applicants:
0
Share via
Job Description
Candidates should have a minimum of 4 years of work experience with at least 2 years of experience as a Data Engineer role • Experience in migrating workloads from on-prem Hadoop to GCP
Strong Proficiency in GCP Technology with an emphasis on Big Query and Data Proc platform
Mastery in at least one of the following domain areas:
Big Data managing Hadoop clusters (all included services), troubleshooting cluster operation issues, migrating Hadoop workloads, architecting solutions on Hadoop, experience with NoSQL data stores like Cassandra and HBase, building batch/streaming ETL pipelines with frameworks such as Spark, Spark Streaming, and Apache Beam, and working with messaging systems like Pub/Sub, Kafka and
RabbitMQ Data warehouse modernization building complete • data warehouse solutions on BigQuery, including technical architectures, star/snowflake schema designs, query optimization, ETL/ELT pipelines, and reporting/analytic tools. Must have hands-on experience working with batch or streaming data
Processing software (such as Beam, Airflow, Hadoop, Spark, Hive). Strong SQL skills in at least 2 platforms - Hive, Big Query, or Data Proc · Experience writing software in one or more
Experience working with Google Cloud data products (CloudSQL, Spanner, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Bigtable, BigQuery,
Dataprep, Composer, etc) • Experience with systems monitoring/alerting, capacity planning, and performance tuning Role.
The candidate needs to be a self-starter - eager to learn new technologies, extract insights and value from data, and leverage technology to deliver those insights.
The candidate should focus on customer experience and design
Work collaboratively with architects and other engineers to recommend, prototype, build, and debug data infrastructures on Google Cloud Platform (GCP) • Perform data migrations, data archival and disaster recovery, and big data analytics solutions requiring a combination of batch or streaming data pipelines, data lakes, and data warehouses. Responsibilities
Design, develop, and deploy modern data warehouse solutions in the Google BigQuery platform.
Investigate, recommend, and implement data ingestion and ETL performance improvements Work closely with architect and business process owners to translate business requirements into technical solutions.
Skills Required:
Hadoop,Big Data,NoSQL,Hive,BigData Proc
")
JOBS BY CATEGORY
Location
Bangalore
Chandigarh
Chennai
Coimbatore
Delhi
Kolkata
Lucknow
Mumbai
Pune
Hyderabad
IT Jobs
Android Developer
Biotechnology
Digital Marketing
Graphic Designer
Networking
Web Developer
Non IT Jobs
BPO Jobs
Call Center Jobs
Content Writing Jobs
Electrical Engineering Jobs
Event Management Jobs
HR Jobs
Hotel Management Jobs
Roles
Accountant Jobs
Air Hostess Jobs
Business Analyst Jobs
Computer Operator Jobs
Data Analyst Jobs
Data Entry Operator Jobs
Networking Jobs
Other Jobs
10th Pass Jobs
12th Pass Jobs
Contract Jobs
Freelance Jobs
Fresher Jobs
Jobs for Women
Part Time Jobs
Walk-ins Jobs
Work from Home Jobs
JOBS BY CATEGORY
Location
Delhi
Mumbai
Bengaluru
Kolkata
Chennai
More Jobs
IT Jobs
Web Developer
Marketing
Designer
SEO
Coder
More Jobs
Non IT Jobs
Accountant
Call center
Hotel
Sales
Content writing
More Jobs
Roles
Air hostess
Data Analyst
Business Analyst
Networking
Accountant
More Jobs
Other Jobs
Walkins
Fresher
Freelance
Part time
Contract
More Jobs