Job
Description
Big Data Developers
In this role you will:
- Innovate, design, develop data extraction, transfer and load programs for large data warehouse to achieve near real-time speeds
- Automate data collection, pre-processing, governance, and analysis.
- Work closely with platform and UI teams to implement the data processing logic using Scala, Spark batch & Spark Streaming.
- Communicate findings clearly and succinctly to technical and non-technical audience.
OTHER FUNCTIONS AND RESPONSIBILITIES
- Project Lifecycle Management
- Project Documentation
- Technical Innovation to embrace latest trends in BI
Education and Experience Required:
- Bachelor's or Master's degree in Computer Science, Information Systems, or equivalent.
Minimum Qualifications:
- Experience with being solution leader for data lake and data warehouse.
- Exposure to cloudera Hadoop ecosystem.
- Proficient in distributed computing frameworks such as Spark and Kafka.
- Strong programming skills in Scala.
- Expertise in writing and understanding complex HQL\SQL queries.
- Good hands-on experience to make oozie scheduling and error handling methodology.
- Experience with database architecture and design including scalability, performance and high availability.
- Ability to work independently in a fast-paced, iterative development environment.
- Strong communication skills and team player
REQUIRED COMPETENCIES (Mandatory)
- Proficiency and minimum hands-on experience of 3-5 years in end to end Big-data projects
- Good verbal & written communication skills to interact with the relevant stakeholders
- Flexibility in adapting to new technologies.
Flexibility to work across time zones and diverse culturesRoles and Responsibilities