Bigdata – Principal Architect

Bigdata – Principal Architect

Company:Xebia IT Architects India Pvt Ltd

Experience: 8 – 13 yrs

Location: Dubai/ UAE

Job Description

Responsibilities:
You will own the Data Platform technical customer engagements including architectural design sessions, specific implementation projects and/or MVPs. The ideal candidate will have experience in customer-facing roles and success leading deep technical architecture discussions with senior customer executives, Enterprise Architects, IT Management and Developers to drive Data Platform solutions to production.

Key responsibilities include:

Understand customers overall data estate, IT and business priorities and success measures to design implementation architectures and solutions.
Apply technical knowledge to architect and administer solutions that meet business and IT needs
Ensure that solution exhibits high levels of performance, security, scalability, maintainability, appropriate reusability and reliability upon deployment
Collaborate with other Cloud Solution Architects and Administrators in developing complex end-to-end Enterprise solutions on Microsoft Azure platform and Cloudera,
Maintain technical skills and knowledge, keeping up to date with market trends and competitive insights; collaborate and share with the technical community while educate customers on Azure platform and Cloudera
Be an Azure and Cloudera Data Platform evangelist with customers, partners and external communities
Technical experience:

  • Platform administration like Hadoop/Elastic Cluster, Cloudera Manager, Cloudera Director in Microsoft Azure cloud and on premise
  • Knowledge on Kerberization & Sentry Implementation
  • Understand the source systems data, gather & able to transfer the data into HDFS as RAW and Decomposed layer (including writing Oozie workflow, coordinators, Sqoop and Flume)
  • Exposure in handling huge quantities of data by taking advantage of both batch and speed methods
  • Strong knowledge of and experience with data wrangling; potentially other advanced math as well and good working knowledge of SQL query
  • Experience in processing large amounts of structured / unstructured data
  • Capability to design and document solutions independently, do code reviews, Ensure standard Quality Assurance activities are completed on time
  • Hands on with spark and kafka with Java/Scala/Python

Mandatory Skills

  • Minimum 7+ years of working experience in the Azure, Cloudera, Apache Hadoop framework, HDFS, Map Reduce, Hive, Flume, Sqoop, Oozie, Spark, Scala, Spark Streaming, Kafka and HBase
  • Minimum of 5+ years of experience in Big Data and batch/real-time ingestion and analytical solutions leveraging transformational technologies
  • Good understanding about Lambda architecture
  • Implements security and recovery tools and techniques as required
  • Ensures all automated processes preserve data by managing the alignment of data availability and integration processes
  • Having exposure on Microsoft Azure Administration with Cloudera, Cloudera Director, Kerberization & Sentry
  • Having exposure on the Cloudera Manager, Cloudera Navigator, Hue & Shell Scripting
  • Having exposure on Microsoft Azure Cloud Migration from on premise to cloud
  • Kafka & Flume – Administration
  • Having exposure in automated deployment, monitoring and DevOps adoption
  • Having exposure on ADLS, Network Segmentation, Azure & on-premise network components, Azure Resources/cost optimization & Performance Optimization
  • Elastic Search & Elastic Search Cloud experience would be added advantage
  • Automated workflow migration would be added advantage
  • Integrating Hadoop with Oracle and NoSQL databases
    CRITICAL COMPETENCIES
  • Technology Doers / Full stack
  • Drive Efficiency and leaders in cutting edge technologies
  • Technical Capability development
  • Transparency and Collaboration
  • Self-driven and Self-learning
  • Problem solving and decision making
  • End-End solution
  • Appetite to develop in multiple platforms
    Preferred qualifications
    Cloud Certifications in one or more of the following: AWS, Azure, GCP
    Middleware Certifications: IBM WebSphere, IBM MQ, Tomcat, Apache web server.
    DevOps Tools Certifications: Jenkins, Chef Certification, Puppet.
    Five or more years experience in WebSphere, Resin, or another Java application server platform(s), as well as MQ, DataPower (or other XML gateway)
    Experience using Chef to develop, code, test, and debug new Chef cookbooks and recipes or enhancements to existing software.
    Certified Scrum Developer (CSD)
    Hands-on experience with Big Data Applications and have working experience with 5 or more: Hadoop, MapReduce, HDFS, YARN, Cloudera, Ambari, HortonWorks, Hadoop HDFS, Accumulo, Spark, Streaming, Kafka, Cassandra, elastic Search.
    Experience with Hadoop architecture, administration and database management
    Experience maintaining Hadoop clusters.
    Experience sizing and scaling clusters, adding and removing nodes, provisioning resources for jobs, job maintenance and scheduling
    Strong understanding of JVM debugging, management and tuning
    Operational experience with Hadoop technologies include Ambari, Ranger, Atlas, Knox, NiFi, Kafka, Storm, Hive, Pig, MapReduce, HDFS, HBase, Accumulo, Spark
    Experience with relational databases (preferably Microsft SQL Server) and data warehouse concepts
    Experience with MDM Infosphere/Datastage/Informatica in a Hadoop environment.
    Familiarity with Java/scala programming
    Soft Skills
    Relationship Building. Proven track record of building deep technical relationships with senior IT executives in large or highly strategic accounts. Experience in managing various stakeholder relationships to get consensus on solution/projects. required
    Problem Solving. Ability to solve customer problems through cloud technologies required
    Collaboration and Communication. Acknowledged for driving decisions collaboratively, resolving conflicts and ensuring follow through with exceptional verbal and written communication skills. Ability to orchestrate, lead, and influence virtual teams, ensuring successful implementation of customer projects.
    Presentation skills with a high degree of comfort with both large and small audiences (Senior Executives, IT management, Database administrators and Data Scientist) required
    Enterprise-scale technical experience with cloud and hybrid infrastructures, architecture designs, database migrations, and technology management. required
    Expertise in data estate workloads like HDInsight, Hadoop, Cloudera, Spark, Python, Kafka, Java required
    Technical Aptitude and experience to learn new technologies and understand relevant cloud trend required
    Competitive Landscape: Knowledge of cloud development platforms preferred
    Partners: Understanding of partner ecosystems and the ability to leverage partner solutions to solve customer needs preferred
    Education
    Bachelor’s or masters degree in computer science, Information Technology, Engineer or related field preferred
    For Recruiters

Key Experience to Evaluate:

o 5+ years of success in consultative/complex technical sales and deployment projects, architecture, design, implementation, and/or support of highly distributed applications required
o 3+ years experience working with large complex enterprise. accounts architecting and administering cloud solutions for data estate workloads
o 3+ years of Azure Cloud experience.
o 5+ experience with data engineering ingesting, transforming, and wrangling data. Hands-on with Spark and Kafka.
o 3 to 5 years of Hadoop administration experience, preferably using Cloudera
o 3+ years of experience on Linux, preferably RedHat/SUSE
o 2+ years of experience creating map reduce or spark jobs and ETL jobs in Hadoop, preferably using Cloudera
o 3+ years of experience on Azure IaaS for big data solutions

Keyskills:

Hadoop Hive Pig Oozie Flume Sqoop Hdfs Mapreduce Cloudera YARN Cloudera Director Cloudera Manager Sentry Kerberization Microsoft Azure Administration

Contact Details:

Recruiter Name: SARMISTA
Contact Company: Xebia IT Architects India Pvt Ltd
Reference Id: BigData Architect

Apply

Follow me on social media: