Big Data Engineer

Zettalogix INC Logo
  • Software
  • Shift
  • Applications have closed

Job Title: Big Data Engineer

Location: Charlotte, NC (uptown) or Chandler, AZ (3 days onsite, 2 days remote)

Duration: 12+ Months Contract

++Must Haves:++

  • Strong experience with big data platforms such as MapR, Hortonworks, and Cloudera Data Platform
  • Hands-on expertise with data virtualization tools such as Dremio, JupyterHub,and AtScale
  • Proficiency in deploying and managing tools in cloud and containerized environments (CDP, OCP)
  • Solid understanding of platform engineering, automation scripting, and DevOps practices
  • Proven ability to troubleshoot complex issues and perform root cause analysis
  • Experience in leading technical efforts
  • Enterprise level experience

++Desired Skills:++

  • Certifications in Cloudera, OpenShift, or related technologies
  • Experience with enterprise-level data lake architectures and governance

++Day to Day:++

  • Administer and support tools on the Data private clou including CDP, HWX, MapR
  • Install, configure, and maintain data analytical and virtualization tools such as Dremio, JupyterHub and AtScale across multiple clusters
  • Develop proof-of-concept solutions leveraging CDP and OCP technologies
  • Deploy tools and troubleshoot issues, perform root cause analysis, and remediate vulnerabilities
  • Act as a technical subject matter expert supporting programming staff during development, testing, and implementation phases
  • Develop automation scripts for configuration and maintenance of data virtualization tools
  • Lead complex platform design, coding, and testing efforts
  • Drive advanced modeling, simulation, and analysis initiatives
  • Maintain comprehensive documentation of Hadoop cluster configurations, processes, and procedures
  • Generate reports on cluster usage, performance metrics, and capacity utilization
  • Work closely with data engineers, data scientists, and other stakeholders to understand their requirements and provide necessary support
  • Collaborate with IT infrastructure teams for integrating Dremio Tool, Hadoop clusters with existing systems and services