Staff Engineer – Cloudera-Hadoop – Big Data – Federal

Apply Now

Job Description

About the Role

Title: Staff Engineer – Cloudera-Hadoop – Big Data – Federal – 2nd Shift

Location: Lawson Lane, Santa Clara, California, United States

Employees can work remotely

Full-time

Employee Type: Regular

Region: AMS – North America and Canada

Work Persona: Flexible or Remote

Job Description:

Company Description

It all started in sunny San Diego, California in 2004 when a visionary engineer, Fred Luddy, saw the potential to transform how we work. Fast forward to today — ServiceNow stands as a global market leader, bringing innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500®. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work. But this is just the beginning of our journey. Join us as we pursue our purpose to make the world work better for everyone.

Job Description

Please Note: This position will include supporting our US Federal Government Cloud Infrastructure.

This position requires passing a ServiceNow background screening, USFedPASS (US Federal Personnel Authorization Screening Standards). This includes a credit check, criminal/misdemeanor check and taking a drug test. Any employment is contingent upon passing the screening. Due to Federal requirements, only US citizens, US naturalized citizens or US Permanent Residents, holding a green card, will be considered.

 

As a Staff DevOps Engineer-Hadoop Admin on our Big Data Federal Team you will help deliver 24×7 support for our Government Cloud infrastructure.

 

The Federal Big Data Team has 3 shifts that provide 24×7 production support for our Big Data Government cloud infrastructure.

 

Below are some highlights.

4 Day work week (Wednesday to Saturday OR Sunday to Wednesday)

No on-call rotation

Shift Bonuses for 2nd and 3rd shifts

This is a 2nd Shift position with work hours from 3 pm – 2 am Pacific Time

The Big Data team plays a critical and strategic role in ensuring that ServiceNow can exceed the availability and performance SLAs of the ServiceNow Platform powered Customer instances – deployed across the ServiceNow cloud and Azure cloud. Our mission is to:

Deliver state-of-the-art Monitoring, Analytics and Actionable Business Insights by employing new tools, Big Data systems, Enterprise Data Lake, AI, and Machine Learning methodologies that improve efficiencies across a variety of functions in the company: Cloud Operations, Customer Support, Product Usage Analytics, Product Upsell Opportunities enabling to have a significant impact both on the top-line and bottom-line growth. The Big Data team is responsible for:

Collecting, storing, and providing real-time access to large amount of data

Provide real-time analytic tools and reporting capabilities for various functions including:

Monitoring, alerting, and troubleshooting

Machine Learning, Anomaly detection and Prediction of P1s

Capacity planning

Data analytics and deriving Actionable Business Insights

What you get to do in this role

Responsible for deploying, production monitoring, maintaining and supporting of Big Data infrastructure, Applications on ServiceNow Cloud and Azure environments.

Architect and drive the end-end Big Data deployment automation from vision to delivering the automation of Big Data foundational modules (Cloudera CDP), prerequisite components and Applications leveraging Ansible, Puppet, Terraform, Jenkins, Docker, Kubernetes to deliver end-end deployment automation across all ServiceNow environments.

Automate Continuous Integration / Continuous Deployment (CI/CD) data pipelines for applications leveraging tools such as Jenkins, Ansible, and Docker.

Performance tuning and troubleshooting of various Hadoop components and other data analytics tools in the environment: HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue, Kerberos, Tableau, Grafana, MariaDB, and Prometheus.

Provide production support to resolve critical Big Data pipelines and application issues and mitigating or minimizing any impact on Big Data applications. Collaborate closely with Site Reliability Engineers (SRE), Customer Support (CS), Developers, QA and System engineering teams in replicating complex issues leveraging broad experience with UI, SQL, Full-stack and Big Data technologies.

Responsible for enforcing data governance policies in Commercial and Regulated Big Data environments.

Qualifications

To be successful in this role you have:

6+ years experience with Cloudera-Hadoop Systems Administration

Deep understanding of Hadoop/Big Data Ecosystem.

Good knowledge in Querying and analyzing large amounts of data on Hadoop HDFS using Hive and Spark Streaming and working on systems like HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue, Tableau, Grafana, MariaDB, and Prometheus.

Experience supporting CI/CD pipelines on Cloudera on Native cloud and Azure/AWS environments

Experience securing Hadoop stack with Sentry, Ranger, LDAP, Kerberos KDC

Strong Linux Systems Administration skills

Ability to code automation scripts in Bash and Python knowledge

In-depth knowledge of Linux internals (Centos 7.x) and shell scripting

Ability to learn quickly in a fast-paced, dynamic team environment

GCS-23

For positions in the Bay Area, we offer a base pay of $158,500 – $277,500, plus equity (when applicable), variable/incentive compensation and benefits. Sales positions generally offer a competitive On Target Earnings (OTE) incentive compensation structure. Please note that the base pay shown is a guideline, and individual total compensation will vary based on factors such as qualifications, skill level, competencies and work location. We also offer health plans, including flexible spending accounts, a 401(k) Plan with company match, ESPP, matching donations, a flexible time away plan and family leave programs (subject to eligibility requirements). Compensation is based on the geographic location in which the role is located, and is subject to change based on work location.

Work Personas

We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work.