Google Cloud / Big Data Engineer - 100% Remote!

Posted 2025-03-14
Remote, USA Full-time Immediate Start

<p><strong>POSITION: </strong>Google Cloud / Big Data Engineer - 100% Remote!</p><p><strong>PAY RATE:</strong> $65.00 - $70.00 per hour W2 (~$135k - $145k per year)</p><p><strong>LOCATION: </strong>100% "Forever" Remote (client is based in Phoenix, AZ</p><p><strong>DURATION:</strong> 12-Months + Ext. (opportunity to convert permanent).</p><p><strong>MISC:</strong> W2 applicants only please (we cannot sponsor any work Visas)</p><p><br></p><p><strong><u>SUMMARY:</u></strong></p><p>Fortune 500 client in the Financial Services/FinTech sector is looking to hire a Big Data Engineer with strong GCP skills to join their team! This team develops customer-centric applications using emerging technology stacks that support a critical event-driven, microservices platform (which is moving exclusively to Google Cloud / GCP).</p><p><br></p><p><strong><u>DESIRED SKILLS &amp; EXPERIENCE:</u></strong></p><ul><li>4-5+ years of Big Data ecosystem and software engineering experience.</li><li>3-5+ years' experience with Google Cloud Platform (GCP) and it's services.</li><li>3+ years' experience with GCP BigQuery, GCP Compute Engine and Google Cloud Dataproc.</li><li>2-3+ years of hands-on experience of working with Hadoop, Map-Reduce, Hive, Spark (core, SQL and PySpark).</li><li>Hands-on experience writing complex SQL (i.e., Hive/PySpark-dataframes &amp; optimizing joins while processing huge amounts of data).</li><li>Ability to design and develop optimized data pipelines for batch and real-time data processing.</li></ul><p><br></p><p><strong><u>NICE-TO-HAVE (PLUS):</u></strong></p><ul><li>Experience with Kafka streams or queues.</li><li>Experience with GitHub and leveraging CI/CD pipelines.</li><li>Experience with NoSQL i.e., HBase, Couchbase, MongoDB.</li><li>Experience with Data Visualization tools like Tableau, Sisense, Looker.</li><li>UNIX shell scripting skills.</li></ul>

Apply Job!

Similar Jobs

Back to Job Board