Java Big Data Developer
W2 Contract / C2C
Job Description: Big Data Engineer with GCP experience.
- Design, Build and operationalize large-scale enterprise data solutions.
- Hands-on experience analyzing and re-platforming on-prem data warehouses to data platforms on GCP cloud using GCP/3rd party services
- Experience using Cloud Spanner to handle relational data and Big Table to store vast volumes of Key-Value pairs and Big Query to do interactive data analysis.
- Experience using Google Dataflow in conjunction with PubSub / Kafka to process and analyze real-time streaming data.
- Expertise in designing and building data pipelines from ingestion to consumption within a hybrid BigData architecture
- Experience with integration of data from multiple data sources
- Proven ability to work effectively in a fast-paced, interdisciplinary, and deadline-driven environment.
- Strong problem-solving and troubleshooting skills.
- Big Data experience with Spark
- Should have a good understanding of how to build an end-end data pipeline in the cloud. NOTE: If they didn’t get a chance to make the pipeline in the cloud but they know how to do it, that will work
- Google cloud experience. NOTE: If somebody is strong in AWS and is willing to learn how GCP works before the interview quickly, that will work.
- If the individual has previous experience working with Informatica, that will be an added advantage.
- Please look for people who moved from ETL to Big data space.
Job Type: Full-time
Pay: Up to $65.00 per hour
- Dental insurance
- Health insurance
- Paid time off
- Vision insurance
- 8 hour shift
- Day shift
- Monday to Friday
- Temporarily due to COVID-19