Position: Big Data Engineer
· 10-15 years of Professional experience.
· Very strong Java knowledge building frameworks/modules
· 5+ years of architecture experience.
· Preferred GCP Data services Experience as requested below, if not should have, strong Hadoop, Java Spark experience or similar data services in AWS.
· GCP Data Experience Big Query, Looker, Data Flow, Data Fusion, Cloud Big Table, Cloud SQL, Cloud Dataproc.
· Hands-on programming experience in Apache Spark using Java/Scala and Spark Streaming
· Exposure to Google DLP API, Cloud Composer, Apache Nifi.
· Knowledge of RDBMS, well versed with SQL, will be required for source data analysis/ extraction/transformation.
· Google Cloud certifications preferred.
· At least 1 project experience on Google Cloud using data-related services.
· Working knowledge of JIRA, Confluence and other Agile work management tools
· Ability to work with remote teams
· Knowledge in building CI/CD pipelines for ETL executions.
· Knowledge in implementing Behave testing practices for Java ETL frameworks.
· Knowledge in automation of ETL testing.
· High level Skill to experience mapping
o Java - 4, Big data Architect - 3, Spark - 4, ETL - 4 , Apache Beam/DataFlow - 3, GCP/AWS/Azure - 3