Synechron, a recognized leader and expert in building business value for global financial services, is looking for a Bigdata Platform Engineer in Toronto, ON.
Please find the below Job Description and if you are interested please help me out with the updated copy of your resume.
Platform engineer – Basically data engineer plus experience managing Hadoop cluster
- Good understanding of Linux
- Experience with automation tools like Jenkins or Ansible
- Strong knowledge in Java or Scala
- Experience with running workloads on clusters
- Experience managing Hadoop or Spark clusters
- Experience with Kubernetes and /or any other cloud environment – Nice to have
- Experience with Python/shell is a plus
- Experience working with Big Data tools and building high performance, high throughput, and distributed data pipeline and big data platform with Hadoop, Spark, Kakfa, Hive, and Presto.
- Experience in building tools to diagnose and fix complex distributed systems handling petabytes of data & drive opportunities to automate infrastructure, deployments, and observability of data services.
- Experience in testing, monitoring, administering, optimizing and operating multiple Hadoop / Spark clusters across cloud providers - GCP and on premise data centers, primarily in Python, Java and Scala
Synechron is a recognized leader and expert in building business value for global financial services and Fortune 500 companies. With offices in USA, Canada, UK, The Netherlands, UAE, India, Singapore, Hong Kong and Japan, we provide strategy, architecture, BPM, design solutions and professional services for the implementation of enterprise level data warehouses, data delivery and transactional systems. Our staff of 8000+ employees includes industry-recognized and published experts in Enterprise Architecture, Information Management, Data Warehousing, Integration Architecture, Web services and Business Intelligence.