We are looking for an ETL Developer to work with large data volumes read from scattered sources of information in the technology infrastructure of an organization.
Languages: Scala Spark
You bring to Applaudo the following competencies:
2+ years of experience with Scala Spark
3+ years of data delivery, ETL (extract, transform and load) and data warehouse design, analysis, and programming experience.
Experience with an excellent grasp of relational and dimensional data modeling
Strong mathematical, statistical, and analytics skills
1+ year of Agile experience
English is a requirement, as you will be working directly with US-based clients.
You will be accountable for the following responsibilities:
- Extracting data from different data sources and transferring it into a data warehouse environment.
- Designing, maintaining and implementing transactional and analytical data storage structures.
- Design, build and maintain data pipelines, consuming for multiple sources, and servicing multiple tenants
- Experience in one or more of the following DBMS: PostgreSQL, MySQL, Google Cloud Datastore, Cosmos
- Experience in one or more of the following programming languages: Scala, PL/SQL, Java
- Experience in one or more of the following cloud environment tools: GCP
- Elaborate informative, expressive, and meaningful reports that support business decision-making processes through the information provided.
- Reporting and subsequently translating the emanating results into good technical and consistent data designs.
Let us know
Help us maintain the quality of jobs posted on RemoteTechJobs and let us know if: