Data Engineer III

Published date Posted on Indeed on May 24, 2022 (41 d ago)
Job Profile Summary
Applies data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources. Creates data collection frameworks for structured and unstructured data. Develops and maintains infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepares and manipulates data using Hadoop or equivalent MapReduce platform. Responsible for adhering to company security policies and procedures and any other relevant policies and standards as directed.
Career Level Summary
Requires in-depth conceptual and practical knowledge in own job discipline and basic knowledge of related job disciplines.
Solves complex problems.
Works independently, receives minimal guidance.
May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.
Acts as a resource for colleagues with less experience.
Level at which career may stabilize for many years or until retirement.
Critical Competencies
Serves as an expert in the efficient and effective gathering and organization of data.
Experienced in cloud migration/building data pipelines on Cloud using Python and pushing data to BigQuery
Utilizes a strategic approach to plan for the use of complex data to create synergies across teams.
Selects appropriate analytical models and tools to analyze big data sets, develop business recommendations and support decision-making.
Develops and shares expert knowledge of customer's organization structure (e.g., geographies, business units), operations and business processes, and how they support the customer's strategic objectives, in order to identify customer needs
Identifies customer’s key decision makers and leverages an understanding of their unique perspectives and priorities when building relationships
Maintains thorough knowledge of customer’s industry, including key market and economic factors impacting business performance, as well as competitive landscape, in order to assist in creating effective customer solutions
Demonstrates an understanding of complex offerings and tailors solutions to meet the unique needs of the customer.
Demonstrates advanced understanding of products, technologies, offerings, etc. and coaches other colleagues in building their knowledge.
Leverages knowledge of competitors’ products and services as well as their strengths and weaknesses to compare choices.
Key Responsibilities
Other Incidental tasks related to the job, as necessary.
Build Pipelines reading data from different sources like Postgress, Oracle into google cloud using Airflow and Bigquery
Build complex ETL code
Build complex SQL queries using MongoDB, Oracle, SQL Server, MariaDB, MySQL
Work on Data and Analytics Tools in the Cloud
Develop code using Python, Scala, R languages
Work with technologies such as Spark, Hadoop, Kafka, etc.
Build complex Data Engineering workflows
Work with team to solve a variety of problems using machine learning techniques, and implement cloud-based solutions for customers
Create complex data solutions and build data pipelines
Establish credibility and build impactful relationships with our customers to enable them to be cloud advocates
Capture and share industry best practices amongst the community
Attend and present valuable information at Industry Events
Knowledge
Basic conceptual knowledge on cloud Data platform Services and solutions.
Knowledge on ETL /ELT data pipelines on Cloud
Knowledge of building ETL /ELT data pipelines on Cloud: participate in code reviews, active player in building cloud data solutions hadoop ecosystem, RDBMS, DW/DM, learn from deep architectural discussions
Skills
Devises new methods and procedures for collecting data; performs complex data analyses and presents findings on the underlying principles, reasons or facts.
Utilizes knowledge and experience to perform data and database management responsibilities on multiple systems and to assist with policy and procedure development.
Demonstrates proper techniques for preparing and filing complex regulatory correspondence in an organized and concise format, sending documents to regulatory agencies as requested.
Education
Bachelor's or Masters Degree in Computer science, Information Systems or related technical degree required
Certifications
Cloud certifications such as GCP Professional Data Engineer or Microsoft Data / AI certifications.
Experience
5 – 7 years of experience in the field of role required.
Physical Demands
General office environment: no special physical demands required. May require long periods of sitting and viewing a computer monitor. Schedule flexibility to include working weekends and/or evenings and holidays as required by the business for 24/7 operations. Must be able to lift 50 lbs over-head.
Travel
Occasional domestic/international travel, less than 50%
Disclaimer
The above information has been designed to indicate the general nature and level of work performed by employees in this classification. It is not designed to contain or to be interpreted as a comprehensive inventory of all duties, responsibilities, and qualifications required of the employee assigned to this job.
The following information is required by the Colorado Equal Pay Transparency Act and applies only to individuals working in the state of Colorado. The anticipated starting pay range of Colorado applicants for this role is $81,600 –$105,800. Actual compensation is influenced by a wide array of factors including but not limited to skill set, level of experience, licenses and certifications, and specific work location. Information on benefits offered is here

About Rackspace Technology
We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future.

More on Rackspace Technology
Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.

Let us know

Help us maintain the quality of jobs posted on RemoteTechJobs and let us know if:

Loading...
Success
Error on reporting

Related jobs

Zimmer Biomet Zimmer Biomet |
|
4 d ago

As a member of the Connected Health software development team, the Senior Data Analyst is responsible for developing curated data models and reporting solutions for the mymobility/ZBEdge platform. This includes responsibility for analyzing, designing.

GCP Data Engineer - What you do:Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.Design, implement.

Position Summary:As a Data Scientist, you will be a key contributor to Dexcom’s core Data Science Team, collaborating to serve as the center of a best-in-class data science capability that augments and applies the whole of Dexcom’s data assets to drive.

OnProcess Technology OnProcess Technology |
5 d ago

Job OverviewThe Data Engineer III will be part of a team responsible for designing, building, optimizing and maintaining our data and data pipeline architecture. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys.

Tellus solutions Tellus solutions |
6 d ago

Job Description: Core Functions include: Generate reports and create presentations for key stakeholders, including Compliance and Executive leadership team.Design and build reports, scorecards and dashboards to support all Compliance functions.Generate.

GATE Staffing GATE Staffing |
7 d ago

Exciting organization that uses AI technologies to save lives!Note: Only US Citizens and Green Card holders can be considered at this time.Position:The ideal candidate has a strong background in image analysis and databases with an interest in developing.

More jobs by this company

Rackers: Valued members of a winning team Our employees, affectionately called “Rackers,” are our true strength and differentiator. As valued members of a winning team on an inspiring mission, Rackers make a real difference for our customers. It’s why.

ResponsibilitiesSeek out, explore, and qualify business opportunities at senior levels with mid-market clients across the US.Own, nurture, and saturate named accounts. Both in cloud services consumption, as well as ensuring an outstanding client experience..

As a monitoring developer, you own a product that monitors the health of Rackspace customer devices around the world. Our globally redundant, multi-tenant monitoring system is a first line of defense against painful, prolonged outages for our customers..

Rackers: Valued members of a winning team Our employees, affectionately called “Rackers,” are our true strength and differentiator. As valued members of a winning team on an inspiring mission, Rackers make a real difference for our customers. It’s why.

Overview: Works with the Product Manager to write user stories, elaborate requirements, assist in the development of acceptance tests and be available to the development teams to clarify ambiguity around product requirements Removes or helps resolve.