Published date Posted on Indeed on Sep 12, 2021 (7 d ago)

Data Operations Engineer

Company Summary

Founded in 2020, BlockScience Labs is a platform to connect all the tools that are essential for scientific best practices and to collaboratively design, analyze, and operate complex multi-stakeholder ecosystems for proper mechanism design, managing complexity and integrating Data Science workflows into cyber physical systems.

Job Summary

BlockScience is looking for a Data Operations Engineer who is excited to join an early-stage startup. The Data Operations Engineer is a hybrid role blending the responsibilities of a DevOps Engineer, a Data Engineer, and a Data Analyst. This role will own the management of moving releases between multiple environments. The Data Operations Engineer will also be responsible for the design, development and maintenance of our analytical database environment and the building of reports using Elasticsearch and Kibana.

What we are looking for:

  • Develops and maintains continuous integration / continuous delivery pipelines between several environments (CI / CD).
  • Defines and develops the development, test, release, update, and support processes for DevOps operations.
  • Participates in the change control process to assess and communicate the impact of proposed changes to the data architecture.
  • Designs, builds, and maintains the analytics database and reports that enable self-service capabilities for business users.
  • Ensures the quality, completeness, security, privacy, and integrity of data throughout the data lifecycle, from the transactional database through downstream analytical data repositories and reports.
  • Defines and implements data quality controls, workflows, and problem escalation and correction procedures.
  • Performs tests and data validation within software systems and for new or changing source data and identifies opportunities for automation.
  • Delivers data quality by continuously identifying areas for process, systems improvement, and providing solution recommendations.
  • Provides training to support groups and oversees / authorizes end-user access to create or update data, playing a key role in adoption of standards.
  • Leverages industry best practices in data transformation and processing techniques, tools, and coding languages, data models, query optimizations and analytics.
  • Performs capacity planning checks and archives data when necessary.
  • Manages master data.
  • Manages relationships with internal stakeholders and presents to senior management when required.
  • Documents data operations and monitoring solutions in clear, concise guides.
  • Develops data mappings, integrations and consolidation approaches, methodology, and tools.
  • Monitors and manages the data platforms and processes to ensure that they are running well, both in data collection, integration, and processing.
  • Audits dashboards against production data to ensure data quality and reporting accuracy.
  • Monitors source data feeds, reviews exception reports, and creates and maintains reporting for KPIs.
  • Provides occasional one-time reports for internal and external parties.

What you need:

  • Experience building CI / CD pipelines.
  • Strong understanding of software development best practices.
  • Minimum of 1 year of experience in data technology/management on a big data platform.
  • Experience with Amazon Web Services (AWS), Elasticsearch, Kibana and SQL Database products or equivalent products from other cloud services providers.
  • Ability to present complex information in an understandable and compelling manner.
  • Demonstrated experience monitoring and optimizing data architectures.
  • In-depth understanding of data management (e. g. permissions, security, and monitoring).
  • Experience writing technical documentation and how-tos for operational tasks.
  • Experience developing and managing ETL scripts in a complex, high-volume data environment.
  • Experience using IT project management software (Ex: Jira).
  • Solid knowledge of scripting languages (SQL and Python), Git and Git workflows.
  • Experience planning and implementing QA and testing for a data warehouse.
  • Advanced data analysis skills using MS Excel and other leading BI tools to import, analyze and report on data.
  • Experience working in an Agile/Scrum development process.
  • Ability to work remote and meet deadlines.
  • Team player with great time-management, interpersonal and communication skills.

Bonus

  • Experience working with Kubernetes.
  • Previous experience in a startup environment.

BlockScience Labs provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability, or genetics.

Job Type: Full-time

Pay: $70,000.00 - $90,000.00 per year

Benefits:

  • Flexible schedule
  • Paid time off

Schedule:

  • 8 hour shift
  • Monday to Friday

Education:

  • High school or equivalent (Preferred)

Experience:

  • AWS: 1 year (Preferred)
  • Python: 1 year (Preferred)
  • CI/CD: 1 year (Preferred)

Work Location:

  • Fully Remote

Let us know

Help us maintain the quality of jobs posted on RemoteTechJobs and let us know if:

Loading...
Success
Error on reporting

Related jobs

Piper Companies is looking for a SQL Database Refactoring Developer, Tech Lead for a Managed Services organization in Remote.Responsibilities for the SQL Database Refactoring Developer, Tech Lead Locating and cataloging all Oracledatabase objects, develop compatibility matrix wit

Roofstock is the leading marketplace for investing in single-family rental homes. Our mission is to make real estate investing radically accessible, cost effective and simple. We want to use technology to transform the way real estate is bought and sold and make real estate inves

Facet Wealth Facet Wealth |
|
6 d ago

About The Role:At Facet Wealth, we put the client above all else. As a Salesforce Engineer at Facet Wealth, you will work with our Engineering & Sales teams, playing a critical role in developing the Salesforce components and tools used by our sales team. Our goals are focuse

Piper Companies Piper Companies |
11 d ago

Piper Companies is looking for a DevOps Engineer for a Clinical Trial Financial Company in King Of Prussia, PA- 100% Remote, FULL TIMEHow You Will Contribute Work collaboratively with software engineers to automate the deployment of applications (develop and maintain CI/C

Happy Money Happy Money |
19 d ago

ABOUT HAPPY MONEY Happy Money® is building a happier and more equitable financial ecosystem that seamlessly blends psychology, technology, and a focus on happiness to help consumers go from borrower to saver, investor, and giver. The company provides a path toward improving f