At Flowhub, we're about more than technology — we're on a mission to make legal cannabis accessible to everyone. Founded in 2015, Flowhub pioneered the first Metrc API integration to help dispensaries stay compliant. Today, over 1,000 dispensaries trust Flowhub's point of sale, inventory management, business intelligence, and mobile solutions to process $3B+ cannabis sales annually. We exist to make safe cannabis products accessible to every adult on planet Earth.


Flowhub creates user-friendly business management and compliance products that increase revenue in the highly regulated cannabis industry. Our Engineering department is highly creative, incredibly resourceful, and obsesses over the user experience.


We are seeking a Data Developer specializing in Big Data and Data Integration in Denver. The successful candidate will work with the enterprise data teams within the organization and will be accountable for the design and development of data pipelines, processes and integrations. The ideal candidate is a strategic thinker always willing to learn, with strong analytics background and team player, comfortable working at a fast pace, and carries considerable experience with enterprise data implementations, data integration, data management, advanced analytic software applications, excellent knowledge of technical concepts, and knowledge of metadata management, data privacy and compliance requirements (HIPAA, CCPA, GDPR) as well as Service Oriented Architecture. In addition, the candidate is comfortable prototyping and quickly iterating in a notebook-based environment to understand data structures, troubleshoot data integrity issues and provide quick insights to stakeholders.

Top Skills

  • At least 4 years of experience working with relational datasets as a data engineer.
  • Experience with relational SQL and NoSQL databases, including Postgres and MongoDB.
  • Experience with Python data stack is required. This includes but is not limited to NumPy, SciPy, Pandas, Scikit-learn, Jupyter, Plotly, Flask, SQLAlchemy.
  • Experience with data pipelines and workflow management tools is required, with Airflow being a big plus.
  • Experience with a cloud technology stack is required, with GCP being a big plus.
  • Hands-on experience with BigQuery is highly preferred but not required.


Additional Skills & Qualifications:

  • Experience in implementing big data solutions using GCP services.
  • Able to recommend appropriate data processing and storage solutions on the GCP stack
  • Ability to develop and maintain a data REST or GraphQL API using Flask
  • Experience with stream-processing systems such as Dataflow.
  • Experience with big data processing tools such as Spark or Kafka

Let us know

Help us maintain the quality of jobs posted on RemoteTechJobs and let us know if:

Loading...
Success
Error on reporting

Related jobs

Jack Henry and Associates
Featured
6 d ago