Data Engineer

Hoodie Analytics - Remote within North America

Hoodie Analytics supports Cannabis brands, consumers, and retailers with a comprehensive
suite of analytics, apps, and data enabled services that provide marketplace intelligence on
sales, distribution, price and promotion.

We are hiring a full-time engineer dedicated to supporting our efforts in source system
integration and data enrichment. The ideal candidate will collaborate with a team to identify new
and novel approaches to apply engineering solutions to improve quality and speed of work
within a data warehouse while leveraging big data solutions, integrations and self-service

About the Company

At Hoodie Analytics, data is the core of our business, we are the leader in Cannabis Product Assortment & Pricing Intelligence.

To achieve our goal we are leveraging the latest technologies available to build and deliver the best data available in the industry.  We have already met our goal of having the largest database of cannabis activity, but it must also be the best in terms of quality and freshness.

With Terabytes of data stacked, 30+ employees in the team, and very quick growth, the data is there and we need someone to help us deliver even faster.


  • Data modeling within a data warehouse
  • Data cleansing using SQL, Python, and Bash scripts
  • Propose and implement data cleansing/enrichment strategies
  • Implement and Monitor a Data Quality framework, to ensure the best data quality is delivered to our customers
  • Build new (ELT, ETL) data pipelines (Python, Bash, SQL, Prefect)
  • Maintain high-quality code using Github and CICD with GitHub Actions
  • Work within AWS and Snowflake to maintain and improve the data.


  • Experience with ETL
    • Comfortable with ELT vs. ETL.
  • Experience working with data-centric/big data/data-warehousing projects
  • Proficient with Git, Github including Actions, Basic Linux, and Bash scripting
  • Expert in Python for data engineering, including using DBT
  • Expert in SQL. Experience in >1 dialect is a plus, ideally Snowflake and Postgres.
  • Data observability – Data testing – CI/CD – Continuous integration – DataOps and DevOps are terms that sound familiar
  • Experience using Prefect for orchestration (Airflow experience can be useful)
  • Well-versed in AWS. Lambda – SQS – EC2 – S3 – ECR – Batch – Glue – IAM
  • Proficient in using APIs, both REST and GraphQL
  • Experience using Docker as part of data pipeline architecture
If you don’t meet all these qualifications but you are still sure you’d be a great addition to our team, contact us anyway and let us know why!

Nice to haves:

  • Familiarity with JavaScript and NodeJs
  • Basic knowledge and/or experience with ML
  • Experience and passion for generative AI
  • Experience with any of the following technologies:
    • Tableau Server
    • CloudFlare
    • Postgres
    • Algolia

What's In It for You

  • Competitive base salary, healthcare, PTO, stock options
  • Work with other high-caliber engineers
  • Working on really hard, unsolved problems in Big Data and Machine Learning and using cutting-edge technologies to do so
  • Flexible working hours – as long as high-quality work gets done
  • Easy-going, infrequent office visits to downtown Chicago (Near the trains!) if you are in the area.  Otherwise, it’s purely remote.
  • Guiding the direction and architecture of the product from an early stage
  • Building the business as a partner, not an employee


We start with people who are genuinely nice people to hang out with.  Make sure they are super smart and motivated.  Lastly, we have a no jerk policy…zero tolerance. No one likes working in a company with a difficult co-worker or boss.

When you focus on the right people, suddenly it’s possible to attack that big idea, crush extremely hard problems, and excite your customer base.

Our philosophy, in a nutshell, is “People, Product, Profit”.  Great companies are created in that order.

If you’re interested in working with us please fill out the form on this page.

Apply Now

Want to Schedule a Demo?