Imagine playing a major role in developing a new SaaS offering? We’re searching for a Data Engineer to create the next generation of Intelligent Web Scrapers to seek out very specific financial data from thousands of data sources, then use Machine Learning & NLP techniques to standardize the data.

This is a key net-new hire for the organization. We need someone to take on these challenges:

  1. Develop, test and launch automated web scrapers and bots to capture and extract specific financial data from a variety of online publicly available sources
  2. Design the application to search the Internet for the expected thousands of new, and continually emerging, sources of specific financial data not yet included within our data catalogues
  3. Using AI and ML techniques, create a comprehensive data catalogue of this financial data

Why we’re a cool company

  • Full benefits package
  • We use team collaboration tools to collaborate
  • Casual, yet professional environment
  • For local candidates, largely work-from-home during the pandemic. Partial WFH afterwards.
  • Receptive to candidates across Canada
  • Flex time

Some of the Experiences & Background we’d like to see
All the things you’re good at because you’ve done most of these before

  • 5-years+ of experience in software development roles with open source technologies in a SaaS environment
  • Backend: Python / Django – 5 years min.
  • Frontend: Bootstrap, Sass, CSS, HTML, VueJS
  • Experience building Restful APIs
  • Experience building and the techniques of web scrapers
  • Experience with Apache Beam pipelines, Apache Airflow, and Google Dataflow
  • ETL experience and productionizing ETL Pipelines. Managed ETL Service – AWS Glue a definite advantage
  • 3-years experience with Artificial Intelligence and Machine Learning
  • Knowledge of Productionizing ML models
  • Experience with NLP
  • SQL/PostgreSQL and data model design (logical & physical data models)
  • Experience manipulating very large data sets (5Tb and 5T records)
  • Experience with Cloud platforms: AWS, Digital Ocean, or GCP required. Experience with Serverless and Virtual Machines
  • Effective communication skills
  • Bachelor’s degree in Computer Science/Engineering or equivalent

About us
We are 20 year old growing Canadian FinTech company provided consulting services to businesses obtaining government funding.

Coupled with our services, we have developed a unique supporting suite of applications providing us with an unrivaled competitive advantage. Our application provides our clients with visibility into the very complex and hidden $140B data feed of open funding opportunities.

This is a Permanent opportunity.

N E X T   S T E P S
If this sounds like you and you’re looking to build your career, then we want to hear from you. Please email your resume to Rick @, Subject: Data Engineer


Keywords: anomaly detection, natural language processing, deep learning classification, Convolutional Neural Network, CNN, ConvNet, Puppeteer, Scrapy, Selenium, Beautiful Soap Amazon Sagemake, Text Extractio, Text Classification, Apache Beam pipelines, Apache Airflow, Google Dataflow, ETL Pipelines, Pycharm, Phstorm, CI/CD pipelines, Deploybot, Ansible, Reinforcement Learning, RL, data engineer, Data scientist, ML developer, Python developer, job, jobs, Halton, Oakville, Milton, Burlington, Hamilton, Mississauga