Lawson T.
DorridgeUnited Kingdom
100% Job Success
Top Rated Plus

Experienced Full Stack Data Engineer

I am an experienced full stack data and backend engineer. My background and skills include: - Expert in Python, SQL and NodeJS - Certified AWS Cloud Practitioner and studying for exams in Certified Developer, Solutions Architect, SysOps Administrator and Data Analytics - Two MScs in Mathematics and Data Science - Databases (Postgres, PostGIS, Aurora, DynamoDB, MySQL, Mongo, Redis, Snowflake, Redshift, Neo4j) and ORMs - Cloud Infrastructure (AWS, GCP, Terraform) - APIs (FastAPI, Flask, Express, GraphQL) - Orchestration (Airflow, Kubernetes) - Pub/Sub + Queuing (Kafka, RabbitMQ) - Version control Git - CI/CD (Docker, Github Actions, Jenkins) - GIS (Postgis, Shapely, GDAL, H3) - Web crawling (Scrapy, custom crawlers in Python/Node/Go) - PySpark/AWS EMR/AWS Batch I have worked in data and tech for 4 years including full time roles as a Data Scientist, Data Engineer, Backend Engineer and Head of Data. In a previous life I worked investment banking in Sales and Trading for 4 years. Previous projects include: - Designing and building a client a booking engine to allow them to expand into the reservation and beauty treatment business (GCP, Python, Postgres, Redis and FastAPI). - Assisted a bottling company in expanding their tech capabilities by moving them from spreadsheets into the cloud and developing APIs to automate their business with other partners (Python, Postgres, AWS, FastAPI). - Refactored an employers crawling system to make it more efficient. Previously each data entity was crawled on a regular frequency, but the vast majority of entities very rarely changed leading to unnecessary crawling and resource use. The refactor took into account the changes in the data and how often they occurred to predict the next optimal time to crawl resulting in a 15% reduction in cloud costs (AWS, Postgres, NodeJS, Python, RabbitMQ). - Developed a streaming change data capture pipeline handling 50 million unique payloads per day resulting in a +15% reduction in total cloud costs while allowing live data to be available to customers (AWS, Python, NodeJS, RabbitMQ, Postgres, Snowflake, Airflow) - Created a data architecture to allow aggregation of geospatial time series for any possible geographic polygon across +20bn global data points in sub-second time (AWS, Clickhouse, Python, FastAPI, Postgres, Uber H3, Redis) - Created an autonomous on-demand Excel and PDF reporting system to allow a sales team to generate their own reports from data stores with no required input from developers (AWS, Python) - Developer multiple machine learning models running production including an age/gender classification for faces in photos (Python, Keras, GCP), entity resolution system combining tabular, text and image embeddings to deduplicate +30mm listings across multiple provider platforms (AWS, Python, PyTorch, RabbitMQ, Neo4j, Postgres)

Lawson T. has more jobs. Create an account to review them

Skills

  • SQL
  • Python
  • Apache Airflow
  • Docker
  • PySpark
  • Snowflake
  • Node.js
  • PostgreSQL
  • RESTful API
  • Machine Learning
  • Flask
  • RabbitMQ
  • Terraform