Lenin M.

Lenin M.

GurgaonIndia

Modern Data Stack Engineer

Hi there, I am Lenin Mishra. I have 5 years of experience in building modern data stack pipelines. Currently, I am working in an Indian-based startup to build analytics for various SAAS tools. Currently, the majority of the companies are consuming data and building analytics using the Modern data stack. 1. They have a fully managed ELT data pipeline like Airbyte or STitch or Fivetran. 2. A columnar storage data warehouse like Redshift, Snowflake, or Big Query to store data. 3. A data transformation tool like DBT. 4. A BI tool like Tableau or Looker or some sort of data visualization platform. I also have a keen interest in data governance, test-driven development, and building analytics logic for different SAAS applications. So, if you need someone to help you in any one of the areas, do hire me!
No items

Portfolio

Example of an Airbyte Connector Development Proposal Objective - Build an Airbyte Connector to read data from an HTTP API and store the flattened JSON data into any database/data warehouse. Estimated Time of Delivery - 8 to 10 working days (64 to 80 hours) Deliverables 1. Working Endpoint that allows the client to consume HTTP API data with multiple sync modes. 2. A Git repository with Python CDK for future development/improvement. 3. A staging schema with flattened JSON data. Sprint based Timeline Estimates Sprint 1 - POC of a HTTP based connector built with Airbyte storing data from the API to Snowflake (24 - 30 hours) Sprint 2 - Building features like Integration tests, multiple sync modes and paginated data reading (24 - 30 hours) Sprint 3 - Deploying the connector and Building the staging layer in Snowflake (16 - 20 hours) Additional Offer - Build relevant BI reports for the API on platforms of choice like Tableau, Power BI, Looker, or another homegrown tool (40 hours) Project Progress/Review Time Regular progress/review meetings of the project shall be conducted with the client, based on the client’s requirement. If the dialogue leads to changes in the scope of the ongoing project, the client shall be informed at the time and the timeline shall be adjusted with the new requirements. What happens if the project is delayed from my side? The above estimates have been provided based on the experience of building the Mailchimp connector for Airbyte. Actual delivery timings may vary. In case of any delay, the client shall be made aware of the delay and the reasons responsible. In case of any such delay from my side, irrespective of extra hours spent, the maximum charge to the client will be made only for 10 working days.
Airbyte Connector Development
Postgresql + DBT + Django at Eleena BV
Postgresql + DBT + Django at Eleena BV
Want to see more? Create Account
Lenin M.

Lenin M.

GurgaonIndia
3
Total Jobs
5
Total Hours

View profile

Modern Data Stack Engineer

Specializes in
Hi there, I am Lenin Mishra. I have 5 years of experience in building modern data stack pipelines. Currently, I am working in an Indian-based startup to build analytics for various SAAS tools. Currently, the majority of the companies are consuming data and building analytics using the Modern data stack. 1. They have a fully managed ELT data pipeline like Airbyte or STitch or Fivetran. 2. A columnar storage data warehouse like Redshift, Snowflake, or Big Query to store data. 3. A data transformation tool like DBT. 4. A BI tool like Tableau or Looker or some sort of data visualization platform. I also have a keen interest in data governance, test-driven development, and building analytics logic for different SAAS applications. So, if you need someone to help you in any one of the areas, do hire me!
No items

Portfolio

Example of an Airbyte Connector Development Proposal Objective - Build an Airbyte Connector to read data from an HTTP API and store the flattened JSON data into any database/data warehouse. Estimated Time of Delivery - 8 to 10 working days (64 to 80 hours) Deliverables 1. Working Endpoint that allows the client to consume HTTP API data with multiple sync modes. 2. A Git repository with Python CDK for future development/improvement. 3. A staging schema with flattened JSON data. Sprint based Timeline Estimates Sprint 1 - POC of a HTTP based connector built with Airbyte storing data from the API to Snowflake (24 - 30 hours) Sprint 2 - Building features like Integration tests, multiple sync modes and paginated data reading (24 - 30 hours) Sprint 3 - Deploying the connector and Building the staging layer in Snowflake (16 - 20 hours) Additional Offer - Build relevant BI reports for the API on platforms of choice like Tableau, Power BI, Looker, or another homegrown tool (40 hours) Project Progress/Review Time Regular progress/review meetings of the project shall be conducted with the client, based on the client’s requirement. If the dialogue leads to changes in the scope of the ongoing project, the client shall be informed at the time and the timeline shall be adjusted with the new requirements. What happens if the project is delayed from my side? The above estimates have been provided based on the experience of building the Mailchimp connector for Airbyte. Actual delivery timings may vary. In case of any delay, the client shall be made aware of the delay and the reasons responsible. In case of any such delay from my side, irrespective of extra hours spent, the maximum charge to the client will be made only for 10 working days.
Airbyte Connector Development
Postgresql + DBT + Django at Eleena BV
Postgresql + DBT + Django at Eleena BV
Want to see more? Create Account
More than 30 hrs/week

Testimonials

Endorsements from past clients

"We hired Lenin to help our company adopt dbt and dbt cloud. He handled setting up the project and building out the initial datasets as well as the documentation for it. He also went above and beyond and gave our team training on the technology and what he had built. Overall, he was great to work with and I highly recommend him for your dbt and Data Engineering needs."

Nick H. | Director of Engineering
Dec 2022
Verified