Lucas isn't taking new orders for this project right now. Here are some similar projects to explore.
You will get web scraping and data mining of a website
Project details
You will get all the data you need from the website you choose. I do a very responsible job, so don't expect me to scrape private information, I will only scrape public information. I work with scraping for 5 years, using mainly Scrapy, Puppeteer, Playwright and Selenium, so I can also send the code if you want to run it on your own.
Please contact me before purchasing the project, so I can see the website before anything else.
Please contact me before purchasing the project, so I can see the website before anything else.
Data Tool
ScrapyWhat's included
Service Tiers |
Starter
$50
|
Standard
$100
|
Advanced
$200
|
---|---|---|---|
Delivery Time | 2 days | 3 days | 3 days |
Number of Revisions | 0 | 0 | 0 |
Optional add-ons
You can add these on the next page.
Fast Delivery
+$10 - $40
30 reviews
(28)
(2)
(0)
(0)
(0)
This project doesn't have any reviews.
CG
Christopher G.
May 27, 2024
Website Scraper Feature
AH
Alain H.
Feb 10, 2023
Importar informação a partir de CSV para publicação de noticias em Wordpress (API + CSV)
AH
Alain H.
Dec 8, 2022
Integrate ERP products information into Woocommerce
I have been consistently impressed by this developer's ability to deliver complex projects on time and on budget.
VB
Vic B.
Feb 22, 2022
Senior Python Flask PostgreSQL developer needed
JB
John B.
Apr 20, 2021
Data Processing + Developer + AWS
Lucas has done a fantastic job for us. Really really professional, easy to work with and easy to communicate with.
We hope to hire Lucas again in the near future.
Thank you for everything Lucas, good job!
We hope to hire Lucas again in the near future.
Thank you for everything Lucas, good job!
About Lucas
Data Engineer | Software Developer | NLP | LLM | RAG
100%
Job Success
Goiania, Brazil - 7:03 am local time
SQL, I develop innovative solutions and leverage Google Cloud Platform (GCP) services like BigQuery, Cloud
Functions, and Airflow to design efficient data pipelines. I have a brief experience with Spark and Dataflow, which I
used in my classes for taking the Google Cloud Certified Professional Data Engineer exam.
✓ Delivered high-quality software solutions for international clients, optimizing marketing campaigns and
leveraging customer data.
✓ Integrated and synchronized data across systems, ensuring seamless accessibility and data integrity.
✓ Developed custom monitors and validation functions within BigQuery for data quality assurance.
✓ Created datamarts in BigQuery for streamlined marketing team analytics and reporting.
✓ Collaborated with cross-functional teams to understand requirements and deliver effective solutions on time.
✓ Effectively communicated and worked with teams across the globe using English.
Google Cloud Certified Professional Data Engineer
Electrical Engineering graduate and currently pursuing a Master's degree.
Tech Skills:
- Python, R
- Flask, FastAPI
- Google Cloud Platform (BigQuery, Cloud functions, GCS, Composer, Airflow, Logging)
- AWS (lambda functions, S3, EC2)
- Postgresql
- LLM, Langchain, RAG
- Docker
Steps for completing your project
After purchasing the project, send requirements so Lucas can start the project.
Delivery time starts when Lucas receives requirements from you.
Lucas works on your project following the steps below.
Revisions may occur after the delivery date.
Create the script to scrape the website
Run the script
This can take some time, depending on the website