Hire the best ETL Developers in Texas
Check out ETL Developers in Texas with the skills you need for your next job.
- $80 hourly
- 5.0/5
- (1 job)
I bring over 7 years of robust experience working within the Azure ecosystem, specializing in data engineering and business intelligence. Skilled at architecting scalable data solutions on Azure and designing insightful Power BI dashboards, I bridge the gap between raw data and actionable insights for businesses. 📊 4+ years of IT experience with hands-on technical skills to exploit business opportunities. 💼 3+ years of MS SQL Server experience, including BI stack, Power BI, and Azure Data Factory. 🔍 Expertise in data management, including archiving, governance, and enabling data-driven decision-making. 🛠️ Developed transactional databases, data warehousing, and advanced data analytics solutions. 🔄 Experience in all phases of the SDLC: Planning, Analysis, Design, Development, Testing, Deployment, and Production Support. 📈 Skilled in Business Requirement Gathering and Business Process Modeling. 🔧 Proven expertise in building Conceptual, Logical, and Physical Data Models. Key Highlights: Developed data strategy roadmaps for global organizations, including data modernization projects. Led data governance initiatives to enhance data quality, accuracy, and accessibility. Extensive experience in ETL processes to automate and streamline workflows across multiple data sources. Proven expertise in SQL Server, Azure Data Factory, and Power BI, creating robust data pipelines and delivering actionable insights. If you are looking for a skilled data engineer or BI expert to help elevate your data strategy, I am here to bring data to life with impactful visualizations, efficient solutions, and actionable insights. Keywords: SQL, Data Analytics, Data Engineering, Power BI, Azure Data Factory ETL Pipelines, Data Lake, Cloud Data Warehousing (Snowflake, BigQuery) Data Modeling (Conceptual, Logical, Physical), Data Governance, Predictive Modeling Data Visualization, Dashboard Development, Looker, Interactive Visualizations Big Data Analytics, Machine Learning (ML), Data Mining, Data Wrangling, Data Querying Exploratory Data Analysis (EDA), Data Cleaning, Data Presentation, Regression Analysis Data Integration, Cloud Platforms (GCP, Azure), Real-Time Data Processing Data Pipeline Orchestration (Airflow), Data Ingestion, Data Security and Optimization Business Intelligence (BI), Visual Analytics, Data Styling, Color Theory in VisualizationETLSQLAzure DevOpsData ModelReal Time Stream ProcessingDatabase DesignData IntegrationData VisualizationPythonData AnalyticsMicrosoft AzureMicrosoft Azure SQL DatabaseData AnalysisBusiness IntelligenceMicrosoft Power BI - $49 hourly
- 5.0/5
- (3 jobs)
🏅 I'm Elijah Hamilton, an Engineer in Computer Science with 9+ years of programming experience 🏅. For the past 5 years, I've been doing Web Scraping & Crawling for a living. Moreover, I've been working on over 150+ Web Scraping projects, Web Crawling, and Automation using Python as the main Programming Language. My software development method regarding prototyping is iterative and incremental 👌🙏. I can manage all of the stages of this particular software's life cycle, from designing & analysis to deployment & maintenance. You don't have to worries about the pool of proxies, CAPTCHAS solving, anti-bot mitigation, server costs, server loads, fail overs, etc. That's why I have my own infrastructure only to run crawlers to offer you the perfect data service ⭐⭐⭐. My main skills are: 📌 Development of crawlers and spiders to navigate a website like a human. 🦾🦾🦾 📌 Web scraping using Selenium Web Driver, Requests, bs4, mechanize, REGEX, and my own functions & scripts. 🤖🤖🤖 📌 Reading of data from RESTfull API's, JSON, XML, YAML, txt, CSV or excel. 📔 📌 I can build outputs in CSV, Excel, txt, POST Request to your server or store in Databases like MongoDB, MySQL, MSSQL, and PostgreSQL. 📔 📌 I'm able to build a RESTful API and deploy it. 📌 Create bots to solve captchas like google ReCaptcha v1, v2, and v3 enterprise and standard edition. 📌 Analysis & Design of the Robotic Process Automation (RPA). 📌 Functional analysis of Systems, using UML & IEEE 830 SRS. 📌 I can deliver .py files or .exe files to be executed on MAC, Windows, or Linux. 📌 I take my time to analyze and design the best crawling strategy. Thank you. Keywords: Python, Scrapy, Selenium, Requests, BeautifulSoup, lxml, Web Scraping, Web Crawling, Data Scraping, Data Extraction, Data Mining, Automation, ETL, Bots, Scrapers, Spiders, Cloudflare, Captcha, Playwright, Amazon S3, DigitalOcean, Bot Development, Database, Django, Browser Automation, Python Script, Data Analysis, Data Visualization, JavaScript, Flask, FastAPI, MongoDB, PostgreSQL, Antidetect Browser, PerimeterX, AKAMAI, NodeJSETLDatabase ManagementMySQLBeautiful SoupSeleniumPython-RequestsFull-Stack DevelopmentPythonBot DevelopmentAutomationWeb CrawlingData MiningData ExtractionData ScrapingWeb Scraping - $85 hourly
- 4.8/5
- (4 jobs)
Experienced Business intelligence implementation consultant with expert knowledge of BI architecture and strategy Experience working navigating with different industries to translate business needs into data and software strategyETLData AnalysisSQLMicrosoft Power BITableauRDomoMySQL ProgrammingData Visualization - $55 hourly
- 4.8/5
- (15 jobs)
My journey in the tech world is backed by a prestigious certification in Data Analytics and Visualization from the University of Texas at Austin, complemented by a solid foundation in Mathematics. This unique blend of academic excellence and practical expertise makes me a valuable asset in the field of software development. Core Competencies: 🐍 Programming Languages Proficiency: I excel in Python, Javascript, and Shell scripting. My fluency in these languages enables me to develop robust, scalable, and efficient software solutions tailored to meet diverse business needs. 📈 Data Analysis and Visualization: With my certification from the University of Texas at Austin, I have honed my skills in data analytics. I am adept at analyzing complex data sets to extract meaningful insights and visually represent them in a clear, concise, and impactful manner. 💾 Database Management: My expertise extends to handling both SQL and NoSQL databases. I can efficiently manage, clean, and organize large volumes of data, ensuring data integrity and accessibility for seamless business operations. ♾️ Mathematical Acumen: My strong background in Mathematics empowers me to tackle challenging problems with analytical rigor and precision. This skill is particularly beneficial in developing algorithms and models for data analysis and software development. Why Choose Me? 🏆 Quality and Precision: My commitment to quality and precision in software development is unwavering. I strive to deliver software products that not only meet but exceed client expectations. 🎯 Problem-Solving Mindset: My analytical skills and mathematical background enable me to approach problems creatively and find innovative solutions. 🗣️ Effective Communication: I believe in clear and concise communication, ensuring that clients are always informed and involved in the development process. 🧠 Continuous Learning: The tech field is ever-evolving, and so am I. I continuously update my skills and knowledge to stay at the forefront of technological advancements.ETLCI/CDDevOpsAPIPostgreSQLPlotlyDatabase ManagementData IntegrationData ScrapingDockerQuery TuningData VisualizationPythonData Analysis - $125 hourly
- 4.8/5
- (14 jobs)
🏆 Achieved Top-Rated Freelancer status (Top 10%) with a proven track record of success. Past experience: Twitter, Spotify, & PwC. I am a certified data engineer & software developer with 5+ years of experience. I am familiar with almost all major tech stacks on data science/engineering and app development. If you require support in your projects, please do get in touch. Programming Languages: Python | Java | Scala | C++ | Rust | SQL | Bash Big Data: Airflow | Hadoop | MapReduce | Hive | Spark | Iceberg | Presto | Trino | Scio | Databricks Cloud: GCP | AWS | Azure | Cloudera Backend: Spring Boot | FastAPI | Flask AI/ML: Pytorch | ChatGPT | Kubeflow | Onnx | Spacy | Vertex AI Streaming: Apache Beam | Apache Flink | Apache Kafka | Spark Streaming SQL Databases: MSSQL | Postgres | MySql | BigQuery | Snowflake | Redshift | Teradata NoSQL Databases: Bigtable | Cassandra | HBase | MongoDB | Elasticsearch Devops: Terraform | Docker | Git | Kubernetes | Linux | Github Actions | Jenkins | GitlabETLJavaApache HadoopAmazon Web ServicesSnowflakeMicrosoft AzureGoogle Cloud PlatformDatabase ManagementLinuxApache SparkAPI IntegrationScalaSQLPython - $50 hourly
- 5.0/5
- (5 jobs)
👋 Hi there! I'm Arjun, a dedicated data professional committed to ensuring companies and individuals harness the power of clean, high-quality data. With a diverse background spanning professional roles and freelancing projects, I bring extensive expertise to the table. 💼 My toolkit encompasses a wide array of tools and technologies tailored to address diverse data challenges effectively. Proficient in SQL and python, I excel at crafting complex queries and optimizing database performance for seamless data retrieval and manipulation. Additionally, I specialize in database management systems like MySQL, PostgreSQL, and MongoDB, leveraging their unique features to meet specific project requirements. 🔍 When it comes to data acquisition, I'm adept at web scraping techniques to gather information from various online sources efficiently. Whether it's scraping structured data from websites or extracting insights from unstructured sources, I employ robust methodologies to ensure accurate and reliable results. 🛠️ ETL (Extract, Transform, Load) processes are a cornerstone of my skill set. I have hands-on experience designing and implementing ETL pipelines to streamline data workflows, ensuring seamless data integration across diverse systems and platforms. From data extraction and cleansing to transformation and loading into target databases, I'm equipped to handle every stage of the process with precision and efficiency. 📊 Whether your project demands database optimization, web scraping, or ETL automation, I'm here to help you unlock the full potential of your data. Let's collaborate to turn raw data into actionable insights and drive informed decision-making for your business or project!ETLData WarehousingData AnalysisData MiningETL PipelinePythonData Scraping - $115 hourly
- 5.0/5
- (17 jobs)
⭐⭐⭐⭐⭐ "Tucker proved to be an exceptional asset in our project. I look forward to partnering with him on future projects and wholeheartedly recommend Tucker to any organization seeking top-tier talent." Do you need help leveraging your data to make better and faster business decisions? Is your data scattered across multiple applications, Excel workbooks, and systems? Do you spend more time pulling reports than actually analyzing the data? Ranked in the top 3% of freelancers on Upwork, and with nearly half of my clients being long-term partners, I specialize in transforming raw data chaos into streamlined insights. As a former data analyst for a Fortune 500 company, I now dedicate my full-time consultancy to empowering clients like you to automate data pipelines and craft reports that save time and illuminate opportunities. Typical engagements include: • Data Warehouse Implementation: Over the past year, I've helped five clients create data warehouses that centralize and synthesize their data, increasing efficiency and boosting their bottom line. • Power BI Development: Data is only as valuable as your ability to use it. I've developed over 50 Power BI reports, automating reporting and providing deeper insights into clients' business operations. • Data Integration: I assist clients in creating automated data pipelines from various sources, including REST and GraphQL APIs, emailed reports, Snowflake shares, and systems like QuickBooks, Hubspot, and Extensiv products. • Data Modeling: A robust data model is the lynchpin of effective data analytics. I've helped clients develop standard data models within their data warehouses and Power BI semantic models following best practices. Ready to transform your data into decisions? Let's schedule a quick call to discuss your needs and how I can help streamline your reporting process. I typically respond within 2 hours during business hours.ETLData Warehousing & ETL SoftwareDatabaseTransact-SQLMicrosoft AzureAzure DevOpsAlteryx, Inc.Microsoft SQL ServerSQL ProgrammingAnalyticsSQLBusiness IntelligencepandasPythonMicrosoft Power BI - $75 hourly
- 4.9/5
- (57 jobs)
Welcome to my profile! With a 100% Job Success rating and several years of experience, I am an expert in AI prompting and a seasoned consultant in Python, JavaScript, and Microsoft products. I specialize in leveraging the power of technology to deliver exceptional results for my clients. In my previous projects, I have demonstrated my expertise in various areas. For instance, I collected information from over 50 websites, gathering data for multiple variables and consolidating it into a spreadsheet with over 10 million rows and 40 columns. Additionally, I have scraped course catalog data from multiple universities, ensuring accurate and reliable information. Currently, I am finalizing a web scraping project that involves logging into a website, downloading a CSV file, and seamlessly converting it for upload via an API to another platform. The outcomes have been outstanding, showcasing my ability to navigate complex data workflows. My experience also extends to web scraping projects for renowned websites such as Home Depot, Lowe's, HEB, NBA.com, NFL.com, Walmart, and more. I can extract, clean, and transform data effectively to derive meaningful insights. Furthermore, I have undertaken custom SKU creation projects, utilizing advanced query techniques to consolidate data from diverse sources. This enables me to provide comprehensive reports on total inventory, shipping costs, and order requirements. I excel in managing complex datasets and generating actionable insights. My involvement in the Ethereum ecosystem has further honed my skills. I have developed a dashboard using Bootcamp 4 and Django, integrating APIs to gather time-series data for purchases in a metaverse. I have built prediction models for trend analysis by employing statistical learning techniques. I am well-versed in version control with GitHub and have experience with Docker builds for seamless deployment. I am also engaged in a sports betting project, where I employ API integration to gather data from multiple bookies. I ensure accurate transaction identification and USD conversion through data cleaning, sorting, and transformation. Leveraging AWS and Google Cloud infrastructure, I create robust data pipelines that drive trend prediction models. With a proven track record of successfully deploying web applications and leveraging Streamlit, FastAPI, and AWS environments, I am adept at developing solutions tailored to specific business needs.ETLGoogle SheetsData MiningGoogle Cloud PlatformMicrosoft ExcelGoogle APIsNumPyData AnalysisJavaScriptData ScrapingData ExtractionPython - $175 hourly
- 5.0/5
- (1 job)
Hello, I'm Mike. Welcome to my profile!. I am a seasoned Business Intelligence Analyst here to help your business grow! Are you struggling to understand what's going on with your business? Have lots of business data but not sure what to do with it? I spent over a decade as a business intelligence analyst helping companies of all sizes, from start-ups to Fortune 500, make smarter business decisions with more insightful data. Let's face it, most business owners are overwhelmed with the volume of data being generated especially coming from so many different reporting systems. Business owners turn to me to help them integrate their systems and deliver important business information in easy to read dashboards. From finding the leak in your sales funnel to predicting the likelihood of a new customer purchase, together we'll tackle any business problem using data. Need to clean or collect your business data before we get started? No problem. I can create an ETL plan for your project as well. Let's schedule a chat so we can further discuss how smarter data will help your business tackle any challenge.ETLBig DataData AnalysisData VisualizationMarketing AnalyticsETL PipelineMySQL ProgrammingData CleaningBhrigus Software IVRDatabaseAnalyticsInformation AnalysisComputerData ScienceMachine Learning - $80 hourly
- 5.0/5
- (2 jobs)
I'm a passionate data engineer who loves exploring big data to find insightful information. I am expert in developing data models, and pipeline architectures, and providing ETL solutions for project models. I enjoy solving problems, providing data driven insight and continually expanding my knowledge. Data are only as valuable as the insights gleaned from analysis and I excel at using the python data science software ecosystem for data analysis, prediction, visualization and storytelling. I believe humility and integrity in my personality creates the trust which is required to be a good team player.ETLData ScienceData EngineeringMicrosoft Power BIPythonSQLETL Pipeline - $80 hourly
- 5.0/5
- (18 jobs)
𝗔𝘇𝘂𝗿𝗲 𝗦𝗼𝗹𝘂𝘁𝗶𝗼𝗻𝘀 𝗔𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁 | 𝗗𝗮𝘁𝗮 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿 | 𝗧-𝗦𝗤𝗟 | 𝗗𝗔𝗫 | 𝗣𝗼𝘄𝗲𝗿 𝗣𝗹𝗮𝘁𝗳𝗼𝗿𝗺 | 𝗔𝗜 𝗘𝘅𝗽𝗲𝗿𝘁 Are you 𝘀𝘁𝗿𝘂𝗴𝗴𝗹𝗶𝗻𝗴 to 𝗶𝗻𝘁𝗲𝗴𝗿𝗮𝘁𝗲, 𝗺𝗮𝗻𝗮𝗴𝗲, 𝗼𝗿 𝘀𝗰𝗮𝗹𝗲 your data infrastructure across multiple 𝗰𝗹𝗼𝘂𝗱 𝗽𝗹𝗮𝘁𝗳𝗼𝗿𝗺𝘀? Is your organization looking for an 𝗲𝘅𝗽𝗲𝗿𝗶𝗲𝗻𝗰𝗲𝗱 𝗮𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁 to streamline 𝗱𝗮𝘁𝗮 𝗽𝗶𝗽𝗲𝗹𝗶𝗻𝗲𝘀, 𝗮𝘂𝘁𝗼𝗺𝗮𝘁𝗲 𝘄𝗼𝗿𝗸𝗳𝗹𝗼𝘄𝘀, and harness the power of 𝗔𝗜 for insightful 𝗱𝗲𝗰𝗶𝘀𝗶𝗼𝗻-𝗺𝗮𝗸𝗶𝗻𝗴? With over 10 years of experience as a Certified Solutions Architect and Data Engineer, I’ve successfully delivered high-impact solutions for major enterprises including Microsoft, Wells Fargo, Thomson Reuters Elite, Coca-Cola, CVS, Walgreens, Capgemini, EPAM, nThrive, Pharmavite, USDA and Google. Also, worked with start ups like LimeGear, SourceGroup etc. As a multi-certified expert in Azure, AWS, GCP, and Power Platform, I specialize in crafting robust, scalable solutions that improve efficiency, reduce costs, and enhance data security. 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝘀 𝗜 𝗢𝗳𝗳𝗲𝗿: ☑️ Cloud Architecture & Integration: Designing and implementing secure, scalable architectures across Azure, AWS, GCP, and Snowflake ☑️ Data Engineering & ETL Solutions: Building optimized data pipelines using Azure Synapse, Databricks, and AWS Glue to deliver real-time insights ☑️ Data Warehousing & Analytics Dashboards: Creating comprehensive, interactive dashboards using Power BI, Tableau, and QuickSight to enable data-driven decision-making ☑️ SQL & DAX Optimization: Solving complex problems with T-SQL, DAX, and Python for highly efficient data models ☑️ Automation & AI Integration: Leveraging Azure Open AI and Power Platform for automating workflows and boosting business productivity ☑️ Database & Storage Management: Expertise in managing Azure SQL, PostgreSQL, Amazon RDS, S3, and other cloud storage solutions 𝗤𝘂𝗮𝗻𝘁𝗶𝗳𝗶𝗲𝗱 𝗥𝗲𝘀𝘂𝗹𝘁𝘀: ✔️ Increased data processing speed by 40% for a major client through efficient ETL pipelines ✔️ Improved business reporting capabilities, driving 30% faster decision-making with optimized Power BI dashboards ✔️ Reduced cloud infrastructure costs by 25% through effective cloud architecture design and resource management ✔️ Delivered secure, high-performance data migration from on-premise to cloud solutions for several Fortune 500 companies 𝗖𝗼𝗿𝗲 𝗘𝘅𝗽𝗲𝗿𝘁𝗶𝘀𝗲: ⭐ Cloud Platforms: Azure, AWS, GCP ⭐ Database Management: Azure SQL, PostgreSQL, MySQL, SQL Server, Oracle, Amazon RDS ⭐ Data Analytics: Power BI, Tableau, QuickSight, SQL, T-SQL, DAX ⭐ Automation & AI: Power Automate, Azure AI, Python, Kubernetes ⭐ Data Pipelines & ETL: Azure Data Factory, SSIS, Databricks, AWS Glue, Lambda ⭐ Security & Networking: Azure AD, AWS IAM, Azure Key Vault, Elastic Load Balancing As a proud U.S. citizen by choice, I am also the Co-Founder of Data Integrity Services, Inc., a Microsoft partner and featured Fabric Partner, with over 60 years of combined experience driving digital transformation for businesses across various industries. I’m passionate about giving back and helping others. In addition to my work, I teach T-SQL, Python, and DAX, focusing on making complex coding challenges easy to solve. I also contribute by donating a portion of our revenue to help those in need globally. 𝗟𝗲𝘁’𝘀 𝗕𝘂𝗶𝗹𝗱 𝘁𝗵𝗲 𝗙𝘂𝘁𝘂𝗿𝗲 𝗧𝗼𝗴𝗲𝘁𝗵𝗲𝗿! I’m committed to establishing long-term relationships with clients by delivering measurable results and innovative solutions. Reach out today to discuss how I can assist you. 𝗞𝗲𝘆𝘄𝗼𝗿𝗱𝘀: Azure Solutions Architect, Data Engineer, T-SQL, DAX, Power Platform, AI, SQL, Data Pipelines, ETL, Power BI, Tableau, Python, Automation, Databricks, AWS, GCP, Cloud Architecture, Azure Synapse, Data Warehousing, Kubernetes, SQL Server, PostgreSQL, Cloud Storage, Data Migration, SSIS, AI Integration, Data Security, Microsoft Partner, Data Engineering SolutionsETLData AnalysisAzure DevOpsMicrosoft PowerAppsPySparkAzure DevOps ServerMicrosoft SQL ServerDevOpsMicrosoft Azure SQL DatabaseMicrosoft Power BI DevelopmentMicrosoft Power AutomateDatabricks PlatformSQL Server Integration ServicesSQLMicrosoft Power BI - $70 hourly
- 4.6/5
- (2 jobs)
As a certified Data Analyst with specialization in Power BI, Tableau, SQL, Excel and Python, I have a proven track record of developing insightful data visualizations that have helped stakeholders better understand and leverage big data applications. - I have extensive experience building and managing complex data models, and have the ability to connect to and integrate data from various sources such as Snowflake and Excel. - My experience includes developing visual reports, KPIs, and dashboards using Power BI/Tableau desktop. I have a strong background in data warehousing, data modeling, and building new data warehouse schema, as well as demonstrated expertise in SQL queries, SQL Server Analysis Services /SSAS, and SQL Server Integration Services /SSIS. - I also have a deep understanding of data processing and storage systems, as well as expertise in using Power BI/Tableau and AWS to deliver comprehensive data solutions. I am proficient in designing, building, and maintaining data pipelines, ensuring data quality and integrity through data cleansing and transformation, and performing data modeling and schema design. Additionally, I have experience with data integration and migration, including designing and implementing ETL processes to move data from source systems into data warehouses or other target systems. My experience with Power BI/Tableau and AWS includes the ability to integrate and extend these technologies to provide scalable and flexible data solutions. I am proficient in using AWS services such as Lambda, S3, and RDS, and have experience with various data storage technologies such as Redshift and DynamoDB. - I am proficient in Python libraries such as Pandas, NumPy, StatTools, and GeoPandas, and have experience using AWS Lambda and S3. Additionally, I have knowledge of machine learning techniques and have experience implementing them in data analysis and modeling. In my work, I support application development and database administration by developing reusable and precise reports and dashboards based on processes and standards. I have also demonstrated the ability to integrate, embed, and extend Power BI/Tableau using APIs, enabling greater functionality and flexibility in data visualization. Overall, I am a skilled and experienced data analyst who is capable of utilizing a range of technical tools and techniques to drive data insights and provide valuable support to businesses and organizations.ETLAWS LambdaETL PipelineData AnalyticsAmazon S3Machine LearningSnowflakePython ScriptBusiness IntelligenceMicrosoft Power BI Data VisualizationPower QuerySQLTableauMicrosoft Power BIPython - $125 hourly
- 5.0/5
- (4 jobs)
My magic power lies in crafting technical solutions, particularly those involving automation, AI, APIs, cloud solutions, and my preferred programming language, Python. My approach integrates a deep understanding of both coding and infrastructure. Projects ChatGPT Plugin: Developed and published in the ChatGPT Plugin store. This custom plugin interacts with web3 APIs to create and manage test environments for EVMs. CustomGPT: Engineered an advanced CustomGPT, a GPT model that generates contracts for the Oil and Gas industry. It creates documents in Google Docs and provides a link in ChatGPT. FedEx and UPS Dropship Automation: Designed a system to generate FedEx/UPS labels based on customer preferences within the RevParts Management system. This solution ensures correct label types and billing for parts drop-shipped from third-party warehouses and communicates these details to vendors. WebScraper Automation: Utilized Selenium to build a scraper that efficiently searches Google for specific keywords and retrieves relevant results. Picture Automation/Process: Created a process using automations, AI, and Azure storage to streamline the cropping, storage, and linking of pictures. Skills: Python, FastAPI, Flask, APIs, Databases, SQL, Node.js, R, M Language, Machine Learning, CustomGPT Actions, OAuth, HTML, CSS, Azure AI, Azure Vision, Azure, GCP, and more. Education: From a technological standpoint, I am predominantly self-taught, backed by over ten years of hands-on experience. My academic foundation is a Bachelor of Arts in Psychology, where I concentrated on Human Factors and Ergonomics. This specialization delves into the broader realm of usability, encompassing UX/UI. Notably, my honors thesis was conducted in collaboration with Dell Computers, focusing on keyboard usability, a study that elegantly blends psychological principles with practical engineering applications.ETLAI Agent DevelopmentGoogle Cloud PlatformFlaskNode.jsAzure AI VisionFastAPIDatabaseOpenAI APIAPISQLMicrosoft AzurePythonArtificial IntelligenceAutomation - $83 hourly
- 4.8/5
- (57 jobs)
I build software systems that make or save serious money: 💰 𝗕𝘂𝗶𝗹𝘁 𝗮𝘂𝘁𝗼𝗺𝗮𝘁𝗶𝗼𝗻𝘀/𝘀𝗰𝗿𝗮𝗽𝗲𝗿𝘀 𝗼𝗻 𝗦𝗼𝘂𝗻𝗱𝗖𝗹𝗼𝘂𝗱 𝗮𝗻𝗱 𝗠𝗶𝘅𝗰𝗹𝗼𝘂𝗱 𝗺𝗮𝗸𝗶𝗻𝗴 𝗼𝘃𝗲𝗿 $𝟱𝟬𝟬𝗞. 🏗️ 𝗗𝗲𝘀𝗶𝗴𝗻𝗲𝗱 𝗺𝗶𝘀𝘀𝗶𝗼𝗻-𝗰𝗿𝗶𝘁𝗶𝗰𝗮𝗹 𝘀𝗰𝗿𝗮𝗽𝗶𝗻𝗴 𝗳𝗼𝗿 𝗺𝗮𝗽.𝗹𝗶𝗳𝗲, 𝗽𝗼𝘄𝗲𝗿𝗶𝗻𝗴 𝘁𝗵𝗲𝗶𝗿 𝗦𝗮𝗮𝗦 𝗳𝗼𝗿 𝟲 𝘆𝗲𝗮𝗿𝘀. ⏱️ 𝗦𝗮𝘃𝗲𝗱 𝗛𝗮𝗽𝗽𝘆 𝗦𝗽𝗮 𝗗𝗼𝗴𝘀 𝟮,𝟬𝟬𝟬+ 𝗵𝗼𝘂𝗿𝘀 𝗮𝗻𝗻𝘂𝗮𝗹𝗹𝘆 𝘄𝗶𝘁𝗵 𝗮𝘂𝘁𝗼𝗺𝗮𝘁𝗲𝗱 𝗼𝗿𝗱𝗲𝗿 𝗽𝗿𝗼𝗰𝗲𝘀𝘀𝗶𝗻𝗴. ---------------------------------- ✨ What Clients Have Said: ---------------------------------- "Sean was perfect for this job. He made sure exactly what we wanted and asked if anything more needed to be done, showcasing the professionalism anyone would want to work with. I urge you to hire him if you want your job done to the highest level." "Sean was great to work with. He took on the job as if it was his own. I appreciated his communication and willingness to talk through my needs. We've just wrapped up the job, and I'm thinking of more jobs to hire Sean for." ----------------------- 🔧 Services I Offer: ------------------------ Advanced Web Scraping: Extract data from any website, regardless of format or security measures. Bot Detection Bypass: Overcome systems like Recaptcha, Datadome, and Selenium detection. Process Automation: Streamline your workflows with custom automation solutions in Python, VBA, and macros. Excel Automation: Save hours daily by automating tedious Excel tasks. Custom Solutions: Develop automation tools for repetitive web tasks to increase your efficiency. ------------------------------ 🚀 How I Deliver Results: ------------------------------- Understand Your Goals: We’ll discuss your objectives and define the project scope in detail. Value Addition: I provide suggestions to enhance the project's value and save you money. Strategic Planning: Deliver a step-by-step proposal outlining the solution. Timely Execution: Develop and deliver the solution within your desired timeframe. ------------------------------ 🛡️ Systems I Can Bypass with Expertise: Recaptcha Datadome Cloudflare Bot Management Selenium Detection Akamai Bot Manager PerimeterX Bot Defender And other advanced bot detection systems I reverse-engineer bot detection mechanisms to provide reliable and efficient solutions. 👨💻 About Me: With nearly 10 years of software development experience specializing in automation and web scraping, I have solved complex challenges in the realm of web automation and scaled bot systems. My expertise allows me to bypass virtually any bot detection system on the internet. I build scalable solutions aimed at saving you time and helping your business grow. By leveraging my experience in mass-scale automation, I deliver powerful tools that make a real difference. Ready to streamline your business and reclaim your time? Feel free to message me anytime to discuss how I can assist with your project. I'm always happy to chat and see if I'm the right fit for your needs.ETLSelenium WebDriverSystem AutomationData MiningAutomationNatural Language ProcessingNLTKData ScrapingSeleniumPython - $125 hourly
- 5.0/5
- (15 jobs)
Data Mining & Machine Learning Broad experience building recommender systems including user and item-based collaborative filtering, latent factor models and binary response measures. Deep experience mining association rules from transaction data for actionable insight. Expert in analyzing and visualizing sequence analysis in customer purchase histories. Expert in use of Permutation Importance and Partial Dependence to uncover deep insight and causal relationships. Data Pipeline Development Extensive experience developing end to end data pipelines in the Azure Cloud. Numerous projects using Azure Function apps to automate ETL and machine learning workflows. Quantitative Analysis Expert in developing and implementing scale-able customer profitability analytics--especially in complex industrial/commercial service business models. Visualization Proficient in shiny application development in R. Particularly skilled at visualization of complex data through network graphs, Sankey diagrams, and faceted plots.ETLAmazon AthenadbtMicrosoft AccessData ModelingAzure DevOpsC#AWS GlueSnowflakeMachine LearningRSQLPython - $150 hourly
- 5.0/5
- (23 jobs)
CTO with over 20 years of experience in the tech space, specializing in dev team and SaaS product buildout for startups. Hold an MBA, PMP, and Six Sigma Black Belt certification. Expert in micro-services approached architecture, and API development utilizing REST and GraphQL technologies. Technical proficiencies include Python, Django, Flask, Ruby, Go, Docker, AWS, CI/CD, NodeJS, ReactJS, Generative AIETLAWS LambdaCentOSLAMP AdministrationAPI DevelopmentMySQL ProgrammingAmazon Web ServicesMicrosoft SQL Server ProgrammingCloud ComputingWeb DevelopmentAPIPythonJavaScript - $40 hourly
- 0.0/5
- (0 jobs)
Machine Learning, Statistics, Programming, Data Analysis Python, R, Data wrangling, Visualization 5 years of academic experience and 2 years work experience Have good math and programming skills Strengths: Time-series analysis Ensemble learning Unsupervised learning Analytical reportsETLConsumer SegmentationData MiningMachine LearningR - $80 hourly
- 0.0/5
- (1 job)
I am an experienced Python developer who has over a decade of experience in multiple industries, for small organizations and large. I have worked using multiple Python web frameworks, in scientific computing, in data analysis, ETL, and also done front-end web work from time to time. I am familiar with all parts of the software development lifecycle, and I am flexible enough to adapt to every client's process. I have had multiple repeat clients, which I think speaks to the fact that I write clean, readable and maintainable code. I also have experience in R and Javascript.ETLpandasPyTorchSciPyRJavaScriptPython - $65 hourly
- 4.0/5
- (16 jobs)
I am a 7-year advertising veteran with a concentration in data engineering. Managing everything from GTM containers to BI implementations. Over the years I have built several end to end ETL and ELT tools using python to initiate the extraction process and, depending on the warehouse/data size a combination of python and sql to manage transformation and load logic. Connecting to api endpoints, extracting data from the web and integrating with file share systems are all core to the work I do. My sql database experience includes Snowflake, Postgres, Mysql and Bigquery. With big data becoming more and more a problem, I have experience managing tables +10gb in size with over 100 billion records. Concepting database architecture and data flow to reporting states are all within my wheelhouse.ETLAnalytics & Tracking SetupData ExtractionMicrosoft Power BIGoogle AnalyticsData VisualizationAPI IntegrationdbtSnowflakeBigQueryPython - $35 hourly
- 0.0/5
- (3 jobs)
I am a Microsoft certified database and Power BI developer with 10+ years of experience in designing, managing, and optimizing databases and developing insightful business intelligence solutions. I am skilled in translating complex datasets into actionable insights, enabling data-driven decision-making. I am also proficient in ETL processes, and creating interactive Power BI dashboards tailored to meet business objectives. Skills ▮Database Management – skilled in writing complex queries, stored procedures, functions and create and maintain database objects in SQL Server and PostgreSQL. ▮ETL process – proficient creating ETL pipelines using Python and ETL tools like SSIS and Pentaho ▮Power BI Development – Strong ability to design and implement dashboards, reports, and KPIs with advanced DAX functions for in-depth analytics Glad to connect and discuss with you on any database or Power BI related projects you may have. Cheers, AbdulETLMicrosoft Power BI DevelopmentMicrosoft Power BI Data VisualizationSQLData ExtractionETL PipelineMicrosoft Power BIDatabase ProgrammingSQL ProgrammingPostgreSQL ProgrammingDatabaseSQL Server Reporting ServicesPostgreSQLSQL Server Integration ServicesTransact-SQL - $50 hourly
- 0.0/5
- (0 jobs)
Hands-on design and development experience on Hadoop ecosystem (Hadoop, HBase, PIG, Hive, and MapReduce) including one or more of the following Big data related technologies - Scala, SPARK, Sqoop, Flume, Kafka and Python, strong ETL, PostgreSQL experience as well Strong background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and microservice architecture * Experience in Cloudera Stack, HortonWorks, and Amazon EMR * Strong experience in using Excel, SQL, SAS, Python and R to dump the data and analyze based on business needs. * Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica PowerCenter * Strong understanding and hands-on programming/scripting experience skills - UNIX shell * An excellent team player & technically strong person who hasETLAmazon S3Data Warehousing & ETL SoftwareBig DataAmazon Web ServicesHiveData ScienceData LakeData CleaningApache HiveApache HadoopApache SparkApache KafkaData MigrationETL Pipeline - $60 hourly
- 0.0/5
- (1 job)
* 10+ years of IT experience in the Business Analysis, Design, Implementation and Testing of Applications using wide range of Technologies including Data Warehousing, Database and Reporting systems for Manufacturing & Distribution ,State Government and Corporate Clients with Power BI, Cognos, Business Objects, Tableau, Hadoop Technologies. * Worked in several BI data warehouse projects using AGILE- SCRUM methodology and full life cycle implementations starting from Blueprint, Requirement gathering, Designing, Development, Implementation, User validation to Go-Live and Post production support. * 10+ years of Extensive experience working in all Databases. * Well experienced in defining, designing, integrating and reengineering the Enterprise Data warehouse and Data Marts in different environments like Teradata, Oracle with multiple Terabytes of size and various levels of complexity. * Develop complex SQL Views, Stored Procedures. * Proficient Communication skills and train the end users professionally. * Can handle tough timelines and deliver the product end to end with complete documentation.ETLData WarehousingData AnalyticsOracleSAP HANAAgile Software DevelopmentSnowflakeInformaticaData LakeTeradataCognosPower QueryTableauMicrosoft Power BIPython - $45 hourly
- 0.0/5
- (0 jobs)
Experienced Data Engineer with 13+ years of proficiency in Snowflake, Databricks, Oracle PL/SQL, and Mainframe Technologies. Skilled in building production-ready data pipelines and designing high-quality data warehousing solutions on large-scale platforms. Expert in migrating data from legacy databases to cloud-based systems. Led assessment and proof-of-concept (POC) for mainframe-to-Databricks migration, overseeing Control-M job assessments and Informatica evaluation. Experience with Data Migration projects from source (Oracle/SQL Server) to Snowflake. Experienced with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading CSV, Parquet and JSON formatted data into snowflake tables. Proficient in implementing a diverse range of database objects in Snowflake, encompassing tables, views, sequences, synonyms, stored procedures, functions, tasks, streams, stages, and data sharing configurations, with a focus on meeting business requirements and adhering to best practices. Experienced in utilizing dbt (Data Build Tool) for streamlining ETL processes and automating the creation of required tables and views to optimize data modeling and storage for analytics and reporting purposes. Utilized PySpark within Databricks to seamlessly migrate mainframe code to modern data environments, ensuring efficient data processing and analysis. Efficiently migrated data from Oracle to Snowflake with the seamless integration of Qlik Replicate. Successfully migrated data and code from mainframe systems to Databricks, leveraging delta live tables and automating data pipelines for efficiency. Experienced in adeptly crafting sophisticated SQL queries to extract, manipulate, and analyze data with precision and efficiency. Proficiently utilized automated testing tools such as Tosca to ensure the integrity and reliability of data. Proficient in developing Python-based scripts for reading/writing data to/from databases and creating automation scripts. Skilled in utilizing AWS services such as Redshift, CloudWatch, S3 buckets, Glue, and Lambda for various data management and processing tasks. Experience with Apache Airflow for orchestrating and managing complex data workflows, ensuring efficient task scheduling and automation. Skilled in utilizing Apache NiFi to efficiently extract data from Oracle sources, transform it into files, and securely transfer them to designated S3 buckets for storage and further processing. Experienced in working with Linux environments, proficiently navigating and utilizing its features for various software development and data processing tasks. Familiar with data visualization tools such as Tableau and Power BI, enabling effective representation and analysis of data for insightful decision-making. Have a comprehensive understanding of Azure cloud services. Proficient in PL/SQL, adept at developing and optimizing stored procedures, triggers, functions, and packages to facilitate efficient data manipulation and processing within Oracle databases. Experienced in mainframe technologies, including COBOL, JCL, VSAM, and CICS, with a track record of successfully managing and modernizing mainframe systems to meet evolving business needs. Experienced in IBM InfoSphere DataStage, proficiently utilizing its capabilities for ETL (Extract, Transform, Load) processes and data integration tasks in enterprise environments. Familiar with Informatica, possessing knowledge of its features and functionalities for data integration, ETL processes, and data management solutions. Familiar in Apache Kafka for real-time data streaming and event-driven architectures.ETLOracle PLSQLMainframeApache AirflowdbtAWS LambdaData Warehousing & ETL SoftwareCloud MigrationSQLOracleCOBOLPythonAWS GlueDatabricks PlatformSnowflake - $45 hourly
- 5.0/5
- (1 job)
I am a mid-senior level developer with a strong background in CS fundamentals, backend development, distributed systems, and AWS services. I have successfully designed and deployed microservices, APIs, ETL pipelines, Infrastructure as Code solutions, and data models, in addition to writing proficient database queries. I am looking for an opportunity to make an impact and grow into a senior software engineer.ETLAPI DevelopmentGitHubAPIJavaScriptAmazon DynamoDBCircleCIScriptPySparkGitPythonAWS GlueDockerTerraformAmazon Web Services - $90 hourly
- 0.0/5
- (0 jobs)
Greetings! I am a seasoned GIS professional with over 16 years of experience, including more than 10 years in development and administration. I specialize in developing GIS web applications using libraries such as ArcGIS JS API, Leaflet, Openlayers, Google Map API, and MapBox. My extensive experience with GIS APIs like ESRI JavaScript API, ESRI-Leaflet, and Leafletjs, coupled with my proficiency in web technologies like JavaScript, jQuery, Typescript, AJAX, React, and Angular, allows me to build robust GIS frontends. I am adept at Spatial ETL using FME and Python scripts. My expertise extends to ESRI (ArcGIS desktop, ArcGIS Server, ArcGIS portal) GIS and open-source GIS tools (e.g., QGIS, GeoServer, PostGIS, Leaflet). I have a deep understanding of spatial modeling using spatial/non-spatial data and have experience creating geoprocessing tools and workflow automation using Python/ArcPy. My database technology skills span both SDE and RDBMS sides, and I have extensive experience with RDBMS (Oracle, SQL Server) management. I am well-versed in the application development lifecycle environment, requirement gathering, documentation, and testing. I have successfully managed change and implemented customization to Enterprise GIS environments. In my previous roles, I have been responsible for managing GIS environments, developing GIS web applications, integrating GIS systems with other systems, managing GIS databases, and creating automated processes for GIS data editing and creating spatial ETL. I am looking forward to bringing my wealth of experience and technical skills to your projects and delivering top-notch GIS solutions. Let's connect and discuss how we can work together to achieve your project goals.ETLAngular MaterialPostGISGoogle Maps APIArcGIS OnlineArcGISGISBootstrapWeb ApplicationLeafletReactAngularJavaScriptPython - $35 hourly
- 5.0/5
- (1 job)
I have a strong background in the field of Quality Assurance with 8 years of experience in the Software Development Life Cycle (SDLC). My journey began by working on requirement gathering and documentation, where I learned to understand the clients' needs thoroughly. Over the years, I have been involved in providing exceptional customer support, resolving issues at various levels, and ensuring software implementation in a production environment. Throughout my career, I have found a passion for QA Analysis. What I like the most is the opportunity to bridge the gap between technical requirements and functional expectations. I take pride in my ability to conduct thorough testing, ensuring that the software functions as required, meeting our clients' expectations. My goal is to deliver a high-quality product, free of defects, that brings value and efficiency to our clients' operations. My skills: - ISTQB Certified Tester Foundation Level (CTFL) - Functional Testing Certifications in Operations, Security, and D&E Management - Experience with Scrum, Agile and Waterfall methodology - Basic knowledge of automated testing with Selenium Web Driver - Proficient in using testing tools such as SoapUI, Postman and JMeter - Spanis: native / Englis - C1 level If you give me the opportunity to work with you, I'm committed to giving my best to deliver high-quality products to our clients. I will actively contribute to effective project meetings, implement best practices, and strategies throughout the testing process.ETLBitbucketGitIBM DataPowerAgile Software DevelopmentScrumQuality AssuranceSQLMantisBTApache JMeterPostmansoapUIAPI TestingFunctional TestingSoftware Testing - $55 hourly
- 5.0/5
- (1 job)
Dynamic and results-driven Data professional with 7 years of expertise in AWS services, business intelligence, reporting and ML. Leveraging a solid foundation in business analytics to drive operational efficiency and deliver clear insights. Seeking opportunities to contribute to innovative projects in a collaborative environment.ETLData Analytics & Visualization SoftwarePythonAWS GlueAmazon QuickSightMicrosoft Power AutomateMicrosoft Power BITableauETL PipelineSQLBusiness IntelligenceAnalytics Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.