SQL

Build and manage Selenium web scrapers with Auto-Scrape

tldr: Auto-scrape let’s you focus on writing web scraping scripts, while it takes care of logging, data persistance, data presentation and data export, all through a modern browser-based UI. It can be run locally or deployed remotely. Here are some screencasts of the UI. Get it on Github. Why scrape the web? Building a Selenium web scraper is almost a rite of passage for programmers starting out. Watching a computer fill out forms, click links and collect data before your eyes is not only a highly satisfying and suitably non-abstract exercise for beginners to complete - browser automation forms a foundation for frontend testing, can be used for automated research, and of course can be used to replace those expensive and unreliable humans to accomplish a wide range of business-related tasks.

Convert SQL query to CSV file in Python

This function takes an SQLAlchemy query object as it’s input (SQLAlchemy is a popular Python SQL toolkit and Object Relational Mapper). It’s strength lies in not needing to hard-code column names, making it scaleable and suitable to “set-and-forget”. While developing enterprise software, every developer has surely have been faced with a client asking “how do I get the data out of the system?”, and after digging a bit deeper, it becomes clear they want the ability to save a snapshot of data as an Excel spreadsheet.