Tallo logoTallo logo

Python Web Scraper — Florida Court Data

Job

Guru.com

$500,000 Salary, Full-Time

Posted 8 weeks ago (Updated 8 weeks ago) • Actively hiring

Expires 5/27/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
80
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Programming & Development Programming & Software Python Programming Web Scraping Pandas Playwright Comma Separated Values (CSV) Beautiful Soup Florida Judiciary Web Scraper — Config-Driven, Resilient Architecture I need a Python-based web scraping application to collect judge data from all 20 Florida judicial circuits and output it to a standardized CSV. The tool must be built for long-term maintainability — when a circuit website changes layout, only minimal configuration updates should be needed, not code rewrites.
Background:
Florida has 20 circuits covering 67 counties. Each circuit publishes judge data differently: some offer Excel/CSV downloads, others publish HTML pages and subpages with varying structures.
The master data source is:
https://www.flcourts.gov/
Florida-Courts/Trial-Courts-Circuit Required Output Fields:
(CSV)ID, Type, Name, Lastname, Assistant, Phone, Location, Street, City, State, Zip, County, Circuit, District, Courtroom, Hearingroom, Subdivision(Sample CSV will be provided — format must match exactly)
Architecture Requirements:
Config-driven circuit registry — All 20 circuits must be defined in an external config file (JSON or YAML), not hardcoded.
Each entry should include:
circuit number, base URL(s), scraping method (HTML/table/CSV download), and field mappings. Adding or updating a circuit should require only a config change. Per-circuit adapter pattern — Each circuit should have its own scraping strategy/adapter to handle unique layouts.
This isolates changes:
if Circuit 11 redesigns their site, only that adapter needs updating. Change detection — On each run, compare results to the previous run and produce a diff report (new judges, removed judges, changed fields). Full output CSV is always saved, but the diff highlights what changed. Flexible execution — Support both a full scrape of all 20 circuits and targeted single-circuit runs (e.g., --circuit 17). This allows quick re-runs when a specific circuit fails. Error handling and logging — If a circuit scrape fails or returns no results, log the error with timestamp and circuit ID. Do not silently skip circuits. Optionally support email or webhook notification on failure. Scheduling-ready — The tool should run headlessly from the command line and be schedulable via cron or Windows Task Scheduler without manual intervention.
Tech Stack Preferences:
Python 3.x, BeautifulSoup or Playwright (for JavaScript-rendered pages), pandas for CSV output. Deliverable should include a requirements.txt and brief setup documentation.
Deliverables:
Working Python application with all 20 circuits implemented External config file for all circuit URLs and scraping strategies Sample output CSV matching the provided format Change-detection diff report on each run README with setup, usage, and instructions for updating a circuit when its site changes
Additional Notes:
Some circuits render content via JavaScript and may require a headless browser (Playwright). Please flag in your proposal which circuits you identify as JS-rendered. Prior experience scraping government/court websites is a strong plus. ... Show more Job Q&A Become a member to ask a question, view Q&A, and get more benefits. Similar Jobs Software Developer Fixed Price | $500-$1k
Posted:
February 14, 2026 Specification document Web Scraper tasks Fixed Price | Under $250
Posted:
March 07, 2026 Migrate Algo to
Web App Fixed Price Posted:
February 18, 2026 Share this job Share this job Copy Link Directly Copy | Report this Job Posted By Michael K United States Feedback 100.0% Total Spend $1,964 Jobs Posted 12 Jobs Paid 6 (50%) Paid Invoices 14 (100%) Outstanding Invoices 0 More Jobs from Michael K (0) No additional jobs found Employer Summary | %
Member Since:
Stats Jobs posted: Jobs paid: [NaN%] Invoices paid: [NaN%] Invoices outstanding: Average pay time lag: days Feedback There are no reviews to display for this employer. Share job Share job Copy Link Directly Copy | Report job Add to Watchlist Send a Quote Michael K United States

Similar remote jobs