Building an Interactive Weather Data Scraper in Google Colab: A Code Guide to Extract, Display, and Download Live Forecast Data Using Python, BeautifulSoup, Requests, Pandas, and Ipywidgets

“`html

In this tutorial, we will create an interactive web scraping project using Google Colab. This guide will help you extract live weather forecast data from the U.S. National Weather Service. You will learn how to set up your environment, write a Python script using BeautifulSoup and requests, and integrate an interactive user interface with ipywidgets. This tutorial offers a straightforward approach to collecting, displaying, and saving weather data, all within a single Colab notebook.

First, we will install three essential libraries: BeautifulSoup4 for parsing HTML content, ipywidgets for creating interactive elements, and pandas for data manipulation and analysis. Running this command in your Colab notebook ensures your environment is ready for the web scraping project.

!pip install beautifulsoup4 ipywidgets pandas

Next, we will import the necessary libraries to build our interactive web scraping project. This includes requests for handling HTTP requests, BeautifulSoup for parsing HTML, and csv for managing CSV file operations. We will also use files from google.colab for file downloads, ipywidgets and IPython display tools for creating an interactive UI, and pandas for data manipulation and display.

import requests
from bs4 import BeautifulSoup
import csv
from google.colab import files
import ipywidgets as widgets
from IPython.display import display, clear_output, FileLink
import pandas as pd

We will define a function called scrape_weather that retrieves weather forecast data for San Francisco from the National Weather Service. This function makes an HTTP request to the forecast page, parses the HTML with BeautifulSoup, and extracts details such as the forecast period, description, and temperature. The collected data is stored as a list of dictionaries and returned.

def scrape_weather():
    url = 'https://forecast.weather.gov/MapClick.php?lat=37.7772&lon=-122.4168'
    response = requests.get(url)
   
    if response.status_code != 200:
        return None
   
    soup = BeautifulSoup(response.text, 'html.parser')
    seven_day = soup.find(id="seven-day-forecast")
    forecast_items = seven_day.find_all(class_="tombstone-container")
   
    weather_data = []
   
    for forecast in forecast_items:
        period = forecast.find(class_="period-name").get_text() if forecast.find(class_="period-name") else ''
        short_desc = forecast.find(class_="short-desc").get_text() if forecast.find(class_="short-desc") else ''
        temp = forecast.find(class_="temp").get_text() if forecast.find(class_="temp") else ''
       
        weather_data.append({
            "period": period,
            "short_desc": short_desc,
            "temp": temp
        })
   
    return weather_data

We will also create a function called save_to_csv that takes the scraped weather data and writes it into a CSV file. This function opens the file in write mode, initializes a DictWriter with predefined field names, writes the header row, and then writes all the rows of data.

def save_to_csv(data, filename="weather.csv"):
    with open(filename, "w", newline='', encoding='utf-8') as f:
        writer = csv.DictWriter(f, fieldnames=["period", "short_desc", "temp"])
        writer.writeheader()
        writer.writerows(data)
    return filename

Next, we will set up an interactive UI in Colab using ipywidgets. When the “Scrape Weather Data” button is clicked, it will scrape the weather data, save it to a CSV file, display the data in a table, and provide a download link for the CSV file.

out = widgets.Output()

def on_button_click(b):
    with out:
        clear_output()
        data = scrape_weather()
        if data is None:
            return
       
        csv_filename = save_to_csv(data)
       
        df = pd.DataFrame(data)
        display(df)
        display(FileLink(csv_filename))

button = widgets.Button(description="Scrape Weather Data", button_style='success')
button.on_click(on_button_click)

display(button, out)

In conclusion, this tutorial demonstrated how to combine web scraping with an interactive UI in a Google Colab environment. We built a complete project that fetches real-time weather data, processes it using BeautifulSoup, and displays the results in an interactive table while offering a CSV download option.

Explore how artificial intelligence can transform your business processes. Identify areas where AI can add value, select appropriate tools, and start with small projects to measure effectiveness before expanding your AI initiatives. For guidance on managing AI in business, contact us at hello@itinai.ru.

“`

AI Products for Business or Try Custom Development

AI Sales Bot

Welcome AI Sales Bot, your 24/7 teammate! Engaging customers in natural language across all channels and learning from your materials, it’s a step towards efficient, enriched customer interactions and sales

AI Document Assistant

Unlock insights and drive decisions with our AI Insights Suite. Indexing your documents and data, it provides smart, AI-driven decision support, enhancing your productivity and decision-making.

AI Customer Support

Upgrade your support with our AI Assistant, reducing response times and personalizing interactions by analyzing documents and past engagements. Boost your team and customer satisfaction

AI Scrum Bot

Enhance agile management with our AI Scrum Bot, it helps to organize retrospectives. It answers queries and boosts collaboration and efficiency in your scrum processes.