How to Scrape Data from a Website into Excel
Pulling data from a website into a spreadsheet is a superpower for anyone doing research, analyzing competitors, or tracking products. This article cuts straight to the chase, showing you three distinct methods for scraping data from a website directly into Excel, from a simple built-in tool to a fully custom script.
What is Web Scraping? (And is it legal?)
Web scraping is the process of automatically extracting information from websites. Instead of manually copying and pasting data, you use a tool or script to "read" the website's code and pull out the specific data you need, then organize it into a structured format like an Excel sheet.
Before you start, it's important to scrape responsibly. Here are two quick rules of thumb:
- Check the robots.txt file: Most websites have a file at
domain.com/robots.txtthat tells automated bots which pages they can and cannot access. Respect these rules. - Review the Terms of Service: The website's terms of service often mention their policy on data scraping. A quick read can save you a lot of trouble. Scraping publicly available data is generally acceptable, but scraping personal data or copyrighted content is not. Always scrape ethically and avoid overwhelming a website's server with too many rapid requests.
Method 1: Use Excel’s Built-In "From Web" Tool (Power Query)
The easiest way to get started is by using Excel's own data import feature, powered by Power Query. This is perfect for pulling data from simple tables on websites, like stock prices, product comparison charts, or lists of statistics.
Let's use a Wikipedia page with a table of the largest cities as an example. Our goal is to pull this table into Excel.
Step-by-Step Guide to "From Web"
1. Find and Copy the URL
First, navigate to the web page containing the data you want to scrape in your browser. For this example, we’ll use a page listing the world's most populous cities. Copy the entire URL from the address bar.
2. Open the "From Web" Data Importer
In a fresh Excel workbook, go to the Data tab in the ribbon. Look for the "Get & Transform Data" section. Click on From Web. If you don't see it, it might be under Get Data > From Other Sources > From Web.
3. Paste the URL
A dialog box will appear. Paste the URL you copied into the text field and click OK.
4. Select Your Data in the Power Query Navigator
Excel will analyze the website's structure and show you a list of all the tables it detects on the page in the Navigator pane. Click through the tables on the left, and a preview of the data will appear on the right.
Find the table that matches what you want to scrape. In our Wikipedia example, we'd select the main table listing the cities, populations, and countries. Once you've selected the right one, click the Load button.
5. Edit with Power Query (Optional)
If the data needs some cleanup before loading, you can click Transform Data instead of "Load." This opens the Power Query Editor, where you can remove unwanted columns, filter rows, change data types, and more, all without altering the original website data. Once you're done, click Close & Load.
6. Your Data is in Excel!
The web data is now imported into a new worksheet as a formatted Excel table. The best part? This is a live connection. You can right-click anywhere in the table and select Refresh to pull the latest data from the website automatically.
Method 2: Use Power Automate for More Complex Tasks
Sometimes, data isn't in a clean HTML table. It might be spread across product cards, require clicking through multiple pages, or be loaded dynamically. For these situations, Microsoft’s free tool, Power Automate Desktop, gives you much more control. It allows you to "record" your scraping actions and build an automated flow.
Let’s imagine we want to scrape product names and prices from an e-commerce category page.
Step-by-Step Guide with Power Automate
1. Install and Set Up Power Automate Desktop
Download Power Automate Desktop from the Microsoft website (it's free) and install it. Once installed, open the application and create a New flow. Give it a descriptive name like "Product Scraper."
2. Launch a Web Browser
In the "Actions" pane on the left, search for "Launch". Drag the Launch new [browser] action (e.g., "Launch new Chrome") into the main workspace. In its properties, enter the starting URL of the website you want to scrape. Set the "Window state" to Maximized for easier viewing and click Save.
3. Use the "Extract Data" Action
Next, find the Extract data from web page action and drag it into the workspace. This will open up a live web browser window with the special "Web page recorder" tools.
4. Record Your Data Extraction Steps
On the live web page, hover your mouse over the first piece of data you want to extract (e.g., the name of the first product). A red box will appear around the element. Right-click on it and choose Extract element value > Text. Do the same for the price.
As you do this for the first product, Power Automate is working to identify the pattern. When you move to the second product on the list and extract its name and price, the software will automatically identify all similar products on the page and highlight them. The extracted data populates in real-time within the "Extract data from web page" window.
When you're happy with the preview, click Finish.
5. Write the Data to Excel
Power Automate has now stored the scraped data in a variable (usually called DataFromWebPage). Now, we need to save it.
- Drag the Launch Excel action into the workspace and select "with a blank document."
- Then, drag the Write to Excel worksheet action.
- In the "Value to write" field, click the
{x}icon and select your data variable (e.g., DataFromWebPage). - Specify the "Column" and "Row" to start writing from (usually A1). Click Save.
- Finally, add a Close Excel action, and configure it to save the document.
Now, run your flow by clicking the "Run" button. Power Automate will perform all of these actions automatically and leave you with a perfectly scraped Excel file.
Method 3: Create a Custom VBA Script
For maximum control and flexibility, you can write a VBA (Visual Basic for Applications) script directly within Excel. This method is best if you're comfortable with a bit of code and need to perform a highly customized scrape. It lets you interact with the webpage's HTML source code to pinpoint exactly what to pull.
Step-by-Step Guide to VBA Scraping
1. Enable the Developer Tab
If you don’t have a Developer tab in your Excel ribbon, you'll need to enable it. Go to File > Options > Customize Ribbon, and check the box next to Developer.
2. Open the VBA Editor
Go to the Developer tab and click Visual Basic. This opens the VBA Editor. In the editor, go to Insert > Module to create a new module for your code.
3. Add Necessary References
Your script will need to access external libraries to control a browser and parse HTML. In the VBA Editor, go to Tools > References... and make sure the following are checked:
- Microsoft HTML Object Library
- Microsoft Internet Controls
4. Write the VBA Code
Copy and paste the following code into your module. This script scrapes a simple HTML table from a webpage.
Sub ScrapeWebTable()
Dim http As New XMLHTTP60, html As New HTMLDocument
Dim targetTable As IHTMLElement
Dim resultsRow As Long, rowNum As Long, colNum As Long
' Send request to the webpage
http.Open "GET", "https://www.w3schools.com/html/html_tables.asp", False
http.send
' Load the response into an HTML document
html.body.innerHTML = http.responseText
' Target the specific table by its ID -- you will need to inspect the website's HTML to find this
Set targetTable = html.getElementById("customers")
' Start writing results in a spreadsheet from row 2
resultsRow = 2
If Not targetTable Is Nothing Then
' Loop through each row in the table
For rowNum = 0 To targetTable.getElementsByTagName("tr").Length - 1
' Loop through each cell in the row
For colNum = 0 To targetTable.getElementsByTagName("tr")(rowNum).getElementsByTagName("td").Length - 1
' Write cell value to the spreadsheet
ThisWorkbook.Sheets("Sheet1").Cells(resultsRow, colNum + 1).Value = targetTable.getElementsByTagName("tr")(rowNum).getElementsByTagName("td")(colNum).innerText
Next colNum
resultsRow = resultsRow + 1
Next rowNum
MsgBox "Scraping complete!"
Else
MsgBox "Table with the specified ID was not found."
End If
End Sub5. Run the Script
This code sends a request to a URL, loads the HTML, finds a table with the ID "customers", and then loops through its rows and columns to write the data into Sheet1. To run it, make sure you are on Sheet1 in Excel, then go back to the VB Editor and press the green Play button.
Quick Note: This script is a basic template. The key challenge with VBA scraping is correctly identifying the element you want to scrape (like getElementById("customers")). You'll often need to right-click on a web page and choose "Inspect" to find the right ID, class, or tag for the data you're after.
Final Thoughts
Now you have three methods to pull data from any website into Excel, each serving a different level of complexity. Using Excel’s built-in "From Web" feature is perfect for grabbing data from straightforward tables, Power Automate adds powerful automation for more complex sites, and VBA scripting offers the ultimate level of customized control.
While mastering these skills is incredibly valuable, the reality is that much of your day-to-day reporting woes come from data an API can already access. Instead of manually scraping your Shopify sales numbers, CRM pipeline data, or ad campaign performance, there are often simpler ways. That's why we built Graphed to completely automate the process of bringing data into a useful format. By connecting directly to sources like Shopify, Salesforce, HubSpot, and Google Analytics, we remove the need for you to manage complex scrapers or battle with endless CSV exports - you can just ask for the report you need in plain English.
Related Articles
How to Connect Facebook to Google Data Studio: The Complete Guide for 2026
Connecting Facebook Ads to Google Data Studio (now called Looker Studio) has become essential for digital marketers who want to create comprehensive, visually appealing reports that go beyond the basic analytics provided by Facebook's native Ads Manager. If you're struggling with fragmented reporting across multiple platforms or spending too much time manually exporting data, this guide will show you exactly how to streamline your Facebook advertising analytics.
Appsflyer vs Mixpanel: Complete 2026 Comparison Guide
The difference between AppsFlyer and Mixpanel isn't just about features—it's about understanding two fundamentally different approaches to data that can make or break your growth strategy. One tracks how users find you, the other reveals what they do once they arrive. Most companies need insights from both worlds, but knowing where to start can save you months of implementation headaches and thousands in wasted budget.
DashThis vs AgencyAnalytics: The Ultimate Comparison Guide for Marketing Agencies
When it comes to choosing the right marketing reporting platform, agencies often find themselves torn between two industry leaders: DashThis and AgencyAnalytics. Both platforms promise to streamline reporting, save time, and impress clients with stunning visualizations. But which one truly delivers on these promises?