Skip to main content
Home / Articles /

How to Fetch and Parse RSS Feeds in JavaScript

Say you have an RSS feed like this one. The goal is to request that RSS feed, parse it, and do something useful with the data in it. RSS is XML, and XML is arguably not as easy to work with than JSON. While a lot of APIs offer JSON responses, it’s less typical for RSS, although it does exist.

Let’s get it done.

First, it’s probably smart to validate the feed. That way you know at least you’re working with a valid response (parsing may fail on invalid responses).

Then we’ll need to make a network request to the URL the RSS feed lives at. Let’s use JavaScript’s native fetch API since that’s the most widely applicable. It definitely works in the browser, and it looks like Node has a very popular implementation.

What we’ll do is:

  1. Call the URL
  2. First parse the response as text
  3. Then parse the text with DOMParser()
  4. Then use the data like we would if we had a normal DOM reference
const RSS_URL = `https://codepen.io/picks/feed/`;

fetch(RSS_URL)
  .then(response => response.text())
  .then(str => new window.DOMParser().parseFromString(str, "text/xml"))
  .then(data => console.log(data))

We can do our work in that function. RSS is sorta like HTML in that it is nested elements. Our data will be something like this:

<rss>
  <channel>
    <title>Feed Title</title>
    <item>
       <link>https://codepen.io/billgil/pen/ewqWzY</link>
       <title>A sad rain cloud</title>
       <dc:creator>Bill Gilmore</dc:creator>
    </item>
    <!-- a bunch more items -->
  </channel>
</rss>

So, we can querySelectorAll for those <item>elements and loop over them to do what we please. Here, I’ll make a bunch of <article> elements as a template and then plop them onto a webpage:

fetch(RSS_URL)
  .then(response => response.text())
  .then(str => new window.DOMParser().parseFromString(str, "text/xml"))
  .then(data => {
    console.log(data);
    const items = data.querySelectorAll("item");
    let html = ``;
    items.forEach(el => {
      html += `
        <article>
          <img src="${el.querySelector("link").innerHTML}/image/large.png" alt="">
          <h2>
            <a href="${el.querySelector("link").innerHTML}" target="_blank" rel="noopener">
              ${el.querySelector("title").innerHTML}
            </a>
          </h2>
        </article>
      `;
    });
    document.body.insertAdjacentHTML("beforeend", html);
  });

Here’s a demo of that working:

I’ve always thought jQuery made for a nice Ajax library, plus it has some helpers all around. This is how you’d do it in jQuery.

const RSS_URL = `https://codepen.io/picks/feed/`;

$.ajax(RSS_URL, {
  accepts: {
    xml: "application/rss+xml"
  },

  dataType: "xml",

  success: function(data) {
    $(data)
      .find("item")
      .each(function() {
        const el = $(this);

        const template = `
          <article>
            <img src="${el.find("link").text()}/image/large.png" alt="">
            <h2>
              <a href="${el
                .find("link")
                .text()}" target="_blank" rel="noopener">
                ${el.find("title").text()}
              </a>
            </h2>
          </article>
        `;

        document.body.insertAdjacentHTML("beforeend", template);
      });
  }
});

If you’re going to do this for real on a production site, I’d say it’s a smidge weird to rely on a third-party API (and I consider RSS an API) to render important stuff on your site. I’d probably make the request server-side on some kind of timer (CRON), cache it, then have your front-end use data from that cache. Safer and faster.