Many of us have many “homes” on the interwebs. Personally I use Twitter, Flickr, ScrnShots, and Facebook. These web services are cool enough to offer ways to interact with them and pull data out of them without even necessarily visiting the site itself. This is called an API (or Application Programming Interface).

You can think of an API as a lot like an RSS feed. If you read CSS-Tricks through a feed reader like Google Reader, you know that you don’t even need to visit the site to read my content, because that data is being served up another way. In the case of RSS, it’s an XML file formatted in a very specific way. API’s are often served up as XML, formatted however that particular application thinks it will be of most use to you. XML is cool, but much like HTML, it needs to be parsed before you can really do anything with it. Enter JSON.
JSON (JavaScript Object Notation) is what all the hip applications are serving up these days with their API’s as an alternative to XML. The cool part about JSON is that you don’t need to parse it in the same way you do XML. That data you get from a JSON call comes back as an object all ready-to-rock and let you do stuff with it. Note: if that was way off or a bad explanation, feel free to correct me.
Using APIs (jQuery and JSON)
jQuery provides a dead-simple way of retrieving these JSON objects
$.getJSON('http://url-to-api.com', function(data){
$.each(data, function(index, item){
// do stuff with each item
});
});
The code above hits the URL provided (you’ll need to replace that with a real URL to a real API that spits out real JSON of course) and then does a loop through each “item” and gives you a chance to do something with that item. I say “item” because while that is a common name that APIs use, isn’t always the case, and you’ll need to be specific and accurate here as JSON provides very little in the way of error handling.
Inserting Stuff Onto Your Page
Another reason to use jQuery here is because of how is it makes it to insert HTML onto the page on-the-fly. Let’s take a look at the code example for grabbing recent “tweets” from Twitter and then use the built-in jQuery function append() to get them on the page.
$.getJSON('https://api.twitter.com/1/statuses/user_timeline/chriscoyier.json?count=10&include_rts=1&callback=?', function(data){
$.each(data, function(index, item){
$('#twitter').append('<div class="tweet"><p>' + item.text + '</p><p>' + item.created_at + '</p></div>');
});
});
This will put a new div of class “tweet” onto the page (inside the parent div of ID “twitter”) for each “item” in the object. Notice the “count” variable in the URL which Twitter provides. It is set at 10 which will return 10 items thus there will be 10 divs on the page. Inside of those div’s we have two paragraph elements. One with “item.text” and one with “item.created_at”. These will be the actual text of my last tweet and when I submitted it.
Example of resulting HTML from one item:
<div class="tweet">
<p>I wish position: relative; worked for table cells =P</p>
<p>1 day ago</p>
</div>
Now that’s more like it! We can use our CSS skillz to style that up however we want. Like this perhaps:
.tweet {
padding: 10px;
margin: 5px 0;
background: url(images/transpOrange25.png);
}
Extending The Idea
Let’s make three div’s on our page, one for Flickr, one for Twitter and one for ScrnShots. Then we’ll use our jQuery + JSON technique to fill them up.
Base HMTL:
<body>
<div id="page-wrap">
<div id="flickr">
<h1>Flickr Photos</h1>
</div>
<div id="twitter">
<h1>Twitter Updates</h1>
</div>
<div id="scrnshots">
<h1>Latest ScrnShots</h1>
</div>
</div>
</body>
Now heres the jQuery to pull in and append all the data from all three services:
<script type="text/javascript" src="js/jquery-1.2.6.min.js"></script>
<script type="text/javascript">
$(document).ready(function(){
$.getJSON("http://api.flickr.com/services/feeds/[email protected]&lang=en-us&format=json&jsoncallback=?", function(data){
$.each(data.items, function(index, item){
$("<img/>").attr("src", item.media.m).appendTo("#flickr")
.wrap("<a href='" + item.link + "'></a>");
});
});
$.getJSON('http://twitter.com/status/user_timeline/chriscoyier.json?count=10&callback=?', function(data){
$.each(data, function(index, item){
$('#twitter').append('<div class="tweet"><p>' + item.text.linkify() + '</p><p>' + relative_time(item.created_at) + '</p></div>');
});
});
$.getJSON("http://www.scrnshots.com/users/chriscoyier/screenshots.json?callback=?", function(screenshots){
$.each(screenshots, function(index, screenshot){
$("#scrnshots").append("<a href='" + screenshot.url + "'><img src='" + screenshot.images.small + "' /></a>");
});
});
});
</script>
Cleaning Up Twitter
Two little problems with the “raw” data the Twitter API spits out.
- Links come in “dead”. The full URL is there, but it’s just text, not a real anchor link.
- The date comes in as an ugly timestamp, not nice human-readable text like “2 days ago”.
Two little javascript functions I dug up to deal with these problems.
Linkify:
String.prototype.linkify = function() {
return this.replace(/[A-Za-z]+:\/\/[A-Za-z0-9-_]+\.[A-Za-z0-9-_:%&\?\/.=]+/, function(m) {
return m.link(m);
});
};
relative_time:
function relative_time(time_value) {
var values = time_value.split(" ");
time_value = values[1] + " " + values[2] + ", " + values[5] + " " + values[3];
var parsed_date = Date.parse(time_value);
var relative_to = (arguments.length > 1) ? arguments[1] : new Date();
var delta = parseInt((relative_to.getTime() - parsed_date) / 1000);
delta = delta + (relative_to.getTimezoneOffset() * 60);
var r = '';
if (delta < 60) {
r = 'a minute ago';
} else if(delta < 120) {
r = 'couple of minutes ago';
} else if(delta < (45*60)) {
r = (parseInt(delta / 60)).toString() + ' minutes ago';
} else if(delta < (90*60)) {
r = 'an hour ago';
} else if(delta < (24*60*60)) {
r = '' + (parseInt(delta / 3600)).toString() + ' hours ago';
} else if(delta < (48*60*60)) {
r = '1 day ago';
} else {
r = (parseInt(delta / 86400)).toString() + ' days ago';
}
So now instead of just putting “item.text” in your append statement, you can put “item.text.linkify()”. Also instead of putting “item.created_at” you can put “relative_time(item.created_at)”.
Thanks
Functions for “linkify” and “relative_time” from Ralph Whitbeck.
Thanks to Greg Bell for whipping the ScrnShots API into shape for me!
Very cool post, thanks!
Has some way to make for deviantart?
Not yet, there were some rumours about dA making an API. But I haven’t seen anything about it lately
@Majesticskull if there’s an RSS output you can always use something like simplepie to parse it
Great work, I was just recently thinking about creating a self-updating homepage for myself so this info will definitely help, all I need now is to become more active on the social media sites!
Great tutorial, any tips on loading in information from any rss feed straight into my html? Like the rss feed from my WordPress blog onto my html site?
@Thomas – Like Tim mentioned above, if you just need to parse RSS to insert onto a page, SimplePie is the way to go. I talk about using it in my iPhone/Mobile Interface article a few weeks back.
There are social network aggregators like FriendFeed (http://friendfeed.com/) and Soup (http://www.soup.io/) which do this sort of thing for you, and then offer their own API, so that you can embed a “lifestream” into your blog, facebook profile, etc. You don’t get the same fine-grained control, but they are easy and convenient to set up.
With more and more social networks popping up all the time it’s a nice idea to be able to create your own personal portal. My only concern is having to rely on these services being up (cough twitter cough) so as not to stall or break the page.
Great post! I’ve been wanting to display my twitter and flickr updates a little better and this will help.
i’ve had mine for quite some time, built with magpieRSS and a little php
http://status.jaredzimmerman.com
Hey Chris! Nice tut! I did something similar some time ago with my personal site and wrote a quick tut :)
I love the idea of life streams, and I actually built my own life feed this past month using Yahoo! pipes and a number of feeds, then a PHP script to parse everything.
Some of the feeds I used I needed to generate so I have them cached on my own server and that seems to alleviate some of the feeds, namely Twitter, being down issues Dave was eluding to. It would be really easy to build a feed from Twitter that would cache and would check to see if Twitter is up before it generated an updated cache and if Twitter was down it would grab the previously generated cache from your own server.
I’m so glad to see this trend taking really off!
@Jared: Very nice, I like it. It’s probably important to note the advantages and disadvantages to doing it with PHP/RSS vs. using the live API. With RSS, many sites will generate it as a flat file every so often, or even pass it through a service like Feedburner that ultimately hosts the RSS. This means that even if the site goes down, your social page will probably remain intact. Whereas if you go the live API route, your page is subject to the whims of that site. You may be able to fight it by doing some caching of your own, but the whole purpose of going the jQuery/Live API route is just how SIMPLE it is to use =)
@Jon: Also very cool. A combination of parsing RSS and using the provided Twitter widget. Obviously more than one way to skin a cat =)
@Terri: Ah yes, Yahoo! Pipes is yet ANOTHER way to go about this. I bet there are even still more ways. This is the age were are living in, where all these brilliant developers are actively trying to think of ways to help us SPREAD information in as easy a fashion as possible. Very bright times for the web.
Very nice tutorial. Definitely going to give it a try.! Thanks for sharing.
Nice Tutorial. Thanks for the link.
Great Tutorial! Thanks! Am definitely going to try this out when I get some free time ;-)
Nice Tutotrial tnx!
I can’t use this if into my page there’s mootools, right?!
Great tutorial, but did you try it your self.
The point is that when you try to retrieve the data on the fly using jQuery, you are sending an XMLHTTPRequest to Twitter. Facebook, etc. And the problem here is that AFAIK Firefox (and I think other browsers do the same thing) prevent you from sending XHR to a server other than the one your page is generated from.
Late reaction, You know I was busy last months and didn’t had the time to read all your posts, but on times like this sunday afternoon I got time free for them and I really enjoyed this post. I’m still busy with learning jQuery (and also a little Mootools lately) and this is very interesting stuff. Thnx!
Not sure if this is just me but it doesn’t seem to Validate. All the errors are Reference to the script.
Validation URL
Something should be said about Sweetcron here, a new open source “lifestreaming” service that allows complete control over style, feeds etc. Also host on your own domain!
Can I make my own Twitter with what you have written above ?
please … I plan to make my own twitter
Wow!! It’s getting better and better. Keep it up man.,
Not yet, there were some rumours about dA making an API. But I haven’t seen anything about it lately
I love you so much! Great place to visit!,
Could i change the size of the imgae (Flickr) to the square format 75 x 75?
And does that works?
Hi Chris, really great tutorial. I don’t suppose you or anyone can tell me how I go about :
a) Limiting the number of screenshots pulled in from ScrnShots using your above method.
b) How to pull in just screenshots that are marked as Favorites in ScrnShots?
Many thanks in advance for any pointers :)