- This topic is empty.
-
AuthorPosts
-
December 10, 2012 at 1:35 pm #41257
Megan
MemberHi,
Can we automatically create posts from the RSS/Atom feeds we choose n customize them according to our design needs. Is this Possible. I’m looking for a website where i can add Tutorials from other websites. I don’t want to add each tutorial manually. So is there a way by which these tutorial posts can be added automatically using other website feeds
Thanks
MeganDecember 10, 2012 at 1:41 pm #116755Chris Coyier
KeymasterHey Megan,
My first thought is that seems kinda of spammmmmy. Lots of sites do that to this site, don’t credit it, and that feels gross and wrong. I think it would be better if you hand-curated things you wanted to share. Perhaps quote a little and add your own thoughts.
December 10, 2012 at 2:01 pm #116763Megan
MemberHi chriscoyier,
Thanks for replying but We’re looking for a search engine based website for finding design resources..I don’t think we’ll able to add 1000’s of files manually…So is there a way by which we can pull resources from other websites n show in search results..!!
December 10, 2012 at 2:07 pm #116765Chris Coyier
KeymasterYou could make a custom search engine that only searches those websites: http://www.google.com/cse/
December 10, 2012 at 2:12 pm #116766Megan
MemberWe’re looking for site similar to http://www.freebiepsd.com if you can see it’s pulling all PSD files from downloadpsd.com. We want to use same mechanism. Google Custom Search is not what we are looking for..!!
January 7, 2013 at 1:02 am #120061ooredroxoo
MemberHi Megan.
You can do a site with php and mysql, on the database side of things you may need store some things like:
Site’s Table
Site Name (Owner of the RSS/Atom Feed)
Feed Address (The URL to the feed)
Active (just a flag for your bot).Article’s Table
Article Name (Name, usually on the feed)
Article URL (The URL for the article on the site, so you can link to)
Article Description (The description, a excerpt or the first couple characters)
Article Category (Category, if not provided use RegExp on the php logic to categorize)Then on the site you will need the usual stuff, a page for showing the articles, some search mechanism, and so on.
For scraping the articles names, descriptions and URL you will need a bot, the bot will download and read each Feed, for this you will need parse XML, luck you that feeds are standards so they have common fields from site to site.
Once your bot scann a file he will check the database looking for the article to see if it has been added yet, if not find anything add the article, then move to the next.
The bot will use the feeds URL located on the Site’s Table, you can make a cron job on your server to run once a day, once each six hours or you can trigger when someone visit the site, make run as background task, so the user don’t get waiting the scan finish.
If I was you I will made the website on PHP and the bot if possible in Nodejs, so it will be more realtime without needing some trigger or cron job to activate.
Good Luck with your project.
I once made a scrapping with php to have a list of all articles of Smashing Magazine, so I can read the titles and choose what read.January 7, 2013 at 3:11 am #120076Megan
MemberThanks a lot ooredroxoo….!! :)
-
AuthorPosts
- The forum ‘Back End’ is closed to new topics and replies.