Updating rss feed times without changing content
-
my question is like the title reads
If I have an rss feed in an xml file and from time to time I update the pubdate and time. Will this have a positive effect on my website in terms of the rss aggregators coming to my site thinking that it was recently updated and creating links to these pages or will they be able to determine that there is nothing new by comparing it to the old page that they may have stored. thus doing nothing or maybe even hurting the website.
-
I'd suspect that if you change the date, and ping rss aggregators, they'll publish the content, which then becomes 99.9% duplicate content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Resubmit sitemaps on every change?
Hello Mozers, Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap. My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume). What are best practices here? Thanks!
Technical SEO | | yacpro131 -
Devaluing certain content to push better content forward
Hi all, I'm new to Moz, but hoping to learn a lot from it in hopes of growing my business. I have a pretty specific question and hope to get some feedback on how to proceed with some changes to my website. First off, I'm a landscape and travel photographer. My website is at http://www.mickeyshannon.com - you can see that the navigation quickly spreads out to different photo galleries based on location. So if a user was looking for photos from California, they would find galleries for Lake Tahoe, Big Sur, the Redwoods and San Francisco. At this point, there are probably 600-800 photos on my website. At last half of these are either older or just not quite up to par with the quality I'm starting to feel like I should produce. I've been contemplating dumbing down the galleries, and not having it break down so far. So instead of four sub-galleries of California, there would just be one California gallery. In some cases, where there are lots of good images in a location, I would probably keep the sub-galleries, but only if there were dozens of images to work with. In the description of each photo, the exact location is already mentioned, so I'm not sure there's a huge need for these sub-galleries except where there's still tons of good photos to work with. I've been contemplating building a sort of search archive. Where the best of my photos would live in the main galleries, and if a user didn't find what they were looking for, they could go and search the archives for older photos. That way they're still around for licensing purposes, etc. while the best of the best are pushed to the front for those buying fine art prints, etc. These pages for these search archives would probably need to be de-valued somehow, so that the main galleries would be more important SEO-wise. So for the California galleries, four sub-galleries of perhaps 10 images each would become one main California gallery with perhaps 15 images. The other 25 images would be thrown in the search archive and could be searched by keyword. The question I have - does this sound like a good plan, or will I really be killing my site when it comes to SEO by making such a large change? My end goal would be to push my better content to the front, while scaling back a lot of the excess. Hopefully I explained this question well. If not, I can try to elaborate further! Thanks, Mickey
Technical SEO | | msphotography0 -
Database driven content producing false duplicate content errors
How do I stop the Moz crawler from creating false duplicate content errors. I have yet to submit my website to google crawler because I am waiting to fix all my site optimization issues. Example: contactus.aspx?propid=200, contactus.aspx?propid=201.... these are the same pages but with some old url parameters stuck on them. How do I get Moz and Google not to consider these duplicates. I have looked at http://moz.com/learn/seo/duplicate-content with respect to Rel="canonical" and I think I am just confused. Nick
Technical SEO | | nickcargill0 -
How long to recover from Panda Update
Hi there, I think I was affected by the recent Panda update as I had a lot of duplicate content for my product descriptions (about 300). I'm going through and rewriting these to be both helpful and unique. I was ranking quite nicely for a big spread of keywords, but have been seeing my rankings drop day after day since the update. Is it possible to see my rankings improve again after Google re-crawls my site, or would a penalty have been applied to my site preventing me to re-gain my positions for sometime. It's probably worth noting that I have a lot of unique and helpful content, it was just my product pages that had duplicate content, but I've seen my rankings across the board drop. Any discussion and insight would be much appreciated.
Technical SEO | | BlueTree_Sean0 -
Changing Domain Name
Hi all, A client has just got their .edu domain and they want to change their current domain name (a .com) to this new .edu domain. The domain's CMS is Wordpress. Please correct me if I'm wrong, but basically I will need to create a new site (but they want to keep current design), move everything across to the new domain name, and 301 URL per URL? What about all the citations that the old URLs have gotten? The website is listed on Google listings/maps for some of their local keywords. Is there anyway to preserve this? Thank you all in advance.
Technical SEO | | EdwardDennis0 -
Does anyone know a sitemap generation tool that updates your sitemap based on changes on your website?
We have a massive site with thousands of pages which we update everyday. Is there a sitemap generator that can create google sitemaps on the fly and change only based on changes in the site? Our site is much too large to create new sitemaps on regular basis. Is there a tool that will run on server that does this automatically?
Technical SEO | | gwynethmarta0 -
Duplicate content
I am getting flagged for duplicate content, SEOmoz is flagging the following as duplicate: www.adgenerator.co.uk/ www.adgenerator.co.uk/index.asp These are obviously meant to be the same path so what measures do I take to let the SE's know that these are to be considered the same page. I have used the canonical meta tag on the Index.asp page.
Technical SEO | | IPIM0