Is There A Way To Automatically Ping Your New Content
-
Hi I believe that if you have wordpress then you can automatically ping your site with new content but I am using joomla and would like to know if you can do it with this type of site.
I write lots of articles each day and it takes time to keep ping each article so i am wondering if there is an easy way of doing it.
-
There is a field in the Dashboard of Word Press where you can place the URLs of services you want to ping and it will automatically do so when you update.
However, if you would like to do so manually and hit a list of different services, try Ping-o-Matic.
-
many thanks this is excellent, strange because i did search the extensions but must not have put in the correct search word, this will make my life a lot easier.
-
yes of course you can be looking modules for pinger try search on google.
-
Hi Diane,
Have you tried any of these? Joomla has an extension directory that lists several pinging extensions. Also, a Bing search on [joomla ping] turned up several useful results with detailed explanations.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Managing Zendesk content, specifically redirecting/retiring content?
We've been using Zendesk to manage support content, and have old/duplicate articles that we'd like to redirect. However, Zendesk doesn't seem to have a solution for this, and the suggestions we've found (some hacky JS) have not worked. I'd like for us to not just delete/hide these articles. Has anyone else successfully navigated retiring/redirecting Zendesk content in an SEO-friendly fashion?
Technical SEO | | KMStrava0 -
SEO for a a static content website
Hi everyone, We would like to ask suggestions on how to improve our SEO for our static content help website. With the release of each new version, our company releases a new "help" page, which is created by an authoring system. This is the latest page: http://kilgray.com/memoq/2015/help-en/ I have a couple of questions: 1- The page has an index with many links that open up new subpages with content for users. It is impossible to add title tags to this subpages, as everything is held together by the mother page. So it is really hard to for users to find these subpage information when they are doing a google search. 2- We have previous "help" pages which usually rank better in google search. They also have the same structure (1 page with big index and many subpages) and no metadata. We obviously want the last version to rank better, however, we are afraid exclude them from bots search because the new version is not easy to find. These are some of the previous pages: http://kilgray.com/memoq/2014R2/help-en/ http://kilgray.com/memoq/62/help-en/ I would really appreciate suggestions! Thanks
Technical SEO | | Kilgray0 -
Content Duplication - Zencart
Hi Guys !!! Based on crawler results, it shows that I have 188 duplicate content pages, out of which some are those in which I am not able to understand where the duplication is ??? The page created is unique. All the URL's are static, all titles, metat tags are unique. How do I remove this duplication !!! I am using Zencart as a platform. Thanks in advance for the help !!! 🙂
Technical SEO | | sidjain4you0 -
Auto genrated content problem?
Hi all, I operate a Dutch website (sneeuwsporter.nl), the website is a a database of European ski resorts and accommodations (hotels, chalets etc). We launched about a month ago with a database of about 1700+ accommodations. Of every accommodation we collected general information like what village it is in, how far it is from the city centre and how many stars it has. This information is shown in a list on the right of each page (e.g. http://www.sneeuwsporter.nl/oostenrijk/zillertal-3000/mayrhofen/appartementen-meckyheim/). In addition a text of this accomodation is auto generated based on some of the properties that are also in the list (like distance, stars etc). Below the paragraph about the accommodation is a paragraph about the village the accommodation is located in, this is a general text that is the same with all the accommodations in this village. Below that is a general text about the resort area, this text is also identical on all the accommodation pages in the area. So a lot of these texts about the village and area are used many times on different pages. Things went well at first and every day we got more Google traffic, and more and more pages. But a few days ago our organic traffic took a near 100% dive, we are hardly listed anymore and if we are at very low places. We expect the Google gave us a penalty. We expect this to be the case because of 2 reasons: we have auto generated text that only vary slightly per page we re-use the content about villages and area's on many pages We quickly removed the content of the villages and resort area's because we are pretty sure that this is definitely something Google does not want. We are less sure about the auto generated content, is this something we should remove as well? These are normal readable text, they just happen to be structured more or less the same way on every page. Finally, when we made these and maybe some other fixes, what is the best and quickest ways to let Google see us again and show them we improved? Thanks in advance!
Technical SEO | | sneeuwsporter0 -
Duplicate Content - Just how killer is it?
Yesterday I received my ranking report and was extremely disappointed that my high-priority pages dropped in rank for a second week in a row for my targeted keywords. This is after running them through the gradecard and getting As for each of them on the keywords I wanted. I looked at my google webmaster tools and saw new duplicate content pages listed, which were the ones I had just modified to get my keyword targeting better. In my hastiness to work on getting the keyword usage up, I neglected to prevent these descriptions from coming up when viewing the page with filter parameters, sort parameters and page parameters... so google saw these descriptions as duplicate content (since myurl.html and myurl.html?filter=blah are seen as different). So my question: is this the likely culprit for some pretty drastic hits to ranking? I've fixed this now, but are there any ways to prevent this in the future? (I know _of _canonical tags, but have never used them, and am not sure if this applies in this situation) Thanks! EDIT: One thing I forgot to ask as well: has anyone inflicted this upon themselves? And how long did it take you to recover?
Technical SEO | | Ask_MMM0 -
Prospective new client it by webspam looking for new resource
Background:
Technical SEO | | tcmktg
Prospective client recently hit by webspam update. (I have verified hundreds of low-quality links, porn links, backlink exchanges etc.) They want us to step in and remove bad links and start over. Question:
What is the best way to examine all the links to determine which need to be removed? We can create the report from open site, but how can we identify the bad links? Here are the site metrics. 5000+ linking domains, so in this example we need to research the 5000 links, and possibly send notifications to thousands of webmasters to remove the links? Open site states about 25,000 total links, but root links are shown below. Yikes. Domain Authority 75
External Followed Links 112,000
Total External Links 115,000
Total Links 150,000,
Followed Linking Root Domains 3,900
Total Linking Root Domains 5,300
Linking C Blocks 2,7000 -
Duplicate Content
Many of the pages on my site are similar in structure/content but not exactly the same. What amount of content should be unique for Google to not consider it duplicate? If it is something like 50% unique would it be preferable to choose one page as the canonical instead of keeping them both as separate pages?
Technical SEO | | theLotter0 -
Entry based content and SEO
My E-commerce team is implementing functionality that allows us to display different content based on what channel and even what keyword the customers used to reach our page. This is of course a move that we believe will strengthen our conversion rates, but how will this effect our organic search listings? Do you guys have any examples of how this could affect us, and are there any technology pitfalls that we absolutely need to know about?
Technical SEO | | GEMoney_No0