Advice on upgrading from Joomla V1.5.17 to latest
-
Hi there
Ive recently taken on managing www.wetspotcharters.com.au and to my horror after eventually finding a way to log into the back end I've found out that its running Joomla version 1.5.17
Would anyone have any advice as to how to upgrade this to latest version or preferably, migrate the site to Wordpress without losing much of its current look and feel (which will eventually be replaced one page at a time)
Thank you in advance
-
I have worked a lot with Joomla 1.5, you will need to upgrade 1.5 to I think 1.7 first then you should be able to go to 3.
But, upgrading from 1.5 is going to be an absolute nightmare - do not waste your time. I HIGHLY recommend just transferring everything over to Wordpress, try find a template from themeforest which is similar to your site, it will save you so much trouble.
-
I just crawl your site with desktop crawler and found that you have just 67 internal HTML files. I know that there is export from Joomla and import to WP but in reality things wasn't as advertised. So here is mine idea:
- Keep actual Joomla as-is for now
- Make copy of site with some web site saver - TeleportPro, Site Sucker, etc. As backup just for sure.
- Create new WP, setup SEO, GA, silo structure. Do this on internal server.
- Creating pages with "nice" URLs, make 301 redirects from old pages to new one in .htaccess
- Copy-paste content from saved site to WP, make titles, meta description and SEO images (alt, title, etc).
- Create internal links between page again. Make onpage SEO for pages and links.
- Trash Joomla.
- Move WP from internal server to public one. Copy .htaccess redirects too.
- Check for errors, check web log files for erros, etc.
- That's all!
Timeframe for this will be few days (up to 6 if you process 10 pages in single day). Almost same time you will lost with migration from 1.5 to 3.X. But i will suggest you to migrate to WP because there process for migration is almost one click compared to Joomla. I don't know why Joomla didn't make easy process for updating their customers to new versions... and i was few times upset with that.
-
Have you looked at the docs? They have an article about how to migrate from 1.5 to 3.x
https://docs.joomla.org/Joomla_1.5_to_3.x_Step_by_Step_Migration
If you've not done it before I would suggest using a migration service (just Google for Joomla migration). You're less likely to run into showstopping issues. For a long lived website I'd pay for the service just to have peace of mind.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Making Shopify URL's Simpler - Losing the words 'collection', 'product' and 'page' in a Shopify store URL. Any advice?
Hi Mozers! I have a Shopify store (of which there are many advantages) however one big SEO disadvantage, is that my URL structures contravene all Moz advice on dynamic URL structure and whats more I am reminded about this every week when I have a Moz site crawl and I have a batch of URL's that are longe than the 75 characters. A Shopify URL will run www.domain name.com/collections/collection-name/product/product-name. According to advice a it should be www.domain name.com/collection-name/product-name - Don't even get started on sub-collections! I sell portfolio books, album etc and keepsake memory boxes (so long keywords) AND, I have a long(ish) business name. So, For user experience and keyword length, do I just ignore trying to achieve a dynamic URL under 75 characters? When I have asked Shopify, the say their URL's are an integral part of the "Ruby on Rails" system, so nothing can be done Or can it ??? I can't be the only Moz member with this issue can I ??
On-Page Optimization | | nick_HandCo0 -
Joomla Home Page Title Tag Issue
I have got a site that requires to have a custom title tag for its home page (primarily). Its using Joomla 3.3.1. I am not using any SEO components as of now. I have checked the the global configuration, It has Polymer Resources as the site name. I am also attaching copy of the Menu Manager>Main menu> home I would like to see the Home Page browser Title to read: Custom Engineered Plastic Resin | Polymer Resources But my current settings does not let me change the home page title. Any help in this regard will be highly appreciated. Thank you in advance 7yfHVw0.png KzouLB7.png
On-Page Optimization | | ArthurRadtke0 -
Duplicated Content with joomla multi language website
Dear Seomoz Community I am running a multi language joomla website (www.siam2nite.com) with 2 active languages. The first and primary language is english. the second language is thai. Most of the content (articles, event descriptions ...) is in english only. What we did is a thai translation for the navigation bars, headers, titles etc (translation of all joomla language files) those texts are static and only help the user navigate / understand our site in their thai language. Now I facing a problem with duplicated content. Lets take our Q&A component as example. the url structure looks like this: english - www.siam2nite.com/en/questions/ thai - www.siam2nite.com/th/questions/ Every question asked will create two URL, one for each language. The content itself (user questions & answers) is identical on both URL's. Only the GUI language is different. If you take a look at this question you will understand what i mean: ENGLISH VERSION: http://www.siam2nite.com/en/questions/where-to-celebrate-halloween-in-bangkok THAI VERSION: http://www.siam2nite.com/th/questions/where-to-celebrate-halloween-in-bangkok As you can see each page has a unique title (H1) and introduction text in the correct language (same for menu, buttons, etc.) but the questions and answers are only available in one language. Now my question 😉 I guess Google will see this pages as duplicated content. How should I proceed with this problem: put all thai links /th/questions/ in the robots.txt and block them or make a canonical tag for the english versions? Not sure if I set a canonical tag google will still index the thai title and introduction texts (they have important thai keywords in them) Would really appreciate your help on this 😉 Regards, Menelik
On-Page Optimization | | menelik0 -
Advice with keywords - category - Forum
Hiya guys Everyone has been really good to me on here, just wanted a bit of advice with the keywords on my forum. my website is a nightlife forum for the UK, each city has its own section. Each section has a eg: _What's on in Birmingham? Club Nights, Upcoming Events, Promotions _ as the Title category, Should I drop the Club Nights, Upcoming Events, Promotions and put that in the description of the forum. So it'll just be What's on in Birmingham? with a description Find Club night information, Upcoming events and pr............. eg Just wondering if it was to stop searches been made, like, Club nights in Birmingham etc. from being targeted. Your thoughts please guys Thanks for reading Lukescotty
On-Page Optimization | | Lukescotty0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
Latest SEO Factors
What is the most recent report you have for "factors" that impact SEO --- you guys had one a few years ago and I don't see a recent report. This was based on a panel of experts I believe.
On-Page Optimization | | mommybu0 -
Followed seomoz advice for on page tweaks but dropped
Hi all We followed seomoz advice regarding on-page tweaks for this page: http://www.compactlaw.co.uk/agency-agreement.html Dropped 10 places by Google UK. Obviously could be a coincidence, but any on-page critique very gladly received. (But also seem to have been dropped by Google across the board in the past week or so. Usual trouble of trying to find causes. But specific page advice would be a start and much appreciated.) Regards P
On-Page Optimization | | dexm100 -
Google seems upset that I took their advice. [Titles and alt tags for images.]
Hey all, I accidentally posted this as a private question and now want to post it publicly due to some updates (for the worse.) I'm a photographer and the site I'm talking about is my portfolio site. It is very image heavy and had basically no text. Those who have consistently beat me (positions 1,2, etc.) in SERPs for my key search phrases have a modest amount of text on their pages. I'd been doing OK in SERPs (top 3-5 for my key search phrases) over the past couple years and my site has decent age and domain authority (a good number of relevant inbound links from extremely reputable sources over the years, etc. etc.) [In case it matters, my root domain has a PageRank of 4 and I have a couple internal pages with PR5.] For years I resisted adding any text because I was trying to obey Google's rule to design "for people, not search engines." Over the past couple of months, though, I got some advice on the SEOMoz webinar about adding (relevant) alt text and body text, and also read Google's Webmaster Central article about giving images good titles and alt tags, so I decided to take the plunge about ten days ago. I went through the site and added modest amounts of relevant text to pages where it was appropriate and where it didn't detract (too much) from the design. I made sure my images had sensible human-readable alt tags that were descriptive and made sure not to do any keyword stuffing. Finally, I edited some of my page titles so that they were a little more descriptive. Again, nothing extreme or radical or spammy. (But overall, esp. from Google's perspective, there were some fairly significant changes in a short period of time.) Well.. you're all already guessing what's next. As soon as Google saw these changes, I tanked pretty badly. I went from position 3-5 on my key phrases to positions like 16-25 and spent a few days in those positions. Now I'm just gone & buried somewhere in Google's boneyard. My latest ranking report for today shows me "not in top 50" for any of my key phrases on Google. I'm #1 for many of those same terms/phrases on Bing and Yahoo. (Always have fared very well with them.) Google's webmaster tools says my sitemap is OK and most of the URLs submitted are in the index. Please tell me this is temporary, while Google deals with my changes? (Actually don't, just tell me what you really think.) 🙂 Thank you all...
On-Page Optimization | | vdms0