JS loading blocker
-
Is there a tool, or Chrome extension I can use to load a page, identify the .js on the page, 'uncheck' selected .js and load the page again to check loading correctly? Even better to be able to defer/move to the end of the file to test.
-
Thanks for checking in, Mick!
-
Sorry for the delay. I got sidetracked on another project and this client decided they would leave .js as is for the time being so I have not really tested. Initially I couldn't get the Chrome ext to do what I wanted and need to look at Firefox.
-
Hi Mick, did you find what you were looking for? We'd love an update. Thanks!
Christy
-
thanks. I'll give it a try and let you know.
-
Hey Mick,
I use Firebug there is a version for Chrome, but it was originally built for Firefox.
Full java-script debugging, breaking, conditional breaking, watching, step in, and profiling
Chrome Version Here: https://getfirebug.com/releases/lite/chrome/
Hope this helps,
Don
-
I´ve found this discussion about the same subject if you want to have a look
stackoverflow.com/questions/9698059/disable-single-javascript-file-with-addon-or-extensionSorry but i can´t help you more than this.
Good luck
-
thanks, that's quite handy but not what I need in this case. This tool seems to switch off .js for the whole page. I'm looking for something where I can cherry pick the .js on the page I want to block, or ideally move.
-
Hi,
You can find what you´re looking for https://chrome.google.com/webstore/detail/quick-javascript-switcher/geddoclleiomckbhadiaipdggiiccfje
Hope it helps you.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webmaster tools showing 200 page load ok - all other testing tools show a 301
hey, on https://www.xxx.co we've setup a 301 redirect to xxx.us - > BUT in webmaster tools its still showing a 200 load ok, whereas on all other testing tools its showing a 301 redirect (screamingfrog etc) even https://dns.google.com/query?name=www.xxx.co is showing that its 301 redirected. Any ideas? as we want to trigger the change of address tool in WMT and its saying it cant as it loads the homepage still....
Technical SEO | | RobertN-London0 -
Video & Graph That Lazy Loads
Hi, Product pages on our site have a couple of elements that are lazy loaded / loaded after user action. Apart from images which is a widely discussed topic in lazy loading, in our case Videos & Price Graphs are lazy loaded. For videos we do something that Amit Agarwal recommended here: http://labnol.org/internet/light-youtube-embeds/27941/ - We load a thumbnail and a play button over it. When a user clicks that play button, the video embedd form Youtube would load. However we are not sure if Google gets that and since the whole thing is under a H3 tag, will we a) loose out benefit of putting a relevant video there b) send any negative signals for only loading a image thumbnail under an h3 tag? We also have price graph, that lazy loads and is not seen when you see a cached version of our page on Google. Are we losing credit (in Google's eyes) for that content on our page? Sample page which has both price history graph & video http://pricebaba.com/mobile/apple-iphone-6s-16gb Appreciate your help! Thanks
Technical SEO | | Maratha0 -
Fetch as Google - stylesheets and js files are temporarily unreachable
Fetch as Google often says that some of my stylesheets and js files are temporarily unreachable. Is that a problem for SEO? These stylesheets and scripts aren't blocked and Search Consoles show that a normal user would see the page just fine.
Technical SEO | | WebGain0 -
Site not loading on Firefox
Hello guys, I can't get my website to be loaded on Firefox, why's that?
Technical SEO | | PremioOscar0 -
Will a google map loaded "on scroll" be ignored by the crawler?
One of my pages has two Google maps on it. This leads to a fairly high keyword density for words like "data", "map data" etc. Since one of the maps is basically at the bottom of the page I thought of loading it "on scroll" as soon as its container becomes visible (before loading the map div should be empty). Will the map then still be craweld by google (can they execute the JS in a way that the map is loaded anyways?) or would this help to reduce the keywords introduced by the maps?
Technical SEO | | ddspg0 -
Remotely Loaded Content
Hi Folks, I have a two part question. I'd like to add a feature to our website where people can click on an ingredient (we manufacture skin care products) and a tool-tip style box pops up and describes information about the ingredient. Because many products share some of the same ingredients, I'm going to load this data from a source file via AJAX. My questions are: Does this type of remotely-fetched content have any effect on how a search engines views and indexes the page? Can it help contribute to the page's search engine ranking? If there are multiple pages fetching the same piece of remotely-fetched content, will this be seen as duplicated content? Thanks! Hal
Technical SEO | | AlabuSkinCare0 -
Will loading ads in an iframe increase page response time?
We are experiencing slow response time because our pages are ad heavy. If we load ads in an iframe, will the google bots, when indexing the page, count the time it takes for the contact within the iframe to load? Or is that load time separate from the total page response/load time?
Technical SEO | | kbbseo0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0