![]() ![]() We have data from our engineer which should be interesting and we are going to sponsor events, do some link baiting with some of our articles, get a pr firm to get us some good articles on major sites and go to events around phily where we will have unique content and a unique perspective such as car shows ect. We are changing our link building tactics and making more Pr related links and creating awesome content on blogs or our own site to generate traffic and links to our site. Ok so we paid the top firm in seo to help us build an seo strategy and i think we have a good one. If any of these pdfs are pulling in tons I traffic I would figure out how I can put the pdf to better use or create webpage (and redirect the pdf) to best monetize/convert or whatever you business goals dictate. You can embed "add to cart" buttons and links into them and they will function just as on a web page. The panda problem was solved after a couple of months.Īlso, keep in mind that. So I used rel=canonical to assign them to the most relevant page using. I had a lot of pdfs with images on one of my site and got hit with a panda problem. Keep in mind that google can read the text in some images embedded in pdfs. (test this by searching for an exact phrase from one of them in quotes and include site: in the query. pdfs of print ads then they might simply be images in a pdf. These are "dead weight" on your site.Īlso, if these are. I would delete (and redirect the URL) of any page that answers "no" to the three items above. ** how many of them are currently useful (are people looking at them) ** how many of them pull traffic from search or other sites I would spend the time needed to do an assessment of these pages. Thoughts and suggestions would be greatly appreciated. ![]() pdf is located under the "Downloads" tab Here's an example page: - The technical spec. Create specs on the page instead of relying on a. Don't open the link in a new window so the back button remains finctionalĤ. If this situation effectively stops the bot in its tracks and it can't crawl any further, what's the best way to fix this?Ģ. pdf) and the "back" button doesn't work because the page opened in a new window? pdf, is there any way for that bot to continue to crawl the site, or does it get stuck on that dangling page because it doesn't contain any links back to the site (it's a. If these pages are being crawled, and a bot follows the link for the. These are all set up to open in a new window when the end user clicks. pdf of the technical specs for that product. We have a large number of product pages that contain links to a. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |