This is a live blog so please be aware there will be typos and mistakes.
Aaron Sheer, Ian Lurie, Rob Garner, Eric Enge
- If you are still doing SEO 101 and have over 100K pages, stop optimizing pages 1 at a time. That will not scale your business. Don’t worry about H1s and alt tags. It’s a waste of your time.
- Are you doing things to scale or things to grow.
What is Google looking for
- Site performance
- url structure
- site structure
- meat! substance. The difference between good and bad copy is not paying some company in India to write poor English. That is entertaining but not good and will not help you.
Do you know how fast your site is?
5 seconds is not good. Go to Google webmaster tools and look at your site performance and know what the score is. Google is collecting information from browsers and Google is using this from your users and using that to rank you. If it is taking to long to load then there is no reason to show your pages. Never go above 4 seconds. Here are some ways to lower your load time. In the green on the chart is better than 85% of the other sites. 2 to 2.5 seconds is a great place to be.
Lot’s of speed factors.
Leverage browser caching and mark it to at least a month. It’s done in the server and hard to find but you have to tell the server that your files need to be cached to a certain level. It can be done in your CSS or Java files. Google can rerank your load times within an hour when you do this. Minify your CSS and .js files. You combine all of your css into one spot so you don’t have to have your server call from 2 places. They do it in one place.
Enable Gzip compression. Turning it on in Apache is not good. You have to actually activate Gzip. It was created in the 90′s and will compress all the files together and pull them all down at once.
- The compbinatio, compression of images on your page.
- Instead of pulling down 50 small images, you are pulling down one compressed file and then the browser does the rest. This really helps.
CDN – Content Network Delivery
Are you using WordPress or one?
Use application W3 total cache and install it. This compresses your file and will help to minimize your site for increased load time.
Where do I look or start the process?
- Install firebug and Firefox.
- Install the Google Page Speed Tool.
- If you don’t see lots of little green buttons then read, test, read again, test, then do it again. Perfect for ADHD developers. The goal is to get as many greens as you can and as little reds in the reports.
- Yea yea, I know I have heard enough about the cloud. It is not Microsoft.
- I am talking about cloud computing, clustering for scale.
- Rack Space actually has both a Cloud and a CDN, no it’s not cheap. It’s a good place to start for tuning your site to be a lot faster.
Is there more?
- I could easily write 1,000 pages about this
- this is simply to open your eyes on the importance of site performance.
Optimize for scale, not one particular keyword.
Log files get their sexy back
Log files are the glue that keeps your seo strong and held together. Otherwise you’ll be flat on your *ss on the asphalt.
1. Get a bot list. A list of the search engine spiders including the names and the ip addresses. Fantomaster Spider Spy is the best.
2. Get your log files. Get your server log files. Find a way to get them, he recommends torture as a great way to convince your host or webmaster to give them to you. It is nerdtastic!
3. Onsite SEO: Match bots to visits. Look for when they spiders are visiting your site. You’ll be able to see how you are expending spider crawls and time on your site, etc… You’ll be able to see how much time on your site and how it crawled. Then export into excel when you have the bots and filter. Look for the patterns of search engines.
4. Look for image problems and patterns. Look for the Google image bot and bing image bot and look for how many got hit by the bots and if they are getting indexed. You can see if images that don’t need to be transferred are being transferred. You’ll also see yoru navigation buttons showing up. Look for the transfers.
5. Onsite SEO: Page issues. Look for things like session ids in crawl results. Session ids mean they are getting duplicate content from your site by having the same session ids. Look for 302 redirects which are bad but 301 are good. Then look at how many bytes of information is being transferred. In a pivot table and look for changes in patterns over a time period. Then measure it against changes or no changes to your site and figure out what happened.
6. Offsite SEO: Links you have, but don’t. Look for links that are broken. Run another script that finds all of the pages on your site that find 404 directs on your site that came from external sites. Then go and talk to them and find the source and contact them, or restore the page and get the link back off of the link. Use the keywords, etc…
7. Do it. Making a list of recommendations doesn’t do anything. Get off your *ss and do it. You don’t get top rankings because you know the information. You get the results because you do the work.
Here are the basics. Go through this list at least 3 times. Then you can become an Advanced SEO.
Digital Publishing is the new marketing. Content is the new marketing….and brands and marketers need to get in the business of being a publisher. Publishing is words, thoughts, status, images, feeds, video, applications, conversation and more. Publishers must also disseminate information in networks to capture links and external influence. this means that new content should be distributed to twitter, Facebook, email lists, etc…
Social link Dissimenation.
Shift in linking measurement.
- links are the cornerstones of natural search algorithms.
- But influence of links has shifted greatly from webmasters and technical influencers to the average internet user.
- Asd a result the social graph has taken a massive bite out of the link graph.
- Examples of this include: Tweeting, Publising on a blog, ratings, Commenting and Posting, Bookmarketing, etc…
Aug. 2010 It’s Official. Tweets count as search signals.
- August SES San Francisco
- Yes tweets with links are treated differently in the algorithms and effectively count as links. Tobials Peggs from One Riot, Dylan Casey from Google and Paul Yiu from Bing, Microsoft. A tweet with aggregate counts even more. Tweets have to have links. Social Authority also matters. The quality of the people following you also counts. The more spammy followers you have the worse off you are as an influencer.
twitter link case study.
“We were hit hard in the Farmer update, due to duplicate content issues, even though we are the primary content source. The only pages that were not affected were pages that has tweeted links from real, non-spammy users.” – Site Manager.
Passive Network distribution for content.
Press releases and content syndication networks.
Active network distribution.
Social networks. Find a way to get it to the people who want to read it and share it. Get it to try and travel around the world in seconds.
What is the big deal about real time?
- To be successful, real time needs these two unique elements. It must have a crawler based algorithm and it must have a human driven social layer.
- Just like image search, local search, news search, etc… real time search is an improtant segment of web search
- Trust and authority are paramount to its success. That means a quality network of followers, etc…
The new R word is Recency. Get your posts in the right time frame.
Social RelevancyL How engines look at Twitter.
Engage one bird and you might attract the whole flock.
- Engaging users in a search via content translates to spreading of content.
- Engaging users in networks allows content to spread like wildfire.
The Panda Update.
Panda targetted at content farms? Not true.
Broader than that
Real Question: How good is your content?
- Google Analytics – Did the user view only one page?
- Google Toolbar – One Page? How long on the page?
- Search Results Interaction.
- Google compares this to your competition since they have that with the toolbar.
Time on Site:
- Hard to measure with analytics. Measures time from 1st page view to start of the last page view.
- Easier to measure with Google Toolbar – can see when they go to another site. Analytics cannot see this which is why analytics may not be right. The toolbar is a better resource.
- Google compares this to your competition since they have that with the toolbar.
- Chrome Blocklist Extension
- A chrome plug in that allows users to say I don’t want to see this site in my search results. Google is using this data. Some companies are saying that they saw this hurting certain sites with the update.
Click on the magnifying glass and show the preview. If you preview it but don’t go there then that is a bad sign for that site.
Ask yourself, is this you?
Don’t be the same. Do you really need 2.5 million results to show how to make french toast? No, so what makes you authentic and who can agree with you? (I added the last part of that sentence in).
You must find ways to add value. Thin content is bad. Ecommerce sites that only use a manufacturer’s description and nothing else. There is nothing special about it and it probably exists everywhere else. Outsourcing to India is not good and the quality may end up just as bad.
Rumor, there is no white list or black list. This is just an update.
eHow escaped because they were one of the first and they have great content. Because they were first they can be seen as the first. Everyone else was a copy that got hurt.
How Ranking Used to Work – Make up of importance for your site.
- 24% Trust/Authority of Domain
- 22% Link Popularity of Page
- 20% InBound Anchor text
- 15% Keyword usage
- 7% Traffic Click through data
- 20% Engagement Metworcs – this is added in after by the Eric Enge.
- 4% registration/Hosting data
How to repair?
User generated content. if you don’t have enough traffic try a Facebook contest or something.
How to improve
- Enticing cross links
- great tools/content
This session kicked ass. I highly recommend seeing some of these speakers if you get a chance.