MySpace Looking to Fill its Advertising Space
MySpace, the website that used to reign as king of the social networking world, is looking for suitors to fill up its ad space. News Corp., the media conglomerate that owns MySpace as well as other popular entities such as the New York Post, Twentieth Century Fox, and much more, is currently trying to fill the void that will be caused when its current contract with Google runs out at the end of August. Besides Google, other suitors for the ad space include Yahoo and Microsoft.
The advertising contract began in 2006 after News Corp. chose Google's offer over Microsoft's and Yahoo's. Google agreed to make payments to News Corp. that would total $900 million for the right to sell advertising space on MySpace as well as other websites owned by the corporation. At that time, MySpace was an Internet sensation. Such is not the case now, however, and that means that the next advertising deal will likely be for much less than the current one. There were several benchmarks set in the Google contract for things such as web traffic that MySpace has failed to reach, making the site now seem less attractive and profitable than it once was.
MySpace is not only experiencing trouble with decreasing traffic numbers and the prospects of a less lucrative advertising deal. In the past few months, former CEO Owen Van Natta was forced out after just ten months on the job, and Co-President Jason Hirschorn has also left the company. Thirty percent of the MySpace's workforce has also been let go. To add insult to injury, News Corp. took a $450 million hit last year due to MySpace's decreasing popularity, along with some of the company's other struggling websites.
The main reason for MySpace's current slump is the growing popularity of its social networking competitor, Facebook. According to comScore Inc., a market research company, MySpace's traffic in May 2010 was down 13 percent when compared to the previous year. Facebook, meanwhile, saw an increase in traffic of 74 percent, with 548 million unique visitors around the globe. MySpace had just 109 million unique visitors. Many analysts blame MySpace for not running with Facebook and finding ways to make the site better and more innovative in terms of features and design. MySpace refuses to go down without swinging, however, and is currently in a renovation stage.
At the heart of the MySpace renovation is a demographic focus on users between the ages of 13 and 34. MySpace has usually been seen as a social networking site for the younger crowd, and they hope they can capitalize on that niche. In addition to the focus on the younger demographic, MySpace hopes to capitalize on complaints by many that Facebook lacks sufficient privacy features. They also vow to make the site a place for entertainers such as musicians and other artists by giving them ways to interact with fans and receive feedback.
Although it should be interesting to see what changes are instituted on MySpace, it is highly questionable whether the site can regain its status once again. Facebook is the hot item right now, and other sites like Twitter are taking pieces of the pie that once belonged to MySpace. Add in smaller advertising revenue from the upcoming contract with Google, Microsoft, or Yahoo, and MySpace has quite a lot of climbing to do to get back to the top.
-------------------------------------------------------------
Improving Onsite SEO: Tips from Google
This is the third (and last) part of our summary article on the SEO site review session panel held by Matt Cutts and other important Google representatives. Keep reading for important tips and suggestions from that panel for improving your onsite SEO. At the end of this article, you'll find a link for downloading the tips from all three articles in checklist format.
Tip#29 : “Make sure there is indexable text-based content visible to Googlebot on your optimized page or website.”
This is one of the common mistakes made by SEOs. You need to remember the basics, and one of these is checking to see whether or not the text can be indexed. The best way to do this is to use a text browser, such as Lynx, to diagnose crawling-related issues.
If you are interested in learning more about Lynx and its uses for SEO, you can read the following tutorials:
“Mastering Lynx (Open Source Text Browser) for Search Engine Optimization”
“Using Lynx for SEO Analysis”
Tip#30 : “Add substantial text to your content. For websites in which it seems difficult to add text to your content, implement 'user reviews and comments' to add more text to your site. Make sure your website has enabled the ability to add user comments. ”
This is one of the most useful suggestions ever made during the SEO site review session pertaining to onsite issues (i.e. lacking website content). So if you have a website which is designed to have less text on it (like a website full of images, flash videos, MP3 streamers, etc), why not allow user comments or reviews? In this way, when users add comments, you will gain valued, added content to your website.
This is very possible and feasible to do, but this technique has a number of potential problems. I've listed them (and some suggested solutions) below
1. If your website is not known, then you will never get user comments. This is particularly true for newly-launched websites that are still not receiving substantial enough traffic to drive users to comment.
Potential Solution: You can tap other sources of traffic, not only search engines. You can network with other interested friends on Facebook; with other related professionals in Linkedin; you can Twitter or maybe submit your material in Stumble Upon and other places if your content and services are newsworthy and worth publishing.
2. If you start receiving user comments, it might increase the chances of receiving spam.
Potential Solution: You need to enable comment moderation. And you might find more useful ways to reward quality/best comments on your website. Maybe you can have a dedicated page for them with links pointing to their own website (if it's related to your website).
Tip #31 : Formulate the optimal title of your website based on the services you are offering. Do not just use the domain name as the title of the website. The title should accurate and descriptive.
There are still some website owners that optimize their website for search engines but cannot even assign an optimal title for the website. You can read the details for formulating an accurate and descriptive title tag from the Google website.
Tip#32: “Register your website in Google Webmaster Tools, so that you will know if your website has been hacked. This is particularly helpful for malware detection."
It might be difficult to detect whether your website has been hacked, because hackers are getting sneaker with their methods. Google Webmaster Tools provides a lot of detection tools which will tell you if your website has been hacked or hosting malware.
Tip #33 : “Use Fetch as a Googlebot feature in Google Webmaster Tools, for malware correction or hacking detection.”
For Tip #31 and Tip #32, you can refer to this updated tutorial on the latest Google Webmaster Tools. You may want to check this page from another tutorial as well.
Tip #34: “Avoid publishing content on your website which is syndicated, because it is not really original content and won't rank very high in Google. Put more emphasis on publishing original content.”
Content is still king to Google. Great content can even beat any form of search engine marketing. So one of the most productive and profitable ways you can spend your time with your website is to write quality and original content.
Remember that this original content is what makes your website unique, which is also important if you are marketing or branding your site. If your website's content is syndicated or otherwise not original, there is no reason for Google to rank it highly, because your content can be found elsewhere on the web. Remember that Google will only rank the canonical and the original version of the content.
It is a major mistake when doing onsite SEO to put little importance on continuous quality content improvement; many SEOs will only suggest putting content on the site just for search engine purposes, which is not good; most of the time, it just looks spammy.
Tip #35: “Getting a good hosting plan is a rare secret in search engine optimization. If you exceed the bandwidth limit, or any hosting issues cause your website to crash, users cannot view your website. Without views, you can't get conversions."
How many SEOs ever review the quality of their hosting provider as a factor in SEO success?
I have experienced working with a web host with a very poor uptime -- 70%-85% uptime. They are a free web hosting company. The website experienced frequent down times until it was suggested that the site switch to a quality web host.
Before switching web hosts, the number of average unique visitors per day was only 70. After switching to a stable, 99.95 percent uptime web host, traffic increased dramatically, to an average of 300 unique visitors per day.
This solution saves the client a lot of SEO money. Instead of diverting resources and campaign money to finance SEO improvements to further increase traffic, simply changing web hosts can make a lot of difference.
The client can put finances into an SEO campaign for further increasing the traffic.
Tip #36: “In generating snippets of your website in the search results, whether you're using a meta description or no meta description, if you are automating the process (using a server side script like PHP), you need to make sure that the snippets shown in the Google search results are natural and relevant to the search query.”
You need to check your click through data (in Google Webmaster Tools) to see if you have this problem and take steps to improve your snippets.
Poorly produced snippets can affect your click through in search results because it will appear that your site is not relevant to the user's query. To learn more about improving click through rate in Google, you can read the tutorial at the link.
Tip#37: “Do not hide text on the page, especially by using the scroll box technique at the bottom of the home page. Make sure your text is presented for your readers and make it easy to read.”
It does not make sense, even if your scroll box is over 500 pixels in height (so the text can still be seen by the readers), if inside this box you've placed around a thousand words of content.
The result is very stressful to read, because the text box window is too small compared to what is inside the scroll box.
Some SEOs use this technique to add content, which is not good in terms of the content's readability and the site's usability.
Tip#38: “Do not worry about unfriendly URLs (URLs that are not rewritten to contain useful keywords), if your site is stable and has lots of content.”
Some webmaster and website owners are still concerned about their URL structure -- despite the fact that they have already been using the URLs more than a year, and they are indexed by Google and getting traffic out of it.
Note that it takes six months for a permanent (301) redirect to fully mature if you are rewriting a lot of URLs. This huge drop off of traffic can be costly to your business.
So the best advice would be to ignore the unfriendly URLs if you have tons of them indexed by Google. Google says that their system will do its best to understand the URL structure of the website (for example, if you are using dynamically generated URLs).
But do your best also to solve issues such as duplicate content URLs. There are a lot of solutions for duplicate content issues which do not necessarily involve rewriting all of your URLs.
Tip #39 : “Do not optimize the same keywords for the home page and the internal pages. This will introduce canonical issues that can affect rankings.”
A common mistake is that you have a well-defined product page but optimize the home page for the same targeted keywords. The result is that the home page will rank, instead of sending targeted traffic directly to the product pages; the conversion rate in this case will be poor.
Tip #40: “TLD is not that important when it comes to ranking ”
Do not be concerned about your TLD when it comes to general ranking (as opposed to geo-targeted ranking). But country-based TLDs can be important. For example, if you need your website to be associated with Canada, it makes sense to use a .ca domain.
But this is not a strict rule. If you have a website set up to draw in visitors throughout the world, Google will know that and will rank your website in other places, not only in your own country of origin.
Tip #41: “URL Shortener should pass 301 redirect. This is the best application for URL Shortener. ”
URL shortening is becoming more popular along with Twitter and other applications. According to the SEO site review session expert, Matt Cutts, this is nice -- as long as it passes a 301 redirect to the canonical URL.
All of these tips can be downloaded in a check list format for easier implementation.
No comments:
Post a Comment