Search Engine Optimization (SEO)
Search Engine Optimization (seo) is defined as the process to create more visibility of your website in search engines for certain set of keywords.
Search Engine Marketing (SEM)
Search Engine Marketing (sem) is a type of internet marketing that involves marketing your website through search engine results page and using other all online marketing techniques.
Social Media Marketing (SMM)
Social Media Marketing (smm) is a new trend of marketing using social networks like Facebook, Twitter, LinkedIn, MySpace, Foursquare and many more. It is the ultimate method to create brand awareness and reach the target audience.
Pay Per Click Advertising (PPC)
Pay Per Click (ppc) is an online marketing method to bring quality traffic to your website from search engines that ultimately converts into leads. Here the advertiser has to pay for the clicks made to his ads on search engines.
Google Local Places For Business
Google places helps anyone to list there business address in google for free. This is the best place where you can add your business and generate very good traffic and sales for your business.
Wednesday, November 16, 2011
Pay Per Click (PPC) Campaigns Cash Building Methods
Tuesday, November 15, 2011
Social Media Marketing In 2011: Five Top Suggestions For Successful Social Media Campaign
Ten recent algorithm changes
- Cross-language information retrieval updates: For queries in languages where limited web content is available (Afrikaans, Malay, Slovak, Swahili, Hindi, Norwegian, Serbian, Catalan, Maltese, Macedonian, Albanian, Slovenian, Welsh, Icelandic), we will now translate relevant English web pages and display the translated titles directly below the English titles in the search results. This feature was available previously in Korean, but only at the bottom of the page. Clicking on the translated titles will take you to pages translated from English into the query language.
- Snippets with more page content and less header/menu content: This change helps us choose more relevant text to use in snippets. As we improve our understanding of web page structure, we are now more likely to pick text from the actual page content, and less likely to use text that is part of a header or menu.
- Better page titles in search results by de-duplicating boilerplate anchors: We look at a number of signals when generating a page’s title. One signal is the anchor text in links pointing to the page. We found that boilerplate links with duplicated anchor text are not as relevant, so we are putting less emphasis on these. The result is more relevant titles that are specific to the page’s content.
- Length-based autocomplete predictions in Russian: This improvement reduces the number of long, sometimes arbitrary query predictions in Russian. We will not make predictions that are very long in comparison either to the partial query or to the other predictions for that partial query. This is already our practice in English.
- Extending application rich snippets: We recently announced rich snippets for applications. This enables people who are searching for software applications to see details, like cost and user reviews, within their search results. This change extends the coverage of application rich snippets, so they will be available more often.
- Retiring a signal in Image search: As the web evolves, we often revisit signals that we launched in the past that no longer appear to have a significant impact. In this case, we decided to retire a signal in Image Search related to images that had references from multiple documents on the web.
- Fresher, more recent results: As we announced just over a week ago, we’ve made a significant improvement to how we rank fresh content. This change impacts roughly 35 percent of total searches (around 6-10% of search results to a noticeable degree) and better determines the appropriate level of freshness for a given query.
- Refining official page detection: We try hard to give our users the most relevant and authoritative results. With this change, we adjusted how we attempt to determine which pages are official. This will tend to rank official websites even higher in our ranking.
- Improvements to date-restricted queries: We changed how we handle result freshness for queries where a user has chosen a specific date range. This helps ensure that users get the results that are most relevant for the date range that they specify.
- Prediction fix for IME queries: This change improves how Autocomplete handles IME queries (queries which contain non-Latin characters). Autocomplete was previously storing the intermediate keystrokes needed to type each character, which would sometimes result in gibberish predictions for Hebrew, Russian and Arabic.
Sunday, November 13, 2011
Google+ Pages Now Available For All Nonprofits
Posted by Leslie Hernandez Dinneen, Google.org Team
Search Engine Optimization Content Writing: Five Strong Tips For Writing Outstanding Content.
Never Ending Debate: SEO Freelancers Vs SEO Company for Quality SEO Services
Thursday, November 3, 2011
Googlebot Learns to Read AJAX/JavaScript Comments
Google's search robot has been improved; it can now read and understand certain dynamic comments implemented through AJAX and JavaScript. This includes Facebook comments left through services like the Facebook social plugin.
As first noted by Digital Inspiration, the Googlebot seems to have picked up a neat trick: the ability to index the comments on specific pages, even when those comments are made through an AJAX/JavaScript format that wasn't previously searchable.
This change was confirmed in a tweet from Matt Cutts. His comment ("Googlebot keeps getting smarter. Now has the ability to execute AJAX/JS to index some dynamic comments.") gives an indicator of the limitations on the update. It seems that the changes were built for a specific sort of content; not all comments built in AJAX/JS will be included, and it's not a universal change that allows all dynamic content to be indexed.
It is a good indicator for things to come, however. As the Googlebot gets better at understanding dynamic page elements, JavaScript and Flash will become safer for designers to use. While we're not to that point of complete safety just yet, an advancement such as this one points us toward a light at the end of the tunnel.
In the meantime, webmasters can approach dynamic commenting options with less concern for the potential SEO consequences, and users will gain access to the large amount of content posted directly to these AJAX/JS-based mediums.
It's not yet certain what other SEO or user impact this will have; the possibility of link-juice being passed from these comments, the comments leading to an increase in social search rankings, or the inclusion of these posts in real-time search have yet to be tested or explored.
Article Source: http://searchenginewatch.com/article/2122137/Googlebot-Learns-to-Read-AJAXJavaScript-Comments