Welcome!

Eclipse Authors: Pat Romanski, Elizabeth White, Liz McMillan, David H Deans, JP Morgenthal

Related Topics: Eclipse, Cognitive Computing

Eclipse: Article

SEO: Objectives, Process, Tips and Tools

Get a head start on SEO process and techniques

SEO Tips

SEO is a very specialized and a huge topic to be fully covered in such a small article but still few tips about SEO are given below , categorized in four categories 
  1. Content Quality
  2. Web site Structure
  3. HTML Guidelines
  4. Search-friendly URLs

Content Quality
 
Good content is the key for search engine optimization: we have to make sure of the quality of content.

While designing an application and populating content we have to think of users first and then for the search engine and also make sure that the website has good quality content because most of search engines also think like this. Some tips for the creating a quality contents are given below
 
1.      Identify original, unique, useful words and their synonyms as keyword(s) and phrases for describing each page i.e. selected words should be concise and specific to page.

These keyword(s) / phrases can be further be used in various HTML (for SEO) tags. There are various tools which are helpful in finding the right keywords for pages. For e.g. AdWords Keyword Tool , Keyword Discovery, SEO Book keyword tool , Keyword Box, Yahoo Keyword tool, Word tracker , Google keyword Tool, etc.

It is advisable to use phrases to describe the page instead of just single words description and there are couple of reasons for this.
a.       While performing searches people are now day’s searches for phrases or set of words instead of just one word.
b.      Competition is too high for Single words
c.       This helps in differentiating our web page or site from competitors.

Before finalizing keyword(s) if possible also try to find out the KEI (Keyword Effectiveness Index). It is a ranking system based on how popular a key word is and how much competition it has on the Internet and it ranges in between 0-10. Higher KEI number means the  keyword is a popular keyword and it also less competition. So select keyword(s) with higher KEI number.

KEI is a good starting point for selecting keyword, but in addition to KEI, keyword researchers should also consider other factors like examining the number of pages indexed, backlinks of top sites for that keyword etc. Good information on KEI is available at http://www.keyworddiscovery.com/kd-faq-kei.html

2.      Restrict to one topic per page i.e. avoid putting multiple topics in one single page.

3.      Make sure the web pages are information-rich and useful. Pages should also include all chosen keyword(s) /phrases which are relevant to the topic that the  users would use to find your pages.

4.      Avoid spelling mistakes in content pages. There are lots of spell check tools  like MS word , Unix based spell or ispell and online tools like Net mechanic , SEO worker Spell Checker tools, etc that can be used to perform spell checking.
 
Google Trends aims to provide insights into broad search patterns .It provides statistics regarding the volume of keywords searched over a time period.
 

 
Web site (or Web Application) Structure
 
1.       Make sure the site confirms to the W3C standards and W3C validator tools can help in achieving standardization

2.      Make sure the site hierarchy is flat and navigation is simple. Content web pages should not be more than three clicks from home page.

3.      Categorize web pages – Better the structure of site, easy it is to target the market. Structure of the site always plays a key role in SEO. So before actually start building the site it advisable that one should carefully plan for the structure of site i.e. how actually the web pages would be categorized.   For e.g. if you are in business of HR consultancy  to different type of industries then try to create separate structure for each industry describing your offerings, specialization  related to that industry and incorporate very specific keyword(s)/ phrases for the same..
 
4.      Provide Web feeds (a.k.a. syndicated feed.) - A web feed is a document or communications channel which contains content items with web links to longer versions. It is a mechanism to share content (not visual representation) over the web. Websites or applications subscribe these feed and then render the content in required layout. Some of the widely used web feed techniques are RSS, ATOM .  

a.       RSS (Really Simple Syndication) – A XML based content used to publish frequently updated content like news, blogs etc. RSS allows to link not just to a page, but to subscribe to it, with notification every time that page changes so the information is updated in a automated manner. It can contain summary of content from an associated web site or the full text. For more please read RSS Wiki
 
b.      ATOM – It is also an XML-based content and metadata syndication format used for publishing frequently updated content like news, blogs etc. Atom is developed to overcome many incompatible versions of the RSS syndication format, all of which had shortcomings, and the poor interoperability. For the list of  difference in these two formats go through http://www.intertwingly.net/wiki/pie/Rss20AndAtom10Compared and Atom Wiki
 
There are lots of free online and downloadable RSS and ATOM generators and convertors are available for e.g. rssfeedssbumit that can be used. W3C also provides one validator tool for RSS and ATOM W3C RSS/ATOM Validator
 
5.       Add Sitemap in your website.  A Sitemap is a file (.xml, .htm, .html, .txt.) which contains structured lists of URLs for a site which allows intelligent and smooth crawling for search engines. Sitemaps are a URL inclusion protocol and complements robots.txt (a URL exclusion protocol). In addition to Standard independent ROR XML format   most of the search engine support other formats like RSS, ATOM, Sitemap for e.g. Google,  Yahoo and Microsoft also support sitemap protocol. It is advisable to keep numbers of link in a site map within 100, if it is not feasible then break the site map into separate pages. Brief introduction to both ROR and sitemap protocol is given below
 
a.       ROR (Resources of a Resource) is an independent XML format for describing any object of your content in a generic fashion, so any search engine can better understand the content. Think of ROR feed as a powerful structured feed for describing all objects to the search engines: sitemap, products, services, reviews, feeds, discounts, images, events, schedule, podcasts, archives and much more.   Tools like ROR Feed GeneratorXML-Sitemaps , ROR Sitemap Generator  makes it easy to create ROR feed. For information on ROR please visit http://www.rorweb.com/
 
b.      Sitemap protocol is an XML format file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site. For more info on sitemap protocol please visit http://www.sitemaps.org/ , XML-Sitemaps, sitemap pals , Google Sitemaps,
 
Sitemap information is typically placed in website's main directory for e.g. ROR feed is stored in ror.xml, sitemap in sitemap.xml in main directory. So it would be good if every website has ror.xml, sitemap.xml and list of search engine specific sitemap files for e.g. urllist.txt (for yahoo).
 
6.      Robot exclusion techniques - Robot.txt is a file which permits or denies access to robots or crawlers to different areas of website. It’s a protocol that every spiders/crawlers/ bots first look for this file in the main directory of website and then based on information available in this they should proceed further. But still there may be some spiders/crawlers/bots that overlooks this file and continue as they want. For more info on robots.txt please visit : http://www.robotstxt.org/
 
7.      Appropriate handling of HTTP Status code – Dealing rightly with HTTP status codes helps not only helps in preserving the link equity and also to avoid getting   delisted from search engines

a.       404  Status code - With the age of site there are changes that some of the information pages are not available in due course of time because of many reasons say non-relevance of content or offering are no more available etc. In this case instead of displaying “404 Page Not found” error it is advisable to either redirect the request to the related page or display customized message. Following this would help in preserving the link equity.

b.      303 and 302 Status code - Avoid multiple redirection chaining. By redirection here we means when some page is requested by a user then it is redirected to a different page because of the change in URL (HTTP code 301, 302) or automatic redirection can also be done. So the idea is to avoid a scenario which involves multiple redirections for one request like Request for Page A, which is redirected to B, then redirect to C and so on and finally after a couple of redirections final page is displayed..

c.       500 Status code – It is always advisable in case of downtime or non –availability of resources or site to return HTTP “500” status code with relevant message instead in displaying “404 page or blank page” or page with full of db connection errors or unable to access resource errors it is advisable to r because of which the search engines does not index or re-index these page or site and will not delist the page or site.
 
For more detail on HTTP headers visit Wiki page. in with of Redirect deleted page to relevant page
 
 
See next page for HTML Guidelines...


More Stories By Rahul Kumar Gupta

Rahul Kumar Gupta is a postgraduate in Computer Applications, graduate in Business Management and 10 certifications including PMP, SCEJA and JCP. He has 10 years IT industry experience and works as Sr. Technical Manager with Indian IT giant HCL Technologies, NOIDA (INDIA). He was also a co-technical reviewer for Professional Java Ecommerce, Professional EJB and Professional JSP Site Design books for Wrox. You can catch him at [email protected] He blogs at http://rahgup.blogspot.com/.

Comments (2)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...