3D Context Delivery™ – Cloaking Operative Features

 

1. SHADOW DOMAINS

We create between 5000 to 30,000 (root domains and sub-domains) search engine-only Shadow Domains™ sites (depending on the competitiveness of your industry) dedicated exclusively to generating loaded context pages based on linguistic frequency parameters, keyword density optimization algorithms, extensive real life search terms analysis and proprietary cloaking and dynamic page generation techniques.

We register shadow domains related to the primary core domain, e.g. “clientdomain2.com” plus thousands of sub-domains. Each client campaign build is then fully driven by the all important SEO phase. This includes an exceptionally aggressive backlinking program of up to 120,000 backlinks each and every day, ensuring high traffic grabbing rankings of your Phantom Pages.

 

The main reasons for this procedure are the following:

a) Search engines now rely increasingly on in-site context/content relevancy to determine search rankings. This is not, however, as might be assumed, merely about some relevant text content: as theme relevancy has to be determined by automated programs (spiders, crawlers, indexers, etc.), the overall procedure is fairly sophisticated and complicated.

 

Content is being analyzed for linguistic-statistical plausibility; overload of keywords (i.e. more-than-average keyword density) is being penalized as spam; meta tags are losing their relevancy, and the hunt is on for classical doorway pages which are radically being exterminated from the engines’ indices.

 

b) Search engines have switched over to context biased ranking algorithms: It is therefore paramount that commercial web sites create a high degree of context relevancy.

 

However, under ordinary circumstances this would entail serious if not downright intolerable trade offs against layout, design, corporate identity elements, aesthetic attractivity and user friendliness of sites. It is hardly feasible and tolerable to have to switch from a sophisticated, aesthetically pleasing and proven layout and site design to some text-only web presence only to please search engine spiders! Moreover, the redesign costs involved would in most cases be exorbitant, time lags would ensue, etc.

 

This is where our proprietary 3D Context Delivery™ comes in: by generating artificial context fine tuned to search engines’ indexing requirements and feeding this optimized material to search engine spiders, while redirecting human visitors to the main site actually intended for their information.

 

There is currently no other technology available which can offer this degree of sophistication in the field of optimized search engine positioning.

 

We term it “3D” because the shadow domains generated will actually consist of inter linked pages based in content upon each other: hence, the illusion of a “real life” domain as perceived by automated search engine programs will be perfect, leading to highly optimized rankings even in extremely competitive areas.

 

2. HUMAN REDIRECTION

Normal human visitors will be redirected to client’s main site. Redirection will happen in real time – no annoying “click thru!” links for visitors to follow, no delay due to slow loading JavaScript applets or refresh meta tags – which are normally penalized by search engines anyway.

Instead, we will be using our own dedicated name servers plus state of the art operating system modules, etc. to achieve this free from delays and system performance issues.

 

3. EXPLOITING THE WORLD’S LARGEST SEARCH ENGINE BOTBASE

All search engine spiders (crawlers, searchbots) have to be recognized by our on site software for this strategy to work.

To this purpose we have created the world’s largest and most comprehensive search engine botBase, our fantomas spiderSpy™ service, currently covering 130.000+ entries including IP addresses, URLs and UserAgent variables as maintained by the search engines. Hundreds of entries are being added weekly: our network of “spider trap” domains dedicated to trap spiders’ data and integrating it into our recognition database before they hit a client’s sites; massive third party support from all over the world; extensive evaluation and verification of all available spider data resources, etc. have helped to make this into the most ambitious and effective search engine spider reference currently available.

 

The fantomas spiderSpy™ service makes a special point of monitoring international search engines on a global scale.

 

This is of great importance because global networks being what they are, major search engines have implemented time and bandwidth sharing plus load balancing in assigning crawling tasks to their vast range of spiders.