Examples of Matrix/pyramid Building

Building a Sample (60 x 2) 12-Tier Black Hat Pyramid

“The death of black hat” will never be final without PandaButers.com, as there are some necessary black hat techniques to overcome the over-competitiveness of “fellow spammers”. Because at one time black hat was not fought with nofollows, before they were challenged, the goal was to create many links in only one tier below, as they were ordered as high Pagerank hijacked websites already on top of pyramids themselves. But some resistance with recaptcha robot blockers, mostly no-follows, and the increasing 6-week indexing period mortality rate with too much competition meant the black hats now have to help the people they are hijacking, by back-linking them with many tiers. Historically this was too hard to do until you came here so I can show you how to do it right, as the black hats historically did not want to help their competition. If you were, as I was, a black hat, you want to spam pages that are more recaptcha free, do-no-follow-mixed (do-follow-only is too competitive and may reduce your Pagerank when Google knows you’re a robot), randomized article content/anchor text/URL’s, high demand, low competition, trending upward (HDLCTU) keywords, niche-based with political or news-media spin (editorials), low 1 hour to 6 week mortality back-link rate with long-tail keywords with post back-link check and metric analysis, tied into Web position (WebCEO), SERP, SEO Quake, SEO Rush, Word Tracker, and Google Trends recursive analysis

Table for Pyramid

1 – n of my websites 1 – n sites (preferably news posts) at top
60 60 spun articles as “teasers” to related posts, political, opinionated, no sales pitch ex PB, pointing to 1 – n pages
120 120 spun articles, as above, only randomize above 60 plus 1 – n top URL’s (my pages)
240 240 spun articles, as above, only randomize above 120 plus 1 – n top URL’s (my pages)
480 480 spun articles, as above, only randomize above 240 plus 1 – n top URL’s (my pages)
960 960 spun articles, as above, only randomize above 480 plus 1 – n top URL’s (my pages)
1920 1920 spun articles, as above, only randomize above 960 plus 1 – n top URL’s (my pages)
3840 3840 spun articles, as above, only randomize above 1920 plus 1 – n top URL’s (my pages)
7680 7680 spun articles, as above, only randomize above 3840 plus 1 – n top URL’s (my pages)
15360 15360 spun articles, as above, only randomize above 7680 plus 1 – n top URL’s (my pages)
30720 30720 spun articles, as above, only randomize above 7680 plus 1 – n top URL’s (my pages)
61440 61440 spun articles, as above, only randomize above 30720 plus 1 – n top URL’s (my pages)
122820 Total number of posts. 12-tier pyramid. Double each attempted number of posts when back-linking to overcome expected 50% recaptcha failure rate. Checking status of back-links a day later important. 10-11-day process. Calculate total indexing mortality rate (TMIR) 6-weeks after the end of posting.

Pyramid Total Number of Back-links = Sum (from n=1 to n=n-1) of [(A = number of 2nd tier posts) * (B = number of backlinks to each above URL) * (C = coefficient that multiplies the preceding layer number of posts for changing tier “number of posted-to posts size”) * n];

Pascal's Triangle
Pascal’s Triangle


For this example, A = 60, B = 2, C = 2. For equilateral pyramid/(Pascal pyramid without changing site-below-to-site-above link number, for binary multiple increase, vulnerable because of unstable pyramid when 2-tier back-links mostly go away) I tried the above 60-link-first-tier pyramid as an example of a good simple pyramid to build, trying to anticipate possible link removal, especially for the second tier. First tier number of back-links from second (60) must be a high number where low TMIR means you can deal with first tier more-or-less likely to not give you credit. If their Pagerank is too low, they may not notice you at all, or cancel post pages/remove links because of more visibility or go out of business because they are less likely to be making money. High Pagerank means nofollow is more likely, very problematic, more competitive, maybe more attention to killing/black-listing back-links, but more likely to “blend in” if traffic volume is high and robots, especially when political with no sales’ pitches, back-link is more likely to survive. Long-tail keywords, however, means more targeting with chances of high rankings because of partial matching of your many-tiered well-back-linked website for more competitive keywords, while capitalizing on 2014 report that average number of keywords per search went from 3 to 4, and 25% of searches are keywords never looked up before, where long-tails are more likely to corner the market. Plus, less competitive long-tail extracted sites are probably low Pagerank, dofollow, and more cooperative, so building them up, as is my motto, is mutual symbiosis for the “little guy” to help one another out and take down the inefficient, grand-fathered, big money and high Pagerank competition. I am proposing taking middle-third of extracted URL list, ranked from high Pagerank to low (but remember, long-tail keywords) and place on top, top third in middle, bottom third remains in same place. B should equal C for simplicity, to build equilateral pyramid with fixed area (work) and maximum volume (Pagerank, traffic, sales). “Eiffel-like towers” are good for stability and ease, where there is the presumption lower tiers have increasing robot back-linking activity of their own to increase your pyramid size. C can be a fraction, where actual URL number, when a fraction, is rounded off, or can be a function of a power series as with Pascal’s triangle (1 + 2 + 4 + 8 + …). For now, 2 is simpler to best approximate a strong close-to-equilateral pyramid, but a wide totem pole (where C = 1) and A is a large number, like 500, is also very good, where the number of tiers are very high, a presumption some of the lower tiers already have back-links, will accept your invitation to build more back-links to them with PandaBusters.com, and therefore for you, less work on spin calculator to randomize fewer URLs in each link-spin, and you can return and create additional (parallel) pyramids, where you start back-linking to old 2
nd tier, and create “multiple totem poles” for more of a pyramid effect. The highest Pagerank has little to do with the goal of creating an equilateral pyramid more maximum volume, but high numbers of average per-tier back-links (times) the number of tiers, for a wide, tall, square orthogonal matrix. So if your affiliates are limited in size to 99-tiers for affiliate commissions, and because it is permission based with much less concern of terminating back-links, a 100 x 99 tier totem pole pole with HDLCTU, long-tail randomized keywords, with keyword relevance grabbing, is essential, even though only 9900 backlinks per project, but in much less time. But if you are promoting more than one site, there are x-number of sites (times) 9900, which can be quite high, no low-number-of-posts tiers that are more mortality-resistant, you’re lookimg at tens of thousands of back-links. The goal with high Pagerank is an Eigenvector calculation, where your keyword density is as close as possible to best frequency, prominence, proximity of keywords, in meta-tages, URL string, comments, close to the best weighted-average factored out with WebCEO, SEO Quake, SEO Rush, etc. (like with frequency, 1 / |your frequency – Google’s best suggested frequency| is a factor, as with lowest possible plagiarism score, that is factored into the “scalar” (an Eigenvector is scalar, or numbers with no direction, the “weighted averages”, multiplied by vectors, numbers of different values, as lower tier back-links may be less important than the higher tiers). For the 12 – m scalar factor like frequency (pointing to original content is important, as it may be a third matrix that gives higher scores to pages with outbound links) it is represented by a vertical row of 12 – m values, and the vector matrix is m x n in size (n = number of back-links in second tier, although can be replaced by a one-column n rows of “Pagerank” scores that apply to all backlinks). Here is an example:

Eigenvector

Google Eigenvector
Your total Pagerank is the transformation of each scalar in the vertical stack on the left multiplied by the corresponding vector number, each term added, where Pagerank=10 is the highest score you can get, probably the result of dividing by the highest score of any page. So a “cube”, wide, and tall, also maximizes volume while minimizing effort, compared to a non 1:1 aspect ratio 3-D quadrilateral with same total link number, and it should generate the highest Pagerank for something truly orthogonal and some randomization of links (below example) to prevent penalties for what may be suspected as robot activity.

I am writing a program available to sellers here so their robots will be able to take a stack of 99 x 100 URL’s from willing resellers, with affiliate commission URL’s, and break them up into 100 groups of 99, for a recursive spin, so you can quickly build a 9900 matrix pyramid, strong and almost perfectly cubical, for highest possible Pagerank, as 99-tier-pyramids are theoretically non-existent until now. For example, the curly brackets for spinning, which normally look like:

{URL1 | URL2 | URL3 }

will look more like this:

{ {URL1-1 | URL1-2 | …URL1-99} | {URL2-1 | URL2-2 | … URL 2-99} | …{URL100-1 | URL100-2 | … URL100-99} }

This will produce close to the 100 back-link-wide 99-tier-high “cube” you desire, where you can use Scrapebox to target all 10,000 back-links as the first tier attempt, with some rare cases of links pointing to themselves and eliminated by Googlebot. It is good to, regardless of the narrow niche you represent, listen to news events, and tie in what you do to current events, go to trends.google.com, and look for HDLCTU keywords, and their related up-trending keywords, brainstorm, merge all possible combinations of keywords on Scrapebox, brainstorm some more, and re-iterate (recursive thinking), considering all possible fluctuations and synonyms of keywords, to produce a final list. You will not need proxy servers for unblocking and will not have to wait days for extracting or buying URL’s. WordTracker.com is still good for this purpose as well. The resellers have been advised to offer URL’s that offer basic blog comments sections on their websites for anchor text to be written as a hyperlink as you can plant HTML with an automatic dofollow credit, as trending upward keywords, but slightly randomized, in anchor text in many tiers, is the key, else the seller membership search engine allows you to look only for “/blog” URL strings and not what you don’t want.

As more resellers join for free, and you attract them with your solicitations to do so, especially as ScrapeBox.com allows you to seek out keyword relevant Web pages, and solicit their webmasters through form mail posts, not a feature I can find on competitors’ software, preferably through our affiliate program where you get commission, you can mix, in the short term, some black hat posting, to guarantee more keyword related resellers, and then the collapse of black hat and the micro-managers, saving the Internet!

For white hat purposes and the above and many other benefits, join here as a seller.