Hello, welcome toPeanut Shell Foreign Trade Network B2B Free Information Publishing Platform!
18951535724
  • How to ensure long-term seo health on the website

       2026-03-19 NetworkingName640
    Key Point:Turning to the issue of the web site seo, the ceo of yelong seo, liu ming, said that many seoer judged the web site only for overall data, whereas the site should have produced a detailed scientific medical report on a weekly basis, which would have revealed problems from a number of detailed indicators at the earliest。It reads as follows:One of the implications of this post is that many station chiefs have no idea about their website's he

    Turning to the issue of the web site seo, the ceo of yelong seo, liu ming, said that many seoer judged the web site only for overall data, whereas the site should have produced a detailed scientific medical report on a weekly basis, which would have revealed problems from a number of detailed indicators at the earliest。

    It reads as follows:

    One of the implications of this post is that many station chiefs have no idea about their website's health:

    At the 100-degree event, a number of station chiefs would ask 100-degree staff questions that could not be answered: what should we do? What do we do? What do we do

    These questions are too general to be answered even if all back-office data are fully open. The questioners must not have had a clear understanding of the basic structure of the website. What's the good question? Patience is the answer。

    The second point of this article is that the re-programming of the product is devastating the seo

    The most serious problem for seos is often not the issue of seos, but of products or technologies. Some of the major websites are recast like this every time:

    1 will replace a set of urls。

    2 old version of pattern could not reach the latest version 301 owing to data incompatibility。

    3 forgot to do 301 even if the data were compatible。

    I asked a product manager how many urls had changed to pattern, and the answers were three to four. But i look at web. Archive. Org, at least eight. On average, one change a year. A little bit of common sense about the search engine should realize that this site is typical of no zuo no die。

    The significance of this article is that in a long iterative development process, the needs of seos may be gradually altered

    In product, technology and test thinking, there is often no clear definition of url, and as long as the page is accessible, the content is right. The following types of urls are considered to be okay, let alone other basic norms of seo. In other words, no one cares about these things except seo, and every development can be missing or mistaken:

    1 http://www. A. Com/project(category)/

    2 http://www. A. Com/product. HTML/

    3 http://www. A. Com/project/? Channel=123&category=abc&brand=def&tracking=other website

    There was a product that had been recorded in google at 30 million, with a 100-degree intake of 20 million, and a good flow of energy, which was diverted to another product. After one month, it was discovered that there had been a decline in flows, which was considered a seasonal factor, which was not taken into account, and two months later, which was a significant decline. A close examination revealed an amazing change。

    One, the address on the list is..

    Http://www. A. Com/product/item100. HTML

    2 without being informed, a 301 jump was added by a technical colleague to http://www. A. Com/search/? Project=a&item=100

    3 of which /search/catalogue is disallow in robots. Txt

    In the following two weeks, the minimum intake fell to around 3 million。

    I would like to have a system that will help me sort these things out, so that i don't worry every day about seo's needs without knowing who lost them, and if there's a problem, i'll get a warning immediately from my development test colleagues, so that i don't have too much time on my hands。

    Content thinkers, meta-information, page module testing, spider log monitoring

    Given the points ahead, my solution is:

    • content mind maps

    • meta-information

    • page module testing

    • spider log monitoring

    These programmes were conceived five years ago and tested on a small scale, but have reached many pits owing to their complexity and high development costs. It was not phased in until the last two years. In no case does it apply to small companies, and a wide range of seo practitioners are invited to make deliberate decisions。

    • content mind maps

    From a product point of view, it is composed of functions that have a reasonable flow relationship (the process is not discussed) and is consistent with the user experience, but may not be compatible with the search engine experience:

    From the point of view of seo, the web site is structured in such a way that it consists of a variety of user search needs and has a reasonable hierarchy:

    Different websites have very different mindsets, as he may largely depend on the technical architecture. It is therefore recommended that seo practitioners draw up a guide after they have a thorough knowledge of the technical architecture of the website. Specific details are not available. But at least you have to make sure that:

    1 node of content on website

    Which hit the user's needs

    Three, which is nothing to search for

    Node missing

    5 how the hierarchy should be deployed

    The thought map is by no means permanent. It needs to be updated in a timely manner whenever the product has a new pattern on line or the old pattern off line. Whenever you find new user search habits, you should update them and send them to your product colleagues。

    • meta-information

    I'm not talking about it here, it's all about the seo, it's about the routine, it's about the regularity, it's about the quantitative information. Includes title, keyword, description, h1, etc。

    From the point of view of seo, the url of a site is this and corresponds to the “thinking guide” level:

    Home page: www. Example. Com/

    Home page - channel 1: www. Example. Com/channel/

    First page - channel 1 - dimension 1:

    Www. Example. Com/channel/abc/

    First page - channel 1-d-1-d2:

    Www. Example. Com/channel/abc/xyz/

    Home page - channel 1 - detailed pages:

    Www. Example. Com/channel/item12345

    From a product, from a development perspective, from a test perspective, the url may be so disorderly:

    @www. Example. Com/channel/? Category=abc&brand=xyz&tracking=other website

    • www. Example. Com/channel/? Item=12345

    If the next edition were changed, it would be fine:

    Www. Example. Com/? Channel=123&category=abc&brand=def&tracking=other website

    Without a clear definition of the rules, it is almost impossible to know whether the current website is or is not what you have perfected. According to the seo content thinkers, we have the following metadata tables (only a few fields are listed for your information):

    • page module testing

    This "unit test" is based on a developed terminology that was originally designed to test a function or class. I'm a specific definition for testing seo. The tools were also developed on the basis of rspec. This module can be run in two environments, proscription and testing。

    The aim of the producation test, which we call the return test, is to ensure that the seo that was previously on line is still in need of proper repair if the alarm is to be called。

    In the testing environment, it is intended to be used by r & d personnel for similar purposes as tdd. It contains the regression test for the program, as well as the new needs of the testing, which can be considered as a demand document. As long as the researchers run this out, it means your needs are complete. When these requirements are online, the combination of the tests into the regression test is completed。

    The contents of the test can cover: all details in meta-information, anchor text of known links, urls within stations, urls outside stations, bread crumbs, alt, response time, page size, etc。

    • spider log monitoring

    There is a definition of “meta-information”. It's easy to watch spider logs. Some of the phenomena that were experienced:

    1, 85% of all visits, respoThat's all 301。

    Two, 50% of spider grabs are for scratchingI don't know。

    3 the average response time for certain categories of pages exceeds 10 seconds/time。

    4 respoSixty per cent of the requests made during the 200 visits were not urls required by seo。

    The attached figure shows only some fields for your information:

    In response to the question at the beginning of this paper, the more realistic question is: “how much do i get on one of my pattern pages every day, how much of this is reponse 200, how much is the average response time, what is the main content, the seo element is normal, and there is no cheating, but this pattern has recently been taken off and recorded”. Those who can actually ask such questions hardly need to mention them. Most of the problems are solved with sufficient detail。

    Note: please move to the build site course channel to read the relevant website construction techniques。

     
    ReportFavorite 0Tip 0Comment 0
    >Related Comments
    No comments yet, be the first to comment
    >SimilarEncyclopedia
    Featured Images
    RecommendedEncyclopedia