Welcome to early school
It's fun to watch the seos recently, and it's always good to learn something new, and as experience increases, it's less agitated than before, and it's a good time to study something fun, and i'd like to recommend here two basic seo books, "the 100-seo book" and "the seven days of seo."
Although the hottest seo book today is the seo war password from zac, and the bean petals score is good, but the actual reading logic is dissipating, the amount of information is a little bit more like the accumulation of historical blogs, which is not really appropriate for someone who has just come into contact with seo, and the individual suggests that he or she could start with his own understanding system, starting with the first book and then using the system that he or she has built. It's my usual method of reading, actually, i don't read much, and i'm more inclined to look at what's right for myself at this stage, to read it。
First, let's take a look at the logic of these two books and take a look at them
Seven days of seo
The book focuses on the inside of the site, on the outside, and on the policy section, which focuses on practical approaches to optimisation, based on an understanding of general knowledge, which can be viewed quickly and can be read in conjunction with the thematic section of the case。
"beautiful seo." png
The book focuses on keywords, linkage optimization techniques, understanding and understanding of web technologies and the range of competitive and internet alliances。
Next, the basics of the seo are combed
Seo definition:
The search engine optimisation is the technique for obtaining traffic from the search engine. The main tasks of the search engine include scientific optimization of web content by understanding how the search engine works, how to climb the web page stream, how to index, and how to rank a keyword, so that it is consistent with user browsing habits and increases ranking and website access, and ultimately acquires the technology of commercialization。
2. The working principles of the search engine:
There are three main workflows: crawling, pre-processing, service output
2. 1 crawling capture
The main function is to retrieve web pages, and there are currently three methods of crawling
2. 1. 1 common spiders
The search engine spider is an automatic program for the search engine, which accesss web pages, pictures, videos, etc. On the internet, creates an index library, and the url here, which is commonly used as a spider+url, is a trace of the search engine by checking the logs on the server for the url and also for some column properties。
2. 1. 2 climbing strategy 2. 1. 3 pre-treatment
That is, an indexing exercise of retrieved data, which includes multiple processes, which are completed ahead of schedule。
2. 1. 3. 1 extracting keywords
Remove tags & programs such as HTML, js, css, and extract valid text for ranking。
2. 1. 3. 2 deleting words
It's like, "ah, ah, ah."
2. 1. 3. 3 phrase technique
It's a technical support that is unique to the chinese search engine, which separates words from words in english from words in spaces, so the search engine has to cut the entire sentence into small units, with two methods of dividing words
2. 1. 3. 4 noise elimination
It eliminates all forms of advertising text, pictures, login boxes, copyright information, etc. That are not useful for search engines。
2. 1. 3. 5 analysis of web pages to create reverse files
Upload index.
2. 1. 3. 6 link relationship calculations
Calculates which links on the page point to which other pages, which import links to each page, which anchor text is used, etc., and which pr is launched by google。
2. 1. 3. 7 special case processing
Scripts and programs cannot be executed for non-textual content such as flash, video, ppt, xls, and pictures. Pictures generally use labels
2. 2 service output
The way the output results are presented, e. G., the part matching the search keyword is marked in red fonts
Output
3. Catalogue of websites
It is the result of an artificially edited search that brings together excellent web sites on the internet and places them in the respective directories according to different categories or themes, mostly submitted by humans, e. G., hao123
Keywords
In general, the information entered by the user in the search box, as defined, can be classified conceptually as: target keywords, long end keywords, related keywords; page-by-page, as first page, column page, content page keywords; object-by-purpose, as direct; marketing key word
5. Weights and pr values (pagerank)
The pr value is one of the methods used by the google search engine to measure the importance of a web page and one of its important criteria for judging whether a website is good or bad, with the greatest impact factor being the availability of large, high-quality external chains。
The weight of the website refers to the hierarchical “treatment” performance of the website and the website in the eyes of the search engine, a composite performance indicator in the search engine, determined by the import of external links, stable quality content and a well-structured website structure。
Be careful to distinguish between two different concepts
6. White hat seo and black hat seo7. Anchor text, outer chain, inner chain, one-way link, two-way link, export link, import link 8. Organic list
It is a free list in serp, a free list on the search result page, which can be optimized by developing a seo strategy。
9. Robots. Txt files
Robots exchange protocol, the website tells the search engine through the robots protocol which pages are accessible and which ones are not. How do we look at this? Format: enter url/robots. Txt in the browser
Uploading
D62a6059252d42a8d159f4103b5b5c9eb838 722675. Jpg . .
Www. Taobao. Com/robots. Txt
United: baiduspider
Disallow: /
User-agent:
Disallow: /
User anent means that the browser robots files are used in combination, mainly in four cases:
10. Noble
Whether or not to vote on a website and to pass on weights that can be used to prevent the garbage chain answer
Black chains
Hyperlink only in source code
12. Dynamic and static sites13. Search drop-out rate
Percentage of users who find a website and click on it and leave when they view only one page
14. Webshot
The search engine provides backup to the web page when entering the page and maintains its own server cache. When the user clicks on the webshot link in the search engine, the search engine displays the content of the web page captured and stored by the spider system at that time, known as the “pageshot”。
15. Common http status code 16. Seo commands
"site:+url" for searching for entries on specific websites
“lInk: +url "extralink"
"related: +url" search for web-based instructions and find sites related to your website content
"info: +url" searchs for incoming information on a specific website, recent snapshots, similar pages, site links, internal links and links containing domain names is a comprehensive command that only supports google。
"allintext/intext:" a valid query of a particular keyword appears on a particular web-page document and the most relevant web page and potential link objects are found and only google is supported。
“allinurl/inurl:” finds a specific url key on a specific web page that can be used in combination with other commands and only supports google。
“allintitle/intitle:” looks for the same text as the information entered in the title of a particular web page, and for competitors, which can be used to search for particular keywords in the title of a particular paper or article and only support google。
"allinanchor/inanchor:" finds anchor text links for specific keywords, only supporting google。
"definine: " word
"filetyle:" search for specific suffixed files such as PDF, doc, etc
"domain:"querys the 100-degree relevance of the site, i. E. The 100-degree outer chain, for the 100-degree search engine only, which is a key in google word
Website maps
The site map, also known as the site map, is a page with links (note: not all pages) to all the pages that need to be captured by the search engine on the site, and most people may use site maps as a remedy when they cannot find the information they need on the site。
18. Website catalogue structure
Website directory structure
The catalogue of sites refers to the directory you created when you created the website, while the catalogue structure refers primarily to physical and logical structures. When the web site involves many, especially thousands of pages, there is often a need for a clear web structure to ensure access to the search engine and users, as is the catalogue structure of the site, which is of great significance in seo. It is generally recommended that the catalogue level of the website should not exceed three levels。
19. 301 redirect
The permanent removal of the page (redirection 301) is a very important “automatic shift” technology. A reorientation of the web site is the most viable option. When a user or search engine sends a browse request to the web server, the server returns one of the status codes in the http data stream (header), which means that the page is permanently transferred to another address. Used when changing domain names on websites, they are often used to transfer weights。
20. Long end words
The term “long tail” has two characteristics: fine and long. Details indicate that the long end is a market with a small share, a market that was previously undervalued, and that the long term indicates that these markets are small but numerous. Many small markets accumulate to take a significant share of the market — this is the long-term idea。
21. All-site links - all links point to the first page; first-page links - only the first page of the website has links to your website and links to the first page when links are exchanged and purchased answer
The reverse link is actually a statement within the target document. In other words, the general link indicates " point to document b " in document a, while the reverse link requires " make document a point to me " in document b。
23. Hypocryl 24. Microformation
Microformats: is the open standard for structured data. It is a definitional format of a structured xHTML code block containing data, which, because it is an xHTML code block, is very suitable for human reading and because it is structured and easily machine-treated, easy to communicate with external data. Delegates website: http://microformats. Org/
25. Centimeter calculation
As long as the user enters the service demand in a " box " , the system clearly identifies the demand and allocates it to the optimal content resource or application provider for processing and ultimately returns the results that match the user accurately and efficiently。
After learning basic common sense, we started to understand the four most important strategies of the seo
How are keywords chosen, how are external chains constructed, how are pages optimized, and how are data analyses conducted
Both books describe more understandable approaches to entry levels, and individuals feel that they can go in that direction。
How do you choose the keyword
Step1: list the key options word
Step2: declining its 100-degree index, competitiveness, orientation
Step3: analyse the indicators in the table, select three target keywords, match the combination with a high level of competition + medium + low, and use a “mony race” approach against competitors。
Step4: view search trends for selected keywords, current ranking of search engines, highly relevant attributes of major competitors
Step5: brainstorms on selected keywords, selection of long end key word
How's the outside chain done
Optimization of the outer chain is mainly about balancing qualitative and quantitative indicators, which, if acquired with a high weighted outer chain, have a much greater effect than 10+ outer chains, which are described in several common ways:
There's a summary of website resources on the exchangeable chain between china and the outside
How to optimize the page
This refers mainly to some fine-tuning content that can be manipulated within the website; it is based on the following:
How to analyse the data
The analysis of the website is often used in a number of station master tools, such as ga, where key indicators for seoer's attention in data analysis are explained:
Finally, a brief summary of tools commonly used by seoer:
Not all will follow up, and you are welcome to add:)
Early classes on the internet. Eight o'clock a day
Focus on products, design, interaction, operation
Weibo: @internet early school
Group password: area-position-nickname
Wemedia, a member of the media, has been attending early classes on the internet
Wemedia is a media first coalition covering 50 million people




