A few days ago, a sshareravery. Com domain name was bought, and my personal website was switched from github pages'secondary domain name to a custom domain name. I also upgraded my personal website from a technical blog to a content site, and by the way, took care of the seo。
Record the whole process and give a reference to those who are also working on the content sites。
Why buy a domain name
The sharebravery. Github. Io of github pages is available, but there are several questions:
Brand weak. It's not like a serious content station, seo. The search engine assesses the weight of a custom domain name more independently, and secondary domain names are always attached to gthub. Io below. If later deployment patterns (e. G. Vercel, cloudflare pages) are changed, the domain name remains unchanged, and all external chains and entries are not lost
Buying domain names in cloudflare. Com is about $80 a year at the same price as the continuation fee, and there is no formula for doubling the first low price。

Claudflare
Domain name and url specification
When the domain name is bought, the first step is to harmonize url norms。
Https://sharebravery. Com and
Https://www. Sharebravery. Com must point to the same place. The search engine will treat them as two sites and spread their weights。
Practice: match cname, www. Jump to root domain in cloudflare dns to ensure only one standard url。
The same goes for the non-first page. /web3/, /tech/ these paths, uniform rules with no slash, vuepress default will handle this。
Sitemap: a map for the search engine
There's no extra chain at the new station. The search engine doesn't know where your page is. And that's what you're saying
The vuepress hope theme carries a sitemap generation with hostname and does not require additional operation。
Submitted to:
Google search coNsole - adds a domain name, validates ownership, submits a spacemap url. One submission is sufficient, followed by the automatic capture of bing webmaster towers - also submitting sitemap, bing receiving faster than google, but covering the 100-degree platform of search engines such as duck duckgo - if there is a domestic reader. However, the recording is slow and requires filing in order to stabilize it. I didn't get into this, and i couldn't find the entrance, and it looked like a lot of trouble in the front steps, unlike the first two intuitive steps, just point them in and do it directly, spit out the baidu..

Google search console
It takes time to get in
After submitting the site, search the site: shareravery. Com may have nothing. It's normal。
The search engine process is to grab-- > index-- > ranking, which takes time at every step。
Approximate intake cycle:
The first two to four weeks of searching have virtually no traffic, no anxiety. There is only one thing to do during this period: to continue writing。
Traffic monitoring: good enough
There's no point in putting a bunch of analytical tools in the new station. The amount of data is too small to be seen。
There are only two prior periods:
Google search coNsole - see status, search words, clicks. This is the core tool of the seo, which must have cloudflare web analytics - see access and sources. Cloudflare, free of charge, no need for extra deployment, privacy friendly
Ga4 (google analytics) could wait until there is a steady flow. It's all the same。
Tools are complementary and focus on content in the early stages。
Content strategy is more important than seo skills
One thing really determines the search ranking: content quality。
Several principles:
Serialized writing. The search engine for scattered articles does not understand the structure. Like the polymarket quantified transaction series,12 articles are linked to each other, and the search engine takes the entire series as an in-depth subject title with search intent. "polymarket quantified transaction actual: api secrecy and proxy wallet" is much better than "api secrecy", which covers a combination of key words that users may search for and does not follow hot trash. Write yourself something really experienced. The search engine is increasingly judging the depth of content, and ai, the ranking of hydrology, will only be worse than social media: radar, not home
X, little red, etc. Are suitable for sharing views and leading, but the full content is only posted on blogs. Of course, if x turns on the vip or can write long papers, my search tests find that x and the known weights are higher, which is usually found
The reason is simple: the url of the blog is yours, the x is on the platform. The search engine recorded your domain name, not your tweet。
Other platforms (know-how, gold digs) can be mirrored if they are linked to the original language. On the one hand, copyright statements and, on the other hand, search engines judge the original source through a link, but readers may be bored by looking at it, leaving it open to choose。
So
Seo has no black magic:
Buy a domain name, unify url specifications to file a spotmap to search console with light monitoring and then write content
The technical configuration is completed half a day and the rest is given time and content。




