SEO
For seo we use Nuxt SEO. This tutorial shows the basic configuration and usage in Rshop Bootrstrap
. For more information refer to Nuxt SEO docs.
Site Title
We set our default site title in nuxt.config.ts
:
{
site: {
url: "https://rshop-bootstrap.rshop.sk",
name: "Rshop bootstrap",
},
}
Then when we set metaTitle
for some page, the final title will be:
Page Specific Title | Rshop bootstrap
Page Metadata
To set page specific metadata we use useSeoMeta
composable.
<script setup lang="ts">
useSeoMeta({
title: "Page title",
description: "Page description",
robots: {
index: true,
follow: true,
},
});
</script>
For more details refer to Nuxt SEO docs.
Opengraph Image
Nuxt SEO
enables us to use Vue components as templates for OG images
. Currently our custom templates are located at /components/ogImages
.
The folder of the templates is also configured in nuxt.config.ts
:
{
ogImage: {
componentDirs: ["./components/ogImages"],
},
}
We utilize defineOgImageComponent
to use our custom template:
<script setup lang="ts">
interface Props {
title: string;
}
const props = defineProps<Props>();
</script>
<template>
<div
class="h-full w-full flex items-center justify-center bg-[#322c82] relative text-white"
>
<div class="h-full flex flex-col items-center justify-center">
<h1 class="text-8xl">{{ props.title }}</h1>
</div>
<div
class="flex items-center justify-center py-4 px-6 rounded-xl bg-white absolute bottom-6 right-6"
>
<img src="/logo.svg" alt="Logo" class="" />
</div>
</div>
</template>
<script setup lang="ts">
defineOgImageComponent("Default");
</script>
Sitemap
Everything conserning sitemap
file is automatically handled for us. It maps the pages and includes separate sitempas for each defined locale
.
The only thing left to do for us, is when we want to include dynamic routes to the sitemap. These could be for the blogs, products etc.. In this case we need to provide the urls. Refer to Nuxt SEO docs for more details on how to do this.
Robots
All that is left for us to do conserning robots.txt
file is, to set routes we don't want robots to crawl. For example in our nuxt.config.ts
we disable crawling for Nuxt API routes
like this:
{
robots: {
disallow: ["/api/"],
},
}
Also the sitemap.xml
file is automatically added to robots.txt
file.
How to see if all of this works?
You can see results for all of these on nuxt devtools which are visible on the local development page.