← Back to all posts

Improving SEO for your Contentlayer Nextjs Blog

May 31, 2024Jadan JonesBlog, SEO, Nextjs, Contentlayer⏱ 12 min read



Improving SEO for your Contentlayer Nextjs Blog

Improving SEO for your Contentlayer Nextjs Blog

You've built a sick blog site with Next.js, Contentlayer, and Tailwind CSS - sweet! If you need a guide on how to build a blog check out my blog post on How to build a blog with Nextjs and contentlayer But now it's time to make sure your hard work gets seen by making your site super search engine friendly.

In this guide, I'll walk through optimizing your Next.js blog for SEO. We'll cover the essentials like dialing in your metadata, implementing structured data, optimizing images for performance, and more tricks to make Google and other search engines fall in love with your content.

Let's get those search engine bots crawling!

Step 1. Define Metadata

When defining metadata for a blog post in a Next.js application, the Metadata API provides a powerful way to customize the HTML head section of your pages. This metadata can be leveraged to improve the search engine optimization (SEO) of your website and enhance the way your content is displayed when shared on social media platforms.

The key aspects of the Metadata API that you can focus on for your blog post include:

  1. Page Title: This is the title of the page that will be shown in the browser tab and search engine results.
  2. Description: A brief summary of the content on the page, which can be used by search engines and social media platforms to provide context about your post.
  3. Open Graph (OG) Tags: These are a set of metadata properties defined by the Open Graph protocol, allowing you to control how your content is displayed when shared on platforms like Facebook, LinkedIn, and WhatsApp.
  4. Twitter Cards: These are a set of metadata properties specific to Twitter, enabling you to customize the way your content is presented when shared on the Twitter platform.

By properly defining these metadata properties, you can ensure that your blog post is presented in an optimal way on search engines and social media, leading to increased visibility and engagement with your audience.

To setup metadata in our blog we're going to implement the following code in your [slug]/page.js file or whatever file your individual blogs are generated:

import { Metadata } from 'next';
 
export  const  generateMetadata  = ({ params }) => {
 
const  post  =  allPosts.find((post) =>  post._raw.flattenedPath  ===  'blog/'  +  params.slug);
 
  if (!post) {
    return {};
  }
 
  const { excerpt, title, date } = post;
 
  const description = excerpt;
 
  const ogImage = {
    url: `${process.env.HOST}/images/blog/${params.slug}/og.png`,
  };
 
  return {
    title,
    description,
    openGraph: {
      type: 'article',
      url: `${process.env.HOST}/blog/${params.slug}`,
      title,
      description,
      publishedTime: date,
      images: [ogImage],
    },
    twitter: {
      title,
      description,
      images: ogImage,
      card: 'summary_large_image',
    },
  };
}

Ensure your blog images are placed in your public/blog/ folder.

Structured data

Structured data is a way to organize and provide context to the information on a web page. It uses standardized formats like JSON-LD, RDFa, or microdata to mark up the data in a way that makes it easier for search engines and other applications to understand the content and its meaning.

By implementing structured data, website owners can improve their chances of appearing in enhanced search results, known as rich snippets. These are the special features that stand out in search engine results pages, such as product information, reviews, event details, and more. When search engines can easily interpret the structured data on a site, they are more likely to display that information prominently, boosting the website's visibility and helping users find the most relevant and useful content.

Structured data is an important part of search engine optimization (SEO) because it allows websites to provide search engines with a clear, machine-readable interpretation of the page's content. This helps search algorithms better understand the context and relevance of the information, leading to more accurate and valuable search results for users. By taking the time to properly implement structured data, website owners can gain a significant advantage in how their content is discovered and presented online.

npm  install  schema-dts

schem-dts - Schema-dts is a library that provides type definitions for Schema.org, a widely-used set of semantic markup standards, making it easier to work with structured data in a project.

Add this code to your [slug]/page.js:

export default async function Page({ params }) {
  const post = allPosts.find((post) => post._raw.flattenedPath === 'blog/' + params.slug);
 
  if (!post) {
    return notFound();
  }
 
  const structuredData = {
    '@context': 'https://schema.org',
    '@type': 'Article',
    headline: post.title,
    url: `${process.env.HOST}/blog/${params.slug}`,
    image: {
      '@type': 'ImageObject',
      url: `${process.env.HOST}/images/blog/${params.slug}.webp`,
    },
    description: post.excerpt,
    datePublished: post.date,
    publisher: {
      '@type': 'Person',
      name: 'Jadan Jones',
      url: process.env.HOST
    },
    author: {
      '@type': 'Person',
      name: 'Jadan Jones',
      url: process.env.HOST
    },
  };
 
  const jsonLd = {
    '@context': 'https://schema.org',
    '@graph': [structuredData],
  };
 
  return (
    <>
    <script
        type="application/ld+json"
        dangerouslySetInnerHTML={{ __html: JSON.stringify(jsonLd) }}
      />
      //Your Blog content or Component goes here
    </>
  );
}

Step 2: robots.txt

A robots.txt file is a standard text file that provides instructions to web crawlers and bots, such as search engine indexing agents, on which parts of a website should be crawled and indexed.

In a Next.js application, it's recommended to place the robots.txt file within the app folder. This allows you to easily manage the instructions for web crawlers, ensuring your website's content is properly discovered and indexed by search engines. A well-crafted robots.txt file is essential for search engine optimization, as it helps control the visibility and indexing of your website's content.

User-Agent: *
Allow: /
Disallow: /private/
Sitemap: https://{yourdomain}/sitemap.xml

Step 3: sitemap.xml

A sitemap.xml file acts as a navigational guide for search engines, helping them discover and understand the key pages on your website. This improves the website's visibility and ranking in search engine results pages (SERPs).

In the context of a Next.js application, it's recommended to create a sitemap.js file within the app folder to manage the sitemap generation. This file will allow you to easily add and control the pages that should be included in the sitemap.

By creating a sitemap.js file in the app folder and populating it with your website's pages, you're providing search engines with a clear roadmap of the important content on your site. This can lead to better indexing and higher visibility in search results, ultimately driving more traffic and engagement to your website.

//sitemap.js
import { allPosts } from 'contentlayer/generated';
 
const postsSitemap = allPosts.map((post) => ({
  url: `${process.env.HOST}/${post._raw.flattenedPath}`,
  lastModified: post.date
}));
 
export default function sitemap() {
  return [
    {
      url: process.env.HOST,
      lastModified: new Date(),
    },
    {
      url: `${process.env.HOST}/blog`,
      lastModified: new Date(),
    },
    {
      url: `${process.env.HOST}`,
      lastModified: new Date(),
    },
    ...postsSitemap,
  ];
}

Conclusion

Alright, let's talk about this article - it's a total game-changer for anyone running a blog on Next.js. Not only does it make writing and maintaining your content a breeze, but it also dives deep into the world of SEO. We're talking optimizing for search engines, boosting your site speed, and unleashing the full potential of Next.js to automate all those crucial SEO tasks. By the end, your blog will be primed for search engine glory, attracting readers from far and wide with its well-optimized content and lightning-fast performance.