Enhancing SEO in Single-Page Web Applications in Contrast With Multi-Page Applications

This paper comprehensively reviews methods for improving single-page applications’ visibility (SPAs) and user experience, focusing on the intricacies of search engine optimisation (SEO). This research contrasts the complexities and challenges in optimising SEO in SPAs instead of conventional multi-page applications (MPAs). It identifies vital optimisation methods and evaluates their applicability in the contemporary web landscape. The research method involves implementing the explored optimisation techniques across three distinct projects utilising emerging technologies for SPA, MPA, and a hybrid approach using Isomorphic JavaScript. These applications are systematically examined and subjected to a comparative analysis to assess the effectiveness of the optimisation strategies before and after applying the optimisation strategies. The empirical results substantiate that adopting an innovative approach to Client-Side rendering for the initial page load, combined with traditional SEO practices, performance enhancements, and tailored methodologies for specific technologies, facilitates SEO optimisation in SPAs at a level commensurate with MPAs. The findings of this work hold significant implications for web developers, offering insights and actionable strategies to augment visibility and performance in search engine results. By bridging the theoretical understanding with hands-on application and empirical analysis, the research contributes to the evolving field of web application development. It underscores the critical role of SEO optimisation in the context of SPAs, highlighting its importance for search engine rankings and overall user engagement and satisfaction. Code is available on GitHub:https://github.com/karolinakowalczyk?tab=repositories&q=TravelBLog


I. INTRODUCTION
In today's world, fierce competition is omnipresent in every branch of business, so it is crucial for all organisations and individuals to have a highly ranked website by search engines [1], [2].The process of building a platform in a way that is recognisable and understandable by search engines is called Search Engine Optimization (SEO).SEO directs unpaid traffic (also known as organic traffic [3]) and, does not analyse paid or direct traffic.Proper optimisation is essential as it increases the odds that users and, most importantly, potential customers will find the page after entering a phrase.It is also a perk for regular visitors who can quickly return to the page, directly improving user experience [4].
The associate editor coordinating the review of this manuscript and approving it for publication was Hai Dong .
Typically, users click on searches available on the first page, from which an average of 71.33% of all clicks originate [1].The second and third pages receive only 5.59% of clicks.Moreover, on the first page, the top five results account for 67.60% of all clicks, while results from 6 to 10 account for only 3.73%.Additionally, the fact that a page appears as the first result in Google means it directs 34% of organic traffic to itself [5], [6].Moving from position 2 to 1 doubles the number of visitors.Position 1 is worth as much as 2, 3, 4, and 5 combined and is worth more than positions 5-20 simultaneously.Meanwhile, a jump from the second page to the first (from 11th to 10th position) increases traffic to the page by 143%.Naturally, the goal of SEO is to strive to display the page as the first search result.Although this is not always possible, the above numbers demonstrate how significant even a one-position leap can be [7].Good positioning directly influences greater reach, and being visible and recognisable online is invaluable in the face of stiff competition.Good SEO builds trust and credibility of the brand in users' eyes and allows for analysing customer needs.Based on SEO, tools such as Google Analytics analyse the Search Engine Results Page (SERP) and provide valuable information such as how many people visit a particular page, how long they stay, location, the impact of keywords, etc.
Historically, the concept of SEO was first introduced in 1991 when the first page was launched on the internet.In the past, the term SEO was recognised as Search Engine Marketing (SEM) due to its application in marketing and business services [8].Formerly, if a company wanted to be highly ranked, it had to ensure that its number of keywords was higher than its competitor's.Such practices today are known as spam and negatively affect the website's ranking.
From a business perspective, SEO is cost-effective.Not only does it provide the aforementioned vital data characterising traffic and visitors [9], [10], but it is also cheaper compared to outbound tactics [11].SEO focuses on attracting users already searching for specific products or services without significant advertising expenses.In relation to Pay Per Click (PPC) ads, SEO provides more lasting and authentic long-term results without spending extra money on it.PPC may generate quick traffic, but it is a short-term solution with limited benefits [12].
The paper's novelty lies in its comprehensive approach to SEO optimisation for SPAs.By comparing SPAs with MPAs and emphasising the unique challenges SPAs present, the paper brings attention to a relatively less explored area in SEO research.The experiments conducted across various technologies and the comparative analysis provide fresh insights on different strategies for improvement of SPA application searchability and reachability among potential clients.

II. STATE OF THE ART
This chapter provides a concise, thorough examination of existing methodologies to address the SEO challenges [11] specifically inherent to Single Page Applications, as well as the current advancements and practical approaches within this domain.

A. HOW DOES WEB INDEXING WORK?
SEO algorithms in search engines are closely guarded trade secrets and are not publicly disclosed in full detail.However, information and guidelines, e.g., provided by Google, help us understand how these algorithms work.There are also SEO professionals who offer insights based on their experience.Broadly, search engines use the following methods: • Crawler-based Search Engines • Human-driven directories • Hybrid search engines Popular search engines like Google, Bing, and Yahoo belong to the first category.A crawler (also called a ''robot'' or ''spider'') is a program designed to automatically find and scan web pages by following URLs [13].It searches the internet to find potential keyword matches in the search query, collecting data (images, keyword positions, backlinks, and other data) from these pages for further processing.Subsequently, figure 1 depicts the search engine behaviour.It creates an index based on the fetched copies of web pages and ranks them using various ranking algorithms.
SEO is divided into two subgroups: on-page and offpage.On-page SEO involves optimisation techniques applied directly on the website, under the developer's control.It includes optimising various elements like title tag, keyword usage, meta description tag, robots.txtfile, optimised URL, sitemap, content organisation, headers, image optimisation, and mobile-friendliness [13].
Off-page SEO entails actions and strategies not directly controlled by the developer, including sharing on social platforms, star ratings, link building, and other promotions outside the target website.
PageRank algorithm, devised by Google's founders Larry Page and Sergey Brin, evaluates the importance and popularity of web pages based on link structure.The more valuable and trustworthy sites link to a page, the higher its PageRank.The general formula for PageRank of page p i is: where d is the damping factor, N is the total number of pages, M (p i ) is the set of pages linking to p i , and L(p j ) is the number of outbound links on page p j .In other words, PageRank represents the probability that a person randomly clicking links will arrive at a specific page.For instance, let us assume that there are only four web pages in the world: A, B, C, and D. Initially, the probability of landing on each page is the same -0.25.Page B links pages A and C, page C links to page A, and page D links to all three pages.In this scenario, page B transfers half of its existing value (0.125) to page A and the other half (0.125) to page C. Page C transfers its entire existing value (0.25) to the only page it links to, A. Since D had three outbound links, it transferred one-third of its existing value, about 0.083, as shown in figure 2.
Link farms, systems designed to inflate PageRank artificially, led to a decrease in its impact on search results, though the extent of its current impact remains a trade secret [13].

B. CHARACTERISTICS OF SINGLE-PAGE AND MULTI-PAGER APPLICATION
Among web applications, three types are distinguished: Single Page Application (SPA), Multi Page Application (MPA), and Progressive Web Application (PWA) [14].This paper focuses on the first two types.The primary difference between them is how they transition between web application views.For SPA, this transition does not trigger a full page refresh, eliminating the need to load each sub-page from  the server.This is because SPA loads a single HTML file and then updates the content of this document as needed using JavaScript APIs.The whole page content is loaded simultaneously, and necessary data and content are rendered client-side, enhancing page performance and user experience.SPAs also convert more easily into applications like mobile apps or PWAs [15].The SPA concept utilises routing, allowing for the creation of links within the page without reloading it.
Historically, the SPA concept was described in a 2002 patent [16] and was practically implemented in 2006 using AJAX technology, which enabled asynchronous page updates.This allowed data to be sent and received in the background without affecting the current page view.The rise of UX importance over the years led to the popularity of SPAs and the emergence of well-known frameworks like Angular, React, and Vue, with libraries like Fetch API and Axios representing modern AJAX.
On the other hand, the traditional MPA approach is based on multiple HTML files dynamically generated by the backend.Each transition between sub-pages requires a new page request, fetching the HTML file from the server, as depicted in figure 3.This brings about a significant issue: data transmission from the server might not always be smooth, causing page loading and content display delays.
Meanwhile, SPAs have the drawback of slower initial loading, as JavaScript needs to make network connections to load content.According to Google, the probability of a user leaving a page on a mobile device increases by 32% if the page load takes 1-3 seconds, 90% for 1-5 seconds, and 123% if it takes longer than ten seconds [17].Given the current dominance of mobile devices over desktops, optimising the initial load time is crucial.Various solutions exist, such as lazy loading, minimisation, and compression of images and files.However, SPAs face a significant challenge with SEO.For a long time, Google bots ignored content delivered via JavaScript as search engines primarily rely on HTML content for indexing and crawling web pages.Since 2014, Google's bots have been striving to understand JavaScript [13], and Googlebot itself is being constantly updated.

C. ISOMORPHIC JAVASCRIPT AND PRE-RENDERING
The first method discussed focuses on improving the initial load time of SPA pages and providing search engine bots with a more understandable, full HTML document instead of the more challenging to-interpret JavaScript code.Understanding this method requires an explanation of two concepts: Server-Side Rendering (SSR) and Client-Side Rendering (CSR) [18].Server-Side Rendering means that each page request is associated with rendering the HTML file on the server.The server sends the generated HTML to the web browser, as shown in figure 4.This approach is familiar as it is utilised by traditional Multi Page Applications (MPAs).
On the other hand, Client-Side Rendering involves rendering and updating the view through JavaScript.The browser only fetches the initial HTML file (which does not contain much content) along with JavaScript files from the server.Dynamic content changes and server requests happen on the client side, as shown in figure 5. Standard SPA apps use this approach.
In the context of single-page applications, exploring solutions that combine the advantages of both approaches is deemed valuable.Isomorphic JavaScript (also known as universal JavaScript) is a concept where the same JavaScript code can be run both on the server and the client sides [19].This is achieved by utilising SSR for the initial page load and then CSR for any subsequent changes in the interface.This process comprises the following stages [20], [21]: • The first step is dynamically generating a static HTML file on the server side and delivering it fully rendered to the client while the user first accesses the page.At this point, the browser can fetch all meta tags and text found in HTML and CSS files.Simultaneously, the page load time is shortened because the page is rendered on the server side before being sent to the client.
• Subsequently, after the initial page load, JavaScript takes over, retaining all the positive aspects of single-page applications.This implies that JavaScript handles further rendering and updates without server involvement.In summary, after the initial load, the application behaves like a traditional SPA.Due to the server-side rendering of the first-page load, this operation's time is reduced, and the HTML files are more understandable for bots.This behaviour positively impacts SEO [22].
Another crucial concept concerning ''manipulating'' the initial page load to facilitate search engine work is prerendering -pre-generating HTML content for each page beforehand, before sending it to the client's browser [23], [24].Figure 6 shows how the cached HTML code is shown to search bots, while the ''normal'' page content containing JavaScript is displayed to real users.This means that instead of seeing an empty screen and waiting for the JavaScript code to load, the user will see the entire HTML content upon first entering the site (but without interactivity until JavaScript is loaded).
Three forms of pre-rendering exist [25]: 1) Static Site Generation (SSG) -generates HTML during compilation.The pre-rendered HTML code is then used for each request.During the application build, two scenarios may occur: a) If the HTML page does not require external data, its content will be rendered by default (e.g., in an IT software development company application, the ''about us'' page).b) Conversely, when a page requires data from an external API, as part of the static generation process, data is fetched in advance and used to generate the HTML code (e.g., the five latest posts on the homepage in a blog application).All the pre-generated content is then stored in a CDN (Content Delivery Network), ensuring that a cached version is sent whenever a user requests to view a web page.This behaviour positively impacts performance.Generally, this method is recommended for applications with static content, like a blog's ''about me'' page.
2) Incremental Site Regeneration (ISR) -is a rendering method used in Next.js applications to create or update static pages post-build and deployment.In this method, the HTML file is generated at certain time intervals.Each time a request is sent, the page is statically generated with a specified initialisation time.After this period, the page is generated again.In ISR, particular cases may lead to inconsistencies on the page.3) Deferred Static Generation (DSG) -this rendering method is available in the Gatsby framework and, similar to ISR, involves delaying the generation of some pages until users visit them.

D. ISR AND DSG -COMPARISON
Assume that the web application is an online store.In the ISR scenario, a situation may arise where, for instance, a message ''out of stock'' is displayed on the product page, while on the same product's main page, the application claims that the product is ''in stock''.This will occur when data from the API changes between compilations [26].On the other hand, DSG utilises data from the previous complete compilation, preventing this inconsistency as seen in figures 7 and 8.When all the above concepts are combined, a powerful tool for SEO improvement emerges.The focus during the initial page load is on providing bots with content for indexing, and subsequently, it shifts to ensuring smooth interface operation.Ho Seop Jeong's work (2022) demonstrates that the method of Isomorphic JS and pre-rendering is the most widely used approach to enhance SEO in singlepage applications [24].The author interviewed seasoned developers who were co-creators of SEO-focused SPA applications, working with various frontend technologies  in his study.Statements from the developers indicate that there are now numerous frameworks and tools implementing isomorphic JS and pre-rendering, thus eliminating the need to implement them manually.

E. HIJAX APPROACH
HiJAX (''High-performance'' and ''AJAX'') is a term proposed by Jeremy Keith, focusing on the incremental enhancement of a standard web application using Ajax [27].To implement this approach, one should initially develop a traditional Multi-Page Application where the page is refreshed with every interaction requiring a view update.Subsequently, if the user has JavaScript enabled in their browser, the code intercepts link clicks and form submissions, handling them with Ajax [28].In this method, the page content is injected into the layout by the server without executing JavaScript code.Hijax also enables designing a Single-Page Application with multiple sub-pages while tracking changes in both URL addresses and page blocks.In Hatami's article [28], the author discusses handling a page request in a Hijax implementing application.When a page is requested for the first time, it does not contain a query string parameter children.Therefore, the server redirects the request to a new URL containing the children parameter, representing the URLs of sub-pages.The entire process encompasses recursive execution to load the sub-pages, and the response headers of each sub-page are appended to the header list sent to the client.
Additionally, each page's structure includes a block tree, and the URLs of all blocks are organised in a tree-based structure.When a user invokes a backward or forward action in the browser, the URLs of all blocks are compared with the URLs of blocks specified in the DOM.New URLs are requested using AJAX for the respective blocks if they do not match.This process ensures correct application state updating and allows multiple sub-pages to be loaded simultaneously.
The utilisation of Hijax in a project should be planned from the outset, but its introduction should occur at the end.However, it necessitates writing more code and sometimes duplicating implemented functionalities [29].Currently, Hijax is not a popular approach; its concept was introduced quite some time ago, in 2006 [27], yet it is still mentioned in new articles as an approach to improve SEO in single-page applications.

F. PROGRESSIVE ENHANCEMENT WITH BROWSER FEATURE DETECTION
This method focuses on designing web pages to ensure that their content reaches users first, regardless of the browser (and its age) they use [30], [31].Within this concept, code is created to function on every browser, and then the page enhancement continues to utilise the potential of new browsers.Since content is the priority, styles and features specific to particular browsers or their newer versions are secondary.Adherence to global web standards is encouraged, and implemented essential functions, and standard events should be accessible in all browsers.It is essential to check browser features and handle cases where a function is incompatible.Examples of progressive enhancement implementation: • If a browser does not have JavaScript extension enabled, it restricts the use of frameworks implementing SPA (e.g., React, Angular).In this case, a conditional statement should be created informing about the necessity of enabling the plugin.
• If a page uses custom fonts, an alternative, commonly available font should be specified in the CSS file.Indeed, some recommendations for progressive enhancement also pertain to SEO improvement in the context of MPAs.However, it is particularly crucial for SPAs as they rely more on JavaScript.

G. EACH VIEW HAS ITS URL
This concept articulates that each view should be treated as a URL [32].Technically speaking, a SPA encompasses only one page (a single index.htmlfile), yet visitors perceive it as browsing through multiple pages.
As users navigate through different sections of the same page, only the hash part of the URL changes, for instance, https://example.com/#/contact.Bots do not regard such addresses as separate URLs (listing 1).
In 2009, Google introduced a solution to this issue and described the idea of escaped fragments [28].This concept is an agreement between the indexing bot and the website.When a bot attempts to index a page, the site indicates that it supports the indexing schema using the symbol #! instead of just # -the hashbang method.In practice, the URL https://example.com/#/contact is transformed to https://example.com/#!/contact.The part of the address after the # symbol is termed a hash fragment, and the URL itself is referred to as a ''pretty'' URL.Subsequently, the crawler replaces the #! symbol with ?_escaped_fragment_ =, creating a new URL that is sent to the server.For the earlier used example, it would be: https : //example.com/?_escaped_fragment_ = contact.
A URL using escaped fragments is termed as an ''ugly'' URL.The ''ugly'' URL is converted back to a ''pretty'' representation and is invoked in a headless browser, where the page is rendered, HTML code generated based on JavaScript code, and returned to the bot for indexing.If the symbol #! cannot be used in the application, it is possible to define a meta tag in the HTML file, which defines the page without a hash fragment.The implementation is shown in the listing 2.
However, in 2015, Google announced that it no longer recommends the proposition related to escaped fragments [33], stating (freely translated): ''Times have changed.Today, unless you blocks Google's bot from indexing JavaScript or CSS files, we generally can render and understand your web pages just like modern browsers.''Simultaneously, it advises the application of the principle of progressive enhancement, and within it, the History API.This API allows for changing the URL without reloading the entire page.Currently, libraries implementing the logic of the History API for frontend frameworks exist, making its use even more convenient.

H. PROPER LINK CREATION
It is also essential to create links that can be parsed by Google bots [34].In this case, the <a> tag with the href attribute should be used (listing 3), and reliance on the JavaScript function onclick should be avoided as seen in listing 4.
Links formatted differently have no guarantee of correct parsing and, consequently, indexing.Other aspects to adhere to include: • Links should be actual web addresses to which the Googlebot can send a request.Otherwise, they might be misunderstood.(listing 4).
• Link text cannot be empty (listing 4).The text does not have to be a string only.An img element or a title attribute can be added (listing 3).• The link text should also not be generic but should describe the sub-page it redirects to, should not be too long, and there should not be too many of them placed next to each other.

I. VIEWS FOR ERROR PAGES
In the case of single-page applications, the server does not handle errors -incorrectly always returns status 200.Of course, this does not change the fact that errors with status 404, 500, etc., can occur (e.g. when a user makes a typo in the URL), and in such cases, appropriate redirects to a URL that triggers the proper server status code should be implemented (and a useful page for the respective error status, providing instructions on how to proceed, should be displayed to the visitor) [35].Moreover, blocking 404 pages from indexing in the robots.txtfile is a mistake.Instead, a no index tag should be added.

J. OTHER METHODS
The importance of titles and meta tag descriptions is common for both SPA and MPA [36].However, in SPA, they can pose a more significant challenge due to the existence of only one HTML file.At the same time, duplicating titles and descriptions is one of the most common SEO problems.Using JavaScript code, the solution is to set and change the <title> element and meta tags.The focus should be on views, which are HTML fragments in SPA, which users perceive as separate screens.Similarly, if there is a need to add robot meta tags (name="robots") [37], it can be implemented from the programming language level.

K. EVALUATING THE ACCURACY OF SEARCH ENGINE ALGORITHMS
Evaluating the accuracy of search engine algorithms is a complex endeavour, particularly given the proprietary nature of these algorithms employed by major search engines like Google [38].The evaluation process often relies on indirect methods and surrogate measures of effectiveness and relevance.
Relevance of Search Results is a fundamental metric.User studies, where participants rate the relevance of results for given queries, provide direct feedback on the search engine's performance [39].Additionally, expert evaluation, involving subject matter experts assessing the relevance of search results, can offer more nuanced insights.Precision and recall are traditional metrics used in information retrieval to assess the effectiveness of search engines.Precision measures the proportion of retrieved documents that are relevant, while recall measures the proportion of relevant documents that are recovered.Balancing these two metrics is critical for a well-rounded search engine.Click-through rate (CTR) analysis involves examining which search results are selected by users, providing an indirect measure of the perceived relevance of these results [40].However, some aspects of search engine evaluation could be more challenging to measure directly.For instance, A/B Testing, a standard method in web development and marketing research, may not be feasible due to the lack of access to the search engine's internal mechanisms [41] Similarly, Query Response Time is a critical factor for user experience.Still, its direct measurement may only be practical with access to the search engine's infrastructure.
While these methods provide a comprehensive framework for assessing search engine accuracy, it is essential to recognise the limitations and challenges in measuring certain aspects, such as A/B testing, query response time, and search result ranking.These limitations necessitate a careful and nuanced approach to evaluating search engine algorithms.

III. METHODOLOGIES
Given the technical nature of the work, the focus was directed towards on-page optimization methods and techniques.The most crucial aspect introduced was the integration of Client-Side Rendering with Server-Side Rendering to ensure the application remained single-page yet provided crawlers with content and HTML files upon page load.Additionally, progressive enhancement elements were employed during implementation, and each page view was assigned URL addresses in line with best practices.General SEO recommendations outlined in the previous section were also applied.
11604 VOLUME 12, 2024 Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.

A. APPLICATION IMPLEMENTATION TOOLS
To facilitate subsequent analysis, testing, and comparison of selected optimization methods, it was decided to implement the identical application in three different technologies.The project involved creating a blog with standard login, registration, and posting functionalities.It was assumed that the first application should be a complete Multi-Page Application, the second a full Single-Page Application, and the last one utilizes the concept of Isomorphic JavaScript.
Flask technology was chosen to create the MPA.Flask is a microframework written in Python [42].It is designed for building simple, scalable, and fast web applications.Despite its minimalism (Flask does not impose many builtin features [43]), it allows for extensions to handle databases, user authorization, form handling, etc., as per requirements.On the first page of the official documentation, it is mentioned that Flask relies on the Jinja template engine and Werkzeug library for communication between server and client [44].Well-known applications like LinkedIn and Pinterest utilize this framework.Flask version 2.2 was used in the project.
To implement the SPA using only Client-Side Rendering, React version 18.2 was used -one of the most commonly used JavaScript libraries [45].On the StackOverflow platform, it is the most frequently appearing frontend technology in question, as shown in diagram n figure 9, which indicates its popularity among developers.React employs the ''Virtual DOM'' technique, which minimizes direct manipulation of the real DOM and makes rendering more efficient.It was created by Facebook, with widely known commercial applications like Netflix, Facebook, Instagram, and GitHub utilizing it.
Currently, many frameworks implement Server-Side Rendering and pre-rendering methods in SPAs.Hence, there is no need to implement the concepts discussed in the state-of-theart section from scratch.After choosing React as the leading library for SPA implementation, solutions compatible with it were sought.The most commonly encountered solutions include: • Next.js • Gatsby • React Router • Express.jsWhile the React Router library and Express.jsrequire more manual implementation, the first two tools are directly dedicated and most often recommended for transforming a React application to support isomorphism [46].In other words, both Next.js and Gatsby support Server-Side Rendering and Static Site Generation.There used to be more discrepancies between them than there are now, yet certain differences still exist, such as: • data fetching -Gatsby prefers using GraphQL, while Next.jsdoes not have a preference.
• plugins and ecosystem -Gatsby offers more communitycreated plugins and facilitates template creation.With Next.js, more programming is involved, which provides greater control over code details but is more labour and time-intensive.
• problem-solving -in the case of Gatsby, moving away from the rich plugin ecosystem to apply a non-standard approach can be more challenging.In this regard, Next.js is more accessible.
• Next.js supports ISR (Incremental Static Regeneration), while Gatsby implements DSG (Deferred Static Generation).A discussion of both approaches can be found in the previous section.In summary, when the latest data is most crucial for the application, it is better to use Next.js.If stability and consistency are more important, then Gatsby should be used [47].
Eventually, Next.js technology was chosen since it was decided not to use GraphQL, and it was deemed that data currency (e.g., displaying the top 5 latest posts) is a priority in a blog application.
Next.js version 13 was used in the project, released in October 2022 and stabilized in May 2023.It introduces several changes.For instance, routing is determined by file structure instead of being programmatic, extended support for server-side executed and rendered react components was added, and asynchronous components were implemented.Also, an update was made to the Image component, which now contains less client-side JavaScript [48].
Firebase platform was chosen for handling backend logic and database, enabling quick and easy support for this project layer [49].It comes with a Firestore database, which stores data in collections and can contain many documents.Each document is stored as a key-value set, similar to traditional databases.Firestore Database is a NoSQL database.Moreover, Firebase offers user authorization mechanisms, facilitating the smooth implementation of login, registration, and account management functionalities.Firebase Storage also proved helpful in cloud file storage in the project [50].
The project code was stored and updated on a remote GitHub repository.This tool was handy in version history storage and application branching.Therefore, during the analysis of conducted optimizations, switching between versions for comparison purposes was possible, including reverting to application editions before changes.
Tools for analyzing websites in terms of SEO require providing their URLs.This means that each of the implemented applications should subsequently be hosted on the Internet.For this purpose, the cloud platform (PaaS -Platform as a Service) Heroku was chosen, which provides an intuitive interface and enables quick application deployment.It also supports many different technologies, including Python and Node.js, and allows for convenient project deployment using git and a project repository placed on GitHub [51].

B. RESEARCH TOOLS
There are numerous SEO research tools focusing on different aspects.However, most of them are paid and offer only a few days of free trial version [52].Therefore, while choosing the platform for research, the rich resource of measurements returned by the tool and the financial aspect were considered.Eventually, Seobility and Google PageSpeed Insights were utilized.Seobility is a web platform used for analyzing and optimizing websites in terms of search engine ranking.The main functions and capabilities offered by Seobility include: • basic performance information: response time, file size • meta information analysis: title, meta tags (description, viewport, lang, etc.), search engine accessibility, canonical links, presence of favicon icon • page quality: content analysis, mobile device optimization, use of alternative text with images, secure HTTPS connection usage • page structure: headers, internal and external links • server configuration: data compression, performance • popularity on social media and other sites (Facebook, webwiki) • robots.txtfile check • key keyword analysis [12], [53] Very usefully, Seobility returns a task list, upon completion of which the website should be better positioned.This tool is free for three queries per day, and upon creating an account with a free plan, this number increases to five queries per day [52].
Google PageSpeed Insights (hereafter referred to simply as PageSpeed) is a free tool developed by Google which analyzes and evaluates website performance.It separates results for mobile and desktop versions of the application.Besides performance and SEO sections, it also conducts measurements concerning accessibility and verified methods.It summarizes results for key performance elements: • First Contentful Paint -time to render the first text or image • Total Blocking Time -the sum of all periods between the first contentful paint and time to full interactivity when task length exceeded 50 ms (result provided in ms) • Speed Index -indicates how quickly the page populates with visible content • Largest Contentful Paint -time to render the largest text or image • Cumulative Layout Shift -a measure of the movement of elements in the visible area.Quantifies how often users experience unexpected layout changes [52].

IV. EXPERIMENTS A. APPLICATION DESCRIPTION
This section outlines the creation of a travel blog web application.The application allows unregistered users to view content and log in or register, while registered users can add and review their posts.The focus is on travel-related content, sharing experiences and travel tips.Features like hashtags and image uploads are incorporated to enhance the user experience.The design is responsive to various device screen sizes, positively impacting SEO and user engagement.The views were crafted based on a template from wix.com, with interface readability and aesthetic appeal being the primary considerations.
The core assumption was implementing three identical pages differing only in the supporting technologies for different application styles (MPA, SPA).This uniformity aids in a more accurate comparison and analysis of technological differences, minimising evaluation errors, especially in the SEO context.

B. IMPLEMENTATION OF SELECTED SEO OPTIMISATIONS 1) ISOMORPHIC JAVASCRIPT
Next.js is utilised for its isomorphic JavaScript approach, initially pre-rendering pages using Static Site Generation or Server-Side Rendering, then transitioning to Client-Side Rendering for interactive elements.Version 13 introduces server components ideal for handling non-reactive components, enabling better server infrastructure utilisation.The functionality of fetching all blog posts is demonstrated using a server component in Next.js and the useEffect hook in React, showing the server-side implementation to be more efficient.
2) FILE ROBOTS.TXT The robots.txtfile prevents crawling of login, registration, add post, and user post display pages, conserving crawling time and directing bots to more relevant pages.

3) IMAGE OPTIMIZATION
Image optimisation strategies, such as format transition to WebP, specifying image dimensions, and implementing a responsive image strategy, are employed to improve application performance and page load times.In Next.js, the Image component is used for managing responsive images, while in React and Flask, the srcset and sizes attributes are utilised.The initial image on the main page is loaded with high priority to positively impact the Largest Contentful Paint (LCP) metric [54].

4) LAZY LOADING OF COMPONENTS
Code splitting is a technique aimed at minimising the page load time.Sending less JavaScript code during the initialisation of an application positively impacts interactivity and the metrics of First Input Delay (FID) and Interaction to Next Paint (INP).The problem of delivering a large payload of JavaScript is particularly severe for owners of weaker devices and slower network connections.
Client-side rendered components consist of JavaScript code.Accordingly, both React and Next.js introduced mechanisms to address this issue.These mechanisms ensure a component is loaded (and JavaScript code is executed) only when necessary.React achieves this through the 'lazy()' function, which provides a built-in way of splitting the application components into separate JavaScript code chunks and is relatively easy to use.However, there will always be a slight delay during the fetching of an element with the codesplitting mechanism.React framework solves this problem with the 'Suspense' component, which displays a fallback component during loading, for example, a spinning loader indicating to the user that some content is being loaded.In the case of a blog, this is a custom 'Loading' component.In Next.js, lazy loading can be implemented similarly, but there is also a second method using 'next/dynamic', a combination of 'lazy()' and 'Suspense'.The second method was utilised while writing the Next.jsapplication.

5) ELIMINATION OF RENDER-BLOCKING RESOURCES
Render-blocking resources such as styles and JS files can be found in both SPA (Single Page Application) and MPA (Multi-Page Application), impacting the performance and SEO.Examples include additional fonts from Google, main styles, or the Bootstrap library (in the case of Flask, included via CDN in the main HTML file).On the other hand, these are also critical files necessary for styling the initial content of the website.The solution is to preload these files.
In Flask, a script with logic for dynamically adding and removing hashtags is also added, which could block rendering, so the 'defer' attribute is added.This makes the JS file wait for all DOM elements to load before it starts operating, preventing situations where JavaScript tries to operate on elements that have not been loaded yet.There is also the option to use the 'async' attribute to prevent render blocking, but it is unsuitable in this case as, unlike 'defer', it does not guarantee the execution order (the script could be executed before the DOM elements are loaded).

6) SECURE REDIRECTS
During the deployment of the application, the HTTPS protocol was ensured for data transmission.However, there is a scenario where the HTTP protocol might be hardcoded in the site address.Such a request should be blocked, and the user should be automatically redirected to the HTTPS protocol for security reasons.In Flask, the 'flask talisman' library was used for default security header settings.In React, the 'react-https-redirect' library was used, and in Next.js, the 'next/server' module was utilised, extracting the 'NextRequest' and 'NextResponse' objects.With their help, a middleware function was implemented to check if the HTTPS protocol is in the header and whether the current application is not running locally.If both conditions are not met and the current environment is in production, a redirect to the HTTPS protocol is performed.

7) LINKS COMPLIANT WITH BEST PRACTICES
Internal links on the site are designed with a proper structure: 'domain/parent directory/subdirectory'. URLs are short, simple, descriptive, and do not contain special characters, diacritical marks, or punctuation.Hyphens separate words, and only lowercase letters are used, and so-called ''stop words'' such as ''the'', ''of'', ''with'', and ''for'' are not present.
One external link leading to the project repository on GitHub was also placed in the footer.In a commercial application, this is an appropriate place to put external links to social media to increase the site's reach.

8) COMPRESSION
Compression is a simple and effective way to save bandwidth and speed up the site.It makes files transferred to the browser from the server weigh less, and the site loads faster.The form of data compression used for transmitting text files, such as HTML, CSS, or JavaScript, is called gzip.In Flask, the 'flask compress' library was used for implementing this method.In React -'react compress', and in Next.js, gzip compression is set by default.

9) MOBILE-FRIENDLY AND INTUITIVE APPEARANCE
This method is reflected in the site's responsiveness and a neat user interface template, encompassing a readable font that is easy for humans to read.Mobile optimisation is further enhanced by adding a meta tag rel="icon", defining an icon displayed when users add the site to their mobile device's home screen.A unique meta tag rel="apple-touch-icon" is used for Apple devices.

10) META INFORMATION
Adding meta information like title and description comes more naturally in MPA applications, where each page is a separate HTML file.However, a template incorporating meta tags was added and then customised to standardise the page layout in Flask.
Besides defining the appropriate meta tags, a language tag lang="en" (as the application content is in English) and an alt attribute describing image content were included.Furthermore, a favicon icon was added, a small symbol helping users recognise the website, usually a simplified version of the logo or graphic symbol, displayed next to the site title in the browser tab.It benefits users with multiple tabs open, facilitating orientation among currently opened sites.The application was also enriched with concise descriptions containing keywords.

C. SEO TOOLS
The first part of the research was conducted using the Seobility tool.Implemented applications in each technology (Flask, React, Next.js) were analysed before and after optimisation methods were applied.Table 1 shows the optimisation methods used.The X sign indicates that the action did not have to be introduced (i.e. it already existed in the application version before the changes, for example, because the default generated project already had these elements).
The Seobility report included an overall SEO score, a review of individual SEO areas, and a detailed analysis description.Figure 10 shows the overall SEO score achieved by the individual technologies before and after implementing optimisation methods.

D. SEO OPTIMIZATION RESULTS
The presented chart indicates that the React application did not improve its score despite the introduced optimisation methods.Why is that, even though the methods were applied?The Seobility tool does not see changes introduced by JavaScript code -and in the case of a page using only Client-Side rendering, all changes are introduced through code.An example is the fact that the tool does not see any words on the page, files, or links at all as seen on figure 11.In practice, this does not mean that, for example, the Google browser would not see a difference in the versions of the React applicationit depends on whether it could analyse the JS code.
Another noticeable conclusion is that the application written in Next.js, even in the version without optimisation methods, achieves a better score than the React application.This shows that using a static site generation approach at the first load positively impacts SEO, as, at this implementation stage, the applications did not differ.
Another, at first glance, the surprising phenomenon is the lowest score before optimisation for the application written in Flask -it is even lower than for React.This result may seem incomprehensible, as, by definition, an MPA page should achieve a higher SEO score.The explanation for this situation lies in the application implementation process.The project written in Flask was not generated like the other two.What does this mean in practice?In Flask, the entire structure, all files, and directories were created from scratch -and at this stage, mistakes were made.For instance, initially, the CSS files were placed incorrectly, so the application loaded all of them simultaneously, wasting resources.
Similarly, the Flask page did not initially have a default favicon icon and language tag.Projects written in React and Next.js had a generated structure from the beginning, and therefore, some elements were automatically implemented according to best practices.However, as the above results also show, both the applications written in Flask and Next.js achieved similar results after the changes.On the one hand, this means that certain gaps resulting from the manual  implementation of some fragments in Flask were ''patched''.On the other hand -an application visually behaving like a pure SPA application (in Next.js) achieved the same SEO score as an MPA application.
Despite the introduction of optimisation methods, the applications achieved an SEO score of 70%, and of course, the aim is to achieve a 100% score.This result was not increased further because of factors that the Seobility tool takes into account when calculating the score, which goes beyond the scope of technical SEO improvement methods.This includes, for instance, website promotion on social media, the presence of backlinks, or hosting the website on a top-level domain instead of a Heroku subdomain.
Table 2 presents the results achieved by Flask, React, and Next.js applications in individual SEO areas before and after optimisation.The most considerable increase for the meta information area was achieved by the application written in Flask (reason: many elements were not taken into account in the initial stages of implementation, which were automatically generated in other applications).However, finally, each of the applications achieved a very high score in this sector, amounting to 98%.On the other hand, the quality of the page (i.e., content analysis, optimisation for mobile devices, use of a secure HTTPS connection, etc.) increased the most in the case of Next.js and maintained the highest level among all in the end.The areas of page structure and link structure for both Flask and Next.js achieved similar initial and final values.The lack of change in React's results is due to the reason discussed earlier in chapter IV-D.
It was also decided to compare the results for keywords.Keywords are terms that users enter into the browser to find information or products related to a given topic.Of course, there can be many potential keywords.This work decided to focus on two simultaneously simple and intuitive ones: the word travel and the phrase travel blog.Figure 12 illustrates both of the implemented technologies, comparing the result before and after optimisation.It shows that for each application, optimisation methods have improved the result.For Flask and Next.js, the results are the same (the content of headers, titles, descriptions, etc., on each page is identical).Also, for both technologies, higher effects are achieved by the word travel than the phrase.In the case of React, both the phrase and the word remain at the same level.
The second part of the research was carried out using the PageSpeed tool.Here, too, the versions of the applications were examined before and after the application of optimisation methods described in table 1.However, in this case, the SEO score for each of the technologies both before and after the changes was 100%.The reason is the lower accuracy of PageSeed in measuring these SEO elements.In other words, PageSpeed takes into account a much smaller number of factors, and if any are taken, it is with little accuracy.For example, this tool checks whether the page contains a <title> tag but does not analyse its content.Table 3 presents all the factors assessed by PageSeed and indicates whether the Seobility tool also took them into account (if so -almost always with greater insight).On the other hand, the application written in React also achieved a high SEO score, which means that this tool can examine the content generated by code written in JS.The reason may be that PageSpeed is a tool from Google and the company declares that it is getting better and better at analysing JS language.
In the ''Verified methods'' section, the situation looked similar -in each variant, a score of 100% was achieved.All the points taken into account in this section were implemented intuitively due to developer experience.These are not points indicated as key for SEO, but rather as generally good practices in website design.Examples of them include checking whether the page uses HTTPS or whether it does not use deprecated APIs.
In the ''Accessibility facilitation'' section, a score 97% was achieved in each case.The reason for not receiving the maximum result was a hint that the background and foreground colours have an insufficient contrast ratio.In the author's opinion, this is an over-the-top comment, as the UI of the page looks good, so it was decided not to take it into account in improvements.Other points referred to elements that affect SEO but have already been correctly implemented -e.g., the note that ''links have distinctive names'' or ''header elements appear in descending order''.Again, these are components that are by default used by developers, not only in the context of SEO but also in general rules for creating applications.
*The rel=canonical attribute was not considered because the implemented blog is a relatively small application, not having many versions of the same content.If the application were to develop and, for example, filtering and sorting were added, which would change the URL address of similar pages, then it would be necessary to consider placing this attribute.Similarly, if the hosted page had URL versions with and without www [55].The ''Performance'' section was the subject of the leading research based on obtaining information about each application's optimisation level.As described earlier, performance affects SEO, and PageSpeed provides data for such parameters as FCP, LCP, CLS, etc.The result itself can change due to changes in internet traffic routing.Therefore, measurements were made at different times of the day, and the results were averaged.For analysis, as input data, applications were taken into account, which already had implemented methods presented in table 1, but did not have implemented methods listed in table 4.This table also lists which of the mentioned actions had a direct impact on which of the examined parameters if they were marked by the tool.
The rel=canonical attribute was not considered, as the implemented blog is a relatively small application, not having multiple versions of the same content.If the application were to develop and, for example, filtering and sorting were added, which would change the URL of similar pages, then it would be advisable to consider including this attribute.Similarly, if the hosted page had URL versions with www and without www.

E. PERFORMANCE ANALYSIS
The ''performance'' section was the subject of major studies, which were based on obtaining information about 11610 VOLUME 12, 2024 Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.the optimisation level of each application.As previously described, performance affects SEO, and PageSpeed provides data for such parameters as FCP, LCP, CLS, etc.The score itself can change due to changes in internet traffic direction.Therefore, measurements were taken at different times of the day, and the results were averaged.For analysis, as input data, applications that already had implemented methods presented in Table 1, but did not have methods listed in Table 4 implemented were considered.This table also lists which of the mentioned actions had a direct impact on which of the investigated parameters if they were marked by the tool.
Figure 13 shows the increase in performance score before and after optimisation for each technology for mobile devices.Applications written in Flask and Next.js received a similar score after implementing optimisations (Next.jsslightly higher), with the page implemented in Next.js showing a more considerable increase in performance.The lowest final result was for the application written in React, but it also received the lowest score initially -comparing the increase in its performance is higher than in the case of Flask.The difference to React's disadvantage in the final score may result from the fact that this framework uses many JS files.It might be possible to implement delayed loading for some of them.
On the other hand, figure 14 shows the results of a performance increase for desktop devices.Here, the React and Next.js technology has a similar score, Flask slightly lower.The most significant increase in performance was observed for the react application, and for Flask and Next.js, the increase is at a similar level.
The general conclusion for both types of devices is that the average performance on a desktop device for all technologies is higher than on a mobile device.At the same time, the mentioned optimisation methods positively impacted performance in each variant.
More detailed results considering optimisation parameters are included in table 5 for mobile devices and table 6 for desktop devices.For mobile devices, the best time for the first contentful paint (FCP) total blocking time (TBT) and speed index (SI) both before and after optimisation was achieved by the flask application.On the other hand, the largest contentful paint (LCP) takes the shortest time for the application written in Next.js.Also, this application variant  maintained the cumulative layout shift (CLS) indicator at level 0. On the other hand, the FCP results are the worst here.However, compared to React, TBT performs better, although initially worse, after applying optimisation methods.The react application achieves the leading FCP result for desktop devices, which is generally very good for all technologies, as it reaches a value less than or equal to 1 second.The TBT indicator is managed to be zeroed out initially for each variant.The final SI is again the most efficient for Flask and LCP for Next.js.Applications in React and Next.js achieve a zero level of CLS.
In summary, both selected tools allowed for a thorough examination of various factors affecting SEO.The success of the presented studies is the improvement of the examined areas and parameters after implementing optimisation methods.

V. CONCLUSION
Applying search engine optimisation methodologies for web applications is a significant area of research in contemporary computer science.In the era of rapid internet growth, nearly every company desiring to capture customer interest with their products or services should establish a presence online.Amidst substantial competition, search engines are a primary source of website traffic, directly impacting a business's online visibility.Consequently, focusing on SEO is a crucial objective during the design phase of web development.Ultimately, even the most elegantly constructed and well-implemented web application is futile without an audience.
With the evolution of web technologies at the onset of the 21st century, single-page web applications emerged, streamlining interfaces and emphasising positive user experiences.This development presents a new challenge for the broad professional group responsible for planning, implementing, and promoting websites.The proliferation of SPAs also opened up a new area for SEO optimisation exploration.While devising methodologies, various factors must be considered, including the evolution of search engine bots over time.On the one hand, this is positive as bots increasingly understand JavaScript code.On the other hand, previously implemented solutions may require further development to maintain the same results.Additionally, despite popular web browsers providing guidelines on SEO-friendly web design, the exact algorithms utilised by bots remain undisclosed.
This study examined contemporary optimisation methods that enhance SEO in SPAs, emphasising its critical role in web application creation and providing a general overview of crawler operations.Various solutions were discussed historically as optimisation solutions have evolved.The core of the research lies in demonstrating that adopting an innovative approach to Client-Side rendering for the initial page load, combined with traditional SEO practices, performance enhancements, and technology-specific methodologies, can significantly improve SEO optimisation in SPAs to a level comparable with MPAs.
In summary, the main research objective of the paper is to comprehensively review and assess various methods for enhancing the SEO of SPAs, contrast these methods with those used in MPAs, implement and evaluate these methods in real-world scenarios using contemporary web technologies, and provide empirical evidence and actionable insights that can help web developers optimise SPAs for better visibility and performance in search engine results, thereby enhancing user engagement and satisfaction.The paper bridges theoretical concepts with practical application and empirical analysis, contributing to the field of web application development with a focus on SEO in the context of SPAs.
Selected strategies were then implemented in both MPAs and SPAs using modern frameworks, facilitating a comparative effectiveness analysis.It was demonstrated that the application of refined methods positively reflected in the results; in other words, these methods work.A groundbreaking SPA optimisation approach is adopting a strategy different from Client-Side Rendering for the initial page load.HTML code, easily interpreted by bots, is generated immediately upon redirection to the page while view changes and link handling function as in a traditional SPA.This approach allows users to continue enjoying the positive experiences of standard SPAs while underlying rendering logic becomes more accessible to search engine bots.Combining this idea with traditional SEO enhancement methods, performance optimisation, and solutions tailored to specific technological solutions can achieve SEO optimisation results equivalent to MPAs.Conducted research and results substantiate this hypothesis, although SPA cases often necessitate project structure overhauls and deeper architectural analysis concerning rendering and data retrieval on the client or server side.Numerous systems are available to help detect SEO issues on websites, and although most are paid, they are a worthy investment from a business perspective.Utilising various tools with different accuracies and focus areas provides diverse insights.
Current application versions across various technologies form a solid foundation for further refinement.For instance, performance-wise, evaluating principles regarding effective caching, examining JavaScript code for loading delays, and considering sitemap addition as applications expand could be beneficial.A range of actions could also support commerciallevel SEO, such as domain purchasing, site promotion online, click monitoring, indexed page count, and other statistics through tools like Google Search Console.
This paper serves as a comprehensive guide for individuals wishing to delve into SEO topics, recognise influencing factors, understand why SPAs pose a more significant challenge in this realm, and how to navigate it.The work could be expanded to include implementation examples in other commonly used SPA technologies like Angular and Angular Universal or Vue and Next.js.It would also be intriguing to conduct studies among non-IT experienced users to gather their impressions on web applications across different technologies, satisfaction with page rendering times, etc.The web application landscape continues to evolve, necessitating an open attitude towards these changes.This work aimed to demonstrate that among the myriad considerations in IT projects, SEO warrants attention.This intriguing subject merges IT, PR, marketing, and UI/UX factors.A thoughtful SEO optimisation implementation for SPAs significantly increases the likelihood that initial efforts will yield business benefits and, above all, user satisfaction.

FIGURE 3 .
FIGURE 3. View update in SPA and MPA -a comparison.(Author's own work).

LISTING 1 .LISTING 2 .
Former method for changing the URL without reloading the page.Defining escaped fragments using a meta tag.

LISTING 3 .LISTING 4 .
Correctly defined links for Googlebot.Incorrectly defined links for Googlebot.

FIGURE 9 .
FIGURE 9. Number of questions on Stackoverflow concerning a given frontend technology.Source: StackOverflow metrics.

FIGURE 13 .
FIGURE 13.Performance score for mobile application -analysis with PageSpeed tool.

TABLE 2 .
Results for SEO areas -analysis with the Seobility tool.

TABLE 3 .
Optimisation elements examined by PageSeed -comparison with the Seobility tool.

TABLE 6 .
Performance analysis results for desktop devices -comparison of results before and after optimisation.