Having recently worked on an Optimizely project, I gained valuable insights into the capabilities and best practices of the Optimizely DXP environment. This experience reinforced my understanding of its scalability, security, and performance optimizations. I worked on a migration project from Sitecore to Optimizely, which involved migrating five complex websites originally built on Sitecore and WordPress to Optimizely. This migration required careful planning, content restructuring, and performance optimization to ensure a seamless transition. To successfully deliver this project, I collaborated with an incredible team of 25+ professionals, each contributing expertise across development, architecture, quality assurance, and deployment.
Optimizely Digital Experience Platform (DXP) is a powerful cloud-based solution designed to help businesses deliver exceptional digital experiences. It provides a scalable, secure, and high-performance environment for managing and optimizing content. This blog will explore the key components, benefits, and best practices for working with Optimizely DXP.
In this comprehensive guide, we’ll walk you through the step-by-step process of deploying a Storybook project on Vercel using an Azure DevOps pipeline.
Before diving into the repository, it’s important to note that we received a working Storybook project built with the Lit Component Library from another vendor. The project, which uses Node.js, runs locally with the following commands:
npm install
npm build // This command build and store the files in the following folder: C:\Opt\StorybookAzureDevOps\apps\storybook\.storybook
npm run deploy
Although hosting on Vercel is straightforward using Github, Gitlab or BitBucket, there are a few configurations and steps that can be time-consuming to understand for Azure DevOps. We’ll guide you through each step for deploying an application on Vercel using Azure DevOps Pipelines and discuss the challenges you might encounter.
Requirement
Storybook project created with the Lit Component Library.
Repository hosted on Azure DevOps.
Deployment target: Vercel.
Solution
Vercel supports easy integration with GitHub, GitLab, and Bitbucket, but not directly with Azure DevOps. Fortunately, Vercel offers a useful “Azure Deployment Extension” that simplifies the process.
Open your command prompt as an administrator and log in to Vercel:
vercel login
Follow the Vercel login verification process to authenticate.
Step 2: Create a New Vercel Project
Use the following command to create a new project on Vercel:
vercel project add
Step 3: Retrieve IDs from Vercel
User ID: Navigate to your Vercel account settings to find your User ID.
Project ID: Copy the Project ID from your newly created project on Vercel.
Step 4: Generate and Copy Token from Vercel
Generate a token in Vercel and copy it for later use.
Step 5: Install Vercel Deployment Extension on Azure DevOps
Install the Vercel Deployment Extension from the Azure Marketplace. This requires administrative access. If you don’t have administrative access, request installation from your Azure DevOps administrator.
If you don’t have administrative access, you can still make a request with a reason for installation when you click “Get Free” during installation. This will notify the Azure DevOps administrator. You cannot proceed further without this extension on your Azure DevOps account from where you need to deploy the application to Vercel.
Step 6: Create and Configure an Azure DevOps Pipeline
Create a new pipeline in Azure DevOps using and select Node.js for pipeline configuration. Replace the pipeline code with the following configuration:
Replace your_vercel_project_id, your_vercel_org_id, and your_vercel_token with the values you retrieved earlier.
Note: If you do not have the extension installed, you’ll see an error within the pipeline because Azure DevOps will not be able to find the required command for the pipeline without the Vercel Deployment Extension.
Step 7: Run the pipeline
Run the pipeline to deploy your project to Vercel. If you encounter errors, ensure your configurations on Vercel are correct.
Configurations for Storybook on Vercel Set the proper configurations in Vercel:
In our case, the output directory should be “/.storybook” and the root directory should be “apps/storybook“. You’ll need to set these according to your Storybook configurations.
Step 8: Set Environment Variables
Configure the environment variables on your Vercel project as shown in the screenshot below. This will ensure the cache is refreshed upon each deployment.
Step 9: Redeploy the Pipeline
Once the configurations are complete, rerun the pipeline on Azure DevOps. This time, the deployment should be successful, and you should see the results.
In the realm of enterprise applications, the firewall stands as a pivotal guardian against potential threats. This blog delves into a specific scenario where Sitecore support facilitated the implementation of Web Application Firewall (WAF) on Microsoft Azure for a Sitecore Managed Website. With WAF in place, access to the App Service was restricted solely to the Firewall’s IP, resulting in a 403 Forbidden error. The intricacies of pinpointing the exact rule causing this restriction within Azure Firewall are explored.
Simple Web Architecture:
Note: This diagram is created only to explain the basic scenario for this blogpost.Read More
In the ever-evolving landscape of web development, ensuring the security of your web applications is paramount. This holds especially true for platforms like Sitecore, a powerful content management system used by many organizations worldwide. To protect your Sitecore web applications from potential threats and vulnerabilities, it’s essential to implement robust security standards. In this blog, we will delve into the key aspects of security standards for Sitecore web applications and explore the best practices to fortify your digital assets.
Security plays an indispensable role in safeguarding any .NET application, ensuring the perpetual safety and integrity of your servers and applications. Regular employment of security tools and site scans is essential to ascertain that best practices are consistently upheld, preserving the sanctity of your website and environments.
Eloqua is one of the leading email marketing platforms. Below are the Pros and Cons of Eloqua.
Pros: API Integrations – Eloqua Rest API integration is incredibly straightforward to set up and customize with any system. The main approach for the integration should be Forms, as I stated in an earlier blog post. There are two types of integration with Eloqua, one for the Sitecore Forms and the second for System forms or fixed forms like registration, login, newsletter subscription, etc,. In our case, we have a Sitecore Submit Action called “Submit to Eloqua” that is linked to Eloqua. We construct a form on Sitecore Forms, and the exact same form is built on Eloqua. We then connect the two forms using the Sitecore Submit Action, passing the correct Eloqua Form ID and other fields in the necessary JSON format. And for System forms obviously, we need Developers to set up a one-time configuration.
Bulk API Integration – Earlier I created the utility to transfer the users in Bulk from the SQL database to Eloqua. It was really easy to understand and implement the Bulk API Integration. This is used to send large amounts of data to Eloqua. I also created a scheduler that runs every day at midnight and sends the daily new users to Eloqua.
Nowadays, we always look for ways to automate. This blog post explains how to automate the process of exporting the Eloqua reports to Google Spreadsheets and share them with different stakeholders.
In Eloqua Series – Part 5, we created the reports using Oracle BI, in this blog post you’ll see how can you send that data directly to Google Spreadsheet and also create Data Studio Dashboards.
By making use of the scheduler within the Oracle BI tool you can send the required report by email. You might not have heard about Microsoft Power Automate – but using Power Automate you can send that report to Google Spreadsheets or Google Drive or Microsoft Onedrive and you can do a lot more processing and automation by using Power Automate.
This blog post is very interesting and useful for everyone who sends bulk emails. Here I’m describing some of the best observations and practices we have seen and followed for all the email marketing campaigns that we dispatch. In addition, email plays a very substantial role in getting a user engaged, interacting with, and converting into a buyer.
Mobile First – In this digital era, everybody is shifting towards handheld devices. 85% of the traffic on the website/emails is from the Mobile Phone. Always look for creating Mobile-friendly email. Also, many email marketing platforms do provide an option to show/hide content based on Desktop/Mobile. Do make use of that feature, A/B Test, and create mobile-first emails.
Oracle Business Intelligence (BI) is very useful for Reporting. It is one of the powerful tools by Oracle which provides very good reporting and insights into all the activity happening with your account on Eloqua.
With Oracle BI, we get an option to obtain reports for all the Email Campaigns, Individual Emails, Forms, Custom Data Objects, Contacts, and a lot more. The very good part with Insights is that you can schedule these reports to your Email Inbox Daily/Weekly or on a specific day as per your needs. We have configured almost 100+ reports that are sent to stakeholders in accordance with the business needs. And all of this is automated! We try to avoid manual work as much as possible and insight helps us to achieve easily.
By default, when you first load the Oracle BI – You need to navigate to the Shared Folder and you’ll see the list of all the default reporting options that are available. Oracle has done a great job of defining the default useful reports for all the business domains.
On Oracle Eloqua, Custom Data Objects [CDO] play an important role because they allow you to store 25 million entries.
If you already have Eloqua you might be aware that it only allows a certain number of records in the Contacts database/table and that depends on the package that you go with. Let’s suppose it’s 1 million. The size of your contacts database will also grow as you enhance your application, and Eloqua Contacts won’t be able to handle all the information.
With the help of CDO, you can store a large set of data in an organized manner. You can have one-to-one or one-to-many relationships. you need to set any unique field as a Key so with the form submission you need some Unique ID. For a one-to-one, If you set an Email Address as a Unique ID – then the system will always update that same email address record with the new set of updates. For a one-to-many, If you set the Submission ID as a Unique ID – the system will always add the new record. You’ll have an option to set the Unique Code Field which will define that an entry is of a specific Email Address. In the above diagram, I’ve shown a one-to-many relationship.
In a previous blog post, we discussed Email Campaigns. We’ll take a closer look at Eloqua forms and the different post-processing options available that can help you organize contacts properly.
Eloqua’s forms are one of the key entry points for Leads. Eloqua Consultants recommend using Eloqua Forms API to push data from websites to Eloqua. We can apply post-processing steps in Eloqua Forms where we provide all the necessary rules for each submission. This practice will give you more control over user contact, allowing you to create different strategies for organizing and planning user journeys.