The Plesk WordPress Toolkit 4.9 Release – What’s New?

We’re happy to announce that the Plesk WordPress Toolkit 4.9.0 release is now available for the general public. As most of you probably know, this year we’ve been pretty busy working on WordPress Toolkit for cPanel. And even though 4.9 is not a huge update in terms of customer features, it certainly brings some long-awaited surprises in store. So, let’s deep dive into details to see what’s new.

Find out more about the Plesk WordPress Toolkit

Limit the Number of WordPress Installations in Service Plans

Hosters could always limit the access to WordPress Toolkit or some of its functionality through Plesk Service Plans. However, it wasn’t possible to set a limit on how many WordPress sites any given user could manage via WordPress Toolkit. This made things unnecessarily harder for some people. Because many Managed WordPress hosters do have these site limits as a part of their business. We’ve decided to address this glaring omission in WordPress Toolkit 4.9 and added this limit on the Resources tab of a Service Plan management screen:

Now, it’s possible to directly customize a particular subscription and change the limit. Service Plan add-ons also have this limit available. So, most kinds of possible upsell scenarios are covered.

The website limit will affect the ability to install WordPress sites via WordPress Toolkit. Add new sites using the Scan feature and create clones of existing sites. Note that so-called “technical” installations – e.g. clones made by Smart Updates don’t count towards the site’s limit, as they’re not visible to users in the interface.

By default, the limit is set to Unlimited. So nothing will change for users out of the box after the update to WordPress Toolkit 4.9. Some of you may ask what happens if the hoster defines a limit that’s lower than the number of sites the customer already has at the moment. In this case, the user won’t be able to add more sites. But existing sites won’t suddenly disappear from the interface. 

However, if the user removes or detaches a site, it won’t be possible to add another site back if the limit is reached. In other words, you can reduce the number of sites as you see fit. But you can’t increase it beyond the limit set for your subscription:

Configure Default Database Table Name Prefix

WordPress Toolkit generates a random prefix for database table names every time someone installs a new WordPress. This is to alleviate the impact of automated bot attacks looking for vulnerable WordPress databases using the default table prefix. For some users – especially WordPress developers, this behavior is quite annoying. So we added the ability to configure a specific default prefix for database table names whenever someone installs a WordPress on a server:

Here comes the tricky part. Generating a random prefix for database table names is a security measure in WordPress Toolkit. That it’s applied automatically during the installation of WordPress. If you set the default prefix back to ‘wp_‘, WordPress Toolkit will respect your choice and will not change this prefix. But it will set the site security status to ‘Danger‘ to tell you that this isn’t secure. This shouldn’t be an insurmountable challenge, like any other predefined prefix (be it ‘wp‘ or ‘wp___‘, or whatever else that is not ‘wp_‘) won’t trigger the security warnings.

If users want to return to the old behavior with a randomized prefix, all they need to do is to leave this field empty. This small QoL (Quality of Life) improvement should provide a number of users with more control over their WordPress Toolkit experience.

Working on WordPress Toolkit for cPanel

We’ve been doing a lot of work on the WordPress Toolkit for cPanel front during the development of WordPress Toolkit 4.9. For instance, we’ve added the capability to update the product in cPanel. And we started to really dig into the security and performance aspects. Addressing a lot of issues that both WordPress Toolkit and cPanel teams found. 

Features like Sets and Reseller support were also added in the scope of the current release. We’re actively working on licensing and test infrastructure at the moment. And while there’s still quite a lot of stuff left to do, we can already foresee a finishing date. Our WordPress Toolkit for cPanel will be good enough to be ready for a demo very soon. And we’re already seeing a lot of interest from various partners – woohoo!

Testing Amazon AWS Infrastructure and Other Stuff

There’s another hidden but very important activity going on behind the scenes for quite some time. And that’s the initiative of moving our regression testing to Amazon AWS infrastructure for extra speed, flexibility, and on-demand availability. This should allow us to test WordPress Toolkit on cPanel as often and as thoroughly as WordPress Toolkit on Plesk. 

Using AWS for testing should also allow us to run a suite of tests per each developer commit in the future. Bringing us closer to our goal of “green master” initiative – or in other words, having a product that could be released in a production-ready state at any given time.

Speaking of improving the product, some of the security and performance improvements done in the scope of WordPress Toolkit for cPanel should also affect WordPress Toolkit for Plesk in a positive way. WordPress Toolkit 4.9 includes a number of important customer bugfixes as well.

Future Plans

Our next major release will be Plesk WordPress Toolkit 4.10, tentatively scheduled to be launched by the end of summer 2020. This upcoming release coincides with the peak of the vacation season. So we won’t have the manpower to push any groundbreaking changes – they’re reserved for the next upcoming releases. 

However, you can rest assured that WordPress Toolkit 4.10 will include some in-demand customer features, bug fixes, and other interesting stuff on top of changes required for cPanel support. We’re also planning to release a small WordPress Toolkit 4.9.1 update very soon with a couple of new CLI utilities as a part of the CLI completeness initiative. The future of the product looks very busy, so stay tuned for updates – and especially, stay healthy! 

…So that’s all for the Plesk WordPress Toolkit 4.9 release. Remember that our teams are always on the lookout for new features to implement or bugs to crash. And here’s where your feedback is essential. You can share your suggestions or ideas for new functionalities to one of our channels – Uservoice, Plesk Community Discussion Forum, and Plesk Online Community

Or while you’re here, you can also leave your feedback in the comments below – our teams have eyes everywhere! Once again, thank you for reading. And cheers from the whole WordPress Toolkit team!

Next Level Ops Podcast: Using Cloud Services for Your Hosting or Website with Lukas Hertig

Hello Pleskians! This week we’re back with the sixth episode of the Official Plesk Podcast: Next Level Ops. In this installment, Superhost Joe welcomes back Lukas Hertig, our Highest Order Pleskian, to have a chat about hyperscale cloud services.

In This Episode: Cloud-Washing, Competing in a Hyperscale Cloud Environment and Specializing Your Niche

What do we mean when we’re talking about cloud services? What is a hyperscale cloud provider? How can hosting companies compete in a hyperscale cloud environment? Joe and Lukas get the ball rolling on cloud computing in this week’s Next Level Ops. “Unfortunately, there is a lot of “cloud-washing” out there in the market,” says Lukas.

“If you want to use cloud services, it depends highly on your use case or your business. All the great stuff that we’re personally using today - Netflix, Uber, Shopify - is backed by cloud services.”

Lukas Hertig

The main idea behind cloud computing is that it lets you share resources. Amazon was the first to consider this idea when it wanted to scale its services back in the 2000s. Companies can now run their applications on top of technology infrastructure provided by Amazon Web Services. These days, cloud computing is available globally. And a few big competitors have entered the market. One of the biggest advantages cloud services provide is that you can keep your data and your services where your customers are.

That said, in what circumstances can a company use cloud services? “If you want to use cloud services, it depends highly on your use case or your business,” says Lukas. “All the great stuff that we’re personally using today – Netflix, Uber, Shopify – is backed by cloud services.”

Key Takeaways

  • Advantages of using cloud services. There has been concern among European companies about privacy in the cloud. However, today cloud providers are fully compliant with GDPR and local privacy regulations. This has made it easier for businesses to use such services. Using cloud services also depends on your use case. If you are a large enterprise, it allows you to spin up servers closest to your customers at the click of a button. When you are a start-up, it allows you to scale your services very fast.
  • Competing in a hyperscale cloud environment. Hyperscale cloud providers have made cloud infrastructure a commodity. So you need to find new ways to compete on a different layer, not just at the infrastructure level. For hosting companies that means moving from “generalist” to “specialist” managed services. Hosting companies should investigate what niche their customers belong to. This will enable them to provide more targeted technologies and services to their end users.
  • Partnering with hyperscale cloud providers. You can partner with companies like AWS and DigitalOcean using their partner programs and build on top of their hyperscale cloud. These companies are huge but they’re also human! It’s not all about competing but using existing services and building strategic relationships for growth.
  • Benefiting from hyperscale cloud technology. The rise of the platform plays a role here, i.e. look at platforms like Wix and Shopify who are actually using hyperscale cloud infrastructure to provide services to their users. Companies can develop more customized solutions using technology from hyperscalers. These solutions may not even be possible without hyperscaler technology!

…Alright Pleskians, it’s time to hit the play button if you want to hear the rest. If you’re interested in hearing more from Lukas, check out this episode. If you’re interested in knowing more about cloud service models, take a look at this guide. Remember you can find all episodes of the official Plesk Podcast here and here. And if you liked this episode, don’t forget to subscribe and leave a rating and review in Apple Podcast. We’ll be back soon with the next installment.

The Official Plesk Podcast: Next Level Ops Featuring

Joe Casabona

Joe is a college-accredited course developer. He is the founder of Creator Courses.

Lukas Hertig

Lukas is the SVP Business Development & Strategic Alliances at Plesk.

As always, remember to update your daily podcast playlist with Next Level Ops. And stay on the lookout for our next episode!

Interested in Multicloud Management? Read This

Multicloud Management

You may be familiar with Multicloud management since many are saying Multicloud is the future of IT. But what you should know is that we’re already living in a Multicloud world. This IBM study clearly shows that 85% of organizations are already using this approach and benefiting from multicloud advantages. However, despite its popularity, as much as 60% of businesses don’t have the tools and procedures for multicloud management.

This can be a major issue for an organization, particularly in how it can unintentionally introduce risk and slow progress of moving high priority workloads. Here, we’re going to explore some of these major Multicloud problems and challenges of the approach. We’ll also look at Multicloud advantages and Multicloud solutions. These can help businesses navigate multiple cloud environments whilst meeting their ever-increasing demands.

What’s a Multicloud Management Platform?

Multicloud Management Platform

Cloud computing has evolved since Google CEO Eric Schmidt introduced the term in an industry conference. Today, most organizations use cloud computing to support some aspect of their business. Multicloud storage has become so popular that many now use two, three, or more cloud providers to meet their objectives. This led to the birth of Multicloud Management and Multicloud advantages.

Multicloud is a cloud computing approach made up of two or more cloud environments. As we touched on above, it developed because there are organizations currently using cloud services from more than one provider. There are many reasons why businesses choose to use a Multicloud management platform – and cloud services from several providers.

Comparing Multicloud vs Single Cloud

Firstly, the main difference between multicloud vs single cloud is that it lets you access the best SaaS applications. And across multiple cloud environments too – Such as Amazon Web Services, Google Cloud Platform, and Microsoft Azure. Secondly, multicloud minimizes their dependency on any single cloud provider, resulting in decreased risk and increased performance.

Even more, now it’s most businesses who are using a multicloud management platform approach. So, it’s becoming increasingly important for those who wish to remain competitive and agile in such a fast-evolving landscape.

The Multicloud Technology

The growth of multicloud so far has relied very much on containers and Kubernetes technology. One of the main multicloud problems has always been running different cloud solutions in different software environments. That’s why businesses need to be able to build applications that can move across environments without causing issues with integration. This is where Multicloud technology comes in.

As they isolate the software from the underlying environment, containers are an ideal solution to these Multicloud problems. They essentially allow developers to build applications that can be deployed when and wherever they please.

To manage and deploy these containerized applications, “container-orchestration systems” such as Kubernetes have emerged. Kubernetes, described as a platform for automating the deployment, scaling, and operations of application containers across clusters of hosts, is one of the most widely-used, open-source container platforms out there.

An organization can be following a multicloud approach without having any solutions in place to support effective multicloud management. Multicloud management is therefore the ability to manage multiple clouds from a single, central environment. Such management solutions may be called multicloud technology or multicloud management platforms. They are increasingly common today and stand out when comparing multicloud vs single cloud.

Multicloud vs Hybrid Cloud: What’s the Difference?

They sound similar but there are some discrepancies between Multicloud vs hybrid cloud. Multicloud is about using multiple cloud services from more than one provider. Meanwhile, with hybrid cloud, an organization uses a combination of private and public clouds and on-premises services from the same provider. Therefore, the providers are key when distinguishing Multicloud vs hybrid cloud.

Moreover, many businesses have moved away from public cloud models toward a hybrid model. Doing so means they can manage their workload across multiple cloud environments. Especially because of the development of hybrid cloud architecture. However, since many businesses are now following a hybrid cloud approach, they’re using a combination of different public cloud providers. Therefore, they’re also in need of a dedicated, multicloud management solution.

A hybrid cloud and multicloud management platform approach can and often do co-exist. But as they are different approaches, they each have their individual risks and challenges and so must have their own dedicated management solutions.

The Streaming Giant with Multicloud Storage

Netflix - The Streaming Giant with Multicloud Storage

As a quick case study of the multicloud storage approach in action, we can turn to streaming giant Netflix. A business based on delivering video via the cloud, Netflix has been open about its model. It’s also been open about how its main cloud provider has long been Amazon Web Services (AWS). More recently, however, the company has begun working with another provider: Google Cloud.

The move was reportedly executed to take advantage of the competing provider’s unique functions, from disaster recovery to artificial intelligence. Although switching providers may threaten its long-term relationship with AWS, the multicloud benefits far outweigh the risks. And in this case, it’s Multicloud storage that comes on top. The open-source tool it developed for Multicloud storage is called Spinnaker, and it’s even supported by AWS, Google, and Microsoft.

Before Spinnaker, Netflix was famous for putting all its eggs in the AWS basket. By multi-sourcing its data storage, the company has reduced its dependency on one provider. Thus, minimizing the risks of outages that occur without multicloud storage, and improving its flexibility for sustained growth.

What Are the Multicloud Advantages?

Multicloud management clearly has many benefits. For many organizations, hitting that 99.9% availability and significantly decreasing latency are enough to convince them. But it can also give you a range of integrations that only work on specific clouds.

For instance, if you have sensitive data you don’t want in the public cloud but need to build integrations for it. You can build workloads on the private side that can take advantage of the private sensitive data. This all-around makes for a more efficient workflow and greater user experience.

Potential Multicloud Problems

Multicloud management isn’t all good news. Alongside the many benefits of working in a complex multicloud environment, come several challenges. IBM outline these three challenges as Rapid Application Innovation, Data Overload, and DevOps and Site Reliability Engineer (SRE) best practices.

  • Rapid Application Innovation: As businesses work more with hybrid and multicloud architecture, the volume and complexity of objects and metrics increases. The result is monitoring and securing all your operations becomes an increasingly difficult task.
  • Data Overload:New technology in big data and artificial intelligence afford businesses many advantages. But with only traditional management methods at their disposal, they often lack the management capabilities to safely handle the data and implement an effective data strategy.
  • DevOps and SRE Best Practices: Employing DevOpsand Site Reliability Engineer (SRE) best practices is becoming increasingly expected in organizations. Moving to a DevOps model, however, is difficult because it is not just about technology, but also making cultural and operational shifts with teams — particularly in learning new ways of working and the introduction of new roles.

When trying to address these challenges, businesses are typically stuck between choosing a management solution that provides speed or control. But not both. An effective multicloud management solution, therefore, allows businesses to strike a balance and achieve both.

“A Unified Approach Across Multiple Clouds”

A lot of the challenges of the multicloud approach are mitigated by choosing the right multicloud solutions. According to the IBM developer Sai Vennam, there are three areas in particular that an effective multicloud management solution needs to address. Together, these three areas or “pillars” form the basis of what can be a “unified approach across multiple clouds”.

  1. Automation:

It’s necessary to have a consistent and yet flexible way to deploy and manage applications. Multicloud solutions enable this by simplifying and automating application management. Plus, improving flexibility with features like multicloud backup, disaster recovery, and the ability to move workloads. Not to mention reducing costs thanks to intelligent data analysis.

  1. Visibility:

With so many clusters running in so many environments, it can be difficult to know what’s running where. A multicloud management solution enables businesses to know instantly what business application components are running where, as well as to be able to monitor the health of resources across multiple clouds, both public and private.

  1. Governance:

Increased use of cloud-native components should not also mean increased management costs and risk to your business. Yet DevOps teams have the difficult task of employing governance and security policies across multiple clouds. With a single unified dashboard, compliance policies can be pushed across multiple clusters with a single command, reducing both time, cost, and risk.

Getting the Multicloud Advantages

Multicloud management can be overwhelming. Each cloud-based service inevitably comes with its own tools and multicloud advantages. And so, when you use services across multiple providers, managing them is going to be nothing less than a complex and costly task.

As well as covering the three pillars of automation, visibility, and governance, multicloud solutions must be simple and efficient overall. New multicloud platforms, with central control panels and rapid access and delivery to multiple clouds, prove ideal. The multicloud solutions that companies have been waiting for.

The Big Data Hosting Dilemma: Is Your Provider Solving it?

Big Data Hosting

You’ve probably heard of how data is like the oil of the digital economy. Last century, everyone wanted to invest in petroleum. Today it’s data that is one of the most valuable resource a business can invest in. Once you start drilling, you may find data in limitless volumes. Hence – “Big data”: the field that helps businesses identify and analyze the deluge of data at their fingertips. So that they can put it to effective use. So, what’s this about a big data hosting dilemma?

Big Data Needs Big Web Hosting

Unlike raw crude oil, data itself has no universal value. Therefore, if you have lots of data but no means of processing it and extracting value, it’s pretty much worthless. Big data is gaining wide popularity across many industries. Mainly because of its capabilities of capturing, storing, and processing data which lets businesses gain that competitive market edge.

However, as the name suggests, big data is all about complex functioning, massive data sets, and intricate multilevel processes. And so, businesses can only get as much out of big data as their hardware allows. To complement big data, you also need strong and dynamic servers that can support sophisticated computing, processing, and storage requirements.

That’s why web hosting companies are key in determining the success of a business’s move into big data. Here we’re exploring some of the best options for big data hosting providers. As well as explore how each can help you boost your big data operations.

AWS (Amazon Web Services)

Amazon Web Services AWS - Big Data Hosting Provider

AWS enjoys the prime position (pun intended) in the big data hosting market. Amazon EC2 (Elastic Compute Cloud) for starters is one of Amazon’s most successful products. Clients love EC2 particularly for its exclusive capabilities and flexibility to scale.

The model lets you enjoy the maximum availability of resources to support fluctuating requirements. All without having to fork out package expenses. Because thanks to a PAYG (Pay as you go) approach, EC2 enables seamless scalability. Plus, it covers the two main bases you need for big data: performance and cost-efficiency.

Here’s a rundown of the main features of Amazon EC2 for supporting big data processing.

Amazon Elastic MapReduce:

Purpose-built and architected for massive data processing operations. EC2 and Amazon Simple Storage Services fuel its hosted Hadoop framework.

Amazon Dynamo DB:

A NoSQL (not only SQL) database service that’s fully managed and promises high tolerance against faults. With seamless scalability and independent provisioning capabilities, DynamoDB significantly reduces any need for active human intervention. Uncomplicated administration makes the experience convenient and smooth.

Amazon Simple Storage Service (S3):

Though thin on features, the Amazon Simple Storage Service is especially for high scale performance and massive storage capacities. It supports seamless scalability by allowing you to insert data in buckets. You can also select specific regions for physically storing your data to address speed or availability issues.

Amazon High-Performance Computing (HPC):

This service supports sophisticated tasks with specific needs. High-end professionals like scientists and academics use HPC for its high performance and rapid delivery, along with other industries too. Mainly because of the rise of big data hosting providers. Undoubtedly, easy reconfiguration provisos and high workload capabilities are the main benefits of Amazon HPC.

Amazon Redshift:

The focus of Redshift is to provide extreme storage capabilities to deliver massive data warehousing. Of course, supported by the strong foundation of MPP architecture. With its high-security ecosystem and reliable performance, Redshift is a powerful substitute for in-house data warehousing. Its architecture aligns well with high-end business intelligence tools. Thus, saving businesses significant infrastructure costs and maintenance hassles – and allowing further boosts in performance.

Google Big Data Services

Google Big Data Services - Big Data Hosting - Plesk

Internet giant Google is another major cloud services player that seems to be especially designed for big data hosting. Firstly, as the leading search engine, Google boasts an in-depth and first-hand experience in big data processing. Secondly, it also possesses the most sophisticated infrastructure out there to support big data operations.

Here are a few of the major features you need to know about Google Big Data services:

Google Compute Engine:

Promising a powerful combo of security and scalability, Google Compute Engine is an advanced computing solution. With its energy-efficient model, it helps enterprises to quickly complete complex computing processes with greater accuracy. It also prevents load imbalance with its reliable workload management solutions.

Google Big Query:

As the name suggests, Google Big Query is a reliable solution for data querying requirements. It supports quick and error-free processing of SQL-like queries against massive data sets. Its specific functionalities make it ideal for presenting an impromptu report or seeking deeper analysis. One limitation to note is that you can’t alter your data once it gets into Big Query.

Google Prediction API:

A powerful machine-learning tool whose advanced features discover and memorize patterns from huge volumes of data. It’s a self-evolving tool, therefore, it gains new, deeper insights about a data pattern each time you use it. Google Prediction API also allows you to use the patterns for purpose-specific analysis. Hence, stuff like customer sentiments and the detection of cyber threats.

Microsoft Azure for Big Data Hosting

Microsoft Azure Big Data Hosting - Plesk

One more major contender in the big data hosting market is Microsoft. Microsoft’s advanced capabilities allowed it to develop sophisticated and sharp big data tech. It’s an especially good option for those who are familiar with its proprietary products like Windows, Net, and SQLServer.

Window AzureHDInsight:

While fully-compatible with Apache, adopting it can connect you with various business intelligence tools as well as Microsoft Excel. You can also deploy In Windows Server.

OpenStack for Big Data Hosting

OpenStack Big Data Hosting - Plesk

As a popular open-source platform for cloud computing, OpenStack is big in Big Data application and processes. It offers clients a choice between public and private clouds. But you do have to follow standard implementation as per the organization’s rules. OpenStack may also limit specific customization efficiency of more sophisticated requirements.

Although not as established as others in our big data hosting provider list, OpenStack has many advantages to offer:

  • A Democratic Approach: OpenStack has a more democratic approach to big data hosting. Once developed, this model could offer huge cost savings to clients.
  • Hardware-agnostic: OpenStack is a hardware-agnostic cloud platform capable of accommodating multiple tenants.
  • In Talks With Leaders: OpenStack is talking to leading computing businesses like IBM, Dell, and Cisco. It’s safe to say that it will spark a revolution in the big data industry.
  • Ubuntu Base: Using Ubuntu as its base, OpenStack is an open-source project that aims to enhance the benefits of big data by making it easier and more affordable to work with.
  • Backed by Rackspace: The project has backing from Rackspace and has Nasa as a partner. Rackspace has plans to launch an OpenStack Hadoop services based on the public cloud.
  • Validated by Hortonworks: The data software company Hortonworks validates OpenStack.

Choosing Your Big Data Hosting Provider

The value of data is growing. And as you would expect, the role of big data is growing along with it.

This innovative field has already helped large global companies achieve a competitive edge in the market. However, due to its heavy and complicated functionalities, a small business’ server is not able to support big data operations. Hence, to get the most out of big data, you need big data web hosting services.

Choosing the right web hosting provider is key in determining the efficiency of your big data operations. Use this guide to get an idea which ecosystem/server is right for your business. Then let us solve the big data dilemma once and for all.

Top 8 AWS Developer Tools You Should Know About

Top 8 AWS Developer Tools

Amazon is launching a set of AWS Developers Tools whose goal is to simplify DevOps so IT professionals can work faster, easier and more efficiently. The main purpose behind these AWS tools is to have the customer or developer be continuously “confident they’ll be able to find a tool for their job”, says Aron Kao, Amazon Web Services’ Senior Manager Product Marketing. So how about you – ready to code, test and deploy automatically with the help of AWS?

AWS CodeDeploy

The CodeDeploy tool lets you automatically deploy applications and update servers and instances of any size. This means a significantly easier workload for developers and quicker releases of new products and features.

AWS CodePipeline

AWS CodePipeline essentially automates the process of releasing a new software and works following the previous models the developer created. CodePipeline automates the phases of compilation, testing and implementation of the launch process each time someone makes an update or modification to the code.

AWS CodeCommit

This cloud-based source control service works with Git tools so you can avoid managing and scaling a version control system. Amazon Web Services released this solution to improve software quality and reduce the time it takes to continuously release new updates.

AWS CodeBuild

AWS CodeBuild works alongside AWS CodePipeline to provide a simpler method of building and testing code. AWS CodeBuild helps you avoid delays in producing software packages as it continually escalates and processes several compilations at the same time, thus removing them from the waiting queue. Also, there’s no need to set anything up or update as this service is fully-managed from the get go.

Blue-Green Deployment on AWS Quick Start

You can Blue-Green Deployment in order to create a CI/CD pipeline in under 20 minutes. It works with AWS Elastic Beanstalk, which is useful to deploy and manage apps in the cloud with no support infrastructure limits. This tool even lowers the risk of you accidentally using identical Blue-Green environments.

AWS Config

We recommend using Config to track AWS resource configuration changes and evaluate them against defined rules. It also verify if any change violate the rules. If it happens, AWS Config will mark the resource.

AWS Amplify for Mobile Development Framework

Building mobile backends have never been so easy. The Amplify tool includes UI components, a command line interface and a set of libraries to integrate your backend in any mobile and/or web app.

AWS Cloud Development Kit

Currently in developer preview, this AWS Cloud solution can give your team a high-level, oriented framework that helps you define your resources in whatever programming language you’re using. You can also build your AWS infrastructure using AWS Construct Library – a set of modules pre-built by the company.

Which AWS Development Tool do you think is most useful? Tell us in the comments.

Cloud giant AWS is one of our top hyperscale partners, allowing Plesk users to scale into the cloud and tap into AWS resources that can help grow their business exponentially. Try Plesk on AWS to experience all these solutions and more.

Hidden Blockchain Opportunities (3): Decentralized Cloud Storage

Hidden Blockchain Opportunities - Decentralized Cloud Storage - Plesk

Decentralized Cloud Storage is one use case that’s growing very fast and aims to solve one of the biggest online challenges today. Cloud storage is controlled by a few super large providers (Google, Microsoft, Dropbox, Amazon, and so on). Raising questions about data protection, privacy, licensing, control and ownership of data. It’s yet another hidden blockchain opportunity for hosting and cloud providers.

Decentralized Cloud Storage in the Blockchain age

A few Blockchain companies have started working on proper alternatives, providing opportunities for cloud and hosting providers too! They all operate in a similar way:

(Read part 1 of the Blockchain series if this is not clear yet)

  • Instead of running storage through a company that controls it centrally, a decentralized Blockchain network stores the data.
  • The technology is open source and there’s no company controlling the data within this Blockchain network.
  • Compared to a centralized network, decentralized cloud storage ( decentralized networks ) represents not 100s or 1000s of computers/servers, but often millions. The price to store data is lower and the availability of such network is significantly higher than traditionally centralized networks.
  • The data is encrypted and each user controls their own encryption keys. Making the Blockchain concept a rock solid, unhackable and unbreakable solution.

Examples of such Blockchains are:

  • STORJ: (Funding: 35M USD) – Version S3 compatible V3 will be released soon
  • Sia: (Funding: 1.5M USD) – in production
  • Filecoin: (Funding: 257M USD) – no product yet, just a file system so far
  • IPFS: (Funding unknown) – in production and already used by developers worldwide.

Where’s the opportunity for cloud and hosting providers?

Because of the way these Blockchain networks operate, there are two use cases that hosting & cloud service providers can pursue.

  1. Consume storage:
    For example, use the storage of these networks to have an additional way of storing special or sensitive data, at a super cheap price.
  2. Contribute your spare/idle infrastructure:
    Add it into the Blockchain network to help keep it up and running. Get paid in tokens.

We’re still in the early stage of decentralized storage, but the expectations are high. This considering the investment sizes and advantages this approach provides, compared to centrally-controlled cloud storage. So I recommend you have a look now and make sure you’re ready for it as early as possible.

Decentralized Computing powered by Fog Computing (aka Blockchain)

Imagine running a decentralized approach for computing power across millions of computers on a Blockchain. It’s probably one of the most complex Blockchain areas being built.

Traditional cloud computing, especially the hyperscale cloud providers, consists of a few large companies – Amazon, Google, Microsoft, Alibaba. They have central control over thousands of machines, used by millions of users. Plus a couple of thousands of hosting providers, but they’re 100x smaller than the global hyperscale giants.

There are now organizations, funded with millions of USD, that are trying to change this. So that cloud computing can become “Fog Computing” – a globally scalable network of computing power based on a Blockchain. Millions of computers connected decentrally – without central control. Making computing power usage on a global scale not only more secure, but also much more inexpensive.

Where’s the opportunity for cloud & hosting providers?

Computing power:

  1. Even if those new approaches are decentralized, the computing power behind is still required. But it will be layered and connected across the world through a secure and scalable Blockchain layer. Such computing power (spare, idle infrastructure) can be easily contributed into those networks and get paid in tokens.
  2. In case you need in-expensive computing power in a secure and scalable way, those offerings will be much more cost effective than traditional offering.

Here are a few well-funded companies working hard to launch or have already launched their network. Some even go as far as to develop apps on these infrastructures using the new standard, Webassembly.

Next Blockchain steps for Hosting & Cloud Providers?

Despite its early stage in a super fast growing and developing space, there already are multiple initial Blockchain use cases. So it’s definitely the right time for the cloud and hosting provider industry to be part of it. We recommend checking all the use cases mentioned above and in part 2 of our Blockchain series and seeing if they work for you. Be active, grow your business.

Recommendations for further reading:

When CTO, Jan Loeffler, reveals Plesk Hosting Tactics at Cloudfest 2018 [Video]

Successful Hosting Tactics Revealed at Cloudfest 2018

Our Plesk CTO, Jan Loeffler, made his grand entrance at CloudFest, Rust, yesterday. As he entered the main forum room bouncing a Plesk-branded football, the Fifa World Cup anthem came on for all to hear. He was going to share inside tactics on how to win, what he called, the ‘Hosting Champions League’.

Jan joined Plesk 2 years ago as CTO with the vision to have Plesk become the leading hosting control panel worldwide, and keep it that way. He’s a diehard football fan. But there’s another thing he feels absolutely passionate about – and that’s cloud hosting. At Cloudfest 2018, he brought these two worlds together.

Jan Loeffler holding a football at his Cloudfest 2018 talk - Secret tactics of how to win the Hosting Champions League

When hosting becomes a game, you need to go long and strong to win the league and increase your revenues. Now Jan reveals how to take home the gold, with collected advice from none other than our very own Pleskian community.

Who are the likely Hosting contenders?

Think of the hosting industry as if it were a football league game. Where would you put your money? And if you think of the likely contenders – how well are they playing the game, and what are they doing?

This is what Jan’s trying to gauge when we look internally. Are we being team players? Are we coaching our customers on what’s best? According to Jan, you can only win the Hosting Game by adapting to what your customers want.

Jan’s advice for all the Hosting teams out there

Offer Hyper-tailored Solutions

How can you score and win the hosting cup? Jan insists that hosters can only compete with giants like AWS by offering cloud service solutions to customers on top of their standard offers.

“Customers want simplicity and great service, which is something that’s difficult for companies like AWS to provide. This is where you, hosters, should step up and become your customers’ coaches. You have the experience and the infrastructure to “tackle the defense” and to advise them to design perfectly adapted solutions.”

Notice the current hosting scene and how many traditional hosters have ignored the winds of changed and not adapted. None of these traditional hosters would have made it to the Premier League if this were a tournament. Jan suggests offering a custom solution and charging a higher price.

Address the Current Web Developer Trends

Get on the WordPress Wagon. WordPress dominates as the most popular way of building websites with 30% of all websites built on this platform worldwide. Half the WP sites out there are hosted by 15 hosting groups. Jan says “WordPress is a must-have for all hosters and service providers”. The current hosting winner for WordPress is GoDaddy as they host most of the wordpress sites all over the world.

Plesk audience at Jan Loeffler's Cloudfest 2018 talk - Winning the hosting champions league

 

Another one is git. Why don’t hosters use git if 70% of web developers use it as their source control system? That’s why Jan advises hosters to include this in their offering. Go where your customers are.

The Call to Hosters

Our hosting partners continue to launch new products, and now they’re even doing so with the new Plesk Onyx and all its updates. We’re talking AI, SEO, Security – the works. So a big thank you to the following hosting partners for their immediate support with our latest product: Infortelecom, UOL, InterNetX, ZNetLive, Conetix, Codero, A2 Hosting, Axarnet, Webplus, Exabytes, Managed.com, OzHosting, 34sp.com, Ptisp. Jan quotes Van Gogh in saying that it’s all the little things that lead to success, and we all agree. 

Plesk Cloudfest 2018 Talk by Jan Loeffler - Great things are done by a series of small things brought together - Van Gogh

 

Finally, Jan encourages hosters to step out of their comfort zone and team up Plesk. A proven ally that runs on all the major cloud players, like Amazon AWS, Microsoft Azure, and Google Cloud. You can plan your gameplay with the all-in-one platform in your corner. And use the available extensions and features in order to win the Hosting Cup. Even the sky isn’t the limit when you can scale to the cloud. 

So, who won the Plesk football?

You didn’t think the Plesk-branded football was just a prop, did you? What would Cloudfest be without a little competition added to the mix? Jan fired a Plesk-Trivia question to the audience, promising to award the Adidas ball to the first correct respondent. “Which is the most visited feature page in Plesk?” he asked.

Answers started shooting from all directions. “WordPress Toolkit” – good following, but no. The “Extension Catalog” – nope, even more than that.  “The Dashboard?” – close, but no. And it wasn’t the login page either! “The help page?” – no, that’s low on the most viewed list, says Jan cheekily.

Plesk Cloudfest 2018 Talk - Jan Loeffler with football winner Patrick Blank from WSPN

 

In the end, it was Patrick Blank from WSPN who gave the right answer and took home the football. Let’s leave a bit of mystery here. Can you guess what the most visited feature page on Plesk is? Let us know in the comments below!

Watch Jan Loeffler’s full keynote speech here

Video Update (March 20, 2018): We now have Plesk CTO – Jan Loeffler’s entire talk below. It’s 23 minutes of great insights – hit play to have a listen. Jan also shares what’s new on Plesk Onyx, justifying all the buzz around it. And the answer to the trivia question will be revealed at the end: “Which is the most visited feature page on Plesk?”

What’s new on Plesk Onyx? The March 2018 Update

Have you heard? We’re coming at you with a huge update to our all-in-one platform. You spoke, we listened. So we’ve further aligned Plesk Onyx to the way web professionals work today. And the types of infrastructure hosting sites and web applications use at the minute. Hence, we focused on 5 main areas: Site Performance, SEO, WordPress, Security and Cloud integration. Check it out.

The Fast-Building Part

We’ve improved onboarding for you and your customers. Hello, simplified registration and social login! As soon as you’re on, you get the First Steps Advisor to guide you through the initial steps. Like adding a domain, creating mailboxes and of course enabling your security measures.

We made an SEO Toolkit. Now you can count on Plesk to help analyze your websites, without having to look elsewhere.

  • You’ll get Site Audit for common SEO issues and receive optimization recommendations.
  • Instantly review search engine crawler activity on your sites with Log File Analyzer. Then track your keyword ranking in order to adopt the right SEO strategy.
  • Finally, think smart and monitor your competitors. So that you can react to their and your ranking changes fast.

Consider the WP Toolkit enhanced with single-click NGINX caching and AI updates.

  1. Let’s introduce you to Smart Updates by AI. Using Deep Learning Technology, you’ll bring your WP instances, plugins and themes up to speed.
  2. Configure NGINX caching to significantly speed up every WP site. And while you’re at it configure your plugin and theme sets to come preinstalled with every new WP instance.
  3. Feel safer when updating because you can now have additional restore points before updating WP or syncing data.
  4. Speaking of safe, we’ve added pingback attack protection for extra security.
  5. With all that in place, open shop and activate your eCommerce. Choose to install WooCommerce on the new Plesk Onyx. Learn more about setting up a WooCommerce online store.
  6. You’ll also find that we’ve made WP management and UX better to accommodate more and more users.

The Tighter Security Part

Out with Security Advisor and in with the all-new Plesk Advisor. This is because we’ve expanded this system-wide. You’ll get recommendations, fixes and enhancements for security, performance, reputation, updates, backups and more.

Combine our new SSL certificate manager with the ‘Keep me secured’ feature. Breaking this down, it monitors and automatically secures Plesk, new domains, subdomains and webmail with SSL certificates. You can even choose between Let’s Encrypt or Symantec SSL certificates. Domain Validation (DV) certificates are free, but you can also choose to purchase Organization Validation (OV) or Extended Validation (EV) certificates directly from Plesk.

The Part Where You Run on Schedule

Get up close with Hyperscale Cloud services. It’s easier than ever to integrate AWS with your system using AWS toolbox (RDS, Route53). Experience an elevated backup-to-cloud experience or integrate your own cloud storage backup. We’re talking incremental, scheduled, self-restore, granular restoration for sites, files, databases, mail accounts and more. Not to mention the improved passive FTP support and Maintenance mode

We gave the Plesk Extensions Catalog a facelift. You’ll see the catalog is completely redesigned with intuitive navigation, rapid search, and fast auto-updates (within 24 hours). And let’s face it, our 100+ extension list is currently unmatched.

The repairing and monitoring tools are smarter than before. Yes, it’s possible. The self-repair tool can find resource-consuming processes without SSH and CLI. So you don’t need an expert to do the work. Detect and limit resources by subscription to ensure your infrastructure’s integrity.

Find your fit with the new Plesk Onyx 17.8

Your complete set of technical, security and automation tools – all in one place. We’re a leading WebOps and Web Hosting platform for a reason. Want to effortlessly build projects, secure against vulnerabilities and automate daily tasks – all in a day’s work? Then let us help with Plesk Onyx 17.8.

See which Plesk edition fits you best. If you’re already a Plesk user, get in touch – and see if we can offer you something better.

Migrate to Plesk on AWS from Plesk, cPanel or DirectAdmin

Why migrate to Plesk on AWS?

Amazon Web Services is the cloud computing platform by Amazon.com, offering over 90 key infrastructure services such as computing power, storage options, networking, and databases, delivered as on-demand resources with pay-as-you-go pricing.

As part of the hyperscale cloud revolution, increasing numbers of web professionals are now running their instances on AWS, and many Plesk hosting partners have chosen AWS to run their managed-services business. Running your instances, hosting or managed business with Plesk on AWS provides many significant benefits over the traditional hosting infrastructure:

  • Scales better than traditional shared or VPS hosting: Plesk on AWS is based on AWS’ latest innovations and integrates smoothly with the Route53 service of AWS. Support for new AWS services are constantly being added as Plesk Extensions to take advantage of the automation and customization features of the Plesk platform. Pass on the value to your website and app customers by including AWS services to their portfolio.
  • Improved innovation: Deploy from websites and apps anywhere to everywhere. Tune, secure and optimize images that can be scaled horizontally (for high traffic sites) with ease-of-use through AWS.
  • Improved infrastructure: Quick and cost-effective spin up of dedicated multi-server environments.
  • Increased security: Intelligent Security Advisor, free SSL with Let’s Encrypt, Fail2ban, configurable firewall, ServerShield by CloudFlare, Security Core w/ ModSecurity by Atomicorp, Patchman (Patches Vulnerabilities in CMS), Datagrid reliability & vulnerability scanner, and much more
  • Proven workflow: Deploy a Domain, DNS, SSL and simple PHP application in just a few minutes…or a multi-services multi-stack application in as much time. Improved CMS (WordPress, Drupal, Joomla!) and eCommerce (WooCommerce, Prestashop, others,..) workflows to ensure better development velocity.
  • Increased productivity: Move from a release cycle every quarter to deploying changes on a minute-by-minute basis
  • Increased agility: Fully integrated deployment capabilities to deploy code more frequently
  • Global AWS Infrastructure: Plesk instances through the AWS marketplace are immediately available on any of Amazon’s many data center locations.

To learn more about Plesk on AWS, as well as our plug-and-play Plesk WordPress Server Solution and Plesk Business Server Solution, go to our Plesk on AWS page.

Let’s start the migration to AWS

Here’s what we’ll cover in this tutorial:

  1. Prepare your Plesk (or cPanel/DirectAdmin) source server
  2. Install Plesk on AWS as a target server + configure public IP
  3. Install Plesk Route53 Extension on Plesk on AWS + configure the extension.
  4. Install Plesk Migrator Extension on Plesk on AWS
  5. Migrate all the data (Plesk to Plesk), domains will be created in route53 also on this step.
  6. Use the Plesk feature “Switch DNS” – so that the source server stays a Slave until all domains are switched on the Route53 side (actually on the domain registrar’s), website stay functional during the switching time of 24-48 hours that Route53 needs for DNS sync.
  7. After that, you need to contact your registrar to delegate your domains to Route53 DNS.

1. Preparing your existing Plesk or cPanel/DirectAdmin server (source)

Note: there are some limitations in case you plan from Linux to a Windows Server and vice versa! In general, we recommend only to migrate from Linux to Linux or Windows to Windows.

To ensure that the migration is successful, a number of TCP and UDP ports need to be opened on the source and destination servers.

Plesk offers a nice Firewall component that needs to be installed in case you don’t want to do this over command line. If you can’t find the firewall in Tools & Settings -> Firewall, then you need to install the component first (click here). Then you can access it over Tools & Settings -> Firewall. More details on the firewall for Linux here and Windows here.

For Unix servers, open the following ports (In case you are migrating form Plesk using the Plesk Firewall Extension, these ports are all configured correctly for you by default!):

  • TCP port 22 for SSH connections on source server.
  • TCP port 8443 for access to Plesk XML API on the target server and on the source servers, if migrating from Plesk.
  • TCP ports 110, 143 for POP3 and IMAP, on the source and target server. These are used for post-migration checks.

For Windows servers, open the following ports:

  • TCP ports 135, 139, 445 and UDP ports 137, 138. Be sure to open these ports on the source and on the target server.
  • TCP port 1433 for MS SQL, if it is used as the default instance.
  • UDP port 1434 and all (or manually selected) TCP ports for MS SQL, if it is used as a named instance.
  • TCP port 10155 for a custom Plesk Migrator service performing miscellaneous tasks.
  • TCP port 10156 for rsync server.
  • TCP port 8443 for access to Plesk XML API on the target server and on the source servers, if migrating from Plesk.
  • TCP ports 110, 143 for POP3 and IMAP, on source and target servers. These are used for post-migration checks.

Also, make sure that https://installer.plesk.com is accessible from the destination server.

We recommend that you install and configure all the necessary services and settings on the destination server before performing the transfer. For example, if you plan on migrating MySQL databases, make sure that the MySQL server is installed and running on the destination server, and so forth.

Make sure that Plesk on the destination server has a separate license. Otherwise, you may experience problems with the license validation during migration. The possible ways of obtaining a license are described in the Administrator Guide.

2. Preparing your AWS Plesk instance (target server)

Follow this guide to set up Plesk on AWS. Note: be absolutely sure that your AWS instance is configured running with a public IP address as described in the installation tutorial, otherwise you might have issues with the DNS parts later. 

Plesk OnyxImage: Plesk Onyx

3. Installing the Plesk Migrator Extension

To install Plesk Migrator using the Plesk interface

  1. Log in as administrator to Plesk on the target server.
  2. Go to Extensions -> Server Tools -> Plesk Migrator 
  3. Select Install on the Plesk Migrator detail page.

4. Installing the AWS Route53 Extension inside Plesk

In case you plan to use the DNS features of Plesk (what is highly recommended!), you need to make sure that you have the Route53 Extension installed.

  1. Log in as administrator to Plesk on the target server.
  2. Go to Extensions -> DNS -> Amazon Route 53
  3. Select Install on the Amazon Route53 detail page.

Just for reference – here the guide for Route53 from AWS: http://docs.aws.amazon.com/Route53/latest/DeveloperGuide/MigratingDNS.html

5. Initiating the migration on your AWS Plesk instance (target server)

  1. Log in to Plesk on the destination server as the Plesk administrator.
  2. Go to Server Management > Extensions > Plesk Migrator > Start a New Migration. If Plesk Migrator is unavailable, install it following the instructions here.
  1. Select the hosting panel installed on the source server from the Panel type menu.
  2. Specify the following:
    • The source server’s IP address. If migrating from a Linux server, specify the SSH port as well (22 by default).
    • (Plesk for Linux) The login and password of a root user on the source server. Alternatively, you can choose to authenticate via SSH keys. For details, refer to Authentication by SSH (Linux).
    • (Plesk for Windows) The login and password of the built-in administrator accounts on both the source and the destination servers.
    • The directory for storing temporary files on the source server (make sure there is enough free disk space available to store the dump of the largest database that will be migrated).
  3. If migrating from a Windows-based server, specify the method for installing the RPC agent (an application enabling Plesk Migrator to gather data):
    • Automatic (recommended). Plesk Migrator will try to deploy and start RPC agent on the source server using the built-in administrator account. In some cases, automatic deployment may fail (for example, due to firewall settings, or because the File and Printer Sharing or RPC services are disabled). If this happens, deploy the agent manually.
    • Manual. A link to download the RPC agent package will be provided. Download the package and install the agent on the source server manually.
  4. Click Prepare Migration to proceed to the next step. Plesk Migrator will attempt to fetch the data about the different objects (domains, subscriptions, customer/reseller accounts, and hosting plans) found on the source server. If the connection fails, double-check the source server information, make sure that the connection is not blocked by firewall, and try again.

    Note that from this point onwards, you can leave the Migrator interface without losing your progress – the migration will remain in progress until you finish it explicitly. To continue from where you left off, click Server Management > Extensions > Plesk Migrator and then click the corresponding migration in the list.

  5. You now find yourself on the Add subscriptions tab.

  6. Here you must select the subscriptions to be migrated (note that you cannot migrate individual domains, the fewest number of objects you can migrate is a single subscription and all its domains). You can use one of the three available filters:

    • By Subscription. If you migrate a subscription owned by a customer or a reseller, the corresponding customer/reseller account will be migrated as well (unless a custom subscription owner is specified – see below). The hosting plan the subscription is based on will also be migrated.
    • By Customer/Reseller. If you migrate a customer or reseller account, all subscriptions owned by the account will be migrated together with the hosting plans they are based on. Note that migrating a reseller account does not automatically migrate the customer accounts owned by the reseller. If you select a reseller account and one or more customer accounts owned by that reseller for migration, the reseller’s ownership of the customer accounts will be preserved on the destination server.
    • By Hosting Plan. If you migrate a hosting plan, all subscriptions based on the said plan will be migrated as well. If you migrate a hosting plan belonging to a reseller, said reseller will be migrated as well, plus all subscriptions based on the selected hosting plan together with the customers who own those subscriptions.
  7. Select what types of content (mail content, web content, and databases) will be migrated.
  8. Select a custom subscription owner. By default, whenever a subscription owned by a customer or reseller is migrated, the corresponding customer or reseller account is created on the destination server as well. If you select a different subscription owner, the ownership of all subscriptions being migrated will be assigned to that account.
  9. To change the migration settings, click Settings in the upper-right corner.

  10. Here, the following controls are available:

    • Adjust application settings. By default, during migration Plesk attempts to make changes to the configuration files of a number of popular web applications to make them operable on the destination server. Clear the checkbox if you want to make the changes manually. Leaving this option enabled will increase the migration time.
    • Apache restart interval (Plesk for Linux only). Restarting the web server on the destination server is necessary for the migrated domains to become available over the network. Make sure not to set this value too low (less than 300 seconds is not recommended), as all hosted domains become temporarily unavailable every time the web server is restarted.
    • Run post-migration checks. By default, after the migration is finished, Plesk performs a number of automated tests to identify potential issues with the migrated domains. Clear the checkbox if you do not want the tests to be run. Leaving this option enabled will increase the migration time.
  11. When you are satisfied with the list of subscriptions to migrate and the migration options, click Migrate to proceed. Plesk will run pre-migration checks to detect potential issues and display a report.

  12. We advise you to fix the detected issues (if any) before continuing with the migration. Make the necessary configuration changes, then click Refresh to re-run the tests.

  13. When the pre-migration check returns a clean result, click Start migration to begin migrating. Once a migration is underway, you can monitor its progress on the Overview tab.
  14. As subscriptions are being migrated, status reports will be displayed for every subscription for which the migration was either completed successfully or failed.
    • The icon indicates that the migration was completed successfully.
    • The icon indicates that the migration was completed with errors. Click [Details] to see the list of issues that occurred during the migration.
    • The icon indicates that the migration failed. Click [Details] to see the list of issues that occurred during the migration.
  15. If you want to perform an additional sync of a subscription’s content after the migration is finished, click [Re-sync] next to the subscription’s name.
  16. If you want to migrate additional subscriptions from the source server, return to step number seven. Otherwise, unless you plan to migrate from the same source server again in the near future, you can click Finish migration to remove it from the list of ongoing migrations.

6. Going into production: switch DNS

  • Any DNS zones (domains) will be transferred to the configured Route53 correctly when using a public IP address on your Plesk server on AWS, without any manual interventions.
  • After the migration, AWS need some time to propagate new DNS records/domains inside their infrastructure.
  • IF on source server DNS was handled by Plesk, then the migrated domains continue to work on old DNS but points to new IP’s because of “switch DNS” feature described earlier.

7. Request at your domain registrar or registries to switch DNS to route53

After your changes to Amazon Route 53 resource record sets have propagated to Amazon Route 53 DNS servers (see Step 4: Check the Status of Your Changes (API Only)), update your registrar’s name server (NS) records to refer to the Amazon Route 53 name servers.

migration with no downtime – woohoo!

 

8. Additional tips

In case you are not ready to migrate your whole server yet, there is also an option to just migrate 1 site to a server with Plesk on AWS or one WordPress instance into Plesk’s all new WordPress Toolkit. For that, please refer to separate documentation available here.

 

Thanks to the whole AWS team for co-authoring this write-up and for providing feedback and technical insights to optimize this tutorial.

Be well, do good, and stay Plesky!