I heard an interesting throwaway comment recently that I thought I'd share. To paraphrase...
The system is being rewritten with new technology, so it won't be slower than what we have now.
While it's true that new and updated technology *can* improve performance, it's also very easy to misuse that technology and come up with something that performs perceivably slower. I've seen this a few times, predominantly where simple websites have been rewritten using new technologies. One of the biggest contributing factors is that new technologies often provider "better" and more complex ways to solve the same problem. For example, it's easy to see how a simple two-tier web application could be rewritten with a web-MVC framework and a distributed middle-tier added to make the architecture more fashionably SOA-like. Here, the added complexity could result in net performance gains or losses, depending on how the technology is applied through the design and architecture.
There's nothing wrong with refreshing a software (or hardware) system with new technology, but it's naive to assume that performance won't degrade. New technology generally brings added complexity, and this complexity introduces more things that can go wrong. Any software/hardware architecture should undergo *some* non-functional testing before it goes live, and this still holds true for systems that are being rewritten as part of a technology refresh strategy. Don't assume anything ... test it.
Simon is an independent consultant specializing in software architecture, and the author of Software Architecture for Developers (a developer-friendly guide to software architecture, technical leadership and the balance with agility). He’s also the creator of the C4 software architecture model and the founder of Structurizr, which is a collection of open source and commercial tooling to help software teams visualise, document and explore their software architecture.