Depending on how a user perceives the performance of their computer will help determine whether the system just needs to be optimized, or if parts will need to be replaced – or the entire system! Long gone are the days when focusing on beefing up the CPU was the only option. We will look at some possible areas of concern and potential solutions to those problems. Many issues are very subjective, and there is not always an easy single answer.
The Central Processing Unit (CPU) is the brains of the computer, primarily where the processing of data happens. The primary role is for processing data related to software programs; where they compile input information and decompile output information such as mathematical calculations, for example. Years ago you would only find multiple CPU’s in servers. End user systems only had single core, single task processing units; today that has changed. End user systems still have only 1 physical CPU, but they now have multiple logical cores on the CPU which most often consists of 2, 4 and even up to 8 cores.
Do more cores equal faster speed? Not necessarily. Often the speed per core is lower as you increase the number of cores and they are more efficient at programs built to take advantage of multi-tasking where more than 1 core is used. When the program running does not understand multiple cores then the additional processing cores have little to do with improving the processing for that application. However, if you are running several tasks at once on a multi-core system then each core can be allocated to each task independently, getting more done at once, creating the illusion of being faster. Upgrading the CPU is the way to go if you are doing a lot of processor intensive work. The only real way to determine this is to run CPU specific benchmarking software to really see the comparisons.
You may have noticed that computers have been shipping with more and more base memory, and tend to allow for larger memory upgrades than in the past. Not only has memory been getting larger in capacity but it has also gotten faster to keep up with the demands of the operating systems.
Years ago a desktop or laptop may have only come with 4GB out of the box, and that was the maximum allowable. The main reason for this was the limitation of the 32bit hardware bus and 32bit versions of Windows which could only use 3GB of the 4GB as the 1GB was reserved by the hardware.
Now with hardware and software running in 64bit, that limitation has been lifted to an almost limitless amount. More than anyone would need today. Most systems now ship with 8GB and allow for upgrading to 16GB, 24GB and even 32GB of memory. Who knows where it will be years from now? Increasing memory is probably the most noticeable area for performance improvement. The more programs you run at the same time, the more system memory they will use. Computers also use virtual memory when the physical memory runs low. When running low on physical memory, the system uses hard drive space as substitute memory (called virtual memory) so that your computer does not crash when it runs low on real memory. Hard drives are very slow in comparison to real memory, but per Gigabyte they are far cheaper to use.
The Graphics Card
Graphics cards once performed only one task, and that was to allow you to see the computer interface on a monitor, nothing more. Today Graphics cards do much more that their early predecessors. Graphics cards now contain what is referred to as a GPU (Graphics Processing Unit). They are responsible for handling 2D & 3D graphics, including your desktop display, and outputting to various types of display connectors. Everything from gaming, to movies and even the computer desktop relies heavily on the GPU now. The GPU will not only determine how much graphics data can be displayed on screen but how fast it is displayed as well.
The original Blu-Ray movies required the highest quality display to be 1920 x 1080 pixels, or commonly known as 1080P, and in order to be able to play a movie smoothly at 30 frames per second, GPU’s were needed. They offloaded the work from the CPU.
It’s also important to know that dedicated Graphics cards also tend to have their own physical memory on board to avoid using the computer’s own physical memory. This has a huge advantage over Integrated Graphics chips that are on the computer’s motherboard that need to share the computer’s own available memory, which takes it away from other software that requires memory also. Today we have Ultra HD 4K UHD, a resolution of 3840 pixels × 2160 lines (8.3 megapixels, aspect ratio 16:9) and is one of the two resolutions of ultra high definition targeted towards consumer television, the other being 8K UHD which is 7680 pixels × 4320 lines (33.2 megapixels) which we will eventually see as common place.
How does this compare with computers display graphics? Well, the more programs you are running, and the more display you need to use, across multiple monitors, the faster the graphics card and the more graphics memory you will want to prevent display slow down, and shearing, which looks like “Lag” to many users. There is a lot more than just Gaming Frames per Second riding on your graphics card. So much so that if you shop around for laptop computers, you will quickly learn how much more expensive one with a higher end GPU will cost you.
The primary hard drive of your computer is when the Operating System is installed, along with all your programs, and likely some if not all of your data (Unless you are storing data externally on the network or a USB drive). Little has changed with hard drive over the years from a physical point of view, until recently.
The traditional hard drives were either classified by Rotations per minute (RPM) of the drive plates (Typically 5400 RPM in 2.5” Laptop drives and 7200 RPM in 3.5” desktop drives), and the Data Bus speed (Such as ATA-33, ATA-66, ATA-100, ATA-133, etc.). The most significant change in recent years was, first, the transition to SATA (Serial ATA) data connections from EIDE (Enhanced IDE) connections, and most recently the introduction of SSD (Solid State Drives).
It is arguable which change would give you the best upgrade. SSD is certainly faster, as it is a purely digital medium and not a mechanical one, and does not rely on RPM’s like traditional drives. How you use your data on your system will determine where the performance benefits are gained, as there are many factors to take into consideration, which we will not go into detail about here.
The Network & Internet
Often people complain about their computers being slow as a general performance issue, but if you can narrow down where the real issues stem from, often it may lead to the Network Wi-Fi or Internet speed being the real issue. If the software being used is Cloud based or has to connect to a Server on a network to operate, often times it’s not the local computer that is at fault, but perhaps a slow network connection or a slow Internet connection.
Beyond that potential of Network or ISP related problems, if the programs used require large amounts of data to be transferred up and down it may be important to determine what the speed network related connections are. For example, is the Cabled network running at 100Mbps or 1Gbps? Is the computer using Wi-Fi, and if so is it using Wireless G, Wireless N or Wireless AC? What type of Internet service is being used, and more importantly what is the upload speed compared to the download speed? Most providers give a much higher download speed than an upload speed. Doing a network or Internet Speed test can help determine this for you. Usually your computer is not the cause of these types of performance issues.
Technology Bottle Necks
When we refer to Technology bottle necks, we are describing different types of technologies interacting with each other and data transfer slowing down to the lowest point of the overall performance equation. For example, if you have a Fast CPU processing data at 10Mbps, and you have memory that can only transfer data at 1 Mbps, then your bottle neck would be the Memory because the CPU needs to slow down and wait for the memory to catch up with it. (This is a very crude and non-realistic example, but I think you get the point). There are many potential bottlenecks across a system that could cause performance issues, and understanding how to locate them and how to remove or improve them by doing certain upgrades can help you avoid having to replace an entire system, unless the cost of replacing parts just doesn’t add up to the savings of buying a completely new system, and the overall benefits of doing so.
Knowing how your system components work together, and what aspect of computer performance is impacted by each of them can help you determine if and where you need to spend money upgrading. New doesn’t always mean better, especially if you are just ending up with the same bottle necks you had on your old system. In fact, some older systems can outperform newer systems based on what components they use. A high end graphics card on an older system may out perform a new lower end graphics card on a new computer. If you are not sure, ask the experts first, before wasting your money.
Paul Comtois is a Client Support Specialist at Triella, a technology consulting company specializing in providing technology audits, planning advice, project management and other CIO-related services to small and medium sized firms. Paul can be reached at 647.426.1004. For additional articles, go to www.triella.com/publications. Triella is a VMware Professional Partner, Microsoft Certified Partner, Citrix Solution Advisor – Silver, Dell Preferred Partner, Authorized Worldox Reseller and a Kaspersky Reseller.
© 2017 by Triella Corp. All rights reserved. Reproduction with credit is permitted.