Max Memory Configuration set too low [SQL Server]

We have been experiencing an issue with a clustered node where instances will run extremely slow and the only process eating CPU resources is “NT Kernel & System”. We have upgraded drivers and even wiped out and reinstalled Windows and SQL Server (both 2008 R2) and the issue reappears.

In the process of troubleshooting with Microsoft, we needed to fail over instances to collect log information. We have done this several times. Because our latest cluster implementations include Active/Active nodes, we need to adjust Max Memory to SQL Server whenever all instances are in one node.

Yesterday something did not go right and a human error caused the instances to failed. Even experienced DBAs will eventually break something, but they will do anything possible to correct. Memory was set to 27 instead of 27000 MB. No connections could be made to the instance.

There are 3 ways this situation can be resolved:

  1. Opening a dedicated connection to the instance (DAC) and changing the max memory configuration.
  2. Shutting down the instance and starting it via command prompt using the -f option. It will fire up the instance with minimal configuration. After that change the max memory configuration.
  3. Bouncing the instance and changing the memory configuration as soon as the instance starts. We are taking advantage of a fresh instance with no allocated cache yet.

Changing the memory configuration:

1
2
3
4
5
6
7
8
sp_configure 'show advanced options', 1;
GO
RECONFIGURE;
GO
sp_configure 'max server memory', 27000;
GO
RECONFIGURE;
GO

Coaching point, double or even triple check before issuing a configuration command.

SQL Server Memory Configuration via SSMS

SQL Server Memory Configuration via SSMS