Start a new topic

Fatal Arithmetic Overflow Error for Minion Backups on Production and Other SQL Servers

I have Minion Backup 1.4 installed on many of my SQL Servers, and on several (including Production), I am getting fatal arithmetic overflow errors when it tries to do a full or a differential backup.  Transaction log backups continue to run successfully.


The Status column of the BackupLogDetails menu says "FATAL ERROR: Arithmetic overflow occurred."


Other backups on the Production server are running successfully.


There is no error message in the SQL Server Error Log around that time, so I believe this is a Minion Backup issue.  I don't see any other information abut the error in the Minion logs.


Please advise as soon as possible



Sean,


I ran a SQL Compare between the SQL Server we originally worked on for the Arithmetic Overflow and subsequent AG errors, and the SQL Server currently having the problem with Minion backup.  (The original is not an AG and has Minion Backup in its own DB; the problem server is an AG and has Minion Backup in master, but those shouldn't matter for the code base.  They should both be version 1.5)


The SQL Compare showed only one stored procedure as different: Minion.DBMaintDBSizeGet.


On the problem server, I renamed then SP and then copied the SP from the original server to it.  It now showed all Minion objects being equal between the two servers.


However, when backup job ran at its next interval, it threw the same error.


(The other SQL Servers are version 1.4 with just the changed procedures and functions applied, so show many differences in the SQL Compare).


What do you suggest for the next step?

Hi Sean,


We're still having a little ripple effect.  The backups on the existing SQL Servers are still working well.  After the last fix, I put all of the install files, including the latest fixes, together.


We just built a new server with an Availability Group.  The Minion Backup installed cleanly.  The system data bases are backing up normally.  But when I created a new data base and put it in the Availability Group, the backups returned this error:  


FATAL ERROR: @BackupCmd is empty. Common causes are misconfigured BackupSettings and BackupTuningThresholds tables.


This message came up whether I used BackupMaster or Backup DB, and happens whether the backup is full, differential, or transaction log.


A native SQL Server Backup Database command worked without error.


I then created another data base but did not put it in an Availability Group.  Backups for those worked normally.


I checked the configurations, and they look correct (it's the same whether or not the DB is in an AG).  I also scripted out the stored procedures that were changed from the new SQL Server and from an existing, working SQL Server and did a compare between the two, and they matched.


My suspicion is that it's something about a call or parameter default to one of the stored procedures, or that something was missed packaging up the changed procedures / functions.


Do you have any suggestions as what to look at?


Thanks!

That worked for both issues in Test.  The errors are no longer occurring there.  I will deploy to Acceptance and beyond where we do have Availability Groups.


Thanks!

OK let's try this... replace the input params for the AGInfo SP with these.  I think when we were tshooting the other day we didn't enable all the defaults back.

(
@DBName NVARCHAR(400),
@DBIsInAG BIT = 0 OUTPUT,
@IsPrimaryReplica BIT = 1 OUTPUT,
@IsPreferredBackup BIT = 1 OUTPUT,
@AGRole VARCHAR(10) = NULL OUTPUT,
@AGName NVARCHAR(100) = NULL OUTPUT,
@PrimaryNode VARCHAR(200) = NULL OUTPUT,
@AG2ndaryReadMode varchar(10) = 'All' OUTPUT
)

Yes, they were copied over. I even checked the modify date. I wonder if the last change where the AG section was moved caused a problem in the non-AG SQL Server.

Ok, did you copy the new objects over from the other server... the ones that we modified?

Sean,


I've found another problem, possibly related to the last fix we did.  This is happening on a server that does not have an availability group.


On a simple recovery or full recovery data base, BackupMaster works fine, but when I try to do any type of backup (full, diff, log) using BackupDB, it gets this error in SSMS (3 times if it's a log backup):


Msg 201, Level 16, State 4, Procedure Minion.MMAGInfo, Line 0 [Batch Start Line 4]

Procedure or function 'MMAGInfo' expects parameter '@AGRole', which was not supplied.


In the log, it says that @BackupCmd is empty.


There's a different problem with full recovery data bases,  BackupMaster works fine for Full and Diff backups, but for log backups, it looks like it works in SSMS (same output as for Full and Diff), but in the BackupLogDetails, it has a Fatal error (same one as above):


FATAL ERROR: HResult 0xC9, Level 16, State 4 Procedure or function 'MMAGInfo' expects parameter '@AGRole', which was not supplied. HResult 0xC9, Level 16, State 4 Procedure or function 'MMAGInfo' expects parameter '@AGRole', which was not supplied. HResult 0xC9, Level 16, State 4 Procedure or function 'MMAGInfo' expects parameter '@AGRole', which was not supplied.


Please advise.

Yeah, let's hookup today and do some live tshooting.  You have my email right?  If so, then send me a couple times that work for you.

I'm afraid things are still not working correctly.


Last night, on multiple SQL Servers running Minion Backup 1.4, I updated DBMaintDBSizeGet.  To do that, I also had to add MMAGInfo, MMSQLInfoGet, and MMCmdServerNameGet because of Missing Dependency messages.


The log backups kept running normally everywhere (as they always have).  In Test (where Minionware is in its own DB), the differential backups seemed to run normally as well (those are usually smaller data bases), though on the server with the path errors (Windows support looking into that), there were still some arithmetic overflow errors.


However, on my Acceptance and Production SQL Servers, which have availability groups (and Minionware is in the master DB), the differential backups for all data bases returned this error:


"FATAL ERROR: @BackupCmd is empty. Common causes are misconfigured BackupSettings and BackupTuningThresholds tables. If you need help configuring them, consult the documentation as it has examples of proper configurations outlined."


I checked the backup settings and they look normal.


What could the problem be, how do we fix it, and why does it behave differently in Test than in Acceptance and Production?


Let me know what info you need to troubleshoot.


Thanks!



Yes, the backup is running now.


Thank you for your help.

Perfect.  Let me know if you need anything else.

Also, just replace that file in your install folder and you'll be good to go.

I found a server that is not live yet.  There were many dependencies on other SPs and functions.


Anyway, got it to run, and it returned this value for the failing backup data base:  2711.35

Sure, but you could just install the new size sp under a new name and test it w/o any trouble.  This way you'll know if it'll actually fix your problem.

Sean,


I'm opening a change form to update Minion to 1.5 with the size fix on the Production SQL Servers tonight.  Those do not have the path issue that exists in Test.


I don't know if you installed that last version of DBSize I sent, but it looks like you may not have.  I've attached it again.


sql
(40.3 KB)
Login or Signup to post a comment