Note: This discussion is about an older version of the COMSOL Multiphysics® software. The information provided may be out of date.

Discussion Closed This discussion was created more than 6 months ago and has been closed. To start a new discussion with a link back to this one, click here.

Out-of-core MUMPS

Please login with a confirmed email address before reporting spam

Hi guys,

I have a question regarding MUMPS out-of-core. What is the effect of changing the value of "In-core memory (MB)" when using MUMPS out-of-core?

Many thanks,

J

4 Replies Last Post 16 janv. 2012, 16:58 UTC−5
Jim Freels mechanical side of nuclear engineering, multiphysics analysis, COMSOL specialist

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago 16 janv. 2012, 13:28 UTC−5
Hello Jacopo,

This is a good question. I have tried this option once, and I was not satisfied with what I obtained. Perhaps we can get a COMSOL tech support to also contribute to this answer.

Here is my best guess:

Normally MUMPS uses the in-core memory for all of it's resources needed to solve. So, if you run out of memory (RAM), then MUMPS will give you an error message and stop the solution process. For example, let's say it takes 30 GB to solve a problem with MUMPS, and you only have 24 GB or RAM, then MUMPS will halt without this option clicked. If it takes 20GB to solve, then MUMPS will solve OK.

So, you click on this option. What I think happens, is COMSOL tells MUMPS to place as much as it can onto a "disk RAM" to solve the problem, except what you specifically tell it to retain into in-core RAM. For example, let's go back to the 30 GB problem, if you click this option and then specify 8GB to be in-core, then 22GB of the problem RAM would be in disk RAM. It tried this several ways to test this theory, and I am not sure this is correct at all ! Also, remember that other parts of COMSOL need RAM to operate also, and MUMPS can't have all the in-core memory even if you gave it all of it.

It has been my experience that using this option slows down the solution process so much, that it is not worth it for me. Also, you will be thrashing the disk drive when you use this option. There is a similar option for PARDISO.

My recommendation is to reduce the problem size, purchase more RAM, or add more compute nodes to use parallel processing before using this option.
Hello Jacopo, This is a good question. I have tried this option once, and I was not satisfied with what I obtained. Perhaps we can get a COMSOL tech support to also contribute to this answer. Here is my best guess: Normally MUMPS uses the in-core memory for all of it's resources needed to solve. So, if you run out of memory (RAM), then MUMPS will give you an error message and stop the solution process. For example, let's say it takes 30 GB to solve a problem with MUMPS, and you only have 24 GB or RAM, then MUMPS will halt without this option clicked. If it takes 20GB to solve, then MUMPS will solve OK. So, you click on this option. What I think happens, is COMSOL tells MUMPS to place as much as it can onto a "disk RAM" to solve the problem, except what you specifically tell it to retain into in-core RAM. For example, let's go back to the 30 GB problem, if you click this option and then specify 8GB to be in-core, then 22GB of the problem RAM would be in disk RAM. It tried this several ways to test this theory, and I am not sure this is correct at all ! Also, remember that other parts of COMSOL need RAM to operate also, and MUMPS can't have all the in-core memory even if you gave it all of it. It has been my experience that using this option slows down the solution process so much, that it is not worth it for me. Also, you will be thrashing the disk drive when you use this option. There is a similar option for PARDISO. My recommendation is to reduce the problem size, purchase more RAM, or add more compute nodes to use parallel processing before using this option.

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago 16 janv. 2012, 15:57 UTC−5
Hi James,

thanks for your reply.

What actually I think is happening without out-of-core option is that when the solver occupies all RAM it starts swapping data on the HD, horribly slowing down the simulation speed (normally you will see this using task manager and you can see the RAM fully occupied and the processors almost running at 0%).

With the out-of-core option it would happen the same but in a much more efficient way, since this time the solver "is aware" that has to use the HD.

Another point that I would like to have clarified is the function of the "allocation factor", because I apparently noticed that even with the out-of-core option I can run out of RAM and fall back to case one up here.

Any idea?

Many thanks

J
Hi James, thanks for your reply. What actually I think is happening without out-of-core option is that when the solver occupies all RAM it starts swapping data on the HD, horribly slowing down the simulation speed (normally you will see this using task manager and you can see the RAM fully occupied and the processors almost running at 0%). With the out-of-core option it would happen the same but in a much more efficient way, since this time the solver "is aware" that has to use the HD. Another point that I would like to have clarified is the function of the "allocation factor", because I apparently noticed that even with the out-of-core option I can run out of RAM and fall back to case one up here. Any idea? Many thanks J

Jim Freels mechanical side of nuclear engineering, multiphysics analysis, COMSOL specialist

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago 16 janv. 2012, 16:26 UTC−5
Jacopo,

The difference in behavior could be due to operating systems. I run COMSOL exclusively on Linux servers/clients and have little to no experience with running it on Windows systems. What I have observed on linux is when the memory is exceeded using mumps, it does not use the virtual memory and does not swap to disk. it simply provides an error message to the screen and stops execution. This is contrary to the pardiso solver, which does swap to hard disk using the virtual memory of the operating system via the swap drive.

I think the allocation factor is a bit different. The default is 1.2, then when more memory is needed, it increases to (1.2)^2 = 1.44, then (1.2)^3=1.728 and so forth until there is enough memory allocated to store the full matrix. I have noticed that if the problem becomes more complex, the Jacobian matrix becomes less sparse, and this allocation factor must be increased to accomodate. You can refine the memory allocated yourself by changing the default. For example, I have used a value of 1.3 on some of my problems, and it did not automatically increase it to 1.44 because 1.3 was enough, but 1.2 (default) was too small.

I don't think that changing the allocation factor will effect how much hard disk is allocated for the solver. I think it effects the overall amount of memory (disk = RAM) that is required to load the system matrix.
Jacopo, The difference in behavior could be due to operating systems. I run COMSOL exclusively on Linux servers/clients and have little to no experience with running it on Windows systems. What I have observed on linux is when the memory is exceeded using mumps, it does not use the virtual memory and does not swap to disk. it simply provides an error message to the screen and stops execution. This is contrary to the pardiso solver, which does swap to hard disk using the virtual memory of the operating system via the swap drive. I think the allocation factor is a bit different. The default is 1.2, then when more memory is needed, it increases to (1.2)^2 = 1.44, then (1.2)^3=1.728 and so forth until there is enough memory allocated to store the full matrix. I have noticed that if the problem becomes more complex, the Jacobian matrix becomes less sparse, and this allocation factor must be increased to accomodate. You can refine the memory allocated yourself by changing the default. For example, I have used a value of 1.3 on some of my problems, and it did not automatically increase it to 1.44 because 1.3 was enough, but 1.2 (default) was too small. I don't think that changing the allocation factor will effect how much hard disk is allocated for the solver. I think it effects the overall amount of memory (disk = RAM) that is required to load the system matrix.

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago 16 janv. 2012, 16:58 UTC−5
What I particular wonder (talking about Win and Mac) is what the "allocation factor" specifically represents. I understand that is a sort of indication of how much RAM the solver needs but I'm interested in knowing exactly how it works. For example, it seems that if my model is big enough I can run out of RAM even if I use an out-of-core solver (MUMPS e.g.), and this happens because the allocation factors increases a lot.

J
What I particular wonder (talking about Win and Mac) is what the "allocation factor" specifically represents. I understand that is a sort of indication of how much RAM the solver needs but I'm interested in knowing exactly how it works. For example, it seems that if my model is big enough I can run out of RAM even if I use an out-of-core solver (MUMPS e.g.), and this happens because the allocation factors increases a lot. J

Note that while COMSOL employees may participate in the discussion forum, COMSOL® software users who are on-subscription should submit their questions via the Support Center for a more comprehensive response from the Technical Support team.