Big Computing (ish)

stenman

Beta member
Messages
2
Location
Canada
Hello all,

This is my first time on this site so I am looking forward to hearing from all of you. I have come to this site in search of an answer to one specific question:

I have to conduct some modeling. I do this by running huge numbers (13 million+) of instances of one program with small input changes at each instance. The program takes about 1hr to run on a single core with no more than 2 gb of RAM needed. I then need to compile and plot all the results at the very end of the sequence. Since each instance is independent of the others, I am not interested in multi-core processing.

My restrictions are:

- The cheaper the better but a max budget of approximately 5000$
- My own computer coding is still what I would consider basic, I would prefer to use predeveloped systems rather than write my own
- I have access to large amounts of individual computer labs (at a university) but do not wish to have to run each computer individually (remote access is better obviously)
- The program uses MatLab to compile the results but it is a TCL based program called OpenSees


Basically, my understanding of the most efficient way of handling this operation is by having many individual cores available with a the token amount of RAM available to each. I was thinking of a small server or a computer lab with an SSI type of set up. Any comments or suggestions are greatly appreciated! I am open to any solution

Regards
 
What about using something like Amazon Web Services? That way you can spin up a server only when you need it and not have it sitting there when you're not using it. $5k would pay for a lot of uptime on a server.
 
What about using something like Amazon Web Services? That way you can spin up a server only when you need it and not have it sitting there when you're not using it. $5k would pay for a lot of uptime on a server.

I have considered this by the ideal would be to be able to reuse the system for many years to come. We were hoping 5k would give us something with at least a 5-6 year lifespan. Furthermore, a single analysis takes approximately 1.15 hours to run on any one core, and we require the computation of more than 13 million. I think a server in house ends up being cheaper that way??

Any hardware suggestions?
 
I'm afraid I'm not too clued up on the enterprise side of this lark, I more specialise in personal use and dabble in small business networks. However, I would agree that if you're utilising the server to it's full capacity, then in the long run it's probably cheaper to have your own!

I'll leave it to somebody else to make hardware recommendations, I don't think I know enough in this area, sorry.
 
Back
Top Bottom