6toda9 0 Share Posted June 24, 2017 Hi! Currently I run a small data center with around 48 blade servers that I am not using both with 2 Intel Xeons (I forgot the specific model number but I can check later) 6 cores each 12 threads. Each blade server carries around 16gb of ECC DDR3 ram. Just out of curiosity, with dual WAN at 500 mbs up and down and some kick ass enterprise network switches. How many bots can I make? How much money can I make? And if any one is interested let me know I am happy to work a partnership where I provide the computing power for a small cut. Link to comment Share on other sites More sharing options...
Donald Trump 333 Share Posted June 24, 2017 Well as you're probably aware of CPUBenchmark, you should use this to measure performance. Each script is different in regards to the total you can run on a server, your best bet is following the cpubench approach and figure it out via maths. Link to comment Share on other sites More sharing options...
Antonio Kala 28 Share Posted June 24, 2017 So each server has 12 cores 24 threads and 16gb of ddr3? and you have 48 of these? Link to comment Share on other sites More sharing options...
6toda9 0 Author Share Posted June 27, 2017 So each server has 12 cores 24 threads and 16gb of ddr3? and you have 48 of these? Yes I have 48 of these. I can get a lot more if it is really worth it. Link to comment Share on other sites More sharing options...
Antonio Kala 28 Share Posted June 27, 2017 Yes I have 48 of these. I can get a lot more if it is really worth it. seems a pretty bad ram/cpu ratio. for 12 cores/24 threads you would want 32gb minimum Link to comment Share on other sites More sharing options...
6toda9 0 Author Share Posted July 10, 2017 seems a pretty bad ram/cpu ratio. for 12 cores/24 threads you would want 32gb minimum My original usage for these servers was for data mining. That's why they don't have a lot of ram. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.