[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]
On 04/06/2020 15:44, Tom via list wrote:
> I've been looking for a very low spec MB that would provide a slot or
> three for popping in top of the range graphics cards for AI and stuff
> but I get the impression they seem to think you'll want some serious CPU
> too which I dont - just something that can support the graphics card
> really. Any ideas?Another one from my area it seems...
Firstly, re-evaluate entirely because this is a terrible idea. You're
going to have to clarify "top end GPU" as well because I don't think you
mean that at all. How many £10k Nvidia Telsas are you planning to buy
exactly? :]If we presume by "top-end GPU" you really mean "basic crappy consumer
gaming card" then how many £1100 2080 Ti cards are you planning to buy? :]My point being is that to do this even slightly seriously your
motherboard+CPU bill are going to be a drop in the ocean compared to
your GPU bill - not to mention your electricity bill.I've built and currently run a lot of these things from video editing
workstations to cryptocurrency mining rigs and servers stuffed full of
dedicated compute cards. Even though you might think the motherboard is
the least important part let me assure you it most definitely is not -
it's the single point of failure that everything else goes through!If you're actually serious about this you need to rethink completely and
first define your specific target workload. What types of maths and
machine learning are you planning to use and to what end? This will
inform your minimum barrier to entry - there's no point buying a second
hand mining rig off ebay full of tired out old Titan X GPUs only to find
that they can't handle anything like the FP64 throughput you're going to
need.I take it you've also finally reached the point of realisation that none
of those old kickstarter vapourware ASICs and the like you were so
interested in were ever going to work out? They were doomed from the
start and outmoded before they ever arrived in anybody's hands (if
anyone ever even got them). They were super interesting and exciting for
a few minutes but the writing was on the wall from the start there I'm
afraid - it was clear that Linux+GPUs were the future.Unfortunately this probably comes across as very negative - it's not
meant to, that's just my terrible personality I'm afraid!You do really need to restart from scratch though and let's be
realistic, for us mere mortals without massive company budgets to spend
the absolute defining bound of this task would be to decide how many
thousands of pounds you want to budget for this - everything else is
secondary if you can't afford it in the first place after all.Docker/Kubernetes + Nvidia Container runtime support + GPUs is basically
my favourite thing at the moment. I've been playing with DeOldify to
artificially recolour black and white images via ML on my home systems
and a much more grown up but similar stack to modify video feeds in real
time at a current $job: it's been a hell of a fun rabbit hole to go down.--
The Mailing List for the Devon & Cornwall LUG
https://mailman.dcglug.org.uk/listinfo/list
FAQ: http://www.dcglug.org.uk/listfaq
-- The Mailing List for the Devon & Cornwall LUG https://mailman.dcglug.org.uk/listinfo/list FAQ: http://www.dcglug.org.uk/listfaq