Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Something has to give...

Is training compute interchangeable with inference compute or does training vs. inference have significantly different hardware requirements?

If training and inference hardware is pooled together, I could imagine a model where training simply fills in any unused compute at any given time (?)



Hardware can be the same but scheduling is a whole different beast.

Also, if you pull too manny resources from training your next model to make inference revenue today, you’ll fall behind in the larger race.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: