Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I like when I see people cite LLMs as a source because it’s an easy heuristic to just dismiss them.


Likewise, and that's despite me finding LLMs impressive.

I only trust LLMs for a first draft, where I can actually fact-check everything, or light copy-editing for tone and style.

I wouldn't expect a fresh graduate to be dotting all the i's and crossing all the t's on their research; and as LLMs are like a fresh graduate at coding, I assume they're like that at everything else, too.

Useful, sure, but not what I'd call a "high quality source".


Why? Do you dismiss people who use Google? Or Wikipedia? Or Britannica? They are all just different sources of info with strengths and weaknesses.


LLMs lie far far more often. It would take a year+ of good experiences for me to start reconsidering my bad experiences with any technical topic outside of programming.

I similarly dismiss people's opinions that rely on LLMs to inform them on important subjects, rather than using them as a search engine to find more reliable sources.

You can't trust a person that educated themselves with an LLM, because they have filled their head with bad foundations. That's very difficult to correct later on.


Having worked in the private satellite indsutry, I would dismiss most of the information on Google, Wikipedia and Britannica on this topic, yes. Most of the information online about satellite operations is very, very wrong.


It’s funny I should’ve just said I work in the satellite industry and everyone would’ve accepted my ballpark :p


Yes, if you lie people will be deceived. Good to have this in your comment history I suppose.


Not really? If one seeks to gain basic information these are reasonable sources but they are not authoritative, nor are they deep enough to generate deep and rich understanding of most phenomenon.

I was taught since at least high school that you don’t cite encyclopedias as sources because they are not sources - they are summaries of sources.

LLMs are similar but worse. They are a statistical approximation of truth, without a concept of truth. They are fuzzing sources that you and I have access to already. I find calling that hand off of agency and responsibility “being informed” deeply disturbing.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: