

My shameful secret is admitting that LLMs are great for things like getting comfortable with a programming language. They’re generally trained on the same publicly available samples as these courses and the conversational extrapolation is great for identifying concepts you forgot the technical terms for (ie. “How would I do this in python: [Java code]”)
Vibe coding sucks, but walking through some examples with an LLM and a REPL can save hours of navigating docs or Hello World blog posts.

Crazy that companies will do this shady stuff with client side code. At least it was slightly obfuscated at first, but that’s just incompetent fraud to leave it so obvious that a self professed non-software engineer (though clearly a smart guy) can read it and deduce what’s happening. Throw a tiny bit of random noise to the stepdown logic and it becomes much harder to find and reproduce as proof.