Glad to hear. I've found it's a very welcoming community.
I'll warn you that Julia's ML ecosystem has the most competitive advantage on "weird" types of ML, involving lots of custom gradients and kernels, integration with other pieces of a simulation or diffeq, etc.
if you just want to throw some tensors around and train a MLP, you'll certainly end up finding more rough edges than you might in PyTorch
If I wanted to get into research ML, I'd pick Julia no doubt. It allows both conventional ML techniques where we throw tons of parameters at the problem, but additionally a more nimble style where we can train over ordinary functions.
Combine that with all the cutting edge applied math packages often being automatically compatible with the autodiff and GPU array backends, even if the library authors didn't think about that... it's a recipe for a lot of interesting possibilities.
but all the normal marketing words: in my opinion it is fast, expressive, and has particularly good APIs for array manipulation