Related questions
๐ Top traders
# | Name | Total profit |
---|---|---|
1 | แน99 | |
2 | แน58 | |
3 | แน30 | |
4 | แน16 | |
5 | แน15 |
@ConnorMcCormick My guess would be, if it resolves negatively, that none of the automatic differentiation packages quite fit the bill (with desired functionality being split across them).
But OP is using Julia for a substantial project, so I think they'll get drawn in by its design (particularly as a scientific programmer who uses Fortran). I studied Physics (and learnt Fortran) and subsequently never enjoyed using Python; using Julia was a panacea for all my programming language gripes.
@finnhambly @ConnorMcCormick Just FYI, I think I'm leaning weakly to YES. (I have decided that I shouldn't place further trades on this market, so I'm sticking with my net NO position. I made it before I thought carefully about under what circumstances self-trading was sensible.)
Factors against:
I've grown used to working with jax's autodiff and the surrounding ecosystem.
The jax community is exceptional.
The large startup time is driving me batty.
The julia ecosystem generally feels extremely "heavy" (in the sense that I can tell that there are more moving parts than I can justify or understand), which I don't like, even if I don't have a great rational reason for opposing it.
Factors for:
Jax's jit compilation time is also pretty bad---I'm expecting that Julia scales better to large programs than jax does.
Python+jax always feels like metaprogramming. I'm not writing the (numerical) program, I'm writing a program which will generate the numerical program. Fine, I can do this, but it's not so fun, and explaining this way of thinking to my co-workers is difficult.
The Julia ecosystem seems to be improving faster than the python ecosystem.
I only need to replace a majority of my python use case for this to resolve YES. Right now I do a lot of autodiff-heavy stuff in python+jax (heavy enough that getting tensorflow or pytorch to do the same thing was not practical), but that's not quite the majority of what I use numerical python for. As long as "take the first derivative" doesn't cause undue headaches, julia can replace the majority. In particular this means that the confusing state of Julia autodiff packages doesn't prevent me from resolving YES.
One thing to note: If it's possible for me to get julia working on my AMD graphics card, then this market will almost certainly resolve YES. I hope to have time to try AMDGPU in a couple of days.
Feel free to suggest ways to make the large startup time more manageable. I'm vaguely aware that it should be possible to reload a package from REPL, and I guess that should be a lot faster? I just haven't found a workflow that really makes sense, other than my usual:
vim source.jl
# (do many edits, hit ctrl-Z)
./source.jl # and wait a long time
# Oh, there's a bug
fg
# Repeat ad-nauseum
@ScottLawrence Oh yeah the startup time is annoying in that situation, might be worth considering https://github.com/dmolina/DaemonMode.jl? (I've never used it)
You can use Revise.jl to keep a package updated while you edit it (but changes to structs still require REPL reloads).
I'm surprised you're using vim and running the whole file while editing in either case; editing on VSCode with the REPL makes development much quicker and easier IME. Even if the startup time wasn't an issue, you get to edit and re-run functions without having to run everything else that comes before.
If it's because you're working on your cluster, you might be able to connect to it through VSCode (via SSH or the Docker extension) and edit your code interactively with the REPL, which should make it easier to avoid bugs before running the program.