It was touched 9 years ago, but maybe you have ported it to current standards. I don't think we had multithreading at that time, only multiprocessing.
Is your Julia implementations available somewhere? (Sorry if it is in your paper but I missed it).
I vaguely remembered in the past that working with threads leaded to some additional allocations (compared to the serial code). Maybe this is also biting us here?
As far as I know the code was ported to use @floops, with minor optimisations in addition to that.
I think it's quite possible that it's an allocation issue, that's something we're looking into, although I don't have any specific results for Julia yet.
For the datasets, I tried to access (like the full disc image in visible wavelength, MTG 0 degree), it is sufficient to register at eumetsat to get a username and password. The eumdac python tool is probably the easiest way to access the data:
(If you do not want to use python, the --debug option is quite useful to see exactly the request made. The output is either some JSON metadata or a large zip with the netcdf data)
Unfortunately, mathworks is a quite litigious company. I guess you are aware of mathworks versus AccelerEyes (now makers of ArrayFire) or Comsol.
For our department, we mostly stop to use MATLAB about 7 years ago, migrating to python, R or Julia. Julia fits the "executable math" quite well for me.
Checkout PythonCall.jl and juliacall (on the python side). Not to mention that now you can literally write python wrappers of Julia compiled libraries like you would c++ ones.
> you can literally write python wrappers of Julia compiled libraries like you would c++ ones
Yes, please. What do I google? Why can't julia compile down to a module easily?
No offense but once you learn to mentally translate between whiteboard math and numpy... it's really not that hard. And if you were used to Matlab before Mathworks added a jit you were doing the same translation to vectored operations because loops are dog slow in Matlab (coincidentally Octave is so much better than Matlab syntax wise).
And again python has numba and maybe mojo, etc. Because julia refused to fill the gap. I don't understand why there's so much friction between julia and python. You should be able to trivially throw a numpy array at julia and get a result back. I don't think the python side of this is holding things back. At least back in the day there was a very anti-python vibe from julia and the insistence that all the things should be re-implemented in julia (webservers etc) because julia was out to prove it was more than a numerical language. I don't know if that's changed but I doubt it. Holy wars don't build communities well.
>> you can literally write python wrappers of Julia compiled libraries like you would c++ ones.
> Yes, please. What do I google? Why can't julia compile down to a module easily?
That said Julia's original design focused on just-in-time compilation rather than ahead-of-time compilation, so the AOT process is still rough.
> I don't understand why there's so much friction between julia and python. You should be able to trivially throw a numpy array at julia and get a result back.
I use the command line tool arduino-cli (with plain Makefile) to compile and upload the code (obviously usable in any editor). It has also a --verbose mode to show exactly what is getting executed.
But I heard a lot about platformio, so I am wondering what is its benefits (beside the integration in vscode; as an emacs user vscode is not working for me)
I think platformio's selling point is multiple target boards via ts config. That and you can use an actual editor instead of the arduino "IDE", although I'm not a fan of vscode anymore either.
I also think they have some testing features built in, though i never delved too deep.
As an experiment, I would be interested to see if somebody would make a 1-based python list-like data structure (or a 0-based R array), to check how many 3rd party (or standard library) function would no longer work.
I guess that the interoperability with Python is a bit better. But on the other hand, the PythonCall.jl (allowing calling python from julia) is quite good and stable. In Julia, you have quite good ML frameworks (Lux.jl and Flux.jl). I am not sure that you have mojo-native ML frameworks which are similarly usable.
https://github.com/JuliaParallel/rodinia/tree/master/julia_m...
It was touched 9 years ago, but maybe you have ported it to current standards. I don't think we had multithreading at that time, only multiprocessing.
Is your Julia implementations available somewhere? (Sorry if it is in your paper but I missed it). I vaguely remembered in the past that working with threads leaded to some additional allocations (compared to the serial code). Maybe this is also biting us here?