[Python-Dev] Should standard library modules optimize for CPython?
Sturla Molden
sturla.molden at gmail.com
Tue Jun 3 17:13:11 CEST 2014
More information about the Python-Dev mailing list
Tue Jun 3 17:13:11 CEST 2014
- Previous message: [Python-Dev] Should standard library modules optimize for CPython?
- Next message: [Python-Dev] Should standard library modules optimize for CPython?
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Stefan Behnel <stefan_ml at behnel.de> wrote: > Thus my proposal to compile the modules in CPython with Cython, rather than > duplicating their code or making/keeping them CPython specific. I think > reducing the urge to reimplement something in C is a good thing. For algorithmic and numerical code, Numba has already proven that Python can be JIT compiled comparable to -O2 in C. For non-algorthmic code, the speed determinants are usually outside Python (e.g. the network connection). Numba is becoming what the "dead swallow" should have been. The question is rather should the standard library use a JIT compiler like Numba? Cython is great for writing C extensions while avoiding all the details of the Python C API. But for speeding up algorithmic code, Numba is easier to use. Sturla
- Previous message: [Python-Dev] Should standard library modules optimize for CPython?
- Next message: [Python-Dev] Should standard library modules optimize for CPython?
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
More information about the Python-Dev mailing list