A list of puns related to "Pypy"
I have prepared this simplified code to illustrate something I have noticed.
from timeit import timeit as tic
class Vars:
def __init__(self, vars):
self.vars = vars
self.func_text = f"lambda self: {' + '.join([f'self.vars[{i}].value' for i in range(len(vars))])}"
self.func = eval(self.func_text)
def sum0(self):
s = 0.0
for var in self.vars:
s += var.value
return s
def sum1(self):
return self.vars[0].value + self.vars[1].value + self.vars[2].value
def sum2(self):
return self.func(self)
class Var:
def __init__(self, value):
self.value = value
x = Var(2.3)
y = Var(5.19)
z = Var(23.91)
vars = Vars([x, y, z])
N = int(1e8)
def test0():
s = 0.0
for i in range(N):
s += vars.sum0()
return s
def test1():
s = 0.0
for i in range(N):
s += vars.sum1()
return s
def test2():
s = 0.0
for i in range(N):
s += vars.sum2()
return s
def test(func):
f = func() # Get function value to confirm answers are the same
tic(func, number=1) # Warm up JIT
t = tic(func, number=1) # Time func
print(f, t)
return f, t
test(test0)
test(test1)
test(test2)
Basically, I have a Var
class which stores values (along with other metadata which is not shown), and a Vars
class which stores a list of Var
. The purpose of the Vars
class is to sum the values in each of the Var
in its list. I have found that if the summation is done using a for loop for each item in the list, it takes way longer (~44x) than if the sum is done explicitly (i.e. if the number of elements is known at programming time-- see sum1
). One solution to this is to use a lambda
function (see sum2
) to explicitly and dynamically write out the summation, which performs just as fast as explicitly writing it out without a lambda
function.
Here is the output from the code:
3140000005.7001 3.0798800999999996 # Summation in loop
3140000005.7001 0.07164459999999995 # Summation explicit
3140000005.7001 0.07178770000000068 # Summation with Lambda
So my questions are:
(1) is this a general rule i.e avoid internal loops in PyPy?
(2) Is replacing the loop with a lambda
... keep reading on reddit β‘Hi, I am using pypy3 on windows 7 and I just found out it is using 15.9/16gb ram I have. Wow... Literally can't do anything else while it's running. It got stuck now, probably because it needs more ram lmao
Is there anything I can do to limit its memory usage?
because the pypy subreddit is literally dead ill ask it here
the error:
PS Q:\pygame\pypy3.8\Scripts> .\pip3.8 install pygame
Collecting pygame
Using cached pygame-2.1.0.tar.gz (5.8 MB)
Preparing metadata (setup.py) ... error
ERROR: Command errored out with exit status 1:
command: 'Q:\pygame\pypy3.8\pypy3.exe' -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\User\\AppData\\Local\\Temp\\pip-install-7oltmffm\\pygame_8d9fdd8adae540bcb29eefc050c1c87a\\setup.py'"'"'; __file__='"'"'C:\\Users\\User\\AppData\\Local\\Temp\\pip-install-7oltmffm\\pygame_8d9fdd8adae540bcb29eefc050c1c87a\\setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base 'C:\Users\User\AppData\Local\Temp\pip-pip-egg-info-micvjlpj'
cwd: C:\Users\User\AppData\Local\Temp\pip-install-7oltmffm\pygame_8d9fdd8adae540bcb29eefc050c1c87a\
Complete output (80 lines):
WARNING, No "Setup" File Exists, Running "buildconfig/config.py"
Using WINDOWS configuration...
Making dir :prebuilt_downloads:
Downloading... https://www.libsdl.org/release/SDL2-devel-2.0.16-VC.zip 13d952c333f3c2ebe9b7bc0075b4ad2f784e7584
Unzipping :prebuilt_downloads\SDL2-devel-2.0.16-VC.zip:
Downloading... https://www.libsdl.org/projects/SDL_image/release/SDL2_image-devel-2.0.5-VC.zip 137f86474691f4e12e76e07d58d5920c8d844d5b
Unzipping :prebuilt_downloads\SDL2_image-devel-2.0.5-VC.zip:
Downloading... https://www.libsdl.org/projects/SDL_ttf/release/SDL2_ttf-devel-2.0.15-VC.zip 1436df41ebc47ac36e02ec9bda5699e80ff9bd27
Unzipping :prebuilt_downloads\SDL2_ttf-devel-2.0.15-VC.zip:
Downloading... https://www.libsdl.org/projects/SDL_mixer/release/SDL2_mixer-devel-2.0.4-VC.zip 9097148f4529cf19f805ccd007618dec280f0ecc
Unzipping :prebuilt_downloads\SDL2_mixer-devel-2.0.4-VC.zip:
Downloading... https://www.pygame.org/ftp/jpegsr9d.zip ed10aa2b5a0fcfe74f8a6f7611aeb346b06a1f99
Unzipping :prebuilt_downloads\jpegsr9d.zip:
Downloading... https://pygame.org/ftp/prebuilt-x64-pygame-1.9.2-20150922.zip 3a5af3427b3aa13a0aaf5c4cb08daaed341613ed
Unzipping :prebuilt_downloads\prebuilt-x64-pygame-1.9.2-20150922.zip:
copying into .\prebuilt-x64
... keep reading on reddit β‘I'm trying to test out how much pypy can actually optimize code, so to kill two birds with one stone I'm using it to optimize a smaller program that I wrote to automatically export a batch of psds I had to pngs so I could preview them all easier. Installing pillow was a nightmare but I think I got it figured out thanks to "Unnamedplayer21" but now when I am saving the psd as a png using psd_tools (psd-tools3 from pip) it crashes with "OSError: encoder zip not available". When I look it up I find several similar crashes with different formats and what not, but I don't actually know how to solve this problem specifically. I know my code works when ran in base python so I'm not sure if this is the right subreddit but its the best guess I had for where it would go.
Greetings Community,
until just now I used Python as it is in the most recent version. Since I lately do some heavy computing stuff (generating specialised random graphs with certain properties in large numbers) I thought I checkout PyPy (and later on maybe Cython). Compared to Cython PyPy seemed to promise a speed up without any major code changes (except something that is not supported by the PyPys current version e.g. passing *args parameter with afterwards further parameters following to a function).
After installing PyPy and running my project it was asking me to install matplotlib. I guess when I will install matplotlib it will ask for every other package used too.
Is there a way to make PyPy able to see all the packages installed with pip in python so that I don't have to maintain duplicates?
(Running Archlinux in case this is of importance)
Thank you very much in advance and kind regards,
WhyNot7891
Hey folks, a while I back I had an idea of converting my project-idea into a usable package that would be open sourced. Here's my first attempt at creating an actual Open Source project. Still too many issues and lot of stupid design decisions that I made which I realise now.
Also I made sure to make it compatible with PyPy which I love using! Although I realise PyPy supporting 3.8 would be so great! β
Also I have some open-issues so anyone looking to contribute towards Hacktoberfest is welcome!
Would you guys want me to make a post on what mistakes I learned from making such a package!?
https://github.com/Agrover112/fliscopt
Python
I've been hearing a lot about the PyPy project. They claim it is 6.3 times faster than the CPython interpreter on their site.
Whenever we talk about dynamic languages like Python, speed is one of the top issues. To solve this, they say PyPy is 6.3 times faster.
The second issue is parallelism, the infamous Global Interpreter Lock (GIL). For this, PyPy says it can give GIL-less Python.
If PyPy can solve these great challenges, what are its weaknesses that are preventing wider adoption? That is to say, what's preventing someone like me, a typical Python developer, from switching to PyPy right now?
Answer link : https://codehunter.cc/a/python/why-shouldnt-i-use-pypy-over-cpython-if-pypy-is-6-3-times-faster
I've looked around using Google and the search function of this subreddit, but haven't found much conversation around this topic.
Have there been conversations around PyPy and Django ?
Thanks
Since PyPy is supposed to make long running code faster with every run is it better to run a web framework (flask/fastapi) on pypy ?
I'd like to be enlightened by Some pro Pythonistas.
It's done!! With some muse from Twitter, my usual slap, tickle and pokery-dashery; https://imgur.com/gallery/DY142ZL and RESULT!! As you can see, 20s vs. 0.23s with the same script used on the blogs.
Now, there is no CFFI or hashlib used in this particular build, used python2 to initially build and hashlib/tk are a pitfa there. However, it's a Python3 equivalent PyPy uses, so may now use this to rebuild itself!
I don't have a GitHub, but have hosted a bzipped tarball up on DropBox; 25.8MB big; https://www.dropbox.com/s/l0hb41swlhgacl0/pypy-377-aarch64.tar.bz2?dl=0
PyPy version 3.7.7 (7.3.4). Built on aarch64. No 32 bit devices to test on.
UPDATE
It seems this did not work on Android Pie, but does seem to work across Android 11. In Pie, it fails to run due to unreferenced symbol, _Unwind_Backtrace() which I suspect isn't available to Android 9 libc/bionic implementation. Will test Android 10 (LOS) soon!
UPDATE 2
Doesn't work on Android 10, either. It seems this package will only work with Android 11.
It requires a manual installation at this point, but, is portable so can be run from wherever you want. I can't provide any specific support, Python isn't my area of expertise. I am however going to rebuild PyPy with itself to get CFFI and hashlib if possible.
Enjoy!
I have tested the followin recursion code on Pypy and Cpython
import time beg = time.time()
def fib(n):
if n <=2:
return 1
return fib(n-2) + fib(n-1)
var = fib(35)
end = time.time()
print (var) print (end - beg)
Results:
0.12 seconds for pypy 3.7
2.83 seconds for Cpython 3.9.5
On MS windows 10
https://preview.redd.it/pp2dm4tdgo071.png?width=578&format=png&auto=webp&s=7ebe9b8affbc701757c70aedf05e2f3c7f05a300
https://preview.redd.it/c1pvpvzpio071.png?width=642&format=png&auto=webp&s=69c674d620b02fe697f330a4f78d45f73ce6985b
I have ~100GB worth of Python code from many projects which must be preprocessed and analyzed. To do so, I created some functions that call ast.walk()
, as well as some subclasses of ast.NodeVisitor
and ast.NodeTransformer
. I also use ast.unparse()
quite a bit.
It takes ~1 hour to process 5-6 GB in CPython. I βthought PyPy would be perfect for speeding up the job since ast
is pure Python code. However, I am seeing negligible speed differences between CPython and PyPy.
Whay is PyPy not providing any speed gains? Some profiling revealed that both CPython and PyPy spend a lot of time in deque.extend()
. Is this hard to optimize for PyPy.
Been trying to get PyPy compiled in Termux since my phone has 12GB RAM and doesn't cause Termux to OOM. I have gotten so far, but when it comes to the make process, it fails with https://pastebin.com/wrXsghip
If anyone is able to point me to the right direction to complete, I could make this a package for other aarch64 users.
Thanks.
Hey there, I'm having trouble running pygame in pypy on a raspberry pi and was wondering if anyone had any insights. I've read elsewhere that pygame is compatible with pypy but I can't seem to get it to work. Not on the pi at least. I get an "ImportError: No module named pygame.base" Any help would be appreciated.
Hopefully this question belongs in this sub. r/conda and r/anaconda weren't related to Python.
Has anyone managed to get scapy and pypy to coexist in Anaconda? I recently wrote some code to compare PCAP files as part of a product test campaign, and while the code works well it runs too slowly to be useful - analyzing a one hour PCAP takes ~8 hours. I've confirmed that I'm CPU bound and not IO limited, and I'm hoping that PyPy can provide a substantial improvement before I start digging into multiprocessing.
Under conda 4.10.1 on Windows 10, conda create -n ethgen_pypy --no-channel-priority scapy pypy
yields
Package python conflicts for:
scapy -> python
scapy -> cryptography -> python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.9,<3.10.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
pypy -> python[version='3.6.12|3.6.12|3.7.9|3.6.9|3.6.9|3.6.9|3.6.9',build='1_73_pypy|2_73_pypy|3_73_pypy|4_73_pypy|5_73_pypy|0_73_pypy']
Looking at the output, it seems like Python 3.7.9 should be acceptable to both scapy and pypy, so I tried adding python=3.7.9
, which yielded vc and vs2015_runtime conflicts. Adding vc=14.1
results in the addition of OpenSSL conflicts.
Package python conflicts for:
python=3.7.9
scapy -> cryptography -> python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.9,<3.10.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
scapy -> python
pypy ->
python[version='3.6.12|3.6.12|3.7.9|3.6.9|3.6.9|3.6.9|3.6.9',build='0_73_pypy|3_73_pypy|5_73_pypy|4_73_pypy|2_73_pypy|1_73_pypy']
Package vc conflicts for:
scapy -> cryptography -> vc[version='10.*|14.*|9.*|>=14.1,<15.0a0|>=14,<15.0a0|>=9,<10.0a0']
python=3.7.9 -> vc[version='>=14.1,<15.0a0']
vc=14.1
Package openssl conflicts for:
python=3.7.9 -> openssl[version='>=1.1.1g,<1.1.2a|>=1.1.1h,<1.1.2a|>=1.1.1i,<1.1.2a']
scapy -> cryptography -> openssl[version='1.0.*|>=1.0.2o,<1.0.3a|>=1.0.2p,<1.0.3a|>=1.1.1a,<1.1.2a|>=1.1.1g,<1.1.2a|>=1.1.1h,<1.1.2a|>=1.1.1i,<1.1.2a|>=1.1.1j,<1.1.2a|>=1.1.1k,<1.1.2a|>=1.1.1d,<1.1.2a|>=1.1.1c,<1.1.2a|>=1.1.1b,<1.1.2a|>=1.0.2n,<1.0.3a|>=1.0.2m,<1.0.3a|>=1.1.1f,&l
... keep reading on reddit β‘I've heard of a JIT compiler called PyPy, it claims to be faster than Cpython (interpreted), my current project has over 16k sloc (source-lines-of-code) would I have to change parts of my code, use an older version of python ect?
The link to it if you're interested is this, hope you enjoy!
UPDATE: I REMOVED THE SYSTEM METHOD FOR INSTALLING TQDM AND INSTEAD USED SETUP.PY THANK YOU FOR THE FEEDBACK :D
UPDATE 2: INSTEAD OF USING SETUPS AND REQUIREMENTS, I WILL PUT ALL THE MODULES NEEDED TO IMPORT IN THE PACKAGE ITSELF (I WILL CREDIT THE CREATOR) THANK YOU FOR UNDERSTANDING
UPDATE 3: TURNS OUT THE SECOND UPDATE WAS GLITCHY, SO NOW WE ADDED A CONFIRMATION SYSTEM TO INSTALL PACKAGES, INSTEAD OF DOWNLOADING THEM WITHOUT ASKING
IMPORTANT: I HAVE TESTED THE MODULE AND EVERYTHING WORKS FINE, IF YOU HAVE ANY BUGS TO REPORT PLEASE REPORT THEM, (such as module not importing) ANY QUESTIONS WILL BE ANSWERED, THANK YOU!
FYI: if any one of you thinks why I don't have any repositories, it's because I don't use GitHub often and just look at other open-source projects made from other fellow creators
Credits: github.com/tqdm/
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.