Welcome Guest ( Log In | Register )

 
Closed TopicStart new topic
> A Python client for H@H, py_hath

 
post Jun 13 2013, 10:33
Post #1
borgler



Newcomer
*
Group: Members
Posts: 38
Joined: 21-April 11
Level 174 (Ascended)


Update:
Now fully up-to-date, with support for H@H proxying and static ranges.
Added tests and almost-complete documentation.
Support for Apache server's X-Sendfile. This means you can serve files with Apache, which should be faster than either the Java or the Python client.

I've written a Python client for Hentai@Home. (No good reason, just because I felt like it.) It's available [www.dropbox.com] here, with installation instructions below.

Info:
  • Speed depends on the server software you run it with.
  • If you run it with the right software, it's approximately as fast as the Java version. (I'm judging from the H@H stats, so this is not a well-backed estimate.)
  • Supports HTTP caching (not much of a pro, because of the H@H timestamp system, but still nice.)
  • Doesn't support gallery downloading, because I don't use it.
  • It should run with any SQL database system that SQLAlchemy (the database library it uses) supports, which is pretty much all of them. However, I've only tested it with SQLite, so don't take my word for it.
Easy Installation for Windows
Download the folder [www.dropbox.com] here and run hath.exe. If you put it the same folder as your Java client, it can use the same cache and database files.

Programming details:
py_hath is a WSGI (a Python server specification) application, and must be run using a WSGI-compiant server. It uses the Bottle mini-web framework and SQLAlchemy, an object-oriented library and abstraction for dealing with SQL. It runs in both Python 2 and 3, using six (a compatibility library packaged with it) to help with this. (It has only been tested in Python 2.7 and 3.3, the most recent versions of Python, so it may or may not run in older versions.)

Installation from source:
  1. If you don't have Python installed, [www.python.org] install Python. (It comes installed on Unix and Macs, but not on Windows.)
  2. [www.pip-installer.org] Install pip (a Python package manager), if you don't have it installed already.
  3. Select and install a WSGI-compliant server (partial list [bottlepy.org] here).
  4. Go to a command prompt and run:
    pip install sqlalchemy -U
    pip install bottle -U
  5. [www.dropbox.com] Download py_hath
  6. py_hath can be placed in your Java client's installation directory (right next to where the Jar files are) and should be able to use the Java client's database files and login information.
Running instructions:
  1. It shares the original's command line options, though it doesn't use all of them (todo: document this better). Two additional options are "wsgi_server", which is given a value from [bottlepy.org] here and "sql_engine", which is given a string such as "sqlite:///path\to\file" ([docs.sqlalchemy.org] more details here). You can probably ignore "sql_engine", but you should set "wsgi_server" to the name of the server you downloaded earlier.
  2. Run the file hath.py (make sure it's with the right version of python) with desired command line options, and the server should start up and start working.

Running py_hath under Apache with X-Sendfile

To use this, install [code.google.com] mod_wsgi and [tn123.org] mod_xsendfile and have a httpd.conf something like the following:
CODE
LoadModule wsgi_module modules/mod_wsgi.so
LoadModule xsendfile_module modules/mod_xsendfile.so

listen 1201
<VirtualHost _default_>
    XSendFile On

    WSGIScriptAlias / {{DIRECTORY OF PY_HATH}}/serve.wsgi

</VirtualHost>

Obviously, replace {{DIRECTORY OF PY_HATH}} with the directory that py_hath is in.

This post has been edited by borgler: Mar 19 2014, 22:43
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jun 13 2013, 16:37
Post #2
blue penguin



in umbra, igitur, pugnabimus
***********
Group: Gold Star Club
Posts: 10,044
Joined: 24-March 12
Level 500 (Godslayer)


Wow, that was some hard work. Thank you.

Python is more friendly than java for a linux machine. I do not promise this for the close future, but I'll test it on a raspberry pi (which is optimised for python and not for java, the python version might be faster there).

It's also a good reason for me to learn bottle.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jun 13 2013, 21:37
Post #3
borgler



Newcomer
*
Group: Members
Posts: 38
Joined: 21-April 11
Level 174 (Ascended)


QUOTE(blue penguin @ Jun 13 2013, 07:37) *

I do not promise this for the close future, but I'll test it on a raspberry pi (which is optimised for python and not for java, the python version might be faster there).

Thanks, post here if you do.

Has anyone tried this yet? Just curious. Also, I don't really expect a response, but I'm wondering what Tenboro thinks of this.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 13 2014, 04:32
Post #4
borgler



Newcomer
*
Group: Members
Posts: 38
Joined: 21-April 11
Level 174 (Ascended)


Update: new version, now fully up-to-date with Hentai@Home 1.2.0 (the latest version).
Added fairly complete tests and documentation.

Support for Apache server's X-Sendfile. This means you can serve files with Apache, which should be faster than either the Java or the Python client. See details above.

Now supports proxying!

This post has been edited by borgler: Mar 14 2014, 00:50
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jun 15 2014, 00:00
Post #5
Duckbuster



Lurker
Group: Recruits
Posts: 2
Joined: 18-August 08
Level 19 (Novice)


Running this on my pi now.
The Java version kept crashing after running for a few days. Let's see how this goes...
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jun 16 2014, 11:58
Post #6
allfoxwy



Newcomer
*
Group: Members
Posts: 24
Joined: 1-March 12


Wow, this thing is so cool! (IMG:[invalid] style_emoticons/default/biggrin.gif)

User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jun 18 2014, 09:28
Post #7
EsotericSatire



Look, Fat.
***********
Group: Catgirl Camarilla
Posts: 11,247
Joined: 31-July 10
Level 500 (Ponyslayer)


Pretty cool especially if there are performance gains at all.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jun 19 2014, 04:45
Post #8
awvnx



Casual Poster
***
Group: Members
Posts: 166
Joined: 7-April 14
Level 267 (Godslayer)


Great, now that you've done X-SendFile for apache, do nginx next (IMG:[invalid] style_emoticons/default/smile.gif)

I mean if you really want the best performance and all.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jun 21 2014, 01:01
Post #9
Jitsu2



Casual Poster
***
Group: Members
Posts: 180
Joined: 17-July 11
Level 262 (Godslayer)


Not really into python, but DEAR LORD ,someone STICKY THIS SOMEWHERE.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 30 2014, 17:37
Post #10
LostLogia4



Translating Miku's Yuri Nikki for the heck of it~~
********
Group: Gold Star Club
Posts: 2,716
Joined: 4-June 11
Level 362 (Godslayer)


Trying it now, but man, what kind of WSGI-compliant server would anyone suggest?
And how do I install it?

EDIT: Here's the list of installed Python frameworks as well:
CODE
root@server:~# pip freeze
SQLAlchemy==0.9.7
argparse==1.2.1
bottle==0.12.7
wsgiref==0.1.2
How do I inspect python and pip's version number?

Anway, be sure to run hash -r after upgrading pip, lest it goes to the wrong directory.

This post has been edited by LostLogia4: Jul 30 2014, 17:46
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 31 2014, 00:42
Post #11
blue penguin



in umbra, igitur, pugnabimus
***********
Group: Gold Star Club
Posts: 10,044
Joined: 24-March 12
Level 500 (Godslayer)


QUOTE(LostLogia4 @ Jul 30 2014, 16:37) *
Trying it now, but man, what kind of WSGI-compliant server would anyone suggest?
And how do I install it?
You have two decent options: [uwsgi-docs.readthedocs.org] uwsgi or [gunicorn.org] green unicorn. Both are good, uwsgi is more "unix like" whilst gunicor is an "ruby on rails" philosophy adopter.

You can install both with pip, but you certainly will need gcc, glibc-devel and possibly some others on your system. Some of the code for those beasts is in C (and not in python), as they need to be fast.

QUOTE
How do I inspect python and pip's version number?
CODE
pip list
?
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 31 2014, 02:05
Post #12
LostLogia4



Translating Miku's Yuri Nikki for the heck of it~~
********
Group: Gold Star Club
Posts: 2,716
Joined: 4-June 11
Level 362 (Godslayer)


QUOTE(blue penguin @ Jul 31 2014, 06:42) *
You have two decent options: [uwsgi-docs.readthedocs.org] uwsgi or [gunicorn.org] green unicorn. Both are good, uwsgi is more "unix like" whilst gunicor is an "ruby on rails" philosophy adopter.

You can install both with pip, but you certainly will need gcc, glibc-devel and possibly some others on your system. Some of the code for those beasts is in C (and not in python), as they need to be fast.
Thanks. I'll try. Oh, but how to I execute py_hath on one specific WSGI server?

Also, about X-Sendfile thingy, how am I supposed to edit Apache's listening port?

QUOTE(blue penguin @ Jul 31 2014, 06:42) *
CODE
pip list
?
CODE
root@server:~# pip list
argparse (1.2.1)
bottle (0.12.7)
Warning: cannot find svn location for distribute==0.6.24dev-r0
distribute (0.6.24dev-r0)
pip (1.5.6)
setuptools (0.6c11)
SQLAlchemy (0.9.7)
wsgiref (0.1.2)
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 31 2014, 02:29
Post #13
blue penguin



in umbra, igitur, pugnabimus
***********
Group: Gold Star Club
Posts: 10,044
Joined: 24-March 12
Level 500 (Godslayer)


QUOTE(LostLogia4 @ Jul 31 2014, 01:05) *
Thanks. I'll try. Oh, but how to I execute py_hath on one specific WSGI server?

Also, about X-Sendfile thingy, how am I supposed to edit Apache's listening port?
Each server have it's own ways of doing it, it's a classic "there's more than one way to do it". I normally use nginx (instead of apache) and uwsgi, then configure a uwsgi emperor in /etc/uwsgi and one vassal in /etc/uwsgi/vassals.

In theory, you can get away without using apache or nginx at all, but that might not be a good idea. The setup of a lot of web servers with uwsgi is here: [uwsgi-docs.readthedocs.org] http://uwsgi-docs.readthedocs.org/en/latest/WebServers.html . But you still will need to do some reading about why the hell you need two web servers working together to serve a file through a web framework (there is a very good reason for that but it's a lengthy read).

PS: apache (and nginx too) have thousands of configuration parameters. You need to know (at least a little) what you're doing to not fuck things up. On the other hand, documentation and tutorials on how to configure those things are good.

QUOTE
CODE
root@server:~# pip list
argparse (1.2.1)
bottle (0.12.7)
Warning: cannot find svn location for distribute==0.6.24dev-r0
distribute (0.6.24dev-r0)
pip (1.5.6)
setuptools (0.6c11)
SQLAlchemy (0.9.7)
wsgiref (0.1.2)
That's the best of what you will get on version numbers. From what I see they match to your freeze.

This post has been edited by blue penguin: Jul 31 2014, 02:32
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 31 2014, 04:34
Post #14
awvnx



Casual Poster
***
Group: Members
Posts: 166
Joined: 7-April 14
Level 267 (Godslayer)


Who's been running this client? Any issues?

I found that the java client actually runs with only 32MB memory usage with cache at 1 GB. I can easily run a few of them on a 256MB VPS. Not sure what will happen once the cache starts getting big, but it seems like the Java client isn't that bad. Other than it randomly sucking up huge amounts of CPU (it starting using 600% on a shared seedbox during some proxy usage)

This post has been edited by awvnx: Jul 31 2014, 04:35
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 31 2014, 05:10
Post #15
blue penguin



in umbra, igitur, pugnabimus
***********
Group: Gold Star Club
Posts: 10,044
Joined: 24-March 12
Level 500 (Godslayer)


I've started setting up the python client once, but couldn't find the time to finish the setup and test it.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Jul 31 2014, 17:16
Post #16
LostLogia4



Translating Miku's Yuri Nikki for the heck of it~~
********
Group: Gold Star Club
Posts: 2,716
Joined: 4-June 11
Level 362 (Godslayer)


So, letting go of Apache for now, I have to somehow configure bottle to run uwgsi and make it execute py.hath on my desired IP address? That's a rather tall order for a novice Linux user like me... (IMG:[invalid] style_emoticons/default/heh.gif)

Attached File  py_hath.zip ( 33.71k ) Number of downloads: 36


This post has been edited by LostLogia4: Aug 13 2014, 05:26
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 1 2014, 07:48
Post #17
atomicpuppy



HV Inflation - Gotta farm like a dog to live like one
*******
Group: Catgirl Camarilla
Posts: 1,888
Joined: 2-April 06
Level 249 (Beginner)


QUOTE(awvnx @ Jul 31 2014, 10:34) *

Who's been running this client? Any issues?

I found that the java client actually runs with only 32MB memory usage with cache at 1 GB. I can easily run a few of them on a 256MB VPS. Not sure what will happen once the cache starts getting big, but it seems like the Java client isn't that bad. Other than it randomly sucking up huge amounts of CPU (it starting using 600% on a shared seedbox during some proxy usage)


I've been testing it out with several WSGI servers but they made my trust / quality drop like crazy so I stopped using it. So far, cherrypy and tornado worked best for me, not as good as Java client though.

Want to try the Apache with X-Sendfile thingie but haven't got the time to.

Just 32MB memory usage? Wow, is that with --use_less_memory switch?
Mine uses around 250+MB without that switch with Ubuntu 14.04 64-bit (IMG:[invalid] style_emoticons/default/cry.gif) , but my cache size is much higher.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Aug 1 2014, 08:07
Post #18
awvnx



Casual Poster
***
Group: Members
Posts: 166
Joined: 7-April 14
Level 267 (Godslayer)


I've seen the client use 400MB when running it on a seedbox with huge RAM, and 150MB on another machine. Both 20 GB cache.

But when I run it on a 32-bit VPS (Debian 7) it only uses around 32MB. Cache now 2.5GB for each one. No switches changed.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Apr 23 2021, 23:58
Post #19
Grandmasters



AFK
******
Group: Catgirl Camarilla
Posts: 760
Joined: 23-April 19
Level 500 (Godslayer)


Still alive?
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Apr 24 2021, 00:22
Post #20
blue penguin



in umbra, igitur, pugnabimus
***********
Group: Gold Star Club
Posts: 10,044
Joined: 24-March 12
Level 500 (Godslayer)


After 7 years, obviously not. Bad necro

/thread
User is offlineProfile CardPM
Go to the top of the page
+Quote Post


Closed TopicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 


Lo-Fi Version Time is now: 16th April 2024 - 20:26