Limit on Subscribed Stocks

SAURABH
Hi!
I want to know if there is any upper limit on the number of stocks that can be subscribed on a websocket client. I am asking this because when I subscribe upto 20 stocks, I receive data smoothly, but above 20 stocks, no data is received.
  • Vivek
    @SAURABH We have hard limit of 200 for subscriptions but for around 20 subscriptions it should be fine. Are you receiving data for previous 20 subscriptions when add more than 20? or you don't get data at all?
  • SAURABH
    I don't get data at all if I subscribe for more than 20 stocks
  • SAURABH
    Is this query so unimportant for you that no one is interested in answering it?
  • RH1558
    Same here, no data received for more than 20 sockets
  • Vivek
    @SAURABH Sorry for delay I lost track of this issue. We acknowledge the issue and will release a patch by today or tomorrow.
  • Vivek
    @SAURABH @RH1558 Python kite client has been updated with patch for above issue. Please update your library and check it - https://pypi.python.org/pypi/kiteconnect/3.1.5
  • SAURABH
    Thanks vivek. Just one more issue. I am facing frequent gateway timeout error when I place many stocks (say above 100). How can I fix the same?
  • SAURABH
    This is the exact error: error from callback >: Gateway timed out
  • Kailash
    @SAURABH Are you referring to Websocket ticks, or placing orders against /orders?
  • SAURABH
    bound method WebSocket._on_data of WebSocket object at 0x000000C72AEAF780: Gateway timed out
  • SAURABH
    On both the occasions I get gateway timeout error
  • Kailash
    Hm, are you sure there are no breaks in connections to websocket.kite.trade? You should not be getting a gateway timeout there. Please run `ping websocket.kite.trade` and see if you get any drops.

    If you're getting timeouts on both WebSocket and the API (which are different servers and services at our end), it's most likely a network issue.
  • SAURABH
    ok I have checked that maximum times I get this error while running kite.trades() command. Although I run this command after 3 seconds, but maybe it needs to be slowed down further
  • SAURABH
    I fixed the above error. But now I am getting the following error: ConnectionResetError: [WinError 10054] An existing connection was forcibly closed by the remote host
  • Vivek
    Vivek edited July 2016
    @SAURABH We had connectivity issues at our end around 10.05 for few minutes. We are investigating the cause.
  • soumyadeep
    There is also a problem with python websocket if you subscribe more than 20 symbols and write their output in separate files, the memory usage exceeds 100% afte 15 minutes and then automatically Linux stops the python script. Python uses only one core of the CPU.
    I cant think of an alternative to stop linux killing the python script.
  • Kailash
    @soumyadeep Investigating this.
  • Vivek
    @soumyadeep We have tested Python websocket client with around 65 scrips for an hour and it was working fine. There were no disconnections and memory usage never went to 100%. Please check with your implementation. Also note that we have tested with this on Ubuntu 12.04 with disable_ssl_verification on.
  • soumyadeep
    soumyadeep edited July 2016
    Hi @Kailash and @vivek , today I tested different scrip numbers (from 20 to 150).
    ----------------------------------------------------------------------------------------------------------------------
    My implementation is-
    def on_tick(tick, ws):
    ..........................for data in tick:
    ...................................write data in separate files.
    -------------------------------------------------------------------------------------------------------------------
    1. I tried writing all the data fo r each scrip in separate text files.
    2. I tried just printing the data in the terminal to test if the problem arises due to python memory usage or file handling or websocket.
    3. I tried python 2.7 and 3 both.

    A) If I subscribe to 150 scrips and continuously write the data into 150 text files, the memory usage shoots up and after 20 min, the python program kills itself. If I rerun the program at this stage, it now takes only 70 seconds to kill. So there is something building up which on reaching the threshold kills the program .

    On decreasing the number of scrips (say 20 scrips), the time to get killed increases (say 45 min) but the program eventually stops.

    B ) I tried adding time.sleep(0.1), it helped keeping the CPU usage down and I could see that the program gets killed even at 35% CPU usage. This raised my suspicion.

    C) Then I thought of just printing the output of 150 scrips and removed the writing part of the code. I could see that the time it takes to kill the program increases (50 min) but it eventually stops.

    So no matter if I write data in file or just print the data, the program gets killed eventually.

    I would also need to inform you that for the last 3 days, I have been able to stream websocket with the same access code and public token. Might be important for you guys to know.
  • Kailash
    @soumyadeep Can you please share your test script? Let me run it at our end and see.

    The public_token bit, we're aware of this. We've left public_token auth open for a limited period.
  • soumyadeep
    soumyadeep edited July 2016
    yes, I will send you later today in your inbox? A mail id would be perfect then I can attach my python script as well. It will be better if you try my script during market hours.
    Thanks
  • Kailash
    Kailash edited July 2016
  • soumyadeep
    Hi @Kailash , I have sent you my script at the said email id. May be you could run it tomorrow during market hours?
    Thanks in advance.
  • SAURABH
    The problem of more than 20 stocks data seemed to resolve earlier, but has again creeped in. Again I am not able to get the data for more than 20 stocks. Please make a permanent resolution
  • Vivek
    @SAURABH can you please check if you are running latest version of pykiteconnect? I have tested current version with around 65 scrips and it seems to be working fine. Also please share if you got any error stack trace or log.
  • SAURABH
    Just now I upgraded my kiteconnect version. Now I am getting the data only for the first 20 stocks. And that too I just get a single tick. Also no error message is shown. For upto 20 stocks, I am getting the data perfectly.
  • Kailash
    @SAURABH I'm not sure where or how this is happening at your end. There's absolutely no restriction of 20 subscriptions in the system. @soumyadeep has a different issue where his script crashes, but he's streaming 60+ instruments.
  • SAURABH
    I was able to stream more than 150 stocks till Friday. But today again I couldn't stream more than 20 stocks. And I have not changed even a single alphabet of my program. So how can the problem be at my end when that program ran successfully last week.
  • Kailash
    HI @SAURABH as I write this, there are thousands of connections streaming 200 instruments concurrently. I'm unable to pinpoint where exactly this is going wrong.

    Can you e-mail your test script that you're using for streaming to [email protected]?
  • SAURABH
    @Kailash @vivek I am facing a very peculiar problem while streaming the data. I am able to stream well if the number of stocks is less than 25 or more than 60. But if their number is in between 25-50, then the streaming becomes very slow. I know that I might be sounding a bit illogical, but I have checked it thoroughly and on different set of stocks. I don't know why this is happening, but really when the number of stocks is about 30, it becomes painfully slow to stream the data. I want to get this problem addressed, because right now my computer can handle only upto 30-40 stocks properly, but it is in this range of stocks that the streaming is not going on properly.
  • Vivek
    @SAURABH can you send me the list of tokens you are subscribing to (30-40 stocks you mentioned above)? Will test is in our end and let you know. Meanwhile you can also test this on some other system or in cloud to check if its problem only at your end.
  • SAURABH
    ["COALINDIA", "ONGC", "BANKBARODA", "SBIN","SAIL", "ENGINERSIN", "CROMPGREAV", "SYNDIBANK", "TATACOMM", "JINDALSTEL", "RECLTD", "ADANIPOWER", "NCC", "SUNTV", "PETRONET", "DLF", "HDIL", "GMRINFRA", "IFCI", "ASHOKLEY", "IOC", "JPASSOCIAT", "CAIRN", "ARVIND", "IDFC", "OIL", "RCOM", "JSWENERGY", "ORIENTBANK", "DISHTV", "GODREJIND"]
  • SAURABH
    I have tried it on different set of stocks. This is only one such set.
  • Vivek
    @SAURABH can you give me the tokens instead of symbols?
  • SAURABH
    [5215745, 633601, 1195009, 779521, 758529, 1256193, 194561, 1837825, 952577, 1723649, 3930881, 4451329, 593665, 3431425, 2905857, 3771393, 3789569, 3463169, 381697, 54273, 415745, 2933761, 3580417, 49409, 3060993, 4464129, 3375873, 4574465, 636673, 3721473, 2796801]
  • SAURABH
    @vivek Did you check with the tokens?
  • Vivek
    @SAURABH Forgot to test it today in market hours. Will do it tomorrow and let you know.
  • SAURABH
    Please check the tokens today.
  • Vivek
    @SAURABH I have tested for the given tokens and it seems eveything is fine. While streaming ticks I was monitoring CPU and RAM and it never peaked (check screenshot). I have tested this both on OS X and Ubuntu 16.04 and its fine on both the systems. For your reference here I attached my test script.
  • pybull
    I am having no problems with streaming over 190 instruments. However, I would love to subscribe to some more instruments. Is it possible to negotiate a higher number of subscribed instruments, at the cost of say market depth and other parameters? This is of course, assuming that, the size of data transferred is the main factor for the hard limit of 200. I understand, there could be some other factors too, but it would be great, if I could subscribe to say 500 instruments, if I use LTP mode only!

    On a side note, I am really impressed with the server availability and the API documentation, plus the implementation help. Keep up the good work!
Sign In or Register to comment.