MAG

About

Username
MAG
Joined
Visits
1,431
Last Active
Roles
Member

Comments

  • @Balaganga If you are planning to pursue this seriously I would suggest setting up a Linux VM using virtualbox and running all your code on Linux instead of windows. Will give much better performance and will save you from a lot of pain/headaches.
  • HTTP 403 is an HTTP status code meaning access to the requested resource is forbidden @lostbodhi Have you subscribed to the Historical Api? It's a separate subscription that's an add on to the base Kiteconnect API subscription.
  • The problem is without programming skills you cannot make much headway. I have already given you working code and all you had to do is replace the Api-key, api- secret and request token with your own values. And you are unable to get that simple 10 …
  • @Balaganga Buddy, respectfully don't take the next sentence in the wrong way. But we are here to help but we cant do all your work for you and spoon feed you. Chuck out the sublime whatever and run the code at the command line using the syntax pyth…
  • @Balaganga Change the line gen_ssn = kite.generate_session(request_token=req_tkn,api_secret=secret) to gen_ssn = kite.generate_session(req_tkn,api_secret=secret) This will solve the problem partially. But you still wont get the access token. In o…
  • @Balaganga There should be some error message or exception output when you ran it. What is the exception/error? I think I know whats wrong but I don't want to try anything now that would invalidate my access token. I will try your code out early m…
  • @Sunilbroker_Hedge Volume= BuyQuantity + SellQuantity is wrong. Volume is quantity of completed/executed trades for the day from market open to current time. Whereas BuyQuantity and SellQuantity are the total quantities available for buy / sell …
  • @Balaganga Seems like there is too much delay in between you generating your request token, saving it to the variable in your python file and running the code to generate the access token. The request token expires in something like 30 seconds - n…
  • Update: Seems fixed.
  • @sujith This is still happening. Logged out and in of web interface. Now I cant see or add/modify to my watchlists. Its all empty. Also this seems to be an issue limited to the kite web interface. The API seems to be working fine. At least all the…
  • Nope - the API is just that - basic building blocks. Allowing one to develop any solution that they want limited only by time and skills. If someone wants the functionality of console to be available, the API allows them the flexibility to design an…
  • @parulg014 AFAIK Thats not available via api at all. What you are asking for are trade/investment reports for your account. For this information you need to log into https://console.zerodha.com/ and then look at the menu on the top right side. Un…
  • ROFL, This is what I heard - I am an entitled highly educated jackass with an overinflated bloated ego and I think I am smarter than everyone else. Therefore while I can go insulting others, no one is allowed to question me or I will use fancy word …
  • @keerthi if you look at the documentation all this information is available there. If you look at the pinned FAQ section of this forum. There is a login flow section. In this section the second last item is "Websocket session flush timings : https:…
  • Please go and read the faqs here and the api limits here. It's clearly mentioned in the second link (no of request to API) that the order limit is at the account level. So whether you create one app or 10, you are still limited to 3000 orders per d…
  • Your query is still pretty vague and unclear. What is "can we also scan like pricing". I am assuming you mean you want to know when the price hits your 5% target between 10:17:00 and 10:17:59. You cant do that from historical data. Unless you stor…
  • Its not in the documentation. Which means it's not possible. The quote api documentation for example clearly mentions that one can pass one or more instruments as a list. The historical api documentation does not specify any such thing. So it's ob…
  • Your comments seem to indicate you are confused about historical api and what it offers. In historical data the smallest unit of time is minute. So your query is invalid as there is no data for 09:48:30. You will get either one minute OHLCV candle …
  • The market opens for trade at 9:15. And the tradeable candles are from 9:15(:00) to 15:29(:59). So what is the query again?
  • @dexter31 268041 is the instrument token for Nifty 500 index. NIFTY 500,268041,1047,0.0,NSE,NIFTY 500,0.0,0.0,EQ,INDICES,,0 Indices are not traded directly ever on any exchange. What are traded are the derivative instruments based on the indexes. …
  • The formulas are given on the brokerage calculator page. You need to write a method to compute the charges given four variables 1. The segment, 2. buy price, 3. sell price and 4. quantity. You can refer this thread for the formula and method I u…
  • @sumegh20 you need to develop your own logic for extracting that information. If I were to do it, 1. One option would be to have the instrument list as a json file and do a json search for matching parameters. 2. Another way is to read the instru…
  • This is how I do it. I base it off the expiry date. Monthly expiry for banknifty is the last Thursday of the month - period. Once you have this, all other expiries for the month are weekly expiries. This logic should apply to any instrument in NFO-O…
  • @sumegh20 how did you get the tradingsymbol of BANKNIFTY30NOV43400CE? If you look up the instruments list for banknifty NFO instruments with 2023-11-30 expiry you get instruments with the 'BANKNIFTY23NOV' prefix. So the symbolname for the instrume…
  • @vikramavertatech @gautama @benny Are you guys using historical api to access the current days candles?
  • The way I do it is I have a dummy flag in my order placement code. If that flag is set it does everything but actually send the order to zerodha. I have my own orders and positions maintained at my end and therefore the code can do everything it wou…
  • @seshank You will have to look at compliance too. If you are out of India for more than 181 days, you are no longer a resident Indian. You qualify as an NRI. Once you are an NRI, there are limitations as well as additional compliances as mandated b…
  • If you are using linux, You can do the following #switch to root user > sudo su - #edit roots crontab using crontab -e and add the following lines @reboot /usr/bin/timedatectl set-timezone Asia/Kolkata @reboot cat Asia/Kolkata > /etc/timezo…
  • A few questions that may enable folks to help you better - 1. What is your OS 2. What programming language are you using? In this case it obviously seems to be python. 3. What is the timezone on your computer? Simplest check - first what is the ti…
  • Unfortunately, there is no such thing as something for nothing. If you want results, you need to put in the work.
  • You will have to write your own code to parse the instrumentlist. The api is just that - basic building blocks. You need to write your own code to use these building blocks to achieve whatever end goal you have. In your case the symbolname for Reli…
  • Also I would like to add that you can set up multiple lookup tables going both ways. One for instrument token to symbolname mapping and the other for symbolname to instrument token mapping. So If you have a symbolname and want to look up its instru…
  • You have to download the instrument token file at the beginning of the day and set up a instrument token to symbolname lookup table using whatever method suits you. I use redis hashes or python dictionaries where the instrument token is the key and …
  • I would have responded but people like you are not worth my time dude. Good bye.
  • @ANL Seriously are you a guy or gal? Seems like a gal to me. Also, please stop tagging me in these useless whataboutery discussions mate. I am trying to help people out here based on decades of real world experience in working at scale. If windows…
  • Linux - I wouldn't use windows. Esp I wouldnt write and run programs on windows even if someone paid me to do it. When I give interviews that is my first question. If I am expected to deal with windows, I am out of there. Linux first, MacOs second …
  • Have you guys read zerodhas tech blog especially the first bog by Kailash Nadh about zerodhas tech stack? https://zerodha.tech/blog/hello-world/ Zerodha uses Postgres and redis and it works at their scale. So I do not understand the discussion and …
  • I am not sure whether its a sliding window or based on actual timestamps but I write my code to assume its based on time stamps and I never got any errors because my code will not send more than the limit per second anyways. Its best to write your …
  • @ANL I havent used timescaledb or postgres at all. So I have no idea. I use a combination of mongodb and redis with python and it works fine for me. Like I said in some other thread - its not the DB or solution you use that matters. Its the way y…
  • JFYI, Python is not a bottleneck. Its all in the way you code. I am subscribing to around 500 instruments and am able to process ticks and generate candles for all 500 instruments in under .2 seconds using python. In the last six years never missed …
  • The main issue that no one is mentioning is that home broadband will never be stable. And if your network is unstable there is no point in having a server grade system at your home. You can write the best trading strategy, you can write the most per…
  • You have to do it in batches in python you would do something like import datetime startdate = datetime.datetime( 2008, 10, 1, 9, 15, 0) while True: enddate = startdate + datetime.timedelta(days=100) # fetch historical data he…
  • I dont think this is available via api. But it is very easy to compute. I have this method in python that I use and works very well. You can refer to it and modify it to your needs. For eg you can add a fourth parameter for the category of trade - …
  • Cant comment on windows RDP or windows OS. I never use windows anywhere so I am sorry I cant help you here. I use AWS EC2 instances running either Debian or Ubuntu. And all access is via SSH shell. Code is primarily python3 running in terminal or c…
  • I dont think that data is provided at all to anyone. Not even the HFT guys who have million rupee colocation setups. Only way to do that would be to see the entire market depth and then try to backtrace how many orders were executed from the availab…
  • Regarding the second para "how we can differentiate the volume: how much is from taking the offer (buying volume) versus hitting the bid (selling volume)? " Your question is irrelevant - volume is for completed transactions - where a bid matched an …
  • I am not part of zerodha team. But I do know that Tickdata is raw data that comes from the exchange and is forwarded as is. So if it does not contain volume computed thats on the exchange. If zerodha takes each tick and tried to compute the volume…
  • At start of day your volume is zero. For first minute you get all ticks from first tick received to 09:15:59 (this condition can be written as 1)
  • The first thing is obviously the latency - the lesser the better. I am at Bangalore using ACT fibernet which if I am not mistaken rated second behind spectranet. I am getting 1 to 1.5 ms response times to any of the zerodha end points. The lower th…
  • Depends on how many instruments you want to subscribe to and what you want to do exactly. I track and process all ticks for nifty and banknifty. Thats about 300-350 instruments and I run it on a m6a.2xlarge (8 cpu, 32gb, 200gb SSD) with considerable…
  • @spartacus I track banknifty and nifty options and futures. The way I prevent these issues is I do not try to generate the symbolnames in code. I use the current days instrument list as the source of truth. I just need three things - the latest i…
  • What do you mean by I am running the code on cloud? Which cloud is this? You wont have this issue on any of the major cloud service providers like GCP, Azure or AWS.
  • Its not a code or server issue. Its clearly a network issue where communication between your system and the server was disrupted. All broadband connnections will have these packet drops / network brownout issues from time to time and if you want to…
  • You can subscribe to 1000 instruments per websocket connection. And each API key can be used to make 3 websocket connections independent of each other for a total of 3000 instruments. All this is clearly mentioned in the frequently asked questions …
    in Tick Data Comment by MAG September 2023
  • @ANL Just came across this AWS is anyday better. When you run on your local system the main issue is the broadband internet connection. There will always be network disruptions on home broadband connections. AWS runs in the best of class datacente…
  • It is always steady at about 1.5 ticks per second or about 100 ticks per minute. Very rarely during days of high volatility would you get a few more ticks. By that I mean 2 to 2.5 ticks per second or 120-150 ticks per minute. But thats very rare may…
  • Your ping response to zerodha servers is too high. I am getting 0.4 ms from my AWS servers and 1.2 ms from my home broadband connection. Can you specify which broadband provider you are using and is it a fiber or coaxial or rj45 connection. The onl…
  • Latency is the roundtrip time for your request to travel from your computer to the remote server (in this case zerodhas OMS/RMS) and the time taken for the servers response to travel back to to your computer. Generally it is measured in millisecon…
  • As promised here is a one tick received (exchange_timestamp': datetime.datetime(2023, 9, 1, 15, 29, 59)) printed out using prettyprinter module. This one tick received contains individual ticks for 168 instruments
  • I don't usually run continuous checks for repeat data. I only did that during initial testing and I found that they were distinct. Also whenever I have looked at individual tick data for any instrument, I don't remember having found duplicates ever…
  • @Saleem I have been using the kite ticker API for a few years now and my observation is that I receive about 37500 ticks per day or to be precise between 37502 to 37509 ticks per day. That mean 100 ticks per minute (375 trading minutes from 9:15 to …
  • Are you doing this on a cloud server or on your home internet? If its home internet, please open a new terminal and set up a continuous ping to google.com or any such popular site and check for irregular ttl or brownouts where there is intermittent …
  • @rakeshr @sujith @Kailash Some of us use the API but do not log into the forum often. I am sure I am not the only one under this category. Therefore these kind of changes should be communicated to the API subscribers via email. I just happened to …
  • @amitchugh I am a Unix/Linux professional. Unix/Linux is the best os on the plant because it follows the following tenet "Keep it simple stupid" also abbreviated as KISS. Putting multiple functionality under one script will lead to unnecessary comp…
  • @amitchugh If you read my comments you will see that I use pandas to create the candles so it's simpler to reprocess the whole ticks list and recreate all candles and overwrite the previous candle data instead of segregating the data and trying to …
  • @amitchugh It depends on how you are running your scripts and on what os. I use linux and the simplest way is to run each script in a separate terminal window. Another option is to start the scripts to run in background mode ( python myscript.py &am…
  • @sujith Just a small clarification. You said rate limit is based on API_KEY. I understand that if one creates multiple apps, each one has a different API_KEY. Does that mean that if one has multiple apps, they will have a higher all overall cumul…
  • Thanks Sujith. I generate the token and update to git which then pulls it to a remote box automatically. Worked like clockwork for months. Which is why I didnt even look there. It failed for the first time today!! Just goes to say you can never h…
  • Hmm I guess this is not true. Because we do not receive all ticks. What we receive is generally an aggregate. If we were to receive full ticks then what you said would be true. But then if it was true, the bandwidth requirements would go through the…
  • Ok. But aside from why or how this is happening, the very fact that I am receiving 15-20+ ticks per second is unusual right? Or would this happen under certain circumstances?
  • Thanks @krtrader for your kind comments. I was only trying to help. Btw, I have done a lot more than sending 5 orders per second :-) And this is why I avoid commenting even when I know stuff because of the incessant bullying attitude that folks hav…
    in Rate Limits Comment by MAG July 2019
  • Dude, there is a restriction for per second too. Let me explain it like this: If you start at 9:15:00 and start sending 5 orders every second, at the 39th second (count starts at 00) you will end up sending orders 196-200. You won't be able to pla…
    in Rate Limits Comment by MAG July 2019
  • @sujith I thought the order limit was 10K orders per day per client. I remember reading it in some discussion earlier.
  • @ZK2689 Great. In parallel I would still suggest monitoring network connection on the side. I caught issues with one provider and switched to another when I found that the first provider used to intermittently drop packets for a few seconds at …
  • @Z@ZK2689 Since we connect to zerodha servers over the internet, it could be intermittent network errors at your end too or something as simple as dns lookup failure. Check your dns setting and if its your ISP's DNS servers, set it to google public…
  • Buddy, If you expect help, you need to be a bit more specific. What do you mean by down? Are you having login issues? Are you logged in but having issues with some api calls? Which version of the API are you using? Which language? Which call / modul…
  • @sujith: yes its working now. What this means is that the redirect URL had a space for a couple of users. which shouldn't have been accepted in the first place. But then as a dev - been there done that myself. I understand that you guys have been …
  • Ok if multiple folks are getting same error, lets not post multiple queries and wait for the guys response on one thread.
  • @sujith: One connection means 1000 instruments. And you made a comment that Zerodha plans to limit each api key to one connection only. If someone wants to subscribe to more than 1000 instruments. This can easily happen if I track all the futures fo…
  • If i try either kws.close() or kws.stop or any combination thereof I get exceptions as follows: 1520571840 Fri, 09 Mar 2018 10:34:00:041181 INFO on_ticks(): Tick written to redis key batch1 and mongo collection (count:2135) last tick: 20…
  • @deepaksinghvi please check the link you have posted. It's irrelevant. Therefore it would be best to remove it. @mailboxofrafiq Check out https://github.com/mdeverdelhan/ta4j-origins Found it using a simple google search. Check the code thoroug…
  • @sauravkedia ,, script1 gets ticks from websocket. For V2 API it's a list of up to 200 stocks in each tick. Something like : tick = [ {"instrumentToken":1, time: 9:15:00.200, other data...}, {"instrumentToken":2, time: 9:15:00.200, other data...}, {…
  • @RP3436 You can in fact skip celery too. Its not needed. if you are subscribing to just 4-5 instruments, all you need is python, the kiteconnect library, redis and pandas. The efficiencies are all in the way you code. Unfortunately writing code is …
  • @sauravkedia : sorry for the late response to your reply on 27th feb. First of all I do not understand why it takes you 2 seconds to load data into pandas. Zerodha websocket library gives you a list of dictionaries. Each dictionary in the list is a…
  • You want daily close prices only which means you need to extract daily ohlc and write it to csv. Is that correct?
    in Write to CSV Comment by MAG March 2018
  • @vishwash_yadav TagStr="02281" order_id=kws.order_place( exchange=NSE, tradingsymbol=ACC, transaction_type=BUY, # or SELL …
  • @phantomdrake It would help if you write to log file within your app which will help you and others view what is happening with the code internally and at what stage it fails or throws exceptions. The exceptions can be logged too. This is a sample …
  • Check for syntax error if you are migrating from kite 2.0 to kite 3.0 def on_ticks( ws , ticks ) : # KIte 3.0 vs def on_tick( tick , ws ) : # Kite 2.0 And the callback assignment also changes kws.on_ticks = on_ticks # Kite 3.0 instead of kws.on…
  • Will ping you later. Will try to help. But no guarantees. Depends on how much time it takes and how much you can figure out on your own.
  • @pradeepsajjan need to understand a few things about you so that folks can understand and help you better. A bit about you and where you stand in terms of education, market experience, coding experience etc will help us evaluate your level and help …
  • @sauravkedia : Interesting points put forward by you. I have a bunch of queries / suggestions. Will reply in detail when I am free later in the day.
  • This is a programming logic issue. Are you placing a bracket order or or are you entering a specific stop loss order once your entry is executed? The logic would vary depending on the answer. You could put a tag on all related orders and then write…
  • @archulysses I havent tried postgres. In fact I haven't used postgres for years now. However I did read up on it and it seems it supports bson and the performance is pretty good. However whether you want to use postgres or mongo would depend on your…
  • @vivek What does this error mean on_close(): ERROR : 1006, connection was closed uncleanly (I dropped the WebSocket TCP connection: not enough arguments for format string)
  • This has been resolved. Minor issues and RTFM multiple times helped solved the issue. It was kws.on_ticks = on_ticks # in kiteconnect 3.0 instead of kws.on_tick = on_tick # in kiteconnect2.0
  • Ignore this for the moment. I think I have got this sorted.
  • @Ashok121: To be safe I always log in at 8:31 or later. Once I started doing that, I never had issues with invalidated access tokens. That also means that if one wants to do any api related work, one really cant do it for the one hour window between…
  • Can you give a screenshot of the error? or what is the command you used? Permission denied when installing any package typically means you are logged in as a regular user and you need to elevate user privileges by appending 'sudo' to whatever comman…
  • @sujith you mean migrating kite to kite3? :-) So in simpler terms; until the migration happens although we can use kite3 for all purposes, in order to get the request token we will need to log into kite.zerodha.com. is that understanding correct…
  • I have tried both. Redis is obviously faster to insert and retrieve data since its in memory. But what many people do not speak about is that redis converts the data to a utf-8 encoded str. When you read data back in from redis, you need to convert …